WorldWideScience

Sample records for large-scale resource sharing

  1. Global resource sharing

    CERN Document Server

    Frederiksen, Linda; Nance, Heidi

    2011-01-01

    Written from a global perspective, this book reviews sharing of library resources on a global scale. With expanded discovery tools and massive digitization projects, the rich and extensive holdings of the world's libraries are more visible now than at any time in the past. Advanced communication and transmission technologies, along with improved international standards, present a means for the sharing of library resources around the globe. Despite these significant improvements, a number of challenges remain. Global Resource Sharing provides librarians and library managers with a comprehensive

  2. Agri-Environmental Resource Management by Large-Scale Collective Action: Determining KEY Success Factors

    Science.gov (United States)

    Uetake, Tetsuya

    2015-01-01

    Purpose: Large-scale collective action is necessary when managing agricultural natural resources such as biodiversity and water quality. This paper determines the key factors to the success of such action. Design/Methodology/Approach: This paper analyses four large-scale collective actions used to manage agri-environmental resources in Canada and…

  3. A Philosophy Research Database to Share Data Resources

    Directory of Open Access Journals (Sweden)

    Jili Cheng

    2007-12-01

    Full Text Available Philosophy research used to rely mainly on the traditional published journals and newspapers for collecting or communicating data. However, because of financial limits or lack of capability to collect data, required published materials and even restricted materials and developing information from research projects often could not be obtained. The rise of digital techniques and Internet opportunities has allowed data resource sharing of philosophy research. However, although there are several ICPs with large-scale comprehensive commercial databases in the field in China, no real non-profit professional database for philosophy researchers exists. Therefore, in 2002, the Philosophy Institute of the Chinese Academy of Social Sciences began a project to build "The Database of Philosophy Research." Until Mar. 2006 the number of subsets had reached 30, with more than 30,000 records, retrieval services reached 6,000, and article-reading reached 30,000. Because of the concept of intellectual property, the service of the database is currently limited to the information held in CASS. Nevertheless, this is the first academic database for philosophy research, so its orientation is towards resource-sharing, leading users to data, and serving large number of demands from other provinces and departments.

  4. Competition over personal resources favors contribution to shared resources in human groups

    DEFF Research Database (Denmark)

    Barker, Jessie; Barclay, Pat; Reeve, H. Kern

    2013-01-01

    laboratory economic games with humans, comparing people's investment decisions in games with and without the options to compete over personal resources or invest in a group resource. Our results help explain why people cooperatively contribute to group resources, suggest how a tragedy of the commons may......Members of social groups face a trade-off between investing selfish effort for themselves and investing cooperative effort to produce a shared group resource. Many group resources are shared equitably: they may be intrinsically non-excludable public goods, such as vigilance against predators, or so...... large that there is little cost to sharing, such as cooperatively hunted big game. However, group members' personal resources, such as food hunted individually, may be monopolizable. In such cases, an individual may benefit by investing effort in taking others' personal resources, and in defending one...

  5. Competition over personal resources favors contribution to shared resources in human groups.

    Directory of Open Access Journals (Sweden)

    Jessica L Barker

    Full Text Available Members of social groups face a trade-off between investing selfish effort for themselves and investing cooperative effort to produce a shared group resource. Many group resources are shared equitably: they may be intrinsically non-excludable public goods, such as vigilance against predators, or so large that there is little cost to sharing, such as cooperatively hunted big game. However, group members' personal resources, such as food hunted individually, may be monopolizable. In such cases, an individual may benefit by investing effort in taking others' personal resources, and in defending one's own resources against others. We use a game theoretic "tug-of-war" model to predict that when such competition over personal resources is possible, players will contribute more towards a group resource, and also obtain higher payoffs from doing so. We test and find support for these predictions in two laboratory economic games with humans, comparing people's investment decisions in games with and without the options to compete over personal resources or invest in a group resource. Our results help explain why people cooperatively contribute to group resources, suggest how a tragedy of the commons may be avoided, and highlight unifying features in the evolution of cooperation and competition in human and non-human societies.

  6. Why is data sharing in collaborative natural resource efforts so hard and what can we do to improve it?

    Science.gov (United States)

    Volk, Carol J; Lucero, Yasmin; Barnas, Katie

    2014-05-01

    Increasingly, research and management in natural resource science rely on very large datasets compiled from multiple sources. While it is generally good to have more data, utilizing large, complex datasets has introduced challenges in data sharing, especially for collaborating researchers in disparate locations ("distributed research teams"). We surveyed natural resource scientists about common data-sharing problems. The major issues identified by our survey respondents (n = 118) when providing data were lack of clarity in the data request (including format of data requested). When receiving data, survey respondents reported various insufficiencies in documentation describing the data (e.g., no data collection description/no protocol, data aggregated, or summarized without explanation). Since metadata, or "information about the data," is a central obstacle in efficient data handling, we suggest documenting metadata through data dictionaries, protocols, read-me files, explicit null value documentation, and process metadata as essential to any large-scale research program. We advocate for all researchers, but especially those involved in distributed teams to alleviate these problems with the use of several readily available communication strategies including the use of organizational charts to define roles, data flow diagrams to outline procedures and timelines, and data update cycles to guide data-handling expectations. In particular, we argue that distributed research teams magnify data-sharing challenges making data management training even more crucial for natural resource scientists. If natural resource scientists fail to overcome communication and metadata documentation issues, then negative data-sharing experiences will likely continue to undermine the success of many large-scale collaborative projects.

  7. Why is Data Sharing in Collaborative Natural Resource Efforts so Hard and What can We Do to Improve it?

    Science.gov (United States)

    Volk, Carol J.; Lucero, Yasmin; Barnas, Katie

    2014-05-01

    Increasingly, research and management in natural resource science rely on very large datasets compiled from multiple sources. While it is generally good to have more data, utilizing large, complex datasets has introduced challenges in data sharing, especially for collaborating researchers in disparate locations ("distributed research teams"). We surveyed natural resource scientists about common data-sharing problems. The major issues identified by our survey respondents ( n = 118) when providing data were lack of clarity in the data request (including format of data requested). When receiving data, survey respondents reported various insufficiencies in documentation describing the data (e.g., no data collection description/no protocol, data aggregated, or summarized without explanation). Since metadata, or "information about the data," is a central obstacle in efficient data handling, we suggest documenting metadata through data dictionaries, protocols, read-me files, explicit null value documentation, and process metadata as essential to any large-scale research program. We advocate for all researchers, but especially those involved in distributed teams to alleviate these problems with the use of several readily available communication strategies including the use of organizational charts to define roles, data flow diagrams to outline procedures and timelines, and data update cycles to guide data-handling expectations. In particular, we argue that distributed research teams magnify data-sharing challenges making data management training even more crucial for natural resource scientists. If natural resource scientists fail to overcome communication and metadata documentation issues, then negative data-sharing experiences will likely continue to undermine the success of many large-scale collaborative projects.

  8. Large scale mapping of groundwater resources using a highly integrated set of tools

    DEFF Research Database (Denmark)

    Søndergaard, Verner; Auken, Esben; Christiansen, Anders Vest

    large areas with information from an optimum number of new investigation boreholes, existing boreholes, logs and water samples to get an integrated and detailed description of the groundwater resources and their vulnerability.Development of more time efficient and airborne geophysical data acquisition...... platforms (e.g. SkyTEM) have made large-scale mapping attractive and affordable in the planning and administration of groundwater resources. The handling and optimized use of huge amounts of geophysical data covering large areas has also required a comprehensive database, where data can easily be stored...

  9. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  10. THE DEVELOPMENT OF GIS EDUCATIONAL RESOURCES SHARING AMONG CENTRAL TAIWAN UNIVERSITIES

    Directory of Open Access Journals (Sweden)

    T.-Y. Chou

    2012-09-01

    Moreover, e-learning platforms provide solutions and resources. Different levels of image scales have been integrated into the systems. Multi-scale spatial development and analyses in Central Taiwan integrate academic research resources among CTTLRC partners. Thus, establish decision-making support mechanism in teaching and learning. Accelerate communication, cooperation and sharing among academic units

  11. A Stream Tilling Approach to Surface Area Estimation for Large Scale Spatial Data in a Shared Memory System

    Directory of Open Access Journals (Sweden)

    Liu Jiping

    2017-12-01

    Full Text Available Surface area estimation is a widely used tool for resource evaluation in the physical world. When processing large scale spatial data, the input/output (I/O can easily become the bottleneck in parallelizing the algorithm due to the limited physical memory resources and the very slow disk transfer rate. In this paper, we proposed a stream tilling approach to surface area estimation that first decomposed a spatial data set into tiles with topological expansions. With these tiles, the one-to-one mapping relationship between the input and the computing process was broken. Then, we realized a streaming framework towards the scheduling of the I/O processes and computing units. Herein, each computing unit encapsulated a same copy of the estimation algorithm, and multiple asynchronous computing units could work individually in parallel. Finally, the performed experiment demonstrated that our stream tilling estimation can efficiently alleviate the heavy pressures from the I/O-bound work, and the measured speedup after being optimized have greatly outperformed the directly parallel versions in shared memory systems with multi-core processors.

  12. Performance Health Monitoring of Large-Scale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rajamony, Ram [IBM Research, Austin, TX (United States)

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  13. Analysis of Utilization of Fecal Resources in Large-scale Livestock and Poultry Breeding in China

    Directory of Open Access Journals (Sweden)

    XUAN Meng

    2018-02-01

    Full Text Available The purpose of this paper is to develop a systematic investigation for the serious problems of livestock and poultry breeding in China and the technical demand of promoting the utilization of manure. Based on the status quo of large-scale livestock and poultry farming in typical areas in China, the work had been done beared on statistics and analysis of the modes and proportions of utilization of manure resources. Such a statistical method had been applied to the country -identified large -scale farm, which the total amount of pollutants reduction was in accordance with the "12th Five-Year Plan" standards. The results showed that there were some differences in the modes of resource utilization due to livestock and poultry manure at different scales and types:(1 Hogs, dairy cattle and beef cattle in total accounted for more than 75% of the agricultural manure storage;(2 Laying hens and broiler chickens accounted for about 65% of the total production of the organic manure produced by fecal production. It is demonstrated that the major modes of resource utilization of dung and urine were related to the natural characteristics, agricultural production methods, farming scale and economic development level in the area. It was concluded that the unreasonable planning, lacking of cleansing during breeding, false selection of manure utilizing modes were the major problems in China忆s large-scale livestock and poultry fecal resources utilization.

  14. Attention and Visuospatial Working Memory Share the Same Processing Resources

    Directory of Open Access Journals (Sweden)

    Jing eFeng

    2012-04-01

    Full Text Available Attention and visuospatial working memory (VWM share very similar characteristics; both have the same upper bound of about four items in capacity and they recruit overlapping brain regions. We examined whether both attention and visuospatial working memory share the same processing resources using a novel dual-task-costs approach based on a load-varying dual-task technique. With sufficiently large loads on attention and VWM, considerable interference between the two processes was observed. A further load increase on either process produced reciprocal increases in interference on both processes, indicating that attention and VWM share common resources. More critically, comparison among four experiments on the reciprocal interference effects, as measured by the dual-task costs, demonstrates no significant contribution from additional processing other than the shared processes. These results support the notion that attention and VWM share the same processing resources.

  15. Sharing Resources in Educational Communities

    Directory of Open Access Journals (Sweden)

    Anoush Margarayn

    2010-06-01

    Full Text Available The paper explores the implications of mobility within educational communities for sharing and reuse of educational resources. The study begins by exploring individuals’ existing strategies for sharing and reusing educational resources within localised and distributed communities, with particular emphasis on the impact of geographic location on these strategies. The results indicate that the geographic distribution of communities has little impact on individuals’ strategies for resource management, since many individuals are communicating via technology tools with colleagues within a localised setting. The study points to few major differences in the ways in which individuals within the localised and distributed communities store, share and collaborate around educational resources. Moving beyond the view of individuals being statically involved in one or two communities, mobility across communities, roles and geographic location are formulated and illustrated through eight scenarios. The effects of mobility across these scenarios are outlined and a framework for future research into mobility and resource sharing within communities discussed.

  16. Duality between resource reservation and proportional share resource allocation

    Science.gov (United States)

    Stoica, Ion; Abdel-Wahab, Hussein; Jeffay, Kevin

    1997-01-01

    We describe anew framework for resource allocation that unifies the well-known proportional share and resource reservation policies. Each client is characterized by two parameters: a weight that represents the rate at which the client 'pays' for the resource, and a share that represents the fraction of the resource that the client should receive. A fixed rate corresponds to a proportional share allocation, while a fixed share corresponds to a reservation. Furthermore, rates and shares are duals of each other. Once one parameters is fixed the other becomes fixed as well. If a client asks for a fixed share then the level of competition for the resource determines the rate at which it has to pay, while if the rate is fixed, level of competition determines the service time the clients should receive. To implement this framework we use a new proportional share algorithm, called earliest eligible virtual deadline first, that achieves optical accuracy in the rates at which process execute. This makes it possible to provide support for highly predictable, real-time services. As a proof of concept we have implemented a prototype of a CPU scheduler under the FreeBSD operating system. The experimental results show that our scheduler achieves the goal of providing integrated support for batch and real-time applications.

  17. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  18. Perspectives on Sharing Models and Related Resources in Computational Biomechanics Research.

    Science.gov (United States)

    Erdemir, Ahmet; Hunter, Peter J; Holzapfel, Gerhard A; Loew, Leslie M; Middleton, John; Jacobs, Christopher R; Nithiarasu, Perumal; Löhner, Rainlad; Wei, Guowei; Winkelstein, Beth A; Barocas, Victor H; Guilak, Farshid; Ku, Joy P; Hicks, Jennifer L; Delp, Scott L; Sacks, Michael; Weiss, Jeffrey A; Ateshian, Gerard A; Maas, Steve A; McCulloch, Andrew D; Peng, Grace C Y

    2018-02-01

    The role of computational modeling for biomechanics research and related clinical care will be increasingly prominent. The biomechanics community has been developing computational models routinely for exploration of the mechanics and mechanobiology of diverse biological structures. As a result, a large array of models, data, and discipline-specific simulation software has emerged to support endeavors in computational biomechanics. Sharing computational models and related data and simulation software has first become a utilitarian interest, and now, it is a necessity. Exchange of models, in support of knowledge exchange provided by scholarly publishing, has important implications. Specifically, model sharing can facilitate assessment of reproducibility in computational biomechanics and can provide an opportunity for repurposing and reuse, and a venue for medical training. The community's desire to investigate biological and biomechanical phenomena crossing multiple systems, scales, and physical domains, also motivates sharing of modeling resources as blending of models developed by domain experts will be a required step for comprehensive simulation studies as well as the enhancement of their rigor and reproducibility. The goal of this paper is to understand current perspectives in the biomechanics community for the sharing of computational models and related resources. Opinions on opportunities, challenges, and pathways to model sharing, particularly as part of the scholarly publishing workflow, were sought. A group of journal editors and a handful of investigators active in computational biomechanics were approached to collect short opinion pieces as a part of a larger effort of the IEEE EMBS Computational Biology and the Physiome Technical Committee to address model reproducibility through publications. A synthesis of these opinion pieces indicates that the community recognizes the necessity and usefulness of model sharing. There is a strong will to facilitate

  19. Large-scale digitizer system, analog converters

    International Nuclear Information System (INIS)

    Althaus, R.F.; Lee, K.L.; Kirsten, F.A.; Wagner, L.J.

    1976-10-01

    Analog to digital converter circuits that are based on the sharing of common resources, including those which are critical to the linearity and stability of the individual channels, are described. Simplicity of circuit composition is valued over other more costly approaches. These are intended to be applied in a large-scale processing and digitizing system for use with high-energy physics detectors such as drift-chambers or phototube-scintillator arrays. Signal distribution techniques are of paramount importance in maintaining adequate signal-to-noise ratio. Noise in both amplitude and time-jitter senses is held sufficiently low so that conversions with 10-bit charge resolution and 12-bit time resolution are achieved

  20. Sharing resources@CERN

    CERN Multimedia

    Maximilien Brice

    2002-01-01

    The library is launching a 'sharing resources@CERN' campaign, aiming to increase the library's utility by including the thousands of books bought by individual groups at CERN. This will improve sharing of information among CERN staff and users. Photo 01: L. to r. Eduardo Aldaz, from the PS division, Corrado Pettenati, Head Librarian, and Isabel Bejar, from the ST division, read their divisional copies of the same book.

  1. Cross-Jurisdictional Resource Sharing in Changing Public Health Landscape: Contributory Factors and Theoretical Explanations.

    Science.gov (United States)

    Shah, Gulzar H; Badana, Adrian N S; Robb, Claire; Livingood, William C

    2016-01-01

    Local health departments (LHDs) are striving to meet public health needs within their jurisdictions, amidst fiscal restraints and complex dynamic environment. Resource sharing across jurisdictions is a critical opportunity for LHDs to continue to enhance effectiveness and increase efficiency. This research examines the extent of cross-jurisdictional resource sharing among LHDs, the programmatic areas and organizational functions for which LHDs share resources, and LHD characteristics associated with resource sharing. Data from the National Association of County & City Health Officials' 2013 National Profile of LHDs were used. Descriptive statistics and multinomial logistic regression were performed for the 5 implementation-oriented outcome variables of interest, with 3 levels of implementation. More than 54% of LHDs shared resources such as funding, staff, or equipment with 1 or more other LHDs on a continuous, recurring basis. Results from the multinomial regression analysis indicate that economies of scale (population size and metropolitan status) had significant positive influences (at P ≤ .05) on resource sharing. Engagement in accreditation, community health assessment, community health improvement planning, quality improvement, and use of the Community Guide were associated with lower levels of engagement in resource sharing. Doctoral degree of the top executive and having 1 or more local boards of health carried a positive influence on resource sharing. Cross-jurisdictional resource sharing is a viable and commonly used process to overcome the challenges of new and emerging public health problems within the constraints of restricted budgets. LHDs, particularly smaller LHDs with limited resources, should consider increased resource sharing to address emerging challenges.

  2. Sharing resources@CERN

    CERN Multimedia

    2002-01-01

    The library is launching a 'sharing resources@CERN' campaign, aiming to increase the library's utility by including the thousands of books bought by individual groups at CERN. This will improve sharing of information among CERN staff and users. Until now many people were unaware that copies of the same book (or standard, or journal) are often held not only by the library but by different divisions. (Here Eduardo Aldaz, from the PS division, and Isabel Bejar, from the ST division, read their divisional copies of the same book.) The idea behind the library's new sharing resources@CERN' initiative is not at all to collect the books in individual collections at the CERN library, but simply to register them in the Library database. Those not belonging to the library will in principle be unavailable for loan, but should be able to be consulted by anybody at CERN who is interested. "When you need a book urgently and it is not available in the library,' said PS Division engineer Eduardo Aldaz Carroll, it is a sham...

  3. Assessing Programming Costs of Explicit Memory Localization on a Large Scale Shared Memory Multiprocessor

    Directory of Open Access Journals (Sweden)

    Silvio Picano

    1992-01-01

    Full Text Available We present detailed experimental work involving a commercially available large scale shared memory multiple instruction stream-multiple data stream (MIMD parallel computer having a software controlled cache coherence mechanism. To make effective use of such an architecture, the programmer is responsible for designing the program's structure to match the underlying multiprocessors capabilities. We describe the techniques used to exploit our multiprocessor (the BBN TC2000 on a network simulation program, showing the resulting performance gains and the associated programming costs. We show that an efficient implementation relies heavily on the user's ability to explicitly manage the memory system.

  4. Large-scale educational telecommunications systems for the US: An analysis of educational needs and technological opportunities

    Science.gov (United States)

    Morgan, R. P.; Singh, J. P.; Rothenberg, D.; Robinson, B. E.

    1975-01-01

    The needs to be served, the subsectors in which the system might be used, the technology employed, and the prospects for future utilization of an educational telecommunications delivery system are described and analyzed. Educational subsectors are analyzed with emphasis on the current status and trends within each subsector. Issues which affect future development, and prospects for future use of media, technology, and large-scale electronic delivery within each subsector are included. Information on technology utilization is presented. Educational telecommunications services are identified and grouped into categories: public television and radio, instructional television, computer aided instruction, computer resource sharing, and information resource sharing. Technology based services, their current utilization, and factors which affect future development are stressed. The role of communications satellites in providing these services is discussed. Efforts to analyze and estimate future utilization of large-scale educational telecommunications are summarized. Factors which affect future utilization are identified. Conclusions are presented.

  5. Large-scale resource sharing at public funded organizations. e-Human "Grid" Ecology.

    NARCIS (Netherlands)

    T.A. Knoch (Tobias); V. Baumgärtner (Volkmar); K.E. Egger (Kurt)

    2008-01-01

    textabstractWith ever-new technologies emerging also the amount of information to be stored and processed is growing exponentially and is believed to be always at the limit. In contrast, however, huge resources are available in the IT sector alike e.g. the renewable energy sector, which are often

  6. Large-scale resource sharing at public funded organizations. e-Human "Grid" Ecology.

    NARCIS (Netherlands)

    T.A. Knoch (Tobias); V. Baumgärtner (Volkmar); L.V. de Zeeuw (Luc); F.G. Grosveld (Frank); K.E. Egger (Kurt)

    2009-01-01

    textabstractWith ever-new technologies emerging also the amount of information to be stored and processed is growing exponentially and is believed to be always at the limit. In contrast, however, huge resources are available in the IT sector alike e.g. the renewable energy sector, which are often

  7. Resource sharing in libraries concepts, products, technologies, and trends

    CERN Document Server

    Breeding, Marshall

    2014-01-01

    Supplementing your local collection through resource sharing is a smart way to ensure your library has the resources to satisfy the needs of your users. Marshall Breeding's new Library Technology Report explores technologies and strategies for sharing resources, helping you streamline workflows and improve resource-sharing services by covering key strategies like interlibrary loan, consortial borrowing, document delivery, and shared collections. You'll also learn about such trends and services as:OCLC WorldCat Resource Sharing, and other systems that facilitate cooperative, reciprocal lendingS

  8. An integrated model for assessing both crop productivity and agricultural water resources at a large scale

    Science.gov (United States)

    Okada, M.; Sakurai, G.; Iizumi, T.; Yokozawa, M.

    2012-12-01

    Agricultural production utilizes regional resources (e.g. river water and ground water) as well as local resources (e.g. temperature, rainfall, solar energy). Future climate changes and increasing demand due to population increases and economic developments would intensively affect the availability of water resources for agricultural production. While many studies assessed the impacts of climate change on agriculture, there are few studies that dynamically account for changes in water resources and crop production. This study proposes an integrated model for assessing both crop productivity and agricultural water resources at a large scale. Also, the irrigation management to subseasonal variability in weather and crop response varies for each region and each crop. To deal with such variations, we used the Markov Chain Monte Carlo technique to quantify regional-specific parameters associated with crop growth and irrigation water estimations. We coupled a large-scale crop model (Sakurai et al. 2012), with a global water resources model, H08 (Hanasaki et al. 2008). The integrated model was consisting of five sub-models for the following processes: land surface, crop growth, river routing, reservoir operation, and anthropogenic water withdrawal. The land surface sub-model was based on a watershed hydrology model, SWAT (Neitsch et al. 2009). Surface and subsurface runoffs simulated by the land surface sub-model were input to the river routing sub-model of the H08 model. A part of regional water resources available for agriculture, simulated by the H08 model, was input as irrigation water to the land surface sub-model. The timing and amount of irrigation water was simulated at a daily step. The integrated model reproduced the observed streamflow in an individual watershed. Additionally, the model accurately reproduced the trends and interannual variations of crop yields. To demonstrate the usefulness of the integrated model, we compared two types of impact assessment of

  9. Scaling Law of Urban Ride Sharing

    Science.gov (United States)

    Tachet, R.; Sagarra, O.; Santi, P.; Resta, G.; Szell, M.; Strogatz, S. H.; Ratti, C.

    2017-03-01

    Sharing rides could drastically improve the efficiency of car and taxi transportation. Unleashing such potential, however, requires understanding how urban parameters affect the fraction of individual trips that can be shared, a quantity that we call shareability. Using data on millions of taxi trips in New York City, San Francisco, Singapore, and Vienna, we compute the shareability curves for each city, and find that a natural rescaling collapses them onto a single, universal curve. We explain this scaling law theoretically with a simple model that predicts the potential for ride sharing in any city, using a few basic urban quantities and no adjustable parameters. Accurate extrapolations of this type will help planners, transportation companies, and society at large to shape a sustainable path for urban growth.

  10. Highly Scalable Trip Grouping for Large Scale Collective Transportation Systems

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Pedersen, Torben Bach; Risch, Tore

    2008-01-01

    Transportation-related problems, like road congestion, parking, and pollution, are increasing in most cities. In order to reduce traffic, recent work has proposed methods for vehicle sharing, for example for sharing cabs by grouping "closeby" cab requests and thus minimizing transportation cost...... and utilizing cab space. However, the methods published so far do not scale to large data volumes, which is necessary to facilitate large-scale collective transportation systems, e.g., ride-sharing systems for large cities. This paper presents highly scalable trip grouping algorithms, which generalize previous...

  11. Sharing a common resource with concave benefits

    OpenAIRE

    Ambec, S.

    2006-01-01

    A group of agents enjoy concave and single-peak benefit functions from consuming a shared resource. They also value money (transfers). The resource is scarce in the sense that not everybody can consume its peak. The paper characterizes the unique (resource and money) allocation that is efficient, incentive compatible and equal-sharing individual rational. It can be implemented (i) by selling the resource or taxing extraction and redistributing the money collected equally, or (ii) by assigning...

  12. The resource curse: Analysis of the applicability to the large-scale export of electricity from renewable resources

    International Nuclear Information System (INIS)

    Eisgruber, Lasse

    2013-01-01

    The “resource curse” has been analyzed extensively in the context of non-renewable resources such as oil and gas. More recently commentators have expressed concerns that also renewable electricity exports can have adverse economic impacts on exporting countries. My paper analyzes to what extent the resource curse applies in the case of large-scale renewable electricity exports. I develop a “comprehensive model” that integrates previous works and provides a consolidated view of how non-renewable resource abundance impacts economic growth. Deploying this model I analyze through case studies on Laos, Mongolia, and the MENA region to what extent exporters of renewable electricity run into the danger of the resource curse. I find that renewable electricity exports avoid some disadvantages of non-renewable resource exports including (i) shocks after resource depletion; (ii) macroeconomic fluctuations; and (iii) competition for a fixed amount of resources. Nevertheless, renewable electricity exports bear some of the same risks as conventional resource exports including (i) crowding-out of the manufacturing sector; (ii) incentives for corruption; and (iii) reduced government accountability. I conclude with recommendations for managing such risks. - Highlights: ► Study analyzes whether the resource curse applies to renewable electricity export. ► I develop a “comprehensive model of the resource curse” and use cases for the analysis. ► Renewable electricity export avoids some disadvantages compared to other resources. ► Renewable electricity bears some of the same risks as conventional resources. ► Study concludes with recommendations for managing such risks

  13. Word Sense Disambiguation Based on Large Scale Polish CLARIN Heterogeneous Lexical Resources

    Directory of Open Access Journals (Sweden)

    Paweł Kędzia

    2015-12-01

    Full Text Available Word Sense Disambiguation Based on Large Scale Polish CLARIN Heterogeneous Lexical Resources Lexical resources can be applied in many different Natural Language Engineering tasks, but the most fundamental task is the recognition of word senses used in text contexts. The problem is difficult, not yet fully solved and different lexical resources provided varied support for it. Polish CLARIN lexical semantic resources are based on the plWordNet — a very large wordnet for Polish — as a central structure which is a basis for linking together several resources of different types. In this paper, several Word Sense Disambiguation (henceforth WSD methods developed for Polish that utilise plWordNet are discussed. Textual sense descriptions in the traditional lexicon can be compared with text contexts using Lesk’s algorithm in order to find best matching senses. In the case of a wordnet, lexico-semantic relations provide the main description of word senses. Thus, first, we adapted and applied to Polish a WSD method based on the Page Rank. According to it, text words are mapped on their senses in the plWordNet graph and Page Rank algorithm is run to find senses with the highest scores. The method presents results lower but comparable to those reported for English. The error analysis showed that the main problems are: fine grained sense distinctions in plWordNet and limited number of connections between words of different parts of speech. In the second approach plWordNet expanded with the mapping onto the SUMO ontology concepts was used. Two scenarios for WSD were investigated: two step disambiguation and disambiguation based on combined networks of plWordNet and SUMO. In the former scenario, words are first assigned SUMO concepts and next plWordNet senses are disambiguated. In latter, plWordNet and SUMO are combined in one large network used next for the disambiguation of senses. The additional knowledge sources used in WSD improved the performance

  14. A theoretical bilevel control scheme for power networks with large-scale penetration of distributed renewable resources

    DEFF Research Database (Denmark)

    Boroojeni, Kianoosh; Amini, M. Hadi; Nejadpak, Arash

    2016-01-01

    In this paper, we present a bilevel control framework to achieve a highly-reliable smart distribution network with large-scale penetration of distributed renewable resources (DRRs). We assume that the power distribution network consists of several residential/commercial communities. In the first ...

  15. Improving Large-scale Storage System Performance via Topology-aware and Balanced Data Placement

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Feiyi [ORNL; Oral, H Sarp [ORNL; Vazhkudai, Sudharshan S [ORNL

    2014-01-01

    With the advent of big data, the I/O subsystems of large-scale compute clusters are becoming a center of focus, with more applications putting greater demands on end-to-end I/O performance. These subsystems are often complex in design. They comprise of multiple hardware and software layers to cope with the increasing capacity, capability and scalability requirements of data intensive applications. The sharing nature of storage resources and the intrinsic interactions across these layers make it to realize user-level, end-to-end performance gains a great challenge. We propose a topology-aware resource load balancing strategy to improve per-application I/O performance. We demonstrate the effectiveness of our algorithm on an extreme-scale compute cluster, Titan, at the Oak Ridge Leadership Computing Facility (OLCF). Our experiments with both synthetic benchmarks and a real-world application show that, even under congestion, our proposed algorithm can improve large-scale application I/O performance significantly, resulting in both the reduction of application run times and higher resolution simulation runs.

  16. Implementation of a Shared Resource Financial Management System

    Science.gov (United States)

    Caldwell, T.; Gerlach, R.; Israel, M.; Bobin, S.

    2010-01-01

    CF-6 Norris Cotton Cancer Center (NCCC), an NCI-designated Comprehensive Cancer Center at Dartmouth Medical School, administers 12 Life Sciences Shared Resources. These resources are diverse and offer multiple products and services. Previous methods for tracking resource use, billing, and financial management were time consuming, error prone and lacked appropriate financial management tools. To address these problems, we developed and implemented a web-based application with a built-in authorization system that uses Perl, ModPerl, Apache2, and Oracle as the software infrastructure. The application uses a role-based system to differentiate administrative users with those requesting services and includes many features requested by users and administrators. To begin development, we chose a resource that had an uncomplicated service, a large number of users, and required the use of all of the applications features. The Molecular Biology Core Facility at NCCC fit these requirements and was used as a model for developing and testing the application. After model development, institution wide deployment followed a three-stage process. The first stage was to interview the resource manager and staff to understand day-to-day operations. At the second stage, we generated and tested customized forms defining resource services. During the third stage, we added new resource users and administrators to the system before final deployment. Twelve months after deployment, resource administrators reported that the new system performed well for internal and external billing and tracking resource utilization. Users preferred the application's web-based system for distribution of DNA sequencing and other data. The sample tracking features have enhanced day-to-day resource operations, and an on-line scheduling module for shared instruments has proven a much-needed utility. Principal investigators now are able to restrict user spending to specific accounts and have final approval of the

  17. Resource sharing in wireless networks: The SAPHYRE approach

    NARCIS (Netherlands)

    Jorswieck, E.A.; Badia, L.; Fahldieck, T.; Gesbert, D.; Gustafsson, S.; Haardt, M.; Ho, K.-M.; Karipidis, E.; Kortke, A.; Larsson, E.G.; Mark, H.; Nawrocki, M.; Piesiewicz, R.; Römer, F.; Schubert, M.; Sykora, J.; Trommelen, P.H.; Ende, B.D. van; Zorzi, M.

    2010-01-01

    Physical resource sharing between wireless operators and service providers is necessary in order to support efficient, competitive, and innovative wireless communication markets. By sharing resources, such as spectrum or infrastructure, which are usually exclusively allocated interference is created

  18. Topology Optimization of Large Scale Stokes Flow Problems

    DEFF Research Database (Denmark)

    Aage, Niels; Poulsen, Thomas Harpsøe; Gersborg-Hansen, Allan

    2008-01-01

    This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs.......This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs....

  19. Assured Resource Sharing in Ad-Hoc Collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Gail-Joon [Arizona State Univ., Tempe, AZ (United States)

    2015-12-19

    The project seeks an innovative framework to enable users to access and selectively share resources in distributed environments, enhancing the scalability of information sharing. We have investigated secure sharing & assurance approaches for ad-hoc collaboration, focused on Grids, Clouds, and ad-hoc network environments.

  20. Learning about water resource sharing through game play

    Directory of Open Access Journals (Sweden)

    T. Ewen

    2016-10-01

    Full Text Available Games are an optimal way to teach about water resource sharing, as they allow real-world scenarios to be enacted. Both students and professionals learning about water resource management can benefit from playing games, through the process of understanding both the complexity of sharing of resources between different groups and decision outcomes. Here we address how games can be used to teach about water resource sharing, through both playing and developing water games. An evaluation of using the web-based game Irrigania in the classroom setting, supported by feedback from several educators who have used Irrigania to teach about the sustainable use of water resources, and decision making, at university and high school levels, finds Irrigania to be an effective and easy tool to incorporate into a curriculum. The development of two water games in a course for masters students in geography is also presented as a way to teach and communicate about water resource sharing. Through game development, students learned soft skills, including critical thinking, problem solving, team work, and time management, and overall the process was found to be an effective way to learn about water resource decision outcomes. This paper concludes with a discussion of learning outcomes from both playing and developing water games.

  1. Learning about water resource sharing through game play

    Science.gov (United States)

    Ewen, Tracy; Seibert, Jan

    2016-10-01

    Games are an optimal way to teach about water resource sharing, as they allow real-world scenarios to be enacted. Both students and professionals learning about water resource management can benefit from playing games, through the process of understanding both the complexity of sharing of resources between different groups and decision outcomes. Here we address how games can be used to teach about water resource sharing, through both playing and developing water games. An evaluation of using the web-based game Irrigania in the classroom setting, supported by feedback from several educators who have used Irrigania to teach about the sustainable use of water resources, and decision making, at university and high school levels, finds Irrigania to be an effective and easy tool to incorporate into a curriculum. The development of two water games in a course for masters students in geography is also presented as a way to teach and communicate about water resource sharing. Through game development, students learned soft skills, including critical thinking, problem solving, team work, and time management, and overall the process was found to be an effective way to learn about water resource decision outcomes. This paper concludes with a discussion of learning outcomes from both playing and developing water games.

  2. Delivering the Goods: Scaling out Results of Natural Resource Management Research

    Directory of Open Access Journals (Sweden)

    Larry Harrington

    2002-01-01

    Full Text Available To help integrated natural resource management (INRM research "deliver the goods" for many of the world's poor over a large area and in a timely manner, the authors suggest a problem-solving approach that facilitates the scaling out of relevant agricultural practices. They propose seven ways to foster scaling out: (1 develop more attractive practices and technologies through participatory research (2 balance supply-driven approaches with resource user demands, (3 use feedback to redefine the research agenda, (4 encourage support groups and networks for information sharing, (5 facilitate negotiation among stakeholders, (6 inform policy change and institutional development, and (7 make sensible use of information management tools, including models and geographic information systems (GIS. They also draw on experiences in Mesoamerica, South Asia, and southern Africa to describe useful information management tools, including site similarity analyses, the linking of simulation models with GIS, and the use of farmer and land type categories.

  3. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  4. Group Clustering Mechanism for P2P Large Scale Data Sharing Collaboration

    Institute of Scientific and Technical Information of China (English)

    DENGQianni; LUXinda; CHENLi

    2005-01-01

    Research shows that P2P scientific collaboration network will exhibit small-world topology, as do a large number of social networks for which the same pattern has been documented. In this paper we propose a topology building protocol to benefit from the small world feature. We find that the idea of Freenet resembles the dynamic pattern of social interactions in scientific data sharing and the small world characteristic of Freenet is propitious to improve the file locating performance in scientificdata sharing. But the LRU (Least recently used) datas-tore cache replacement scheme of Freenet is not suitableto be used in scientific data sharing network. Based onthe group locality of scientific collaboration, we proposean enhanced group clustering cache replacement scheme.Simulation shows that this scheme improves the request hitratio dramatically while keeping the small average hops per successful request comparable to LRU.

  5. Sharing Resources in Open Educational Communities

    Directory of Open Access Journals (Sweden)

    Paolo Tosato

    2014-06-01

    Full Text Available The spread of Internet and the latest Web developments have promoted the relationships between teachers, learners and institutions, as well as the creation and sharing of new Open Educational Resources (OERs. Despite this fact, many projects and research efforts paid more attention to content distribution focusing on their format and description, omitting the relationship between these materials and online communities of teachers. In this article we emphasize the importance of sharing resources in open educational communities (OEC, analysing the role of OERs and OEC in teachers' lifelong learning. Investigating their current usage, we aim to discover whether their interweavings could be an effective approach to support sharing of resources among teachers and to promote new educational practices. Through two surveys which involved more than 300 teachers from across Europe it was possible to highlight that is not simple to stimulate the collaboration among teachers, both online and face to face; nevertheless, when this happens, it seems to be a good way to promote formal and informal learning for teachers, as well as innovation in their professional practices.

  6. The natural resources supply indexes study of the pig breeding scale in China

    Science.gov (United States)

    Leng, Bi-Bin; Zhang, Qi-Zhen; Ji, Xue-Qiang; Xu, Yue-Feng

    2017-08-01

    For the pollution problem of the pig breeding scale, we took three indexes as evaluation criterion, including arable land per capita, the water resource per capita and per capita share of grain. Then SPSS was used to synthesized the natural resources supply indexes of the pig breeding scale. The results show that with the fast development of technology and the steadily rising of grain production, the natural resources supply indexes of the pig breeding scale are raising constantly.

  7. Design principles of a resource sharing real-time-system

    International Nuclear Information System (INIS)

    Gliss, B.

    1978-01-01

    Criteria for developing a resource sharing real time system are given. Resource sharing necessitates extra precautions for guaranteeing stable operating conditions. Some relevant measures to insure reliability and maintainability of the system are discussed. (Auth.)

  8. Implementation of a large-scale hospital information infrastructure for multi-unit health-care services.

    Science.gov (United States)

    Yoo, Sun K; Kim, Dong Keun; Kim, Jung C; Park, Youn Jung; Chang, Byung Chul

    2008-01-01

    With the increase in demand for high quality medical services, the need for an innovative hospital information system has become essential. An improved system has been implemented in all hospital units of the Yonsei University Health System. Interoperability between multi-units required appropriate hardware infrastructure and software architecture. This large-scale hospital information system encompassed PACS (Picture Archiving and Communications Systems), EMR (Electronic Medical Records) and ERP (Enterprise Resource Planning). It involved two tertiary hospitals and 50 community hospitals. The monthly data production rate by the integrated hospital information system is about 1.8 TByte and the total quantity of data produced so far is about 60 TByte. Large scale information exchange and sharing will be particularly useful for telemedicine applications.

  9. The BioLexicon: a large-scale terminological resource for biomedical text mining

    Directory of Open Access Journals (Sweden)

    Thompson Paul

    2011-10-01

    Full Text Available Abstract Background Due to the rapidly expanding body of biomedical literature, biologists require increasingly sophisticated and efficient systems to help them to search for relevant information. Such systems should account for the multiple written variants used to represent biomedical concepts, and allow the user to search for specific pieces of knowledge (or events involving these concepts, e.g., protein-protein interactions. Such functionality requires access to detailed information about words used in the biomedical literature. Existing databases and ontologies often have a specific focus and are oriented towards human use. Consequently, biological knowledge is dispersed amongst many resources, which often do not attempt to account for the large and frequently changing set of variants that appear in the literature. Additionally, such resources typically do not provide information about how terms relate to each other in texts to describe events. Results This article provides an overview of the design, construction and evaluation of a large-scale lexical and conceptual resource for the biomedical domain, the BioLexicon. The resource can be exploited by text mining tools at several levels, e.g., part-of-speech tagging, recognition of biomedical entities, and the extraction of events in which they are involved. As such, the BioLexicon must account for real usage of words in biomedical texts. In particular, the BioLexicon gathers together different types of terms from several existing data resources into a single, unified repository, and augments them with new term variants automatically extracted from biomedical literature. Extraction of events is facilitated through the inclusion of biologically pertinent verbs (around which events are typically organized together with information about typical patterns of grammatical and semantic behaviour, which are acquired from domain-specific texts. In order to foster interoperability, the BioLexicon is

  10. The BioLexicon: a large-scale terminological resource for biomedical text mining

    Science.gov (United States)

    2011-01-01

    Background Due to the rapidly expanding body of biomedical literature, biologists require increasingly sophisticated and efficient systems to help them to search for relevant information. Such systems should account for the multiple written variants used to represent biomedical concepts, and allow the user to search for specific pieces of knowledge (or events) involving these concepts, e.g., protein-protein interactions. Such functionality requires access to detailed information about words used in the biomedical literature. Existing databases and ontologies often have a specific focus and are oriented towards human use. Consequently, biological knowledge is dispersed amongst many resources, which often do not attempt to account for the large and frequently changing set of variants that appear in the literature. Additionally, such resources typically do not provide information about how terms relate to each other in texts to describe events. Results This article provides an overview of the design, construction and evaluation of a large-scale lexical and conceptual resource for the biomedical domain, the BioLexicon. The resource can be exploited by text mining tools at several levels, e.g., part-of-speech tagging, recognition of biomedical entities, and the extraction of events in which they are involved. As such, the BioLexicon must account for real usage of words in biomedical texts. In particular, the BioLexicon gathers together different types of terms from several existing data resources into a single, unified repository, and augments them with new term variants automatically extracted from biomedical literature. Extraction of events is facilitated through the inclusion of biologically pertinent verbs (around which events are typically organized) together with information about typical patterns of grammatical and semantic behaviour, which are acquired from domain-specific texts. In order to foster interoperability, the BioLexicon is modelled using the Lexical

  11. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    Science.gov (United States)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  12. General-purpose computer networks and resource sharing in ERDA. Volume 3. Remote resource-sharing experience and findings

    Energy Technology Data Exchange (ETDEWEB)

    1977-07-15

    The investigation focused on heterogeneous networks in which a variety of dissimilar computers and operating systems were interconnected nationwide. Homogeneous networks, such as MFE net and SACNET, were not considered since they could not be used for general purpose resource sharing. Issues of privacy and security are of concern in any network activity. However, consideration of privacy and security of sensitive data arise to a much lesser degree in unclassified scientific research than in areas involving personal or proprietary information. Therefore, the existing mechanisms at individual sites for protecting sensitive data were relied on, and no new protection mechanisms to prevent infringement of privacy and security were attempted. Further development of ERDA networking will need to incorporate additional mechanisms to prevent infringement of privacy. The investigation itself furnishes an excellent example of computational resource sharing through a heterogeneous network. More than twenty persons, representing seven ERDA computing sites, made extensive use of both ERDA and non-ERDA computers in coordinating, compiling, and formatting the data which constitute the bulk of this report. Volume 3 analyzes the benefits and barriers encountered in actual resource sharing experience, and provides case histories of typical applications.

  13. Optimal defense resource allocation in scale-free networks

    Science.gov (United States)

    Zhang, Xuejun; Xu, Guoqiang; Xia, Yongxiang

    2018-02-01

    The robustness research of networked systems has drawn widespread attention in the past decade, and one of the central topics is to protect the network from external attacks through allocating appropriate defense resource to different nodes. In this paper, we apply a specific particle swarm optimization (PSO) algorithm to optimize the defense resource allocation in scale-free networks. Results reveal that PSO based resource allocation shows a higher robustness than other resource allocation strategies such as uniform, degree-proportional, and betweenness-proportional allocation strategies. Furthermore, we find that assigning less resource to middle-degree nodes under small-scale attack while more resource to low-degree nodes under large-scale attack is conductive to improving the network robustness. Our work provides an insight into the optimal defense resource allocation pattern in scale-free networks and is helpful for designing a more robust network.

  14. Disaster and Contingency Planning for Scientific Shared Resource Cores.

    Science.gov (United States)

    Mische, Sheenah; Wilkerson, Amy

    2016-04-01

    Progress in biomedical research is largely driven by improvements, innovations, and breakthroughs in technology, accelerating the research process, and an increasingly complex collaboration of both clinical and basic science. This increasing sophistication has driven the need for centralized shared resource cores ("cores") to serve the scientific community. From a biomedical research enterprise perspective, centralized resource cores are essential to increased scientific, operational, and cost effectiveness; however, the concentration of instrumentation and resources in the cores may render them highly vulnerable to damage from severe weather and other disasters. As such, protection of these assets and the ability to recover from a disaster is increasingly critical to the mission and success of the institution. Therefore, cores should develop and implement both disaster and business continuity plans and be an integral part of the institution's overall plans. Here we provide an overview of key elements required for core disaster and business continuity plans, guidance, and tools for developing these plans, and real-life lessons learned at a large research institution in the aftermath of Superstorm Sandy.

  15. Disaster and Contingency Planning for Scientific Shared Resource Cores

    Science.gov (United States)

    Wilkerson, Amy

    2016-01-01

    Progress in biomedical research is largely driven by improvements, innovations, and breakthroughs in technology, accelerating the research process, and an increasingly complex collaboration of both clinical and basic science. This increasing sophistication has driven the need for centralized shared resource cores (“cores”) to serve the scientific community. From a biomedical research enterprise perspective, centralized resource cores are essential to increased scientific, operational, and cost effectiveness; however, the concentration of instrumentation and resources in the cores may render them highly vulnerable to damage from severe weather and other disasters. As such, protection of these assets and the ability to recover from a disaster is increasingly critical to the mission and success of the institution. Therefore, cores should develop and implement both disaster and business continuity plans and be an integral part of the institution’s overall plans. Here we provide an overview of key elements required for core disaster and business continuity plans, guidance, and tools for developing these plans, and real-life lessons learned at a large research institution in the aftermath of Superstorm Sandy. PMID:26848285

  16. Gender differences and social ties effects in resource sharing

    NARCIS (Netherlands)

    d'Exelle, Ben; Riedl, Arno

    2016-01-01

    In rural areas in developing countries gender inequality tends to be severe which might have substantial welfare implications if it determines how scarce economic resources are shared between men and women. Therefore, it is important to know how gender influences resource sharing and - given the

  17. Tight Temporal Bounds for Dataflow Applications Mapped onto Shared Resources

    NARCIS (Netherlands)

    Alizadeh Ara, H.; Geilen, M.; Basten, T.; Behrouzian, A.R.B.; Hendriks, M.; Goswami, D.

    2016-01-01

    We present an analysis method that provides tight temporal bounds for applications modeled by Synchronous Dataflow Graphs and mapped to shared resources. We consider the resource sharing effects on the temporal behaviour of the application by embedding worst case resource availability curves in the

  18. Application of cooperative and non-cooperative games in large-scale water quantity and quality management: a case study.

    Science.gov (United States)

    Mahjouri, Najmeh; Ardestani, Mojtaba

    2011-01-01

    In this paper, two cooperative and non-cooperative methodologies are developed for a large-scale water allocation problem in Southern Iran. The water shares of the water users and their net benefits are determined using optimization models having economic objectives with respect to the physical and environmental constraints of the system. The results of the two methodologies are compared based on the total obtained economic benefit, and the role of cooperation in utilizing a shared water resource is demonstrated. In both cases, the water quality in rivers satisfies the standards. Comparing the results of the two mentioned approaches shows the importance of acting cooperatively to achieve maximum revenue in utilizing a surface water resource while the river water quantity and quality issues are addressed.

  19. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  20. Findings and Challenges in Fine-Resolution Large-Scale Hydrological Modeling

    Science.gov (United States)

    Her, Y. G.

    2017-12-01

    Fine-resolution large-scale (FL) modeling can provide the overall picture of the hydrological cycle and transport while taking into account unique local conditions in the simulation. It can also help develop water resources management plans consistent across spatial scales by describing the spatial consequences of decisions and hydrological events extensively. FL modeling is expected to be common in the near future as global-scale remotely sensed data are emerging, and computing resources have been advanced rapidly. There are several spatially distributed models available for hydrological analyses. Some of them rely on numerical methods such as finite difference/element methods (FDM/FEM), which require excessive computing resources (implicit scheme) to manipulate large matrices or small simulation time intervals (explicit scheme) to maintain the stability of the solution, to describe two-dimensional overland processes. Others make unrealistic assumptions such as constant overland flow velocity to reduce the computational loads of the simulation. Thus, simulation efficiency often comes at the expense of precision and reliability in FL modeling. Here, we introduce a new FL continuous hydrological model and its application to four watersheds in different landscapes and sizes from 3.5 km2 to 2,800 km2 at the spatial resolution of 30 m on an hourly basis. The model provided acceptable accuracy statistics in reproducing hydrological observations made in the watersheds. The modeling outputs including the maps of simulated travel time, runoff depth, soil water content, and groundwater recharge, were animated, visualizing the dynamics of hydrological processes occurring in the watersheds during and between storm events. Findings and challenges were discussed in the context of modeling efficiency, accuracy, and reproducibility, which we found can be improved by employing advanced computing techniques and hydrological understandings, by using remotely sensed hydrological

  1. Task sharing in Zambia: HIV service scale-up compounds the human resource crisis

    Directory of Open Access Journals (Sweden)

    Simbaya Joseph

    2010-09-01

    Full Text Available Abstract Background Considerable attention has been given by policy makers and researchers to the human resources for health crisis in Africa. However, little attention has been paid to quantifying health facility-level trends in health worker numbers, distribution and workload, despite growing demands on health workers due to the availability of new funds for HIV/AIDS control scale-up. This study analyses and reports trends in HIV and non-HIV ambulatory service workloads on clinical staff in urban and rural district level facilities. Methods Structured surveys of health facility managers, and health services covering 2005-07 were conducted in three districts of Zambia in 2008 (two urban and one rural, to fill this evidence gap. Intra-facility analyses were conducted, comparing trends in HIV and non-HIV service utilisation with staff trends. Results Clinical staff (doctors, nurses and nurse-midwives, and clinical officers numbers and staff population densities fell slightly, with lower ratios of staff to population in the rural district. The ratios of antenatal care and family planning registrants to nurses/nurse-midwives were highest at baseline and increased further at the rural facilities over the three years, while daily outpatient department (OPD workload in urban facilities fell below that in rural facilities. HIV workload, as measured by numbers of clients receiving antiretroviral treatment (ART and prevention of mother to child transmission (PMTCT per facility staff member, was highest in the capital city, but increased rapidly in all three districts. The analysis suggests evidence of task sharing, in that staff designated by managers as ART and PMTCT workers made up a higher proportion of frontline service providers by 2007. Conclusions This analysis of workforce patterns across 30 facilities in three districts of Zambia illustrates that the remarkable achievements in scaling-up HIV/AIDS service delivery has been on the back of

  2. Task sharing in Zambia: HIV service scale-up compounds the human resource crisis

    LENUS (Irish Health Repository)

    Walsh, Aisling

    2010-09-17

    Abstract Background Considerable attention has been given by policy makers and researchers to the human resources for health crisis in Africa. However, little attention has been paid to quantifying health facility-level trends in health worker numbers, distribution and workload, despite growing demands on health workers due to the availability of new funds for HIV\\/AIDS control scale-up. This study analyses and reports trends in HIV and non-HIV ambulatory service workloads on clinical staff in urban and rural district level facilities. Methods Structured surveys of health facility managers, and health services covering 2005-07 were conducted in three districts of Zambia in 2008 (two urban and one rural), to fill this evidence gap. Intra-facility analyses were conducted, comparing trends in HIV and non-HIV service utilisation with staff trends. Results Clinical staff (doctors, nurses and nurse-midwives, and clinical officers) numbers and staff population densities fell slightly, with lower ratios of staff to population in the rural district. The ratios of antenatal care and family planning registrants to nurses\\/nurse-midwives were highest at baseline and increased further at the rural facilities over the three years, while daily outpatient department (OPD) workload in urban facilities fell below that in rural facilities. HIV workload, as measured by numbers of clients receiving antiretroviral treatment (ART) and prevention of mother to child transmission (PMTCT) per facility staff member, was highest in the capital city, but increased rapidly in all three districts. The analysis suggests evidence of task sharing, in that staff designated by managers as ART and PMTCT workers made up a higher proportion of frontline service providers by 2007. Conclusions This analysis of workforce patterns across 30 facilities in three districts of Zambia illustrates that the remarkable achievements in scaling-up HIV\\/AIDS service delivery has been on the back of sustained non

  3. Task sharing in Zambia: HIV service scale-up compounds the human resource crisis.

    Science.gov (United States)

    Walsh, Aisling; Ndubani, Phillimon; Simbaya, Joseph; Dicker, Patrick; Brugha, Ruairí

    2010-09-17

    Considerable attention has been given by policy makers and researchers to the human resources for health crisis in Africa. However, little attention has been paid to quantifying health facility-level trends in health worker numbers, distribution and workload, despite growing demands on health workers due to the availability of new funds for HIV/AIDS control scale-up. This study analyses and reports trends in HIV and non-HIV ambulatory service workloads on clinical staff in urban and rural district level facilities. Structured surveys of health facility managers, and health services covering 2005-07 were conducted in three districts of Zambia in 2008 (two urban and one rural), to fill this evidence gap. Intra-facility analyses were conducted, comparing trends in HIV and non-HIV service utilisation with staff trends. Clinical staff (doctors, nurses and nurse-midwives, and clinical officers) numbers and staff population densities fell slightly, with lower ratios of staff to population in the rural district. The ratios of antenatal care and family planning registrants to nurses/nurse-midwives were highest at baseline and increased further at the rural facilities over the three years, while daily outpatient department (OPD) workload in urban facilities fell below that in rural facilities. HIV workload, as measured by numbers of clients receiving antiretroviral treatment (ART) and prevention of mother to child transmission (PMTCT) per facility staff member, was highest in the capital city, but increased rapidly in all three districts. The analysis suggests evidence of task sharing, in that staff designated by managers as ART and PMTCT workers made up a higher proportion of frontline service providers by 2007. This analysis of workforce patterns across 30 facilities in three districts of Zambia illustrates that the remarkable achievements in scaling-up HIV/AIDS service delivery has been on the back of sustained non-HIV workload levels, increasing HIV workload and stagnant

  4. Shared-resource computing for small research labs.

    Science.gov (United States)

    Ackerman, M J

    1982-04-01

    A real time laboratory computer network is described. This network is composed of four real-time laboratory minicomputers located in each of four division laboratories and a larger minicomputer in a centrally located computer room. Off the shelf hardware and software were used with no customization. The network is configured for resource sharing using DECnet communications software and the RSX-11-M multi-user real-time operating system. The cost effectiveness of the shared resource network and multiple real-time processing using priority scheduling is discussed. Examples of utilization within a medical research department are given.

  5. Large-Scale Unsupervised Hashing with Shared Structure Learning.

    Science.gov (United States)

    Liu, Xianglong; Mu, Yadong; Zhang, Danchen; Lang, Bo; Li, Xuelong

    2015-09-01

    Hashing methods are effective in generating compact binary signatures for images and videos. This paper addresses an important open issue in the literature, i.e., how to learn compact hash codes by enhancing the complementarity among different hash functions. Most of prior studies solve this problem either by adopting time-consuming sequential learning algorithms or by generating the hash functions which are subject to some deliberately-designed constraints (e.g., enforcing hash functions orthogonal to one another). We analyze the drawbacks of past works and propose a new solution to this problem. Our idea is to decompose the feature space into a subspace shared by all hash functions and its complementary subspace. On one hand, the shared subspace, corresponding to the common structure across different hash functions, conveys most relevant information for the hashing task. Similar to data de-noising, irrelevant information is explicitly suppressed during hash function generation. On the other hand, in case that the complementary subspace also contains useful information for specific hash functions, the final form of our proposed hashing scheme is a compromise between these two kinds of subspaces. To make hash functions not only preserve the local neighborhood structure but also capture the global cluster distribution of the whole data, an objective function incorporating spectral embedding loss, binary quantization loss, and shared subspace contribution is introduced to guide the hash function learning. We propose an efficient alternating optimization method to simultaneously learn both the shared structure and the hash functions. Experimental results on three well-known benchmarks CIFAR-10, NUS-WIDE, and a-TRECVID demonstrate that our approach significantly outperforms state-of-the-art hashing methods.

  6. Assessing Knowledge Sharing Among Academics: A Validation of the Knowledge Sharing Behavior Scale (KSBS).

    Science.gov (United States)

    Ramayah, T; Yeap, Jasmine A L; Ignatius, Joshua

    2014-04-01

    There is a belief that academics tend to hold on tightly to their knowledge and intellectual resources. However, not much effort has been put into the creation of a valid and reliable instrument to measure knowledge sharing behavior among the academics. To apply and validate the Knowledge Sharing Behavior Scale (KSBS) as a measure of knowledge sharing behavior within the academic community. Respondents (N = 447) were academics from arts and science streams in 10 local, public universities in Malaysia. Data were collected using the 28-item KSBS that assessed four dimensions of knowledge sharing behavior namely written contributions, organizational communications, personal interactions, and communities of practice. The exploratory factor analysis showed that the items loaded on the dimension constructs that they were supposed to represent, thus proving construct validity. A within-factor analysis revealed that each set of items representing their intended dimension loaded on only one construct, therefore establishing convergent validity. All four dimensions were not perfectly correlated with each other or organizational citizenship behavior, thereby proving discriminant validity. However, all four dimensions correlated with organizational commitment, thus confirming predictive validity. Furthermore, all four factors correlated with both tacit and explicit sharing, which confirmed their concurrent validity. All measures also possessed sufficient reliability (α > .70). The KSBS is a valid and reliable instrument that can be used to formally assess the types of knowledge artifacts residing among academics and the degree of knowledge sharing in relation to those artifacts. © The Author(s) 2014.

  7. Indirect Reciprocity, Resource Sharing, and Environmental Risk: Evidence from Field Experiments in Siberia

    Science.gov (United States)

    Howe, E. Lance; Murphy, James J.; Gerkey, Drew; West, Colin Thor

    2016-01-01

    Integrating information from existing research, qualitative ethnographic interviews, and participant observation, we designed a field experiment that introduces idiosyncratic environmental risk and a voluntary sharing decision into a standard public goods game. Conducted with subsistence resource users in rural villages on the Kamchatka Peninsula in Northeast Siberia, we find evidence consistent with a model of indirect reciprocity and local social norms of helping the needy. When participants are allowed to develop reputations in the experiments, as is the case in most small-scale societies, we find that sharing is increasingly directed toward individuals experiencing hardship, good reputations increase aid, and the pooling of resources through voluntary sharing becomes more effective. We also find high levels of voluntary sharing without a strong commitment device; however, this form of cooperation does not increase contributions to the public good. Our results are consistent with previous experiments and theoretical models, suggesting strategic risks tied to rewards, punishments, and reputations are important. However, unlike studies that focus solely on strategic risks, we find the effects of rewards, punishments, and reputations are altered by the presence of environmental factors. Unexpected changes in resource abundance increase interdependence and may alter the costs and benefits of cooperation, relative to defection. We suggest environmental factors that increase interdependence are critically important to consider when developing and testing theories of cooperation PMID:27442434

  8. Developing A Large-Scale, Collaborative, Productive Geoscience Education Network

    Science.gov (United States)

    Manduca, C. A.; Bralower, T. J.; Egger, A. E.; Fox, S.; Ledley, T. S.; Macdonald, H.; Mcconnell, D. A.; Mogk, D. W.; Tewksbury, B. J.

    2012-12-01

    Over the past 15 years, the geoscience education community has grown substantially and developed broad and deep capacity for collaboration and dissemination of ideas. While this community is best viewed as emergent from complex interactions among changing educational needs and opportunities, we highlight the role of several large projects in the development of a network within this community. In the 1990s, three NSF projects came together to build a robust web infrastructure to support the production and dissemination of on-line resources: On The Cutting Edge (OTCE), Earth Exploration Toolbook, and Starting Point: Teaching Introductory Geoscience. Along with the contemporaneous Digital Library for Earth System Education, these projects engaged geoscience educators nationwide in exploring professional development experiences that produced lasting on-line resources, collaborative authoring of resources, and models for web-based support for geoscience teaching. As a result, a culture developed in the 2000s in which geoscience educators anticipated that resources for geoscience teaching would be shared broadly and that collaborative authoring would be productive and engaging. By this time, a diverse set of examples demonstrated the power of the web infrastructure in supporting collaboration, dissemination and professional development . Building on this foundation, more recent work has expanded both the size of the network and the scope of its work. Many large research projects initiated collaborations to disseminate resources supporting educational use of their data. Research results from the rapidly expanding geoscience education research community were integrated into the Pedagogies in Action website and OTCE. Projects engaged faculty across the nation in large-scale data collection and educational research. The Climate Literacy and Energy Awareness Network and OTCE engaged community members in reviewing the expanding body of on-line resources. Building Strong

  9. Governance of global health research consortia: Sharing sovereignty and resources within Future Health Systems.

    Science.gov (United States)

    Pratt, Bridget; Hyder, Adnan A

    2017-02-01

    Global health research partnerships are increasingly taking the form of consortia that conduct programs of research in low and middle-income countries (LMICs). An ethical framework has been developed that describes how the governance of consortia comprised of institutions from high-income countries and LMICs should be structured to promote health equity. It encompasses initial guidance for sharing sovereignty in consortia decision-making and sharing consortia resources. This paper describes a first effort to examine whether and how consortia can uphold that guidance. Case study research was undertaken with the Future Health Systems consortium, performs research to improve health service delivery for the poor in Bangladesh, China, India, and Uganda. Data were thematically analysed and revealed that proposed ethical requirements for sharing sovereignty and sharing resources are largely upheld by Future Health Systems. Facilitating factors included having a decentralised governance model, LMIC partners with good research capacity, and firm budgets. Higher labour costs in the US and UK and the funder's policy of allocating funds to consortia on a reimbursement basis prevented full alignment with guidance on sharing resources. The lessons described in this paper can assist other consortia to more systematically link their governance policy and practice to the promotion of health equity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. The demands and resources arising from shared office spaces.

    Science.gov (United States)

    Morrison, Rachel L; Macky, Keith A

    2017-04-01

    The prevalence of flexible and shared office spaces is increasing significantly, yet the socioemotional outcomes associated with these environments are under researched. Utilising the job demands-resources (JD-R) model we investigate both the demands and the resources that can accrue to workers as a result of shared work environments and hot-desking. Data were collected from work experienced respondents (n = 1000) assessing the extent to which they shared their office space with others, along with demands comprising distractions, uncooperative behaviours, distrust, and negative relationships, and resources from co-worker friendships and supervisor support. We found that, as work environments became more shared (with hot-desking being at the extreme end of the continuum), not only were there increases in demands, but co-worker friendships were not improved and perceptions of supervisory support decreased. Findings are discussed in relation to employee well-being and recommendations are made regarding how best to ameliorate negative consequences of shared work environments. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Human resource management practices stimulating knowledge sharing

    Directory of Open Access Journals (Sweden)

    Matošková Jana

    2017-12-01

    Full Text Available The major goal of the paper was to develop a theoretical framework that conceptualizes the indirect impact on human resource management practice on knowledge sharing in the organization. In the current competitive environment, the ability to use knowledge assets and to continuously renovate it is required for organizational success. Therefore, the field of human resource management should dedicate great effort to understanding how to enhance the knowledge flows within the organization. Theoretical indications were provided about HRM practices that influence the quality and quantity of knowledge sharing within an organization. Further, a conceptual model of relations between HRM practices and factors influencing knowledge sharing within an organization was introduced. It is supposed that HRM practices have direct impacts on personality traits of employees, organizational culture, characteristics of managers, and instruments used for knowledge sharing. Subsequently, these factors have direct effects on the perceived intensity of knowledge sharing. The paper offers 12 testable propositions for the indirect relation between HRM practices and knowledge sharing in the organization. The suggested model could assist future research to examine the influence of HRM practices upon managing knowledge is a more complex way. Via a theoretical contribution to the debate on the influence on HRM practices upon managing knowledge, the study contributes to further research development in this field.

  12. Data management strategies for multinational large-scale systems biology projects.

    Science.gov (United States)

    Wruck, Wasco; Peuker, Martin; Regenbrecht, Christian R A

    2014-01-01

    Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don't Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects.

  13. The Climate-G testbed: towards a large scale data sharing environment for climate change

    Science.gov (United States)

    Aloisio, G.; Fiore, S.; Denvil, S.; Petitdidier, M.; Fox, P.; Schwichtenberg, H.; Blower, J.; Barbera, R.

    2009-04-01

    The Climate-G testbed provides an experimental large scale data environment for climate change addressing challenging data and metadata management issues. The main scope of Climate-G is to allow scientists to carry out geographical and cross-institutional climate data discovery, access, visualization and sharing. Climate-G is a multidisciplinary collaboration involving both climate and computer scientists and it currently involves several partners such as: Centro Euro-Mediterraneo per i Cambiamenti Climatici (CMCC), Institut Pierre-Simon Laplace (IPSL), Fraunhofer Institut für Algorithmen und Wissenschaftliches Rechnen (SCAI), National Center for Atmospheric Research (NCAR), University of Reading, University of Catania and University of Salento. To perform distributed metadata search and discovery, we adopted a CMCC metadata solution (which provides a high level of scalability, transparency, fault tolerance and autonomy) leveraging both on P2P and grid technologies (GRelC Data Access and Integration Service). Moreover, data are available through OPeNDAP/THREDDS services, Live Access Server as well as the OGC compliant Web Map Service and they can be downloaded, visualized, accessed into the proposed environment through the Climate-G Data Distribution Centre (DDC), the web gateway to the Climate-G digital library. The DDC is a data-grid portal allowing users to easily, securely and transparently perform search/discovery, metadata management, data access, data visualization, etc. Godiva2 (integrated into the DDC) displays 2D maps (and animations) and also exports maps for display on the Google Earth virtual globe. Presently, Climate-G publishes (through the DDC) about 2TB of data related to the ENSEMBLES project (also including distributed replicas of data) as well as to the IPCC AR4. The main results of the proposed work are: wide data access/sharing environment for climate change; P2P/grid metadata approach; production-level Climate-G DDC; high quality tools for

  14. Theme II Joint Work Plan -2017 Collaboration and Knowledge Sharing on Large-scale Demonstration Projects

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xiaoliang [World Resources Inst. (WRI), Washington, DC (United States); Stauffer, Philip H. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-25

    This effort is designed to expedite learnings from existing and planned large demonstration projects and their associated research through effective knowledge sharing among participants in the US and China.

  15. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    Science.gov (United States)

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. 38 CFR 17.240 - Sharing specialized medical resources.

    Science.gov (United States)

    2010-07-01

    ..., agreements may be entered into for sharing medical resources with other hospitals, including State or local, public or private hospitals or other medical installations having hospital facilities or organ banks... medical resources, incidental hospital care or other needed services, supplies used, and normal...

  17. Resource Sharing in the Logistics of the Offshore Wind Farm Installation Process based on a Simulation Study

    Directory of Open Access Journals (Sweden)

    Thies Beinke

    2017-06-01

    Full Text Available This present contribution examines by means of a discrete event and agent-based simulation the potential of a joint use of resources in the installation phase of offshore wind energy. To this end, wind farm projects to be installed simultaneously are being examined, the impact of weather restrictions on the processes of loading, transport and installation are also taken into consideration, and both the wind farm specific resource allocation and the approach of a resource pool or resource sharing, respectively, are being implemented. This study is motivated by the large number of wind farms that will be installed in the future and by the potential savings that might be realized through resource sharing. While, so far, the main driver of the resource sharing approach has been the end consumer market, it has been applied in more and more areas, even in relatively conservative industries such as logistics. After the presentation of the backgrounds and of the underlying methodology, and the description of the prior art in this context, the network of the offshore wind energy installation phase will be described. This is the basis for the subsequent determination of the savings potential of a shared resource utilization, which is determined by the performance indicators such as the total installation time and degree of utilization of the resources. The results of the simulation show that weather restrictions have a significant effect on the installation times and the usage times of the resources as well as on their degree of utilization. In addition, the resource sharing approach, has been identified to have significant savings potential for the offshore wind energy installation.

  18. Preparing laboratory and real-world EEG data for large-scale analysis: A containerized approach

    Directory of Open Access Journals (Sweden)

    Nima eBigdely-Shamlo

    2016-03-01

    Full Text Available Large-scale analysis of EEG and other physiological measures promises new insights into brain processes and more accurate and robust brain-computer interface (BCI models.. However, the absence of standard-ized vocabularies for annotating events in a machine understandable manner, the welter of collection-specific data organizations, the diffi-culty in moving data across processing platforms, and the unavailability of agreed-upon standards for preprocessing have prevented large-scale analyses of EEG. Here we describe a containerized approach and freely available tools we have developed to facilitate the process of an-notating, packaging, and preprocessing EEG data collections to enable data sharing, archiving, large-scale machine learning/data mining and (meta-analysis. The EEG Study Schema (ESS comprises three data Levels, each with its own XML-document schema and file/folder convention, plus a standardized (PREP pipeline to move raw (Data Level 1 data to a basic preprocessed state (Data Level 2 suitable for application of a large class of EEG analysis methods. Researchers can ship a study as a single unit and operate on its data using a standardized interface. ESS does not require a central database and provides all the metadata data necessary to execute a wide variety of EEG processing pipelines. The primary focus of ESS is automated in-depth analysis and meta-analysis EEG studies. However, ESS can also encapsulate meta-information for the other modalities such as eye tracking, that are in-creasingly used in both laboratory and real-world neuroimaging. ESS schema and tools are freely available at eegstudy.org, and a central cata-log of over 850 GB of existing data in ESS format is available at study-catalog.org. These tools and resources are part of a larger effort to ena-ble data sharing at sufficient scale for researchers to engage in truly large-scale EEG analysis and data mining (BigEEG.org.

  19. HOKES/POKES : Light-weight resource sharing

    NARCIS (Netherlands)

    Bos, Herbert; Samwel, Bart

    2003-01-01

    In this paper, we explain mechanisms for providing embedded network processors and other low-level programming environments with light-weight support for safe resource sharing. The solution consists of a host part, known as HOKES, and a network processor part, known as POKES. As common operating

  20. Backup flexibility classes in emerging large-scale renewable electricity systems

    International Nuclear Information System (INIS)

    Schlachtberger, D.P.; Becker, S.; Schramm, S.; Greiner, M.

    2016-01-01

    Highlights: • Flexible backup demand in a European wind and solar based power system is modelled. • Three flexibility classes are defined based on production and consumption timescales. • Seasonal backup capacities are shown to be only used below 50% renewable penetration. • Large-scale transmission between countries can reduce fast flexible capacities. - Abstract: High shares of intermittent renewable power generation in a European electricity system will require flexible backup power generation on the dominant diurnal, synoptic, and seasonal weather timescales. The same three timescales are already covered by today’s dispatchable electricity generation facilities, which are able to follow the typical load variations on the intra-day, intra-week, and seasonal timescales. This work aims to quantify the changing demand for those three backup flexibility classes in emerging large-scale electricity systems, as they transform from low to high shares of variable renewable power generation. A weather-driven modelling is used, which aggregates eight years of wind and solar power generation data as well as load data over Germany and Europe, and splits the backup system required to cover the residual load into three flexibility classes distinguished by their respective maximum rates of change of power output. This modelling shows that the slowly flexible backup system is dominant at low renewable shares, but its optimized capacity decreases and drops close to zero once the average renewable power generation exceeds 50% of the mean load. The medium flexible backup capacities increase for modest renewable shares, peak at around a 40% renewable share, and then continuously decrease to almost zero once the average renewable power generation becomes larger than 100% of the mean load. The dispatch capacity of the highly flexible backup system becomes dominant for renewable shares beyond 50%, and reach their maximum around a 70% renewable share. For renewable shares

  1. Biodiversity, extinctions, and evolution of ecosystems with shared resources

    Science.gov (United States)

    Kozlov, Vladimir; Vakulenko, Sergey; Wennergren, Uno

    2017-03-01

    We investigate the formation of stable ecological networks where many species share the same resource. We show that such a stable ecosystem naturally occurs as a result of extinctions. We obtain an analytical relation for the number of coexisting species, and we find a relation describing how many species that may become extinct as a result of a sharp environmental change. We introduce a special parameter that is a combination of species traits and resource characteristics used in the model formulation. This parameter describes the pressure on the system to converge, by extinctions. When that stress parameter is large, we obtain that the species traits are concentrated at certain values. This stress parameter is thereby a parameter that determines the level of final biodiversity of the system. Moreover, we show that the dynamics of this limit system can be described by simple differential equations.

  2. The scale concept and sustainable development: implications on the energetics and water resources

    International Nuclear Information System (INIS)

    Demanboro, Antonio Carlos; Mariotoni, Carlos Alberto

    1999-01-01

    The relationships between both the demographic growth and the water and energetic resources are focused. The planet scale and carrying capacity are discussed starting from the maximum and optimum sustainable concepts, both anthropocentric and biocentric. Two scenarios denominated 'sustainable agriculture' and 'sharing-water' are elaborated with the available resources of water, fertile lands and energy consumption, and with the population trends. (author)

  3. Concepts for Large Scale Hydrogen Production

    OpenAIRE

    Jakobsen, Daniel; Åtland, Vegar

    2016-01-01

    The objective of this thesis is to perform a techno-economic analysis of large-scale, carbon-lean hydrogen production in Norway, in order to evaluate various production methods and estimate a breakeven price level. Norway possesses vast energy resources and the export of oil and gas is vital to the country s economy. The results of this thesis indicate that hydrogen represents a viable, carbon-lean opportunity to utilize these resources, which can prove key in the future of Norwegian energy e...

  4. Institutional shared resources and translational cancer research

    Directory of Open Access Journals (Sweden)

    De Paoli Paolo

    2009-06-01

    Full Text Available Abstract The development and maintenance of adequate shared infrastructures is considered a major goal for academic centers promoting translational research programs. Among infrastructures favoring translational research, centralized facilities characterized by shared, multidisciplinary use of expensive laboratory instrumentation, or by complex computer hardware and software and/or by high professional skills are necessary to maintain or improve institutional scientific competitiveness. The success or failure of a shared resource program also depends on the choice of appropriate institutional policies and requires an effective institutional governance regarding decisions on staffing, existence and composition of advisory committees, policies and of defined mechanisms of reporting, budgeting and financial support of each resource. Shared Resources represent a widely diffused model to sustain cancer research; in fact, web sites from an impressive number of research Institutes and Universities in the U.S. contain pages dedicated to the SR that have been established in each Center, making a complete view of the situation impossible. However, a nation-wide overview of how Cancer Centers develop SR programs is available on the web site for NCI-designated Cancer Centers in the U.S., while in Europe, information is available for individual Cancer centers. This article will briefly summarize the institutional policies, the organizational needs, the characteristics, scientific aims, and future developments of SRs necessary to develop effective translational research programs in oncology. In fact, the physical build-up of SRs per se is not sufficient for the successful translation of biomedical research. Appropriate policies to improve the academic culture in collaboration, the availability of educational programs for translational investigators, the existence of administrative facilitations for translational research and an efficient organization

  5. Institutional shared resources and translational cancer research.

    Science.gov (United States)

    De Paoli, Paolo

    2009-06-29

    The development and maintenance of adequate shared infrastructures is considered a major goal for academic centers promoting translational research programs. Among infrastructures favoring translational research, centralized facilities characterized by shared, multidisciplinary use of expensive laboratory instrumentation, or by complex computer hardware and software and/or by high professional skills are necessary to maintain or improve institutional scientific competitiveness. The success or failure of a shared resource program also depends on the choice of appropriate institutional policies and requires an effective institutional governance regarding decisions on staffing, existence and composition of advisory committees, policies and of defined mechanisms of reporting, budgeting and financial support of each resource. Shared Resources represent a widely diffused model to sustain cancer research; in fact, web sites from an impressive number of research Institutes and Universities in the U.S. contain pages dedicated to the SR that have been established in each Center, making a complete view of the situation impossible. However, a nation-wide overview of how Cancer Centers develop SR programs is available on the web site for NCI-designated Cancer Centers in the U.S., while in Europe, information is available for individual Cancer centers. This article will briefly summarize the institutional policies, the organizational needs, the characteristics, scientific aims, and future developments of SRs necessary to develop effective translational research programs in oncology.In fact, the physical build-up of SRs per se is not sufficient for the successful translation of biomedical research. Appropriate policies to improve the academic culture in collaboration, the availability of educational programs for translational investigators, the existence of administrative facilitations for translational research and an efficient organization supporting clinical trial recruitment

  6. Macro-economic impact of large-scale deployment of biomass resources for energy and materials on a national level—A combined approach for the Netherlands

    International Nuclear Information System (INIS)

    Hoefnagels, Ric; Banse, Martin; Dornburg, Veronika; Faaij, André

    2013-01-01

    Biomass is considered one of the most important options in the transition to a sustainable energy system with reduced greenhouse gas (GHG) emissions and increased security of enegry supply. In order to facilitate this transition with targeted policies and implementation strategies, it is of vital importance to understand the economic benefits, uncertainties and risks of this transition. This article presents a quantification of the economic impacts on value added, employment shares and the trade balance as well as required biomass and avoided primary energy and greenhouse gases related to large scale biomass deployment on a country level (the Netherlands) for different future scenarios to 2030. This is done by using the macro-economic computable general equilibrium (CGE) model LEITAP, capable of quantifying direct and indirect effects of a bio-based economy combined with a spread sheet tool to address underlying technological details. Although the combined approach has limitations, the results of the projections show that substitution of fossil energy carriers by biomass, could have positive economic effects, as well as reducing GHG emissions and fossil energy requirement. Key factors to achieve these targets are enhanced technological development and the import of sustainable biomass resources to the Netherlands. - Highlights: • We analyse large scale production of bioenergy and biochemicals in the Netherlands. • The scenarios include up to 30% substitution of fossil fuels by biomass in 2030. • Resulting in strong greenhouse gas savings and positive macro-economic effects. • Large amounts of imported biomass are required to meet the domestic demand. • This requires high rates of technological change and strict sustainability criteria

  7. Challenges in Managing Trustworthy Large-scale Digital Science

    Science.gov (United States)

    Evans, B. J. K.

    2017-12-01

    The increased use of large-scale international digital science has opened a number of challenges for managing, handling, using and preserving scientific information. The large volumes of information are driven by three main categories - model outputs including coupled models and ensembles, data products that have been processing to a level of usability, and increasingly heuristically driven data analysis. These data products are increasingly the ones that are usable by the broad communities, and far in excess of the raw instruments data outputs. The data, software and workflows are then shared and replicated to allow broad use at an international scale, which places further demands of infrastructure to support how the information is managed reliably across distributed resources. Users necessarily rely on these underlying "black boxes" so that they are productive to produce new scientific outcomes. The software for these systems depend on computational infrastructure, software interconnected systems, and information capture systems. This ranges from the fundamentals of the reliability of the compute hardware, system software stacks and libraries, and the model software. Due to these complexities and capacity of the infrastructure, there is an increased emphasis of transparency of the approach and robustness of the methods over the full reproducibility. Furthermore, with large volume data management, it is increasingly difficult to store the historical versions of all model and derived data. Instead, the emphasis is on the ability to access the updated products and the reliability by which both previous outcomes are still relevant and can be updated for the new information. We will discuss these challenges and some of the approaches underway that are being used to address these issues.

  8. Language influences music harmony perception: Effects of shared syntactic integration resources beyond attention

    NARCIS (Netherlands)

    Kunert, R.; Willems, R.M.; Hagoort, P.

    2016-01-01

    Many studies have revealed shared music–language processing resources by finding an influence of music harmony manipulations on concurrent language processing. However, the nature of the shared resources has remained ambiguous. They have been argued to be syntax specific and thus due to shared

  9. Large-scale event extraction from literature with multi-level gene normalization.

    Directory of Open Access Journals (Sweden)

    Sofie Van Landeghem

    Full Text Available Text mining for the life sciences aims to aid database curation, knowledge summarization and information retrieval through the automated processing of biomedical texts. To provide comprehensive coverage and enable full integration with existing biomolecular database records, it is crucial that text mining tools scale up to millions of articles and that their analyses can be unambiguously linked to information recorded in resources such as UniProt, KEGG, BioGRID and NCBI databases. In this study, we investigate how fully automated text mining of complex biomolecular events can be augmented with a normalization strategy that identifies biological concepts in text, mapping them to identifiers at varying levels of granularity, ranging from canonicalized symbols to unique gene and proteins and broad gene families. To this end, we have combined two state-of-the-art text mining components, previously evaluated on two community-wide challenges, and have extended and improved upon these methods by exploiting their complementary nature. Using these systems, we perform normalization and event extraction to create a large-scale resource that is publicly available, unique in semantic scope, and covers all 21.9 million PubMed abstracts and 460 thousand PubMed Central open access full-text articles. This dataset contains 40 million biomolecular events involving 76 million gene/protein mentions, linked to 122 thousand distinct genes from 5032 species across the full taxonomic tree. Detailed evaluations and analyses reveal promising results for application of this data in database and pathway curation efforts. The main software components used in this study are released under an open-source license. Further, the resulting dataset is freely accessible through a novel API, providing programmatic and customized access (http://www.evexdb.org/api/v001/. Finally, to allow for large-scale bioinformatic analyses, the entire resource is available for bulk download from

  10. Survival and growth of epiphytic ferns depend on resource sharing

    Directory of Open Access Journals (Sweden)

    Hua-Zheng eLu

    2016-03-01

    Full Text Available Locally available resources can be shared within clonal plant systems through physiological integration, thus enhancing their survival and growth. Most epiphytes exhibit clonal growth habit, but few studies have tested effects of physiological integration (resource sharing on survival and growth of epiphytes and whether such effects vary with species. We conducted two experiments, one on individuals (single ramets and another on groups (several ramets within a plot, with severed and intact rhizome treatments (without and with physiological integration on two dominant epiphytic ferns (Polypodiodes subamoena and Lepisorus scolopendrium in a subtropical montane moist forest in Southwest China. Rhizome severing (preventing integration significantly reduced ramet survival in the individual experiment and number of surviving ramets in the group experiment, and it also decreased biomass of both species in both experiments. However, the magnitude of such integration effects did not vary significantly between the two species. We conclude that resource sharing may be a general strategy for clonal epiphytes to adapt to forest canopies where resources are limited and heterogeneously distributed in space and time.

  11. Livelihood Implications and Perceptions of Large Scale Investment in Natural Resources for Conservation and Carbon Sequestration : Empirical Evidence from REDD+ in Vietnam

    NARCIS (Netherlands)

    Bayrak, Mucahid Mustafa; Marafa, Lawal Mohammed

    2017-01-01

    The complex relationship between local development and current large scale investments in natural resources in the Global South for the purpose of conservation and carbon sequestration is not fully understood yet. The Reducing Emissions from Deforestation and Forest Degradation programme (REDD+) is

  12. Towards open sharing of task-based fMRI data: The OpenfMRI project

    Directory of Open Access Journals (Sweden)

    Russell A Poldrack

    2013-07-01

    Full Text Available The large-scale sharing of task-based functional neuroimaging data has the potential to allow novel insights into the organization of mental function in the brain, but the field of neuroimaging has lagged behind other areas of bioscience in the development of data sharing resources. This paper describes the OpenFMRI project (accessible online at http://www.openfmri.org, which aims to provide the neuroimaging community with a resource to support open sharing of task-based fMRI studies. We describe the motivation behind the project, focusing particularly on how this project addresses some of the well-known challenges to sharing of task-based fMRI data. Results from a preliminary analysis of the current database are presented, which demonstrate the ability to classify between task contrasts with high generalization accuracy across subjects, and the ability to identify individual subjects from their activation maps with moderately high accuracy. Clustering analyses show that the similarity relations between statistical maps have a somewhat orderly relation to the mental functions engaged by the relevant tasks. These results highlight the potential of the project to support large-scale multivariate analyses of the relation between mental processes and brain function.

  13. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  14. Success Factors of Large Scale ERP Implementation in Thailand

    OpenAIRE

    Rotchanakitumnuai; Siriluck

    2010-01-01

    The objectives of the study are to examine the determinants of ERP implementation success factors of ERP implementation. The result indicates that large scale ERP implementation success consist of eight factors: project management competence, knowledge sharing, ERP system quality , understanding, user involvement, business process re-engineering, top management support, organization readiness.

  15. GIGGLE: a search engine for large-scale integrated genome analysis.

    Science.gov (United States)

    Layer, Ryan M; Pedersen, Brent S; DiSera, Tonya; Marth, Gabor T; Gertz, Jason; Quinlan, Aaron R

    2018-02-01

    GIGGLE is a genomics search engine that identifies and ranks the significance of genomic loci shared between query features and thousands of genome interval files. GIGGLE (https://github.com/ryanlayer/giggle) scales to billions of intervals and is over three orders of magnitude faster than existing methods. Its speed extends the accessibility and utility of resources such as ENCODE, Roadmap Epigenomics, and GTEx by facilitating data integration and hypothesis generation.

  16. Report of the Workshop on Petascale Systems Integration for LargeScale Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, William T.C.; Walter, Howard; New, Gary; Engle, Tom; Pennington, Rob; Comes, Brad; Bland, Buddy; Tomlison, Bob; Kasdorf, Jim; Skinner, David; Regimbal, Kevin

    2007-10-01

    There are significant issues regarding Large Scale System integration that are not being addressed in other forums such as current research portfolios or vendor user groups. Unfortunately, the issues in the area of large-scale system integration often fall into a netherworld; not research, not facilities, not procurement, not operations, not user services. Taken together, these issues along with the impact of sub-optimal integration technology means the time required to deploy, integrate and stabilize large scale system may consume up to 20 percent of the useful life of such systems. Improving the state of the art for large scale systems integration has potential to increase the scientific productivity of these systems. Sites have significant expertise, but there are no easy ways to leverage this expertise among them . Many issues inhibit the sharing of information, including available time and effort, as well as issues with sharing proprietary information. Vendors also benefit in the long run from the solutions to issues detected during site testing and integration. There is a great deal of enthusiasm for making large scale system integration a full-fledged partner along with the other major thrusts supported by funding agencies in the definition, design, and use of a petascale systems. Integration technology and issues should have a full 'seat at the table' as petascale and exascale initiatives and programs are planned. The workshop attendees identified a wide range of issues and suggested paths forward. Pursuing these with funding opportunities and innovation offers the opportunity to dramatically improve the state of large scale system integration.

  17. Resource allocation in shared spectrum access communications for operators with diverse service requirements

    Science.gov (United States)

    Kibria, Mirza Golam; Villardi, Gabriel Porto; Ishizu, Kentaro; Kojima, Fumihide; Yano, Hiroyuki

    2016-12-01

    In this paper, we study inter-operator spectrum sharing and intra-operator resource allocation in shared spectrum access communication systems and propose efficient dynamic solutions to address both inter-operator and intra-operator resource allocation optimization problems. For inter-operator spectrum sharing, we present two competent approaches, namely the subcarrier gain-based sharing and fragmentation-based sharing, which carry out fair and flexible allocation of the available shareable spectrum among the operators subject to certain well-defined sharing rules, traffic demands, and channel propagation characteristics. The subcarrier gain-based spectrum sharing scheme has been found to be more efficient in terms of achieved throughput. However, the fragmentation-based sharing is more attractive in terms of computational complexity. For intra-operator resource allocation, we consider resource allocation problem with users' dissimilar service requirements, where the operator supports users with delay constraint and non-delay constraint service requirements, simultaneously. This optimization problem is a mixed-integer non-linear programming problem and non-convex, which is computationally very expensive, and the complexity grows exponentially with the number of integer variables. We propose less-complex and efficient suboptimal solution based on formulating exact linearization, linear approximation, and convexification techniques for the non-linear and/or non-convex objective functions and constraints. Extensive simulation performance analysis has been carried out that validates the efficiency of the proposed solution.

  18. Protocol-transparent resource sharing in hierarchically scheduled real-time systems

    NARCIS (Netherlands)

    Heuvel, van den M.M.H.P.; Bril, R.J.; Lukkien, J.J.

    2010-01-01

    Hierarchical scheduling frameworks (HSFs) provide means for composing complex real-time systems from well-defined, independently analyzed subsystems. To support resource sharing within two-level HSFs, three synchronization protocols based on the stack resource policy (SRP) have recently been

  19. GIGGLE: a search engine for large-scale integrated genome analysis

    Science.gov (United States)

    Layer, Ryan M; Pedersen, Brent S; DiSera, Tonya; Marth, Gabor T; Gertz, Jason; Quinlan, Aaron R

    2018-01-01

    GIGGLE is a genomics search engine that identifies and ranks the significance of genomic loci shared between query features and thousands of genome interval files. GIGGLE (https://github.com/ryanlayer/giggle) scales to billions of intervals and is over three orders of magnitude faster than existing methods. Its speed extends the accessibility and utility of resources such as ENCODE, Roadmap Epigenomics, and GTEx by facilitating data integration and hypothesis generation. PMID:29309061

  20. 'Y' a distributed resource sharing system in nuclear research environment

    International Nuclear Information System (INIS)

    Popescu-Zeletin, R.

    1986-01-01

    The paper outlines the rationales for the transition from HMINET-2 to a distributed resource sharing system in Hahn-Meitner-Institute for Nuclear Research. The architecture and rationales for the planned new distributed resource system (Y) in HMI are outlined. The introduction of a distributed operating system is a prerequisite for a resource-sharing system. Y will provide not only the integration of networks of different qualities (high speed back-bone, LANs of different technologies, ports to national X.25 network and satellite) at hardware level, but also an integrated global user view of the whole system. This will be designed and implemented by decoupling the user-view from the hardware topology by introducing a netwide distributed operating system. (Auth.)

  1. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    Science.gov (United States)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating large-scale/local-scale correlation, enmpirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the large-scale/local-scale links were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach which integrated discrete wavelet multiresolution analysis for reconstructing local hydrometeorological processes (predictand : precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector) on a monthly time-step. This approach

  2. The Alberta dilemma: optimal sharing of a water resource by an agricultural and an oil sector

    NARCIS (Netherlands)

    Gaudet, G.; Moreaux, M.; Withagen, C.A.A.M.

    2006-01-01

    We fully characterize the optimal time paths of production and water usage by an agricultural and an oil sector that share a limited water resource. We show that for any given water stock, if the oil stock is sufficiently large, it will become optimal to have a phase during which the agricultural

  3. Fair Access to and Benefit Sharing of Genetic Resources : National ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Fair Access to and Benefit Sharing of Genetic Resources : National Policy Development (China, Jordan, Nepal, Peru). Local practices pertaining to biodiversity conservation, crop improvement and natural resource management are under stress. Existing laws and mechanisms - such as intellectual property rights (IPRs) ...

  4. Information partnerships--shared data, shared scale.

    Science.gov (United States)

    Konsynski, B R; McFarlan, F W

    1990-01-01

    How can one company gain access to another's resources or customers without merging ownership, management, or plotting a takeover? The answer is found in new information partnerships, enabling diverse companies to develop strategic coalitions through the sharing of data. The key to cooperation is a quantum improvement in the hardware and software supporting relational databases: new computer speeds, cheaper mass-storage devices, the proliferation of fiber-optic networks, and networking architectures. Information partnerships mean that companies can distribute the technological and financial exposure that comes with huge investments. For the customer's part, partnerships inevitably lead to greater simplification on the desktop and more common standards around which vendors have to compete. The most common types of partnership are: joint marketing partnerships, such as American Airline's award of frequent flyer miles to customers who use Citibank's credit card; intraindustry partnerships, such as the insurance value-added network service (which links insurance and casualty companies to independent agents); customer-supplier partnerships, such as Baxter Healthcare's electronic channel to hospitals for medical and other equipment; and IT vendor-driven partnerships, exemplified by ESAB (a European welding supplies and equipment company), whose expansion strategy was premised on a technology platform offered by an IT vendor. Partnerships that succeed have shared vision at the top, reciprocal skills in information technology, concrete plans for an early success, persistence in the development of usable information for all partners, coordination on business policy, and a new and imaginative business architecture.

  5. Assessment of renewable energy resources potential for large scale and standalone applications in Ethiopia

    NARCIS (Netherlands)

    Tucho, Gudina Terefe; Weesie, Peter D.M.; Nonhebel, Sanderine

    2014-01-01

    This study aims to determine the contribution of renewable energy to large scale and standalone application in Ethiopia. The assessment starts by determining the present energy system and the available potentials. Subsequently, the contribution of the available potentials for large scale and

  6. Sharing Economy vs Sharing Cultures? Designing for social, economic and environmental good

    Directory of Open Access Journals (Sweden)

    Ann Light

    2015-05-01

    Full Text Available This paper explores the story behind a crowdfunding service as an example of sharing technology. Research in a small neighborhood of London showed how locally-developed initiatives can differ in tone, scale, ambition and practice to those getting attention in the so-called sharing economy. In local accounts, we see an emphasis on organizing together to create shared spaces for collaborative use of resources and joint ownership of projects and places. Whereas, many global business models feature significant elements of renting, leasing and hiring and focus only on resource management, sometimes at the expense of community growth. The service we discuss is based in the area we studied and has a collective model of sharing, but hopes to be part of the new global movement. We use this hybridity to problematize issues of culture, place and scalability in developing sharing resources and addressing sustainability concerns. We relate this to the motivation, rhetoric and design choices of other local sharing enterprises and other global sharing economy initiatives, arguing, in conclusion, that there is no sharing economy, but a variety of new cultures being fostered.

  7. Working memory resources are shared across sensory modalities.

    Science.gov (United States)

    Salmela, V R; Moisala, M; Alho, K

    2014-10-01

    A common assumption in the working memory literature is that the visual and auditory modalities have separate and independent memory stores. Recent evidence on visual working memory has suggested that resources are shared between representations, and that the precision of representations sets the limit for memory performance. We tested whether memory resources are also shared across sensory modalities. Memory precision for two visual (spatial frequency and orientation) and two auditory (pitch and tone duration) features was measured separately for each feature and for all possible feature combinations. Thus, only the memory load was varied, from one to four features, while keeping the stimuli similar. In Experiment 1, two gratings and two tones-both containing two varying features-were presented simultaneously. In Experiment 2, two gratings and two tones-each containing only one varying feature-were presented sequentially. The memory precision (delayed discrimination threshold) for a single feature was close to the perceptual threshold. However, as the number of features to be remembered was increased, the discrimination thresholds increased more than twofold. Importantly, the decrease in memory precision did not depend on the modality of the other feature(s), or on whether the features were in the same or in separate objects. Hence, simultaneously storing one visual and one auditory feature had an effect on memory precision equal to those of simultaneously storing two visual or two auditory features. The results show that working memory is limited by the precision of the stored representations, and that working memory can be described as a resource pool that is shared across modalities.

  8. A Resource Sharing Mechanism for Sustainable Production in the Garment Industry

    Directory of Open Access Journals (Sweden)

    Ke Ma

    2017-12-01

    Full Text Available With the development of mass customization, the traditional garment production model needs to be optimized to have a more sustainable structure. To meet demand for flexibility, low-cost, and high-efficiency, an innovative resource sharing mechanism was proposed in this paper to form a new sustainable type of garment production. Different from the individual production in traditional models, the new mechanism involves resources being shared among various manufacturers. The tradeoff between positive and negative effects of the proposed mechanism is a key issue for sustainable production. In the present study, an overall sustainable index, integrating four production performance indicators, was defined on the basis of an Analytical Network Process to assess various production scenarios. According to the discrete-event simulation results of the different scenarios, we found that garment manufacturers could obtain comprehensive improvements in sustainable production by implementing the proposed resource sharing mechanism under the threshold of an increasing production failure rate.

  9. A METHOD OF AND A SYSTEM FOR CONTROLLING ACCESS TO A SHARED RESOURCE

    DEFF Research Database (Denmark)

    2006-01-01

    A method and a system of controlling access of data items to a shared resource, wherein the data items each is assigned to one of a plurality of priorities, and wherein, when a predetermined number of data items of a priority have been transmitted to the shared resource, that priority...

  10. Dynamic Control of Facts Devices to Enable Large Scale Penetration of Renewable Energy Resources

    Science.gov (United States)

    Chavan, Govind Sahadeo

    This thesis focuses on some of the problems caused by large scale penetration of Renewable Energy Resources within EHV transmission networks, and investigates some approaches in resolving these problems. In chapter 4, a reduced-order model of the 500 kV WECC transmission system is developed by estimating its key parameters from phasor measurement unit (PMU) data. The model was then implemented in RTDS and was investigated for its accuracy with respect to the PMU data. Finally it was tested for observing the effects of various contingencies like transmission line loss, generation loss and large scale penetration of wind farms on EHV transmission systems. Chapter 5 introduces Static Series Synchronous Compensators (SSSC) which are seriesconnected converters that can control real power flow along a transmission line. A new application of SSSCs in mitigating Ferranti effect on unloaded transmission lines was demonstrated on PSCAD. A new control scheme for SSSCs based on the Cascaded H-bridge (CHB) converter configuration was proposed and was demonstrated using PSCAD and RTDS. A new centralized controller was developed for the distributed SSSCs based on some of the concepts used in the CHB-based SSSC. The controller's efficacy was demonstrated using RTDS. Finally chapter 6 introduces the problem of power oscillations induced by renewable sources in a transmission network. A power oscillation damping (POD) controller is designed using distributed SSSCs in NYPA's 345 kV three-bus AC system and its efficacy is demonstrated in PSCAD. A similar POD controller is then designed for the CHB-based SSSC in the IEEE 14 bus system in PSCAD. Both controllers were noted to have significantly damped power oscillations in the transmission networks.

  11. Distributed Sharing of Functionalities and Resources in Survivable GMPLS-controlled WSONs

    DEFF Research Database (Denmark)

    Fagertun, Anna Manolova; Cerutti, I.; Muñoz, R.

    2012-01-01

    Sharing of functionalities and sharing of network resources are effective solutions for improving the cost-effectiveness of wavelength-switched optical networks (WSONs). Such cost-effectiveness should be pursued together with the objective of ensuring the requested level of performance at the phy...

  12. ability in Large Scale Land Acquisitions in Kenya

    International Development Research Centre (IDRC) Digital Library (Canada)

    Corey Piccioni

    Kenya's national planning strategy, Vision 2030. Agri- culture, natural resource exploitation, and infrastruc- ... sitions due to high levels of poverty and unclear or in- secure land tenure rights in Kenya. Inadequate social ... lease to a private company over the expansive Yala. Swamp to undertake large-scale irrigation farming.

  13. A 2-layer and P2P-based architecture on resource location in future grid environment

    International Nuclear Information System (INIS)

    Pei Erming; Sun Gongxin; Zhang Weiyi; Pang Yangguang; Gu Ming; Ma Nan

    2004-01-01

    Grid and Peer-to-Peer computing are two distributed resource sharing environments developing rapidly in recent years. The final objective of Grid, as well as that of P2P technology, is to pool large sets of resources effectively to be used in a more convenient, fast and transparent way. We can speculate that, though many difference exists, Grid and P2P environments will converge into a large scale resource sharing environment that combines the characteristics of the two environments: large diversity, high heterogeneity (of resources), dynamism, and lack of central control. Resource discovery in this future Grid environment is a basic however, important problem. In this article. We propose a two-layer and P2P-based architecture for resource discovery and design a detailed algorithm for resource request propagation in the computing environment discussed above. (authors)

  14. Quantitative analysis on the environmental impact of large-scale water transfer project on water resource area in a changing environment

    Directory of Open Access Journals (Sweden)

    D. H. Yan

    2012-08-01

    Full Text Available The interbasin long-distance water transfer project is key support for the reasonable allocation of water resources in a large-scale area, which can optimize the spatio-temporal change of water resources to secure the amount of water available. Large-scale water transfer projects have a deep influence on ecosystems; besides, global climate change causes uncertainty and additive effect of the environmental impact of water transfer projects. Therefore, how to assess the ecological and environmental impact of megaprojects in both construction and operation phases has triggered a lot of attention. The water-output area of the western route of China's South-North Water Transfer Project was taken as the study area of the present article. According to relevant evaluation principles and on the basis of background analysis, we identified the influencing factors and established the diagnostic index system. The climate-hydrology-ecology coupled simulation model was used to simulate and predict ecological and environmental responses of the water resource area in a changing environment. The emphasis of impact evaluation was placed on the reservoir construction and operation scheduling, representative river corridors and wetlands, natural reserves and the water environment below the dam sites. In the end, an overall evaluation of the comprehensive influence of the project was conducted. The research results were as follows: the environmental impacts of the western route project in the water resource area were concentrated on two aspects: the permanent destruction of vegetation during the phase of dam construction and river impoundment, and the significant influence on the hydrological situation of natural river corridor after the implementation of water extraction. The impact on local climate, vegetation ecology, typical wetlands, natural reserves and the water environment of river basins below the dam sites was small.

  15. Quantitative analysis on the environmental impact of large-scale water transfer project on water resource area in a changing environment

    Science.gov (United States)

    Yan, D. H.; Wang, H.; Li, H. H.; Wang, G.; Qin, T. L.; Wang, D. Y.; Wang, L. H.

    2012-08-01

    The interbasin long-distance water transfer project is key support for the reasonable allocation of water resources in a large-scale area, which can optimize the spatio-temporal change of water resources to secure the amount of water available. Large-scale water transfer projects have a deep influence on ecosystems; besides, global climate change causes uncertainty and additive effect of the environmental impact of water transfer projects. Therefore, how to assess the ecological and environmental impact of megaprojects in both construction and operation phases has triggered a lot of attention. The water-output area of the western route of China's South-North Water Transfer Project was taken as the study area of the present article. According to relevant evaluation principles and on the basis of background analysis, we identified the influencing factors and established the diagnostic index system. The climate-hydrology-ecology coupled simulation model was used to simulate and predict ecological and environmental responses of the water resource area in a changing environment. The emphasis of impact evaluation was placed on the reservoir construction and operation scheduling, representative river corridors and wetlands, natural reserves and the water environment below the dam sites. In the end, an overall evaluation of the comprehensive influence of the project was conducted. The research results were as follows: the environmental impacts of the western route project in the water resource area were concentrated on two aspects: the permanent destruction of vegetation during the phase of dam construction and river impoundment, and the significant influence on the hydrological situation of natural river corridor after the implementation of water extraction. The impact on local climate, vegetation ecology, typical wetlands, natural reserves and the water environment of river basins below the dam sites was small.

  16. Living in a network of scaling cities and finite resources.

    Science.gov (United States)

    Qubbaj, Murad R; Shutters, Shade T; Muneepeerakul, Rachata

    2015-02-01

    Many urban phenomena exhibit remarkable regularity in the form of nonlinear scaling behaviors, but their implications on a system of networked cities has never been investigated. Such knowledge is crucial for our ability to harness the complexity of urban processes to further sustainability science. In this paper, we develop a dynamical modeling framework that embeds population-resource dynamics-a generalized Lotka-Volterra system with modifications to incorporate the urban scaling behaviors-in complex networks in which cities may be linked to the resources of other cities and people may migrate in pursuit of higher welfare. We find that isolated cities (i.e., no migration) are susceptible to collapse if they do not have access to adequate resources. Links to other cities may help cities that would otherwise collapse due to insufficient resources. The effects of inter-city links, however, can vary due to the interplay between the nonlinear scaling behaviors and network structure. The long-term population level of a city is, in many settings, largely a function of the city's access to resources over which the city has little or no competition. Nonetheless, careful investigation of dynamics is required to gain mechanistic understanding of a particular city-resource network because cities and resources may collapse and the scaling behaviors may influence the effects of inter-city links, thereby distorting what topological metrics really measure.

  17. Active power reserves evaluation in large scale PVPPs

    DEFF Research Database (Denmark)

    Crăciun, Bogdan-Ionut; Kerekes, Tamas; Sera, Dezso

    2013-01-01

    The present trend on investing in renewable ways of producing electricity in the detriment of conventional fossil fuel-based plants will lead to a certain point where these plants have to provide ancillary services and contribute to overall grid stability. Photovoltaic (PV) power has the fastest...... growth among all renewable energies and managed to reach high penetration levels creating instabilities which at the moment are corrected by the conventional generation. This paradigm will change in the future scenarios where most of the power is supplied by large scale renewable plants and parts...... of the ancillary services have to be shared by the renewable plants. The main focus of the proposed paper is to technically and economically analyze the possibility of having active power reserves in large scale PV power plants (PVPPs) without any auxiliary storage equipment. The provided reserves should...

  18. Resolving and Prevention of Shared Water Resources Conflicts ...

    African Journals Online (AJOL)

    Learning from experiences from other parts of the world, it was recommended to incorporate game theory technique in water resources conflicts and cooperation in the African river basins for equitable and fair utilization and management of shared water. Journal of Civil Engineering Research and Practice Vol.1(1) 2004: 51- ...

  19. Exploiting multi-scale parallelism for large scale numerical modelling of laser wakefield accelerators

    International Nuclear Information System (INIS)

    Fonseca, R A; Vieira, J; Silva, L O; Fiuza, F; Davidson, A; Tsung, F S; Mori, W B

    2013-01-01

    A new generation of laser wakefield accelerators (LWFA), supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modelling to further understand the underlying physics and identify optimal regimes, but large scale modelling of these scenarios is computationally heavy and requires the efficient use of state-of-the-art petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed/shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modelling of LWFA, demonstrating speedups of over 1 order of magnitude on the same hardware. Finally, scalability to over ∼10 6 cores and sustained performance over ∼2 P Flops is demonstrated, opening the way for large scale modelling of LWFA scenarios. (paper)

  20. Geospatial Optimization of Siting Large-Scale Solar Projects

    Energy Technology Data Exchange (ETDEWEB)

    Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Quinby, Ted [National Renewable Energy Lab. (NREL), Golden, CO (United States); Caulfield, Emmet [Stanford Univ., CA (United States); Gerritsen, Margot [Stanford Univ., CA (United States); Diffendorfer, Jay [U.S. Geological Survey, Boulder, CO (United States); Haines, Seth [U.S. Geological Survey, Boulder, CO (United States)

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  1. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed

    2017-03-16

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end, cellular networks are indeed a strong first mile candidate to accommodate the data tsunami to be generated by the IoT. However, IoT devices are required in the cellular paradigm to undergo random access procedures as a precursor to resource allocation. Such procedures impose a major bottleneck that hinders cellular networks\\' ability to support large-scale IoT. In this article, we shed light on the random access dilemma and present a case study based on experimental data as well as system-level simulations. Accordingly, a case is built for the latent need to revisit random access procedures. A call for action is motivated by listing a few potential remedies and recommendations.

  2. Scalable multi-objective control for large scale water resources systems under uncertainty

    Science.gov (United States)

    Giuliani, Matteo; Quinn, Julianne; Herman, Jonathan; Castelletti, Andrea; Reed, Patrick

    2016-04-01

    The use of mathematical models to support the optimal management of environmental systems is rapidly expanding over the last years due to advances in scientific knowledge of the natural processes, efficiency of the optimization techniques, and availability of computational resources. However, undergoing changes in climate and society introduce additional challenges for controlling these systems, ultimately motivating the emergence of complex models to explore key causal relationships and dependencies on uncontrolled sources of variability. In this work, we contribute a novel implementation of the evolutionary multi-objective direct policy search (EMODPS) method for controlling environmental systems under uncertainty. The proposed approach combines direct policy search (DPS) with hierarchical parallelization of multi-objective evolutionary algorithms (MOEAs) and offers a threefold advantage: the DPS simulation-based optimization can be combined with any simulation model and does not add any constraint on modeled information, allowing the use of exogenous information in conditioning the decisions. Moreover, the combination of DPS and MOEAs prompts the generation or Pareto approximate set of solutions for up to 10 objectives, thus overcoming the decision biases produced by cognitive myopia, where narrow or restrictive definitions of optimality strongly limit the discovery of decision relevant alternatives. Finally, the use of large-scale MOEAs parallelization improves the ability of the designed solutions in handling the uncertainty due to severe natural variability. The proposed approach is demonstrated on a challenging water resources management problem represented by the optimal control of a network of four multipurpose water reservoirs in the Red River basin (Vietnam). As part of the medium-long term energy and food security national strategy, four large reservoirs have been constructed on the Red River tributaries, which are mainly operated for hydropower

  3. Mixing Metaphors: Building Infrastructure for Large Scale School Turnaround

    Science.gov (United States)

    Peurach, Donald J.; Neumerski, Christine M.

    2015-01-01

    The purpose of this analysis is to increase understanding of the possibilities and challenges of building educational infrastructure--the basic, foundational structures, systems, and resources--to support large-scale school turnaround. Building educational infrastructure often exceeds the capacity of schools, districts, and state education…

  4. Development of Resource Sharing System Components for AliEn Grid Infrastructure

    CERN Document Server

    Harutyunyan, Artem

    2010-01-01

    The problem of the resource provision, sharing, accounting and use represents a principal issue in the contemporary scientific cyberinfrastructures. For example, collaborations in physics, astrophysics, Earth science, biology and medicine need to store huge amounts of data (of the order of several petabytes) as well as to conduct highly intensive computations. The appropriate computing and storage capacities cannot be ensured by one (even very large) research center. The modern approach to the solution of this problem suggests exploitation of computational and data storage facilities of the centers participating in collaborations. The most advanced implementation of this approach is based on Grid technologies, which enable effective work of the members of collaborations regardless of their geographical location. Currently there are several tens of Grid infrastructures deployed all over the world. The Grid infrastructures of CERN Large Hadron Collider experiments - ALICE, ATLAS, CMS, and LHCb which are exploi...

  5. Incentive Mechanism Model Design for Sharing of Information Resources in Rural Areas

    OpenAIRE

    Gao, Xirong; Shan, Lingling

    2013-01-01

    In order to solve the issues concerning the cross-unit sharing of information resources in rural areas, we analyze the incentive problem of the sharing of information resources in rural areas using the incentive theory method; establish corresponding incentive mechanism model (It is divided into positive incentive model and negative incentive model, and only when the two models guarantee each other and are used at the same time can they be effective). Based on this, we put forward the institu...

  6. NASA's Information Power Grid: Large Scale Distributed Computing and Data Management

    Science.gov (United States)

    Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)

    2001-01-01

    Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.

  7. Being Sticker Rich: Numerical Context Influences Children's Sharing Behavior.

    Directory of Open Access Journals (Sweden)

    Tasha Posid

    Full Text Available Young children spontaneously share resources with anonymous recipients, but little is known about the specific circumstances that promote or hinder these prosocial tendencies. Children (ages 3-11 received a small (12 or large (30 number of stickers, and were then given the opportunity to share their windfall with either one or multiple anonymous recipients (Dictator Game. Whether a child chose to share or not varied as a function of age, but was uninfluenced by numerical context. Moreover, children's giving was consistent with a proportion-based account, such that children typically donated a similar proportion (but different absolute number of the resources given to them, regardless of whether they originally received a small or large windfall. The proportion of resources donated, however, did vary based on the number of recipients with whom they were allowed to share, such that on average, children shared more when there were more recipients available, particularly when they had more resources, suggesting they take others into consideration when making prosocial decisions. Finally, results indicated that a child's gender also predicted sharing behavior, with males generally sharing more resources than females. Together, findings suggest that the numerical contexts under which children are asked to share, as well as the quantity of resources that they have to share, may interact to promote (or hinder altruistic behaviors throughout childhood.

  8. Parallel clustering algorithm for large-scale biological data sets.

    Science.gov (United States)

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies.

  9. TensorFlow: A system for large-scale machine learning

    OpenAIRE

    Abadi, Martín; Barham, Paul; Chen, Jianmin; Chen, Zhifeng; Davis, Andy; Dean, Jeffrey; Devin, Matthieu; Ghemawat, Sanjay; Irving, Geoffrey; Isard, Michael; Kudlur, Manjunath; Levenberg, Josh; Monga, Rajat; Moore, Sherry; Murray, Derek G.

    2016-01-01

    TensorFlow is a machine learning system that operates at large scale and in heterogeneous environments. TensorFlow uses dataflow graphs to represent computation, shared state, and the operations that mutate that state. It maps the nodes of a dataflow graph across many machines in a cluster, and within a machine across multiple computational devices, including multicore CPUs, general-purpose GPUs, and custom designed ASICs known as Tensor Processing Units (TPUs). This architecture gives flexib...

  10. Research on the digital education resources of sharing pattern in independent colleges based on cloud computing environment

    Science.gov (United States)

    Xiong, Ting; He, Zhiwen

    2017-06-01

    Cloud computing was first proposed by Google Company in the United States, which was based on the Internet center, providing a standard and open network sharing service approach. With the rapid development of the higher education in China, the educational resources provided by colleges and universities had greatly gap in the actual needs of teaching resources. therefore, Cloud computing of using the Internet technology to provide shared methods liked the timely rain, which had become an important means of the Digital Education on sharing applications in the current higher education. Based on Cloud computing environment, the paper analyzed the existing problems about the sharing of digital educational resources in Jiangxi Province Independent Colleges. According to the sharing characteristics of mass storage, efficient operation and low input about Cloud computing, the author explored and studied the design of the sharing model about the digital educational resources of higher education in Independent College. Finally, the design of the shared model was put into the practical applications.

  11. The essential nature of sharing in science.

    Science.gov (United States)

    Fischer, Beth A; Zigmond, Michael J

    2010-12-01

    Advances in science are the combined result of the efforts of a great many scientists, and in many cases, their willingness to share the products of their research. These products include data sets, both small and large, and unique research resources not commercially available, such as cell lines and software programs. The sharing of these resources enhances both the scope and the depth of research, while making more efficient use of time and money. However, sharing is not without costs, many of which are borne by the individual who develops the research resource. Sharing, for example, reduces the uniqueness of the resources available to a scientist, potentially influencing the originator's perceived productivity and ultimately his or her competitiveness for jobs, promotions, and grants. Nevertheless, for most researchers-particularly those using public funds-sharing is no longer optional but must be considered an obligation to science, the funding agency, and ultimately society at large. Most funding agencies, journals, and professional societies now require a researcher who has published work involving a unique resource to make that resource available to other investigators. Changes could be implemented to mitigate some of the costs. The creator of the resource could explore the possibility of collaborating with those who request it. In addition, institutions that employ and fund researchers could change their policies and practices to make sharing a more attractive and viable option. For example, when evaluating an individual's productivity, institutions could provide credit for the impact a researcher has had on their field through the provision of their unique resources to other investigators, regardless of whether that impact is reflected in the researcher's list of publications. In addition, increased funding for the development and maintenance of user-friendly public repositories for data and research resources would also help to reduce barriers to sharing

  12. An investigation into the practices of resource sharing among ...

    African Journals Online (AJOL)

    The study investigated the practice of resource sharing among Academic Libraries in Federal Universities in the South-South Geo-Political zone of Nigeria. The survey research design was employed for the study. The population for the study consists of the federal universities in the zone, except the Federal University of ...

  13. Knowledge Sharing Strategies for Large Complex Building Projects.

    Directory of Open Access Journals (Sweden)

    Esra Bektas

    2013-06-01

    Full Text Available The construction industry is a project-based sector with a myriad of actors such as architects, construction companies, consultants, producers of building materials (Anumba et al., 2005. The interaction between the project partners is often quite limited, which leads to insufficient knowledge sharing during the project and knowledge being unavailable for reuse (Fruchter et al. 2002. The result can be a considerable amount of extra work, delays and cost overruns. Design outcomes that are supposed to function as boundary objects across different disciplines can lead to misinterpretation of requirements, project content and objectives. In this research, knowledge is seen as resulting from social interactions; knowledge resides in communities and it is generated through social relationships (Wenger 1998, Olsson et al. 2008. Knowledge is often tacit, intangible and context-dependent and it is articulated in the changing responsibilities, roles, attitudes and values that are present in the work environment (Bresnen et al., 2003. In a project environment, knowledge enables individuals to solve problems, take decisions, and apply these decisions to actions. In order to achieve a shared understanding and minimize the misunderstanding and misinterpretations among project actors, it is necessary to share knowledge (Fong 2003. Sharing knowledge is particularly crucial in large complex building projects (LCBPs in order to accelerate the building process, improve architectural quality and prevent mistakes or undesirable results. However, knowledge sharing is often hampered through professional or organizational boundaries or contractual concerns. When knowledge is seen as an organizational asset, there is little willingness among project organizations to share their knowledge. Individual people may recognize the need to promote knowledge sharing throughout the project, but typically there is no deliberate strategy agreed by all project partners to address

  14. Multimode Resource-Constrained Multiple Project Scheduling Problem under Fuzzy Random Environment and Its Application to a Large Scale Hydropower Construction Project

    Science.gov (United States)

    Xu, Jiuping

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method. PMID:24550708

  15. Multimode resource-constrained multiple project scheduling problem under fuzzy random environment and its application to a large scale hydropower construction project.

    Science.gov (United States)

    Xu, Jiuping; Feng, Cuiying

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method.

  16. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  17. Assessment of climate change impacts on rainfall using large scale ...

    Indian Academy of Sciences (India)

    Many of the applied techniques in water resources management can be directly or indirectly influenced by ... is based on large scale climate signals data around the world. In order ... predictand relationships are often very complex. .... constraints to solve the optimization problem. ..... social, and environmental sustainability.

  18. Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling (Final Report)

    International Nuclear Information System (INIS)

    Schroeder, William J.

    2011-01-01

    This report contains the comprehensive summary of the work performed on the SBIR Phase II, Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling at Kitware Inc. in collaboration with Stanford Linear Accelerator Center (SLAC). The goal of the work was to develop collaborative visualization tools for large-scale data as illustrated in the figure below. The solutions we proposed address the typical problems faced by geographicallyand organizationally-separated research and engineering teams, who produce large data (either through simulation or experimental measurement) and wish to work together to analyze and understand their data. Because the data is large, we expect that it cannot be easily transported to each team member's work site, and that the visualization server must reside near the data. Further, we also expect that each work site has heterogeneous resources: some with large computing clients, tiled (or large) displays and high bandwidth; others sites as simple as a team member on a laptop computer. Our solution is based on the open-source, widely used ParaView large-data visualization application. We extended this tool to support multiple collaborative clients who may locally visualize data, and then periodically rejoin and synchronize with the group to discuss their findings. Options for managing session control, adding annotation, and defining the visualization pipeline, among others, were incorporated. We also developed and deployed a Web visualization framework based on ParaView that enables the Web browser to act as a participating client in a collaborative session. The ParaView Web Visualization framework leverages various Web technologies including WebGL, JavaScript, Java and Flash to enable interactive 3D visualization over the web using ParaView as the visualization server. We steered the development of this technology by teaming with the SLAC National Accelerator Laboratory. SLAC has a computationally-intensive problem

  19. Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling (Final Report)

    Energy Technology Data Exchange (ETDEWEB)

    William J. Schroeder

    2011-11-13

    This report contains the comprehensive summary of the work performed on the SBIR Phase II, Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling at Kitware Inc. in collaboration with Stanford Linear Accelerator Center (SLAC). The goal of the work was to develop collaborative visualization tools for large-scale data as illustrated in the figure below. The solutions we proposed address the typical problems faced by geographicallyand organizationally-separated research and engineering teams, who produce large data (either through simulation or experimental measurement) and wish to work together to analyze and understand their data. Because the data is large, we expect that it cannot be easily transported to each team member's work site, and that the visualization server must reside near the data. Further, we also expect that each work site has heterogeneous resources: some with large computing clients, tiled (or large) displays and high bandwidth; others sites as simple as a team member on a laptop computer. Our solution is based on the open-source, widely used ParaView large-data visualization application. We extended this tool to support multiple collaborative clients who may locally visualize data, and then periodically rejoin and synchronize with the group to discuss their findings. Options for managing session control, adding annotation, and defining the visualization pipeline, among others, were incorporated. We also developed and deployed a Web visualization framework based on ParaView that enables the Web browser to act as a participating client in a collaborative session. The ParaView Web Visualization framework leverages various Web technologies including WebGL, JavaScript, Java and Flash to enable interactive 3D visualization over the web using ParaView as the visualization server. We steered the development of this technology by teaming with the SLAC National Accelerator Laboratory. SLAC has a computationally

  20. Cost-effectiveness Assessment of 5G Systems with Cooperative Radio Resource Sharing

    Directory of Open Access Journals (Sweden)

    V. Nikolikj

    2015-11-01

    Full Text Available By use of techno-economic analysis of heterogeneous hierarchical cell structures and spectral efficiencies of the forthcoming advanced radio access technologies, this paper proposes various cost-efficient capacity enlargement strategies evaluated through the level of the production cost per transferred data unit and achievable profit margins. For the purpose of maximizing the aggregate performance (capacity or profit, we also assess the cooperative manners of radio resource sharing between mobile network operators, especially in the cases of capacity over-provisioning, when we also determine the principles to provide guaranteed data rates to a particular number of users. The results show that, for heavily loaded office environments, the future 5G pico base stations could be a preferable deployment solution. Also, we confirm that the radio resource management method with dynamic resource allocation can significantly improve the capacity of two comparably loaded operators which share the resources and aim to increase their cost effectiveness.

  1. Modern ICT Tools: Online Electronic Resources Sharing Using Web ...

    African Journals Online (AJOL)

    Modern ICT Tools: Online Electronic Resources Sharing Using Web 2.0 and Its Implications For Library And Information Practice In Nigeria. ... The PDF file you selected should load here if your Web browser has a PDF reader plug-in installed (for example, a recent version of Adobe Acrobat Reader). If you would like more ...

  2. Large Scale Computing and Storage Requirements for High Energy Physics

    International Nuclear Information System (INIS)

    Gerber, Richard A.; Wasserman, Harvey

    2010-01-01

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. The effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes

  3. New generation pharmacogenomic tools: a SNP linkage disequilibrium Map, validated SNP assay resource, and high-throughput instrumentation system for large-scale genetic studies.

    Science.gov (United States)

    De La Vega, Francisco M; Dailey, David; Ziegle, Janet; Williams, Julie; Madden, Dawn; Gilbert, Dennis A

    2002-06-01

    Since public and private efforts announced the first draft of the human genome last year, researchers have reported great numbers of single nucleotide polymorphisms (SNPs). We believe that the availability of well-mapped, quality SNP markers constitutes the gateway to a revolution in genetics and personalized medicine that will lead to better diagnosis and treatment of common complex disorders. A new generation of tools and public SNP resources for pharmacogenomic and genetic studies--specifically for candidate-gene, candidate-region, and whole-genome association studies--will form part of the new scientific landscape. This will only be possible through the greater accessibility of SNP resources and superior high-throughput instrumentation-assay systems that enable affordable, highly productive large-scale genetic studies. We are contributing to this effort by developing a high-quality linkage disequilibrium SNP marker map and an accompanying set of ready-to-use, validated SNP assays across every gene in the human genome. This effort incorporates both the public sequence and SNP data sources, and Celera Genomics' human genome assembly and enormous resource ofphysically mapped SNPs (approximately 4,000,000 unique records). This article discusses our approach and methodology for designing the map, choosing quality SNPs, designing and validating these assays, and obtaining population frequency ofthe polymorphisms. We also discuss an advanced, high-performance SNP assay chemisty--a new generation of the TaqMan probe-based, 5' nuclease assay-and high-throughput instrumentation-software system for large-scale genotyping. We provide the new SNP map and validation information, validated SNP assays and reagents, and instrumentation systems as a novel resource for genetic discoveries.

  4. Irrigania – a web-based game about sharing water resources

    Directory of Open Access Journals (Sweden)

    J. Seibert

    2012-08-01

    Full Text Available For teaching about collaboration and conflicts with regard to shared water resources, various types of games offer valuable opportunities. Single-player computer games often give much power to the player and ignore the fact that the best for some group might be difficult to achieve in reality if the individuals have their own interests. Here we present a new game called Irrigania, which aims at representing water conflicts among several actors in a simplified way. While simple in its rules, this game illustrates several game-theoretical situations typical for water-related conflicts. The game has been implemented as a web-based computer game, which allows easy application in classes. First classroom applications of the game indicated that, despite the simple rules, interesting patterns can evolve when playing the game in a class. These patterns can be used to discuss game theoretical considerations related to water resource sharing.

  5. Disinformative data in large-scale hydrological modelling

    Directory of Open Access Journals (Sweden)

    A. Kauffeldt

    2013-07-01

    Full Text Available Large-scale hydrological modelling has become an important tool for the study of global and regional water resources, climate impacts, and water-resources management. However, modelling efforts over large spatial domains are fraught with problems of data scarcity, uncertainties and inconsistencies between model forcing and evaluation data. Model-independent methods to screen and analyse data for such problems are needed. This study aimed at identifying data inconsistencies in global datasets using a pre-modelling analysis, inconsistencies that can be disinformative for subsequent modelling. The consistency between (i basin areas for different hydrographic datasets, and (ii between climate data (precipitation and potential evaporation and discharge data, was examined in terms of how well basin areas were represented in the flow networks and the possibility of water-balance closure. It was found that (i most basins could be well represented in both gridded basin delineations and polygon-based ones, but some basins exhibited large area discrepancies between flow-network datasets and archived basin areas, (ii basins exhibiting too-high runoff coefficients were abundant in areas where precipitation data were likely affected by snow undercatch, and (iii the occurrence of basins exhibiting losses exceeding the potential-evaporation limit was strongly dependent on the potential-evaporation data, both in terms of numbers and geographical distribution. Some inconsistencies may be resolved by considering sub-grid variability in climate data, surface-dependent potential-evaporation estimates, etc., but further studies are needed to determine the reasons for the inconsistencies found. Our results emphasise the need for pre-modelling data analysis to identify dataset inconsistencies as an important first step in any large-scale study. Applying data-screening methods before modelling should also increase our chances to draw robust conclusions from subsequent

  6. Supporting Shared Resource Usage for a Diverse User Community: the OSG Experience and Lessons Learned

    International Nuclear Information System (INIS)

    Garzoglio, Gabriele; Levshina, Tanya; Sehgal, Chander; Slyz, Marko; Rynge, Mats

    2012-01-01

    The Open Science Grid (OSG) supports a diverse community of new and existing users in adopting and making effective use of the Distributed High Throughput Computing (DHTC) model. The LHC user community has deep local support within the experiments. For other smaller communities and individual users the OSG provides consulting and technical services through the User Support area. We describe these sometimes successful and sometimes not so successful experiences and analyze lessons learned that are helping us improve our services. The services offered include forums to enable shared learning and mutual support, tutorials and documentation for new technology, and troubleshooting of problematic or systemic failure modes. For new communities and users, we bootstrap their use of the distributed high throughput computing technologies and resources available on the OSG by following a phased approach. We first adapt the application and run a small production campaign on a subset of “friendly” sites. Only then do we move the user to run full production campaigns across the many remote sites on the OSG, adding to the community resources up to hundreds of thousands of CPU hours per day. This scaling up generates new challenges – like no determinism in the time to job completion, and diverse errors due to the heterogeneity of the configurations and environments – so some attention is needed to get good results. We cover recent experiences with image simulation for the Large Synoptic Survey Telescope (LSST), small-file large volume data movement for the Dark Energy Survey (DES), civil engineering simulation with the Network for Earthquake Engineering Simulation (NEES), and accelerator modeling with the Electron Ion Collider group at BNL. We will categorize and analyze the use cases and describe how our processes are evolving based on lessons learned.

  7. Extending a HSF-enabled open-source real-time operating system with resource sharing

    NARCIS (Netherlands)

    Heuvel, van den M.M.H.P.; Bril, R.J.; Lukkien, J.J.; Behnam, M.; Petters, S.M.; Zijlstra, P.

    2010-01-01

    Hierarchical scheduling frameworks (HSFs) provide means for composing complex real-time systems from well-defined, independently analyzed subsystems. To support resource sharing within two-level, fixed priority scheduled HSFs, two synchronization protocols based on the stack resource policy (SRP)

  8. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  9. Resource Planning for SPARQL Query Execution on Data Sharing Platforms

    DEFF Research Database (Denmark)

    Hagedorn, Stefan; Hose, Katja; Sattler, Kai-Uwe

    2014-01-01

    To increase performance, data sharing platforms often make use of clusters of nodes where certain tasks can be executed in parallel. Resource planning and especially deciding how many processors should be chosen to exploit parallel processing is complex in such a setup as increasing the number...

  10. Landscapes for Energy and Wildlife: Conservation Prioritization for Golden Eagles across Large Spatial Scales.

    Directory of Open Access Journals (Sweden)

    Jason D Tack

    Full Text Available Proactive conservation planning for species requires the identification of important spatial attributes across ecologically relevant scales in a model-based framework. However, it is often difficult to develop predictive models, as the explanatory data required for model development across regional management scales is rarely available. Golden eagles are a large-ranging predator of conservation concern in the United States that may be negatively affected by wind energy development. Thus, identifying landscapes least likely to pose conflict between eagles and wind development via shared space prior to development will be critical for conserving populations in the face of imposing development. We used publically available data on golden eagle nests to generate predictive models of golden eagle nesting sites in Wyoming, USA, using a suite of environmental and anthropogenic variables. By overlaying predictive models of golden eagle nesting habitat with wind energy resource maps, we highlight areas of potential conflict among eagle nesting habitat and wind development. However, our results suggest that wind potential and the relative probability of golden eagle nesting are not necessarily spatially correlated. Indeed, the majority of our sample frame includes areas with disparate predictions between suitable nesting habitat and potential for developing wind energy resources. Map predictions cannot replace on-the-ground monitoring for potential risk of wind turbines on wildlife populations, though they provide industry and managers a useful framework to first assess potential development.

  11. Coarse-Grain Bandwidth Estimation Scheme for Large-Scale Network

    Science.gov (United States)

    Cheung, Kar-Ming; Jennings, Esther H.; Sergui, John S.

    2013-01-01

    A large-scale network that supports a large number of users can have an aggregate data rate of hundreds of Mbps at any time. High-fidelity simulation of a large-scale network might be too complicated and memory-intensive for typical commercial-off-the-shelf (COTS) tools. Unlike a large commercial wide-area-network (WAN) that shares diverse network resources among diverse users and has a complex topology that requires routing mechanism and flow control, the ground communication links of a space network operate under the assumption of a guaranteed dedicated bandwidth allocation between specific sparse endpoints in a star-like topology. This work solved the network design problem of estimating the bandwidths of a ground network architecture option that offer different service classes to meet the latency requirements of different user data types. In this work, a top-down analysis and simulation approach was created to size the bandwidths of a store-and-forward network for a given network topology, a mission traffic scenario, and a set of data types with different latency requirements. These techniques were used to estimate the WAN bandwidths of the ground links for different architecture options of the proposed Integrated Space Communication and Navigation (SCaN) Network. A new analytical approach, called the "leveling scheme," was developed to model the store-and-forward mechanism of the network data flow. The term "leveling" refers to the spreading of data across a longer time horizon without violating the corresponding latency requirement of the data type. Two versions of the leveling scheme were developed: 1. A straightforward version that simply spreads the data of each data type across the time horizon and doesn't take into account the interactions among data types within a pass, or between data types across overlapping passes at a network node, and is inherently sub-optimal. 2. Two-state Markov leveling scheme that takes into account the second order behavior of

  12. Quantitative Missense Variant Effect Prediction Using Large-Scale Mutagenesis Data.

    Science.gov (United States)

    Gray, Vanessa E; Hause, Ronald J; Luebeck, Jens; Shendure, Jay; Fowler, Douglas M

    2018-01-24

    Large datasets describing the quantitative effects of mutations on protein function are becoming increasingly available. Here, we leverage these datasets to develop Envision, which predicts the magnitude of a missense variant's molecular effect. Envision combines 21,026 variant effect measurements from nine large-scale experimental mutagenesis datasets, a hitherto untapped training resource, with a supervised, stochastic gradient boosting learning algorithm. Envision outperforms other missense variant effect predictors both on large-scale mutagenesis data and on an independent test dataset comprising 2,312 TP53 variants whose effects were measured using a low-throughput approach. This dataset was never used for hyperparameter tuning or model training and thus serves as an independent validation set. Envision prediction accuracy is also more consistent across amino acids than other predictors. Finally, we demonstrate that Envision's performance improves as more large-scale mutagenesis data are incorporated. We precompute Envision predictions for every possible single amino acid variant in human, mouse, frog, zebrafish, fruit fly, worm, and yeast proteomes (https://envision.gs.washington.edu/). Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Large-scale river regulation

    International Nuclear Information System (INIS)

    Petts, G.

    1994-01-01

    Recent concern over human impacts on the environment has tended to focus on climatic change, desertification, destruction of tropical rain forests, and pollution. Yet large-scale water projects such as dams, reservoirs, and inter-basin transfers are among the most dramatic and extensive ways in which our environment has been, and continues to be, transformed by human action. Water running to the sea is perceived as a lost resource, floods are viewed as major hazards, and wetlands are seen as wastelands. River regulation, involving the redistribution of water in time and space, is a key concept in socio-economic development. To achieve water and food security, to develop drylands, and to prevent desertification and drought are primary aims for many countries. A second key concept is ecological sustainability. Yet the ecology of rivers and their floodplains is dependent on the natural hydrological regime, and its related biochemical and geomorphological dynamics. (Author)

  14. Bionimbus: a cloud for managing, analyzing and sharing large genomics datasets.

    Science.gov (United States)

    Heath, Allison P; Greenway, Matthew; Powell, Raymond; Spring, Jonathan; Suarez, Rafael; Hanley, David; Bandlamudi, Chai; McNerney, Megan E; White, Kevin P; Grossman, Robert L

    2014-01-01

    As large genomics and phenotypic datasets are becoming more common, it is increasingly difficult for most researchers to access, manage, and analyze them. One possible approach is to provide the research community with several petabyte-scale cloud-based computing platforms containing these data, along with tools and resources to analyze it. Bionimbus is an open source cloud-computing platform that is based primarily upon OpenStack, which manages on-demand virtual machines that provide the required computational resources, and GlusterFS, which is a high-performance clustered file system. Bionimbus also includes Tukey, which is a portal, and associated middleware that provides a single entry point and a single sign on for the various Bionimbus resources; and Yates, which automates the installation, configuration, and maintenance of the software infrastructure required. Bionimbus is used by a variety of projects to process genomics and phenotypic data. For example, it is used by an acute myeloid leukemia resequencing project at the University of Chicago. The project requires several computational pipelines, including pipelines for quality control, alignment, variant calling, and annotation. For each sample, the alignment step requires eight CPUs for about 12 h. BAM file sizes ranged from 5 GB to 10 GB for each sample. Most members of the research community have difficulty downloading large genomics datasets and obtaining sufficient storage and computer resources to manage and analyze the data. Cloud computing platforms, such as Bionimbus, with data commons that contain large genomics datasets, are one choice for broadening access to research data in genomics. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  15. Energy transfers in large-scale and small-scale dynamos

    Science.gov (United States)

    Samtaney, Ravi; Kumar, Rohit; Verma, Mahendra

    2015-11-01

    We present the energy transfers, mainly energy fluxes and shell-to-shell energy transfers in small-scale dynamo (SSD) and large-scale dynamo (LSD) using numerical simulations of MHD turbulence for Pm = 20 (SSD) and for Pm = 0.2 on 10243 grid. For SSD, we demonstrate that the magnetic energy growth is caused by nonlocal energy transfers from the large-scale or forcing-scale velocity field to small-scale magnetic field. The peak of these energy transfers move towards lower wavenumbers as dynamo evolves, which is the reason for the growth of the magnetic fields at the large scales. The energy transfers U2U (velocity to velocity) and B2B (magnetic to magnetic) are forward and local. For LSD, we show that the magnetic energy growth takes place via energy transfers from large-scale velocity field to large-scale magnetic field. We observe forward U2U and B2B energy flux, similar to SSD.

  16. Climate change and large-scale land acquisitions in Africa: Quantifying the future impact on acquired water resources

    Science.gov (United States)

    Chiarelli, Davide Danilo; Davis, Kyle Frankel; Rulli, Maria Cristina; D'Odorico, Paolo

    2016-08-01

    Pressure on agricultural land has markedly increased since the start of the century, driven by demographic growth, changes in diet, increasing biofuel demand, and globalization. To better ensure access to adequate land and water resources, many investors and countries began leasing large areas of agricultural land in the global South, a phenomenon often termed "large-scale land acquisition" (LSLA). To date, this global land rush has resulted in the appropriation of 41million hectares and about 490 km3 of freshwater resources, affecting rural livelihoods and local environments. It remains unclear to what extent land and water acquisitions contribute to the emergence of water-stress conditions in acquired areas, and how these demands for water may be impacted by climate change. Here we analyze 18 African countries - 20 Mha (or 80%) of LSLA for the continent - and estimate that under present climate 210 km3 year-1of water would be appropriated if all acquired areas were actively under production. We also find that consumptive use of irrigation water is disproportionately contributed by water-intensive biofuel crops. Using the IPCCA1B scenario, we find only small changes in green (-1.6%) and blue (+2.0%) water demand in targeted areas. With a 3 °C temperature increase, crop yields are expected to decrease up to 20% with a consequent increase in the water footprint. When the effect of increasing atmospheric CO2concentrations is accounted for, crop yields increase by as much as 40% with a decrease in water footprint up to 29%. The relative importance of CO2 fertilization and warming will therefore determine water appropriations and changes in water footprint under climate change scenarios.

  17. Shared random access memory resource for multiprocessor real-time systems

    International Nuclear Information System (INIS)

    Dimmler, D.G.; Hardy, W.H. II

    1977-01-01

    A shared random-access memory resource is described which is used within real-time data acquisition and control systems with multiprocessor and multibus organizations. Hardware and software aspects are discussed in a specific example where interconnections are done via a UNIBUS. The general applicability of the approach is also discussed

  18. Estimating the electricity prices, generation costs and CO_2 emissions of large scale wind energy exports from Ireland to Great Britain

    International Nuclear Information System (INIS)

    Cleary, Brendan; Duffy, Aidan; Bach, Bjarne; Vitina, Aisma; O’Connor, Alan; Conlon, Michael

    2016-01-01

    The share of wind generation in the Irish and British electricity markets is set to increase by 2020 due to renewable energy (RE) targets. The United Kingdom (UK) and Ireland have set ambitious targets which require 30% and 40% of electricity demand to come from RE, mainly wind, by 2020, respectively. Ireland has sufficient indigenous onshore wind energy resources to exceed the RE target, while the UK faces uncertainty in achieving its target. A possible solution for the UK is to import RE directly from large scale onshore and offshore wind energy projects in Ireland; this possibility has recently been explored by both governments but is currently on hold. Thus, the aim of this paper is to estimate the effects of large scale wind energy in the Irish and British electricity markets in terms of wholesale system marginal prices, total generation costs and CO_2 emissions. The results indicate when the large scale Irish-based wind energy projects are connected directly to the UK there is a decrease of 0.6% and 2% in the Irish and British wholesale system marginal prices under the UK National Grid slow progression scenario, respectively. - Highlights: • Modelling the Irish and British electricity markets. • Investigating the impacts of large scale wind energy within the markets. • Results indicate a reduction in wholesale system marginal prices in both markets. • Decrease in total generation costs and CO_2 emissions in both markets.

  19. Scaling to diversity: The DERECHOS distributed infrastructure for analyzing and sharing data

    Science.gov (United States)

    Rilee, M. L.; Kuo, K. S.; Clune, T.; Oloso, A.; Brown, P. G.

    2016-12-01

    Integrating Earth Science data from diverse sources such as satellite imagery and simulation output can be expensive and time-consuming, limiting scientific inquiry and the quality of our analyses. Reducing these costs will improve innovation and quality in science. The current Earth Science data infrastructure focuses on downloading data based on requests formed from the search and analysis of associated metadata. And while the data products provided by archives may use the best available data sharing technologies, scientist end-users generally do not have such resources (including staff) available to them. Furthermore, only once an end-user has received the data from multiple diverse sources and has integrated them can the actual analysis and synthesis begin. The cost of getting from idea to where synthesis can start dramatically slows progress. In this presentation we discuss a distributed computational and data storage framework that eliminates much of the aforementioned cost. The SciDB distributed array database is central as it is optimized for scientific computing involving very large arrays, performing better than less specialized frameworks like Spark. Adding spatiotemporal functions to the SciDB creates a powerful platform for analyzing and integrating massive, distributed datasets. SciDB allows Big Earth Data analysis to be performed "in place" without the need for expensive downloads and end-user resources. Spatiotemporal indexing technologies such as the hierarchical triangular mesh enable the compute and storage affinity needed to efficiently perform co-located and conditional analyses minimizing data transfers. These technologies automate the integration of diverse data sources using the framework, a critical step beyond current metadata search and analysis. Instead of downloading data into their idiosyncratic local environments, end-users can generate and share data products integrated from diverse multiple sources using a common shared environment

  20. The Politics of Scale, Position, and Place in the Governance of Water Resources in the Mekong Region

    Directory of Open Access Journals (Sweden)

    Louis Lebel

    2005-12-01

    Full Text Available The appropriate scales for science, management, and decision making cannot be unambiguously derived from physical characteristics of water resources. Scales are a joint product of social and biophysical processes. The politics-of-scale metaphor has been helpful in drawing attention to the ways in which scale choices are constrained overtly by politics, and more subtly by choices of technologies, institutional designs, and measurements. In doing so, however, the scale metaphor has been stretched to cover a lot of different spatial relationships. In this paper, we argue that there are benefits to understanding - and actions to distinguish - issues of scale from those of place and position. We illustrate our arguments with examples from the governance of water resources in the Mekong region, where key scientific information is often limited to a few sources. Acknowledging how actors' interests fit along various spatial, temporal, jurisdictional, and other social scales helps make the case for innovative and more inclusive means for bringing multi-level interests to a common forum. Deliberation can provide a check on the extent of shared understanding and key uncertainties.

  1. Mathematical Analysis of Vehicle Delivery Scale of Bike-Sharing Rental Nodes

    Science.gov (United States)

    Zhai, Y.; Liu, J.; Liu, L.

    2018-04-01

    Aiming at the lack of scientific and reasonable judgment of vehicles delivery scale and insufficient optimization of scheduling decision, based on features of the bike-sharing usage, this paper analyses the applicability of the discrete time and state of the Markov chain, and proves its properties to be irreducible, aperiodic and positive recurrent. Based on above analysis, the paper has reached to the conclusion that limit state (steady state) probability of the bike-sharing Markov chain only exists and is independent of the initial probability distribution. Then this paper analyses the difficulty of the transition probability matrix parameter statistics and the linear equations group solution in the traditional solving algorithm of the bike-sharing Markov chain. In order to improve the feasibility, this paper proposes a "virtual two-node vehicle scale solution" algorithm which considered the all the nodes beside the node to be solved as a virtual node, offered the transition probability matrix, steady state linear equations group and the computational methods related to the steady state scale, steady state arrival time and scheduling decision of the node to be solved. Finally, the paper evaluates the rationality and accuracy of the steady state probability of the proposed algorithm by comparing with the traditional algorithm. By solving the steady state scale of the nodes one by one, the proposed algorithm is proved to have strong feasibility because it lowers the level of computational difficulty and reduces the number of statistic, which will help the bike-sharing companies to optimize the scale and scheduling of nodes.

  2. MATHEMATICAL ANALYSIS OF VEHICLE DELIVERY SCALE OF BIKE-SHARING RENTAL NODES

    Directory of Open Access Journals (Sweden)

    Y. Zhai

    2018-04-01

    Full Text Available Aiming at the lack of scientific and reasonable judgment of vehicles delivery scale and insufficient optimization of scheduling decision, based on features of the bike-sharing usage, this paper analyses the applicability of the discrete time and state of the Markov chain, and proves its properties to be irreducible, aperiodic and positive recurrent. Based on above analysis, the paper has reached to the conclusion that limit state (steady state probability of the bike-sharing Markov chain only exists and is independent of the initial probability distribution. Then this paper analyses the difficulty of the transition probability matrix parameter statistics and the linear equations group solution in the traditional solving algorithm of the bike-sharing Markov chain. In order to improve the feasibility, this paper proposes a "virtual two-node vehicle scale solution" algorithm which considered the all the nodes beside the node to be solved as a virtual node, offered the transition probability matrix, steady state linear equations group and the computational methods related to the steady state scale, steady state arrival time and scheduling decision of the node to be solved. Finally, the paper evaluates the rationality and accuracy of the steady state probability of the proposed algorithm by comparing with the traditional algorithm. By solving the steady state scale of the nodes one by one, the proposed algorithm is proved to have strong feasibility because it lowers the level of computational difficulty and reduces the number of statistic, which will help the bike-sharing companies to optimize the scale and scheduling of nodes.

  3. Young children consider merit when sharing resources with others.

    Science.gov (United States)

    Kanngiesser, Patricia; Warneken, Felix

    2012-01-01

    MERIT IS A KEY PRINCIPLE OF FAIRNESS: rewards should be distributed according to how much someone contributed to a task. Previous research suggests that children have an early ability to take merit into account in third-party situations but that merit-based sharing in first-party contexts does not emerge until school-age. Here we provide evidence that three- and five-year-old children already use merit to share resources with others, even when sharing is costly for the child. In Study 1, a child and a puppet-partner collected coins that were later exchanged for rewards. We varied the work-contribution of both partners by manipulating how many coins each partner collected. Children kept fewer stickers in trials in which they had contributed less than in trials in which they had contributed more than the partner, showing that they took merit into account. Few children, however, gave away more than half of the stickers when the partner had worked more. Study 2 confirmed that children related their own work-contribution to their partner's, rather than simply focusing on their own contribution. Taken together, these studies show that merit-based sharing is apparent in young children; however it remains constrained by a self-serving bias.

  4. Young children consider merit when sharing resources with others.

    Directory of Open Access Journals (Sweden)

    Patricia Kanngiesser

    Full Text Available MERIT IS A KEY PRINCIPLE OF FAIRNESS: rewards should be distributed according to how much someone contributed to a task. Previous research suggests that children have an early ability to take merit into account in third-party situations but that merit-based sharing in first-party contexts does not emerge until school-age. Here we provide evidence that three- and five-year-old children already use merit to share resources with others, even when sharing is costly for the child. In Study 1, a child and a puppet-partner collected coins that were later exchanged for rewards. We varied the work-contribution of both partners by manipulating how many coins each partner collected. Children kept fewer stickers in trials in which they had contributed less than in trials in which they had contributed more than the partner, showing that they took merit into account. Few children, however, gave away more than half of the stickers when the partner had worked more. Study 2 confirmed that children related their own work-contribution to their partner's, rather than simply focusing on their own contribution. Taken together, these studies show that merit-based sharing is apparent in young children; however it remains constrained by a self-serving bias.

  5. Large-scale straw supplies to existing coal-fired power stations

    International Nuclear Information System (INIS)

    Gylling, M.; Parsby, M.; Thellesen, H.Z.; Keller, P.

    1992-08-01

    It is considered that large-scale supply of straw to power stations and decentral cogeneration plants could open up new economical systems and methods of organization of straw supply in Denmark. This thesis is elucidated and involved constraints are pointed out. The aim is to describe to what extent large-scale straw supply is interesting with regard to monetary savings and available resources. Analyses of models, systems and techniques described in a foregoing project are carried out. It is reckoned that the annual total amount of surplus straw in Denmark is 3.6 million tons. At present, use of straw which is not agricultural is limited to district heating plants with an annual consumption of 2-12 thousand tons. A prerequisite for a significant increase in the use of straw is an annual consumption by power and cogeneration plants of more than 100.000 tons. All aspects of straw management are examined in detail, also in relation to two actual Danish coal-fired plants. The reliability of straw supply is considered. It is concluded that very significant resources of straw are available in Denmark but there remain a number of constraints. Price competitiveness must be considered in relation to other fuels. It is suggested that the use of corn harvests, with whole stems attached (handled as large bales or in the same way as sliced straw alone) as fuel, would result in significant monetary savings in transport and storage especially. An equal status for whole-harvested corn with other forms of biomass fuels, with following changes in taxes and subsidies could possibly reduce constraints on large scale straw fuel supply. (AB) (13 refs.)

  6. Large Scale Computing and Storage Requirements for High Energy Physics

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard A.; Wasserman, Harvey

    2010-11-24

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. The effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years

  7. A resource of large-scale molecular markers for monitoring Agropyron cristatum chromatin introgression in wheat background based on transcriptome sequences.

    Science.gov (United States)

    Zhang, Jinpeng; Liu, Weihua; Lu, Yuqing; Liu, Qunxing; Yang, Xinming; Li, Xiuquan; Li, Lihui

    2017-09-20

    Agropyron cristatum is a wild grass of the tribe Triticeae and serves as a gene donor for wheat improvement. However, very few markers can be used to monitor A. cristatum chromatin introgressions in wheat. Here, we reported a resource of large-scale molecular markers for tracking alien introgressions in wheat based on transcriptome sequences. By aligning A. cristatum unigenes with the Chinese Spring reference genome sequences, we designed 9602 A. cristatum expressed sequence tag-sequence-tagged site (EST-STS) markers for PCR amplification and experimental screening. As a result, 6063 polymorphic EST-STS markers were specific for the A. cristatum P genome in the single-receipt wheat background. A total of 4956 randomly selected polymorphic EST-STS markers were further tested in eight wheat variety backgrounds, and 3070 markers displaying stable and polymorphic amplification were validated. These markers covered more than 98% of the A. cristatum genome, and the marker distribution density was approximately 1.28 cM. An application case of all EST-STS markers was validated on the A. cristatum 6 P chromosome. These markers were successfully applied in the tracking of alien A. cristatum chromatin. Altogether, this study provided a universal method of large-scale molecular marker development to monitor wild relative chromatin in wheat.

  8. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  9. Digital Resource Sharing and Library Consortia in Italy

    Directory of Open Access Journals (Sweden)

    Tommaso Giordano

    2017-09-01

    Full Text Available Interlibrary cooperation in Italy is a fairly recent and not very widespread practice. Attention to the topic was aroused in the eighties with the Italian library network project. More recently, under the impetus toward technological innovation, there has been renewed (and more pragmatic interest in cooperation in all library sectors. Sharing electronic resources is the theme of greatest interest today in university libraries, where various initiatives are aimed at setting up consortia to purchase licenses and run digital products. A number of projects in hand are described, and emerging trends analyzed.

  10. Evolutionary Hierarchical Multi-Criteria Metaheuristics for Scheduling in Large-Scale Grid Systems

    CERN Document Server

    Kołodziej, Joanna

    2012-01-01

    One of the most challenging issues in modelling today's large-scale computational systems is to effectively manage highly parametrised distributed environments such as computational grids, clouds, ad hoc networks and P2P networks. Next-generation computational grids must provide a wide range of services and high performance computing infrastructures. Various types of information and data processed in the large-scale dynamic grid environment may be incomplete, imprecise, and fragmented, which complicates the specification of proper evaluation criteria and which affects both the availability of resources and the final collective decisions of users. The complexity of grid architectures and grid management may also contribute towards higher energy consumption. All of these issues necessitate the development of intelligent resource management techniques, which are capable of capturing all of this complexity and optimising meaningful metrics for a wide range of grid applications.   This book covers hot topics in t...

  11. 3D large-scale calculations using the method of characteristics

    International Nuclear Information System (INIS)

    Dahmani, M.; Roy, R.; Koclas, J.

    2004-01-01

    An overview of the computational requirements and the numerical developments made in order to be able to solve 3D large-scale problems using the characteristics method will be presented. To accelerate the MCI solver, efficient acceleration techniques were implemented and parallelization was performed. However, for the very large problems, the size of the tracking file used to store the tracks can still become prohibitive and exceed the capacity of the machine. The new 3D characteristics solver MCG will now be introduced. This methodology is dedicated to solve very large 3D problems (a part or a whole core) without spatial homogenization. In order to eliminate the input/output problems occurring when solving these large problems, we define a new computing scheme that requires more CPU resources than the usual one, based on sweeps over large tracking files. The huge capacity of storage needed in some problems and the related I/O queries needed by the characteristics solver are replaced by on-the-fly recalculation of tracks at each iteration step. Using this technique, large 3D problems are no longer I/O-bound, and distributed CPU resources can be efficiently used. (author)

  12. Virtual partitioning for robust resource sharing: computational techniques for heterogeneous traffic

    NARCIS (Netherlands)

    Borst, S.C.; Mitra, D.

    1998-01-01

    We consider virtual partitioning (VP), which is a scheme for sharing a resource among several traffic classes in an efficient, fair, and robust manner. In the preliminary design stage, each traffic class is allocated a nominal capacity, which is based on expected offered traffic and required quality

  13. Large-Scale Astrophysical Visualization on Smartphones

    Science.gov (United States)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  14. Temporal isolation in an HSF-enabled real-time kernel in the presence of shared resources

    NARCIS (Netherlands)

    Heuvel, van den M.M.H.P.; Bril, R.J.; Lukkien, J.J.; Parmer, G.; Gleixner, T.

    2011-01-01

    Hierarchical scheduling frameworks (HSFs) have been extensively investigated as a paradigm for facilitating temporal isolation between components that need to be integrated on a single shared processor. To support resource sharing within two-level, fixed priority scheduled HSFs, two synchronization

  15. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  16. Sharing network resources

    CERN Document Server

    Parekh, Abhay

    2014-01-01

    Resource Allocation lies at the heart of network control. In the early days of the Internet the scarcest resource was bandwidth, but as the network has evolved to become an essential utility in the lives of billions, the nature of the resource allocation problem has changed. This book attempts to describe the facets of resource allocation that are most relevant to modern networks. It is targeted at graduate students and researchers who have an introductory background in networking and who desire to internalize core concepts before designing new protocols and applications. We start from the fun

  17. Working Memory Span Development: A Time-Based Resource-Sharing Model Account

    Science.gov (United States)

    Barrouillet, Pierre; Gavens, Nathalie; Vergauwe, Evie; Gaillard, Vinciane; Camos, Valerie

    2009-01-01

    The time-based resource-sharing model (P. Barrouillet, S. Bernardin, & V. Camos, 2004) assumes that during complex working memory span tasks, attention is frequently and surreptitiously switched from processing to reactivate decaying memory traces before their complete loss. Three experiments involving children from 5 to 14 years of age…

  18. Secret Sharing Schemes with a large number of players from Toric Varieties

    DEFF Research Database (Denmark)

    Hansen, Johan P.

    A general theory for constructing linear secret sharing schemes over a finite field $\\Fq$ from toric varieties is introduced. The number of players can be as large as $(q-1)^r-1$ for $r\\geq 1$. We present general methods for obtaining the reconstruction and privacy thresholds as well as conditions...... for multiplication on the associated secret sharing schemes. In particular we apply the method on certain toric surfaces. The main results are ideal linear secret sharing schemes where the number of players can be as large as $(q-1)^2-1$. We determine bounds for the reconstruction and privacy thresholds...

  19. Resource sharing of online teaching materials: The lon-capa project

    Science.gov (United States)

    Bauer, Wolfgang

    2004-03-01

    The use of information technology resources in conventional lecture-based courses, in distance-learning offerings, as well as hybrid courses, is increasing. But this may put additional burden on faculty, who are now asked to deliver this new content. Additionally, it may require the installation of commercial courseware systems, putting the colleges and universities in new financial licensing dependencies. To address exactly these two problems, the lon-capa system was invented to provide an open-source, gnu public license based, courseware system that allows for sharing of educational resources across institutional and disciplinary boundaries. This presentation will focus on both aspects of the system, the courseware capabilities that allow for customized environments for individual students, and the educational resources library that enables teachers to take full advantages of the work of their colleagues. Research results on learning effectiveness, resource and system usage patterns, and customization for different learning styles will be shown. Institutional perceptions of and responses to open source courseware systems will be discussed.

  20. CSCW Challenges in Large-Scale Technical Projects - a case study

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Kyng, Morten; Mogensen, Preben Holst

    1992-01-01

    This paper investigates CSCW aspects of large-scale technical projects based on a case study of a specific Danish engineering company and uncovers s challenges to CSCW applications in this setting. The company is responsible for management and supervision of one of the worlds largest tunnel....... The initial qualitative analysis identified a number of bottlenecks in daily work, where support for cooperation is needed. Examples of bottlenecks are: sharing materials, issuing tasks, and keeping track of task status. Grounded in the analysis, cooperative design workshops based on scenarios of future work...

  1. Water limited agriculture in Africa: Climate change sensitivity of large scale land investments

    Science.gov (United States)

    Rulli, M. C.; D'Odorico, P.; Chiarelli, D. D.; Davis, K. F.

    2015-12-01

    The past few decades have seen unprecedented changes in the global agricultural system with a dramatic increase in the rates of food production fueled by an escalating demand for food calories, as a result of demographic growth, dietary changes, and - more recently - new bioenergy policies. Food prices have become consistently higher and increasingly volatile with dramatic spikes in 2007-08 and 2010-11. The confluence of these factors has heightened demand for land and brought a wave of land investment to the developing world: some of the more affluent countries are trying to secure land rights in areas suitable for agriculture. According to some estimates, to date, roughly 38 million hectares have been acquired worldwide by large scale investors, 16 million of which in Africa. More than 85% of large scale land acquisitions in Africa are by foreign investors. Many land deals are motivated not only by the need for fertile land but for the water resources required for crop production. Despite some recent assessments of the water appropriation associated with large scale land investments, their impact on the water resources of the target countries under present conditions and climate change scenarios remains poorly understood. Here we investigate irrigation water requirements by various crops planted in the acquired land as an indicator of the pressure likely placed by land investors on ("blue") water resources of target regions in Africa and evaluate the sensitivity to climate changes scenarios.

  2. A Critical Study of Effect of Web-Based Software Tools in Finding and Sharing Digital Resources--A Literature Review

    Science.gov (United States)

    Baig, Muntajeeb Ali

    2010-01-01

    The purpose of this paper is to review the effect of web-based software tools for finding and sharing digital resources. A positive correlation between learning and studying through online tools has been found in recent researches. In traditional classroom, searching resources are limited to the library and sharing of resources is limited to the…

  3. 'You should at least ask'. The expectations, hopes and fears of rare disease patients on large-scale data and biomaterial sharing for genomics research.

    Science.gov (United States)

    McCormack, Pauline; Kole, Anna; Gainotti, Sabina; Mascalzoni, Deborah; Molster, Caron; Lochmüller, Hanns; Woods, Simon

    2016-10-01

    Within the myriad articles about participants' opinions of genomics research, the views of a distinct group - people with a rare disease (RD) - are unknown. It is important to understand if their opinions differ from the general public by dint of having a rare disease and vulnerabilities inherent in this. Here we document RD patients' attitudes to participation in genomics research, particularly around large-scale, international data and biosample sharing. This work is unique in exploring the views of people with a range of rare disorders from many different countries. The authors work within an international, multidisciplinary consortium, RD-Connect, which has developed an integrated platform connecting databases, registries, biobanks and clinical bioinformatics for RD research. Focus groups were conducted with 52 RD patients from 16 countries. Using a scenario-based approach, participants were encouraged to raise topics relevant to their own experiences, rather than these being determined by the researcher. Issues include wide data sharing, and consent for new uses of historic samples and for children. Focus group members are positively disposed towards research and towards allowing data and biosamples to be shared internationally. Expressions of trust and attitudes to risk are often affected by the nature of the RD which they have experience of, as well as regulatory and cultural practices in their home country. Participants are concerned about data security and misuse. There is an acute recognition of the vulnerability inherent in having a RD and the possibility that open knowledge of this could lead to discrimination.

  4. Analysis of the Economic Impact of Large-Scale Deployment of Biomass Resources for Energy and Materials in the Netherlands. Appendix 2. Macro-economic Scenarios

    International Nuclear Information System (INIS)

    Banse, M.

    2009-03-01

    The Bio-based Raw Materials Platform (known as PGG), which is part of the Energy Transition programme in the Netherlands, commissioned the Agricultural Economics Research Institute (LEI) and the Copernicus Institute of Utrecht University to study the macro-economic impact of large-scale deployment of biomass for energy and materials in the Netherlands. Two model approaches were applied based on a consistent set of scenario assumptions: a bottom-up study including techno-economic projections of fossil and bio-based conversion technologies and a top-down study including macro-economic modelling of (global) trade of biomass and fossil resources. The results of the top-down study (part 2) including macro-economic modelling of (global) trade of biomass and fossil resources, are presented in this report

  5. Water footprints as an indicator for the equitable utilization of shared water resources. (Case study: Egypt and Ethiopia shared water resources in Nile Basin)

    Science.gov (United States)

    Sallam, Osama M.

    2014-12-01

    The question of "equity." is a vague and relative term in any event, criteria for equity are particularly difficult to determine in water conflicts, where international water law is ambiguous and often contradictory, and no mechanism exists to enforce principles which are agreed-upon. The aim of this study is using the water footprints as a concept to be an indicator or a measuring tool for the Equitable Utilization of shared water resources. Herein Egypt and Ethiopia water resources conflicts in Nile River Basin were selected as a case study. To achieve this study; water footprints, international virtual water flows and water footprint of national consumption of Egypt and Ethiopia has been analyzed. In this study, some indictors of equitable utilization has been gained for example; Egypt water footprint per capita is 1385 CM/yr/cap while in Ethiopia is 1167 CM/yr/cap, Egypt water footprint related to the national consumption is 95.15 BCM/yr, while in Ethiopia is 77.63 BCM/yr, and the external water footprints of Egypt is 28.5%, while in Ethiopia is 2.3% of the national consumption water footprint. The most important conclusion of this study is; natural, social, environmental and economical aspects should be taken into account when considering the water footprints as an effective measurable tool to assess the equable utilization of shared water resources, moreover the water footprints should be calculated using a real data and there is a necessity to establishing a global water footprints benchmarks for commodities as a reference.

  6. Resource Distribution Approaches in Spectrum Sharing Systems

    Directory of Open Access Journals (Sweden)

    Friedrich K. Jondral

    2008-05-01

    Full Text Available It is increasingly difficult to satisfy growing demands for spectrum with the conventional policy of fixed spectrum allocation. To overcome this problem, flexible/dynamic spectrum sharing methods that can significantly improve spectrum utilization of the spectrum have gained increasing interest recently. This paper presents two dynamic spectrum sharing approaches, a centralized and a decentralized one. The centralized approach is based on hierarchical trading. Each level of hierarchy is composed of “markets” that are associated with a certain spatial area and trading occurrence frequency, whereas area size and trading occurrence frequency depend on the hierarchy level. The decentralized approach is based on game-theory. There, it is assumed that the operators are averse to unequal payoffs and act unselfishly, enabling a stable and sustainable community. Numerical results show that, in the observed scenario, both proposals outperform the reference case of fixed resource allocation significantly in terms of utilized bandwidth. Whereas, negotiation costs for spectrum brokerage appear in the centralized approach, nonnegligible amounts of spectrum are lost in the decentralized approach due to collisions. Thus, a hybrid of centralized and decentralized approach that exploits the benefits of both is also considered.

  7. Methods for assessing the socioeconomic impacts of large-scale resource developments: implications for nuclear repository siting

    International Nuclear Information System (INIS)

    Murdock, S.H.; Leistritz, F.L.

    1983-03-01

    An overview of the major methods presently available for assessing the socioeconomic impacts of large-scale resource developments and includes discussion of the implications and applications of such methods for nuclear-waste-repository siting are provided. The report: (1) summarizes conceptual approaches underlying, and methodological alternatives for, the conduct of impact assessments in each substantive area, and then enumerates advantages and disadvantages of each alternative; (2) describes factors related to the impact-assessment process, impact events, and the characteristics of rural areas that affect the magnitude and distribution of impacts and the assessment of impacts in each area; (3) provides a detailed review of those methodologies actually used in impact assessment for each area, describes advantages and problems encountered in the use of each method, and identifies the frequency of use and the general level of acceptance of each technique; and (4) summarizes the implications of each area of projection for the repository-siting process, the applicability of the methods for each area to the special and standard features of repositories, and makes general recommendations concerning specific methods and procedures that should be incorporated in assessments for siting areas

  8. Large scale electronic structure calculations in the study of the condensed phase

    NARCIS (Netherlands)

    van Dam, H.J.J.; Guest, M.F.; Sherwood, P.; Thomas, J.M.H.; van Lenthe, J.H.; van Lingen, J.N.J.; Bailey, C.L.; Bush, I.J.

    2006-01-01

    We consider the role that large-scale electronic structure computations can now play in the modelling of the condensed phase. To structure our analysis, we consider four distict ways in which today's scientific targets can be re-scoped to take advantage of advances in computing resources: 1. time to

  9. Pemanfaatan Social Media Network Sebagai Media Komunikasi Komunitas Pustakawan Homogen Dalam Rangka Optimalisasi Resources Sharing Koleksi Antar Perguruan Tinggi

    Directory of Open Access Journals (Sweden)

    Haryanto Haryanto

    2016-07-01

    The purpose of this analysis is the creation of a homogeneous communication between librarians between universities so that they can support each other to provide of collection. In the world of libraries, collection limitations faced by almost all libraries, so that the necessary efforts such as sharing collections (resources sharing, for it is needed of a comunication medium that can be used as a medium of communication that connect these libraries. And social media is facebook With social media may be possible to create communities of similar or homogeneous so that they can communicate quickly for sharing collections. In utilizing social media for sharing resources the college library, in order to effectively take a few things in common among communities majors / homogeneous, the main admin control, resources sharing deal, admin list each library, freight forwarding services, as well as the MoU.

  10. H2@Scale Resource and Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ruth, Mark

    2017-05-04

    The 'H2@Scale' concept is based on the potential for wide-scale utilization of hydrogen as an energy intermediate where the hydrogen is produced from low cost energy resources and it is used in both the transportation and industrial sectors. H2@Scale has the potential to address grid resiliency, energy security, and cross-sectoral emissions reductions. This presentation summarizes the status of an ongoing analysis effort to quantify the benefits of H2@Scale. It includes initial results regarding market potential, resource potential, and impacts of when electrolytic hydrogen is produced with renewable electricity to meet the potential market demands. It also proposes additional analysis efforts to better quantify each of the factors.

  11. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  12. An industrial perspective on bioreactor scale-down: what we can learn from combined large-scale bioprocess and model fluid studies.

    Science.gov (United States)

    Noorman, Henk

    2011-08-01

    For industrial bioreactor design, operation, control and optimization, the scale-down approach is often advocated to efficiently generate data on a small scale, and effectively apply suggested improvements to the industrial scale. In all cases it is important to ensure that the scale-down conditions are representative of the real large-scale bioprocess. Progress is hampered by limited detailed and local information from large-scale bioprocesses. Complementary to real fermentation studies, physical aspects of model fluids such as air-water in large bioreactors provide useful information with limited effort and cost. Still, in industrial practice, investments of time, capital and resources often prohibit systematic work, although, in the end, savings obtained in this way are trivial compared to the expenses that result from real process disturbances, batch failures, and non-flyers with loss of business opportunity. Here we try to highlight what can be learned from real large-scale bioprocess in combination with model fluid studies, and to provide suitable computation tools to overcome data restrictions. Focus is on a specific well-documented case for a 30-m(3) bioreactor. Areas for further research from an industrial perspective are also indicated. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Coexistence and conflict: IWRM and large-scale water infrastructure development in Piura, Peru

    Directory of Open Access Journals (Sweden)

    Megan Mills-Novoa

    2017-06-01

    Full Text Available Despite the emphasis of Integrated Water Resources Management (IWRM on 'soft' demand-side management, large-scale water infrastructure is increasingly being constructed in basins managed under an IWRM framework. While there has been substantial research on IWRM, few scholars have unpacked how IWRM and large-scale water infrastructure development coexist and conflict. Piura, Peru is an important site for understanding how IWRM and capital-intensive, concrete-heavy water infrastructure development articulate in practice. After 70 years of proposals and planning, the Regional Government of Piura began construction of the mega-irrigation project, Proyecto Especial de Irrigación e Hidroeléctrico del Alto Piura (PEIHAP in 2013. PEIHAP, which will irrigate an additional 19,000 hectares (ha, is being realised in the wake of major reforms in the ChiraPiura River Basin, a pilot basin for the IWRM-inspired 2009 Water Resources Law. We first map the historical trajectory of PEIHAP as it mirrors the shifting political priorities of the Peruvian state. We then draw on interviews with the newly formed River Basin Council, regional government, PEIHAP, and civil society actors to understand why and how these differing water management paradigms coexist. We find that while the 2009 Water Resources Law labels large-scale irrigation infrastructure as an 'exceptional measure', this development continues to eclipse IWRM provisions of the new law. This uneasy coexistence reflects the parallel desires of the state to imbue water policy reform with international credibility via IWRM while also furthering economic development goals via largescale water infrastructure. While the participatory mechanisms and expertise of IWRM-inspired river basin councils have not been brought to bear on the approval and construction of PEIHAP, these institutions will play a crucial role in managing the myriad resource and social conflicts that are likely to result.

  14. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  15. Aggregated Representation of Distribution Networks for Large-Scale Transmission Network Simulations

    DEFF Research Database (Denmark)

    Göksu, Ömer; Altin, Müfit; Sørensen, Poul Ejnar

    2014-01-01

    As a common practice of large-scale transmission network analysis the distribution networks have been represented as aggregated loads. However, with increasing share of distributed generation, especially wind and solar power, in the distribution networks, it became necessary to include...... the distributed generation within those analysis. In this paper a practical methodology to obtain aggregated behaviour of the distributed generation is proposed. The methodology, which is based on the use of the IEC standard wind turbine models, is applied on a benchmark distribution network via simulations....

  16. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    Science.gov (United States)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical

  17. Optimal Wind Energy Integration in Large-Scale Electric Grids

    Science.gov (United States)

    Albaijat, Mohammad H.

    profit for investors for renting their transmission capacity, and cheaper electricity for end users. We propose a hybrid method based on a heuristic and deterministic method to attain new transmission lines additions and increase transmission capacity. Renewable energy resources (RES) have zero operating cost, which makes them very attractive for generation companies and market participants. In addition, RES have zero carbon emission, which helps relieve the concerns of environmental impacts of electric generation resources' carbon emission. RES are wind, solar, hydro, biomass, and geothermal. By 2030, the expectation is that more than 30% of electricity in the U.S. will come from RES. One major contributor of RES generation will be from wind energy resources (WES). Furthermore, WES will be an important component of the future generation portfolio. However, the nature of WES is that it experiences a high intermittency and volatility. Because of the great expectation of high WES penetration and the nature of such resources, researchers focus on studying the effects of such resources on the electric grid operation and its adequacy from different aspects. Additionally, current market operations of electric grids add another complication to consider while integrating RES (e.g., specifically WES). Mandates by market rules and long-term analysis of renewable penetration in large-scale electric grid are also the focus of researchers in recent years. We advocate a method for high-wind resources penetration study on large-scale electric grid operations. PMU is a geographical positioning system (GPS) based device, which provides immediate and precise measurements of voltage angle in a high-voltage transmission system. PMUs can update the status of a transmission line and related measurements (e.g., voltage magnitude and voltage phase angle) more frequently. Every second, a PMU can provide 30 samples of measurements compared to traditional systems (e.g., supervisory control and

  18. Large-scale bioenergy production: how to resolve sustainability trade-offs?

    Science.gov (United States)

    Humpenöder, Florian; Popp, Alexander; Bodirsky, Benjamin Leon; Weindl, Isabelle; Biewald, Anne; Lotze-Campen, Hermann; Dietrich, Jan Philipp; Klein, David; Kreidenweis, Ulrich; Müller, Christoph; Rolinski, Susanne; Stevanovic, Miodrag

    2018-02-01

    Large-scale 2nd generation bioenergy deployment is a key element of 1.5 °C and 2 °C transformation pathways. However, large-scale bioenergy production might have negative sustainability implications and thus may conflict with the Sustainable Development Goal (SDG) agenda. Here, we carry out a multi-criteria sustainability assessment of large-scale bioenergy crop production throughout the 21st century (300 EJ in 2100) using a global land-use model. Our analysis indicates that large-scale bioenergy production without complementary measures results in negative effects on the following sustainability indicators: deforestation, CO2 emissions from land-use change, nitrogen losses, unsustainable water withdrawals and food prices. One of our main findings is that single-sector environmental protection measures next to large-scale bioenergy production are prone to involve trade-offs among these sustainability indicators—at least in the absence of more efficient land or water resource use. For instance, if bioenergy production is accompanied by forest protection, deforestation and associated emissions (SDGs 13 and 15) decline substantially whereas food prices (SDG 2) increase. However, our study also shows that this trade-off strongly depends on the development of future food demand. In contrast to environmental protection measures, we find that agricultural intensification lowers some side-effects of bioenergy production substantially (SDGs 13 and 15) without generating new trade-offs—at least among the sustainability indicators considered here. Moreover, our results indicate that a combination of forest and water protection schemes, improved fertilization efficiency, and agricultural intensification would reduce the side-effects of bioenergy production most comprehensively. However, although our study includes more sustainability indicators than previous studies on bioenergy side-effects, our study represents only a small subset of all indicators relevant for the

  19. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  20. Enabling Large-Scale Biomedical Analysis in the Cloud

    Directory of Open Access Journals (Sweden)

    Ying-Chih Lin

    2013-01-01

    Full Text Available Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable.

  1. SharePoint governance

    OpenAIRE

    Ali, Mudassar

    2013-01-01

    Masteroppgave i informasjons- og kommunikasjonsteknologi IKT590 2013 – Universitetet i Agder, Grimstad SharePoint is a web-based business collaboration platform from Microsoft which is very robust and dynamic in nature. The platform has been in the market for more than a decade and has been adapted by large number of organisations in the world. The platform has become larger in scale, richer in features and is improving consistently with every new version. However, SharePoint ...

  2. Discussion on the nuclear information resources co-constructing and sharing under network information

    International Nuclear Information System (INIS)

    Wu Yang

    2010-01-01

    During the tenth five-year plan, along with the digitization of information, and the development of information transmission network, the co-construction and sharing of China's nuclear industry information is facing a new development opportunities and challenges. This paper is based on the analysis of the nuclear library status and characteristics, combined of the development process of nuclear information resources over the past 20 years. For the characteristic of information sharing and services in the net environment, the problem in the current co-construction and sharing of nuclear information, and the needs of the future nuclear research and development of nuclear production, this paper forecast the work trends of nuclear information, and gives some countermeasure to strength the development of the co-construction and sharing of nuclear information. (author)

  3. Optimal Sequential Resource Sharing and Exchange in Multi-Agent Systems

    OpenAIRE

    Xiao, Yuanzhang

    2014-01-01

    Central to the design of many engineering systems and social networks is to solve the underlying resource sharing and exchange problems, in which multiple decentralized agents make sequential decisions over time to optimize some long-term performance metrics. It is challenging for the decentralized agents to make optimal sequential decisions because of the complicated coupling among the agents and across time. In this dissertation, we mainly focus on three important classes of multi-agent seq...

  4. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  5. Projecting water resources changes in potential large-scale agricultural investment areas of the Kafue River Basin in Zambia

    Science.gov (United States)

    Kim, Y.; Trainor, A. M.; Baker, T. J.

    2017-12-01

    Climate change impacts regional water availability through the spatial and temporal redistribution of available water resources. This study focuses on understanding possible response of water resources to climate change in regions where potentials for large-scale agricultural investments are planned in the upper and middle Kafue River Basin in Zambia. We used historical and projected precipitation and temperature to assess changes in water yield, using the Soil and Water Assessment Tool (SWAT) hydrological model. Some of the Coupled Model Intercomparison Project Phase 5 (CMIP5) climate model outputs for the Representative Concentration Pathway (RCP) 4.5 and 8.5 scenarios project a temperature warming range from 1.8 - 5.7 °C over the region from 2020 to 2095. Precipitation projection patterns vary monthly but tend toward drier dry seasons with a slight increase in precipitation during the rainy season as compared to the historical time series. The best five calibrated parameter sets generated for the historical record (1965 - 2005) were applied for two future periods, 2020 - 2060 and 2055 - 2095, to project water yield change. Simulations projected that the 90th percentile water yield would be exceeded across most of the study area by up to 800% under the medium-low (RCP4.5) CO2 emission scenario, whereas the high (RCP8.5) CO2 emission scenario resulted in a more spatially varied pattern mixed with increasing (up to 500%) and decreasing (up to -54%) trends. The 10th percentile water yield indicated spatially varied pattern across the basin, increasing by as much as 500% though decreasing in some areas by 66%, with the greatest decreases during the dry season under RCP8.5. Overall, available water resources in the study area are projected to trend toward increased floods (i.e. water yields far exceeding 90th percentile) as well as increasing drought (i.e. water yield far below 10th percentile) vulnerability. Because surface water is a primary source for agriculture

  6. Computational challenges of large-scale, long-time, first-principles molecular dynamics

    International Nuclear Information System (INIS)

    Kent, P R C

    2008-01-01

    Plane wave density functional calculations have traditionally been able to use the largest available supercomputing resources. We analyze the scalability of modern projector-augmented wave implementations to identify the challenges in performing molecular dynamics calculations of large systems containing many thousands of electrons. Benchmark calculations on the Cray XT4 demonstrate that global linear-algebra operations are the primary reason for limited parallel scalability. Plane-wave related operations can be made sufficiently scalable. Improving parallel linear-algebra performance is an essential step to reaching longer timescales in future large-scale molecular dynamics calculations

  7. When the globe is your classroom: teaching and learning about large-scale environmental change online

    Science.gov (United States)

    Howard, E. A.; Coleman, K. J.; Barford, C. L.; Kucharik, C.; Foley, J. A.

    2005-12-01

    Understanding environmental problems that cross physical and disciplinary boundaries requires a more holistic view of the world - a "systems" approach. Yet it is a challenge for many learners to start thinking this way, particularly when the problems are large in scale and not easily visible. We will describe our online university course, "Humans and the Changing Biosphere," which takes a whole-systems perspective for teaching regional to global-scale environmental science concepts, including climate, hydrology, ecology, and human demographics. We will share our syllabus and learning objectives and summarize our efforts to incorporate "best" practices for online teaching. We will describe challenges we have faced, and our efforts to reach different learner types. Our goals for this presentation are: (1) to communicate how a systems approach ties together environmental sciences (including climate, hydrology, ecology, biogeochemistry, and demography) that are often taught as separate disciplines; (2) to generate discussion about challenges of teaching large-scale environmental processes; (3) to share our experiences in teaching these topics online; (4) to receive ideas and feedback on future teaching strategies. We will explain why we developed this course online, and share our experiences about benefits and challenges of teaching over the web - including some suggestions about how to use technology to supplement face-to-face learning experiences (and vice versa). We will summarize assessment data about what students learned during the course, and discuss key misconceptions and barriers to learning. We will highlight the role of an online discussion board in creating classroom community, identifying misconceptions, and engaging different types of learners.

  8. Large Scale Beam-beam Simulations for the CERN LHC using Distributed Computing

    CERN Document Server

    Herr, Werner; McIntosh, E; Schmidt, F

    2006-01-01

    We report on a large scale simulation of beam-beam effects for the CERN Large Hadron Collider (LHC). The stability of particles which experience head-on and long-range beam-beam effects was investigated for different optical configurations and machine imperfections. To cover the interesting parameter space required computing resources not available at CERN. The necessary resources were available in the LHC@home project, based on the BOINC platform. At present, this project makes more than 60000 hosts available for distributed computing. We shall discuss our experience using this system during a simulation campaign of more than six months and describe the tools and procedures necessary to ensure consistent results. The results from this extended study are presented and future plans are discussed.

  9. Reviving large-scale projects

    International Nuclear Information System (INIS)

    Desiront, A.

    2003-01-01

    For the past decade, most large-scale hydro development projects in northern Quebec have been put on hold due to land disputes with First Nations. Hydroelectric projects have recently been revived following an agreement signed with Aboriginal communities in the province who recognized the need to find new sources of revenue for future generations. Many Cree are working on the project to harness the waters of the Eastmain River located in the middle of their territory. The work involves building an 890 foot long dam, 30 dikes enclosing a 603 square-km reservoir, a spillway, and a power house with 3 generating units with a total capacity of 480 MW of power for start-up in 2007. The project will require the use of 2,400 workers in total. The Cree Construction and Development Company is working on relations between Quebec's 14,000 Crees and the James Bay Energy Corporation, the subsidiary of Hydro-Quebec which is developing the project. Approximately 10 per cent of the $735-million project has been designated for the environmental component. Inspectors ensure that the project complies fully with environmental protection guidelines. Total development costs for Eastmain-1 are in the order of $2 billion of which $735 million will cover work on site and the remainder will cover generating units, transportation and financial charges. Under the treaty known as the Peace of the Braves, signed in February 2002, the Quebec government and Hydro-Quebec will pay the Cree $70 million annually for 50 years for the right to exploit hydro, mining and forest resources within their territory. The project comes at a time when electricity export volumes to the New England states are down due to growth in Quebec's domestic demand. Hydropower is a renewable and non-polluting source of energy that is one of the most acceptable forms of energy where the Kyoto Protocol is concerned. It was emphasized that large-scale hydro-electric projects are needed to provide sufficient energy to meet both

  10. A Technical Mode for Sharing and Utilizing Open Educational Resources in Chinese Universities

    Directory of Open Access Journals (Sweden)

    Juan Yang

    2011-09-01

    Full Text Available Open educational resources just supply potentials to help equalize the access to worldwide knowledge and education, but themselves alone do not cause effective learning or education. How to make effective use of the resources is still a big challenge. In this study, a technical mode is proposed to collect the open educational resources from different sources on the Internet into a campus-network-based resource management system. The system facilitates free and easy access to the resources for instructors and students in universities and integrates the resources into learning and teaching. The technical issues regarding the design the resource management system are examined, including the structure and functions of the system, metadata standard compatibility and scalability, metadata file format, and resource utilization assessment. Furthermore, the resource collecting, storage and utilization modes are also discussed so as to lay a technical basis for extensive and efficient sharing and utilization of the OER in Chinese universities.

  11. Large-Scale Ocean Circulation-Cloud Interactions Reduce the Pace of Transient Climate Change

    Science.gov (United States)

    Trossman, D. S.; Palter, J. B.; Merlis, T. M.; Huang, Y.; Xia, Y.

    2016-01-01

    Changes to the large scale oceanic circulation are thought to slow the pace of transient climate change due, in part, to their influence on radiative feedbacks. Here we evaluate the interactions between CO2-forced perturbations to the large-scale ocean circulation and the radiative cloud feedback in a climate model. Both the change of the ocean circulation and the radiative cloud feedback strongly influence the magnitude and spatial pattern of surface and ocean warming. Changes in the ocean circulation reduce the amount of transient global warming caused by the radiative cloud feedback by helping to maintain low cloud coverage in the face of global warming. The radiative cloud feedback is key in affecting atmospheric meridional heat transport changes and is the dominant radiative feedback mechanism that responds to ocean circulation change. Uncertainty in the simulated ocean circulation changes due to CO2 forcing may contribute a large share of the spread in the radiative cloud feedback among climate models.

  12. Coupling Agent-Based and Groundwater Modeling to Explore Demand Management Strategies for Shared Resources

    Science.gov (United States)

    Al-Amin, S.

    2015-12-01

    Municipal water demands in growing population centers in the arid southwest US are typically met through increased groundwater withdrawals. Hydro-climatic uncertainties attributed to climate change and land use conversions may also alter demands and impact the replenishment of groundwater supply. Groundwater aquifers are not necessarily confined within municipal and management boundaries, and multiple diverse agencies may manage a shared resource in a decentralized approach, based on individual concerns and resources. The interactions among water managers, consumers, and the environment influence the performance of local management strategies and regional groundwater resources. This research couples an agent-based modeling (ABM) framework and a groundwater model to analyze the effects of different management approaches on shared groundwater resources. The ABM captures the dynamic interactions between household-level consumers and policy makers to simulate water demands under climate change and population growth uncertainties. The groundwater model is used to analyze the relative effects of management approaches on reducing demands and replenishing groundwater resources. The framework is applied for municipalities located in the Verde River Basin, Arizona that withdraw groundwater from the Verde Formation-Basin Fill-Carbonate aquifer system. Insights gained through this simulation study can be used to guide groundwater policy-making under changing hydro-climatic scenarios for a long-term planning horizon.

  13. Principles of cooperation across systems: from human sharing to multicellularity and cancer.

    Science.gov (United States)

    Aktipis, Athena

    2016-01-01

    From cells to societies, several general principles arise again and again that facilitate cooperation and suppress conflict. In this study, I describe three general principles of cooperation and how they operate across systems including human sharing, cooperation in animal and insect societies and the massively large-scale cooperation that occurs in our multicellular bodies. The first principle is that of Walk Away: that cooperation is enhanced when individuals can leave uncooperative partners. The second principle is that resource sharing is often based on the need of the recipient (i.e., need-based transfers) rather than on strict account-keeping. And the last principle is that effective scaling up of cooperation requires increasingly sophisticated and costly cheater suppression mechanisms. By comparing how these principles operate across systems, we can better understand the constraints on cooperation. This can facilitate the discovery of novel ways to enhance cooperation and suppress cheating in its many forms, from social exploitation to cancer.

  14. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  15. National Measures on Access to Genetic Resources and Benefit Sharing – The Case of the Philippines

    Directory of Open Access Journals (Sweden)

    Aphrodite Smagadi

    2005-06-01

    Full Text Available The objective of the Convention on Biological Diversity stipulated at the United Nations Conference on Environment and Development (Rio de Janeiro, 1992 was not merely to promote the conservation and sustainable use of biological resources, but to ensure the fair and equitable sharing of benefits arising from their utilisation. The Convention stresses the sovereignty that signatory states exert over the biological wealth within their jurisdiction and calls on them to enact national legislation that will contribute to fleshing out the provisions on access to genetic resources and benefit sharing. The Philippines was the first country to enact such legislation and has thus accrued a decade of experience in this field. The first and much-analysed access and benefit sharing instrument enacted by the Government of the Philippines, was Executive Order 247 of 1995. However, due to problems experienced during the implementation of the Order, draft guidelines based on the 2001 Implementing Rules to the Wildlife Act have been drafted and are expected to correct the failures of the previous law. This article takes the example of the Philippines to assess the extent to which laws regulating the access and benefit sharing of biological resources can be effective in any country.

  16. Blockchain-Empowered Fair Computational Resource Sharing System in the D2D Network

    Directory of Open Access Journals (Sweden)

    Zhen Hong

    2017-11-01

    Full Text Available Device-to-device (D2D communication is becoming an increasingly important technology in future networks with the climbing demand for local services. For instance, resource sharing in the D2D network features ubiquitous availability, flexibility, low latency and low cost. However, these features also bring along challenges when building a satisfactory resource sharing system in the D2D network. Specifically, user mobility is one of the top concerns for designing a cooperative D2D computational resource sharing system since mutual communication may not be stably available due to user mobility. A previous endeavour has demonstrated and proven how connectivity can be incorporated into cooperative task scheduling among users in the D2D network to effectively lower average task execution time. There are doubts about whether this type of task scheduling scheme, though effective, presents fairness among users. In other words, it can be unfair for users who contribute many computational resources while receiving little when in need. In this paper, we propose a novel blockchain-based credit system that can be incorporated into the connectivity-aware task scheduling scheme to enforce fairness among users in the D2D network. Users’ computational task cooperation will be recorded on the public blockchain ledger in the system as transactions, and each user’s credit balance can be easily accessible from the ledger. A supernode at the base station is responsible for scheduling cooperative computational tasks based on user mobility and user credit balance. We investigated the performance of the credit system, and simulation results showed that with a minor sacrifice of average task execution time, the level of fairness can obtain a major enhancement.

  17. Reducing the market impact of large shares of intermittent energy in Denmark

    DEFF Research Database (Denmark)

    Jacobsen, Henrik; Zvingilaite, Erika

    2010-01-01

    The increasing prevalence of renewable and intermittent energy sources in the electricity system is creating new challenges for the interaction of the system. In Denmark, high renewable shares have been achieved without great difficulty, mainly due to the flexibility of the nearby Nordic hydro......-power dominated system. Further increases in the share of renewable energy sources require that additional options are considered to facilitate integration with the lowest possible cost. With large shares of intermittent energy, the impact can be observed on wholesale prices, giving both lower prices and higher...... and the attractiveness of additional interconnection capacity. This paper also analyses options for increasing the flexibility of heat generation involving large and decentralized CHP plants and heat generation based on electricity. The incentives that the market provides for shifting demand and using electricity...

  18. Bio-based economy in the Netherlands. Macro-economic outline of a large-scale introduction of green resources in the Dutch energy supply

    International Nuclear Information System (INIS)

    Van der Hoeven, D.

    2009-03-01

    The Bio-based Raw Materials Platform (PGG), part of the Energy Transition in The Netherlands, commissioned the Agricultural Economics Research Institute (LEI) and the Copernicus Institute of Utrecht University to conduct research on the macro-economic impact of large scale deployment of biomass for energy and materials in the Netherlands. Two model approaches were applied based on a consistent set of scenario assumptions: a bottom-up study including technoeconomic projections of fossil and bio-based conversion technologies and a topdown study including macro-economic modelling of (global) trade of biomass and fossil resources. The results of the top-down and bottom-up modelling work are reported separately. This is the public version of studies [nl

  19. Infrastructure for Large-Scale Quality-Improvement Projects: Early Lessons from North Carolina Improving Performance in Practice

    Science.gov (United States)

    Newton, Warren P.; Lefebvre, Ann; Donahue, Katrina E.; Bacon, Thomas; Dobson, Allen

    2010-01-01

    Introduction: Little is known regarding how to accomplish large-scale health care improvement. Our goal is to improve the quality of chronic disease care in all primary care practices throughout North Carolina. Methods: Methods for improvement include (1) common quality measures and shared data system; (2) rapid cycle improvement principles; (3)…

  20. Key Recovery Using Noised Secret Sharing with Discounts over Large Clouds

    OpenAIRE

    JAJODIA , Sushil; Litwin , Witold; Schwarz , Thomas

    2013-01-01

    Encryption key loss problem is the Achilles's heel of cryptography. Key escrow helps, but favors disclosures. Schemes for recoverable encryption keys through noised secret sharing alleviate the dilemma. Key owner escrows a specifically encrypted backup. The recovery needs a large cloud. Cloud cost, money trail should rarefy illegal attempts. We now propose noised secret sharing schemes supporting discounts. The recovery request with discount code lowers the recovery complexity, easily by orde...

  1. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  2. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  3. The Plant Phenology Ontology: A New Informatics Resource for Large-Scale Integration of Plant Phenology Data.

    Science.gov (United States)

    Stucky, Brian J; Guralnick, Rob; Deck, John; Denny, Ellen G; Bolmgren, Kjell; Walls, Ramona

    2018-01-01

    Plant phenology - the timing of plant life-cycle events, such as flowering or leafing out - plays a fundamental role in the functioning of terrestrial ecosystems, including human agricultural systems. Because plant phenology is often linked with climatic variables, there is widespread interest in developing a deeper understanding of global plant phenology patterns and trends. Although phenology data from around the world are currently available, truly global analyses of plant phenology have so far been difficult because the organizations producing large-scale phenology data are using non-standardized terminologies and metrics during data collection and data processing. To address this problem, we have developed the Plant Phenology Ontology (PPO). The PPO provides the standardized vocabulary and semantic framework that is needed for large-scale integration of heterogeneous plant phenology data. Here, we describe the PPO, and we also report preliminary results of using the PPO and a new data processing pipeline to build a large dataset of phenology information from North America and Europe.

  4. Biofuel Development and Large-Scale Land Deals in Sub-Saharan Africa

    OpenAIRE

    Giorgia Giovannetti; Elisa Ticci

    2013-01-01

    Africa's biofuel potential over the last ten years has increasingly attracted foreign investors’ attention. We estimate the determinants of foreign investors land demand for biofuel production in SSA, using Poisson specifications of the gravity model. Our estimates suggest that land availability, abundance of water resources and weak land governance are significant determinants of large-scale land acquisitions for biofuel production. This in turn suggests that this type of investment is mainl...

  5. Strategic innovation between PhD and DNP programs: Collaboration, collegiality, and shared resources.

    Science.gov (United States)

    Edwards, Joellen; Rayman, Kathleen; Diffenderfer, Sandra; Stidham, April

    2016-01-01

    At least 111 schools and colleges of nursing across the nation provide both PhD and DNP programs (AACN, 2014a). Collaboration between nurses with doctoral preparation as researchers (PhD) and practitioners (DNP) has been recommended as essential to further the profession; that collaboration can begin during the educational process. The purpose of this paper is to describe the development and implementation of successful DNP and PhD program collaboration, and to share the results of that collaboration in an educational setting. Faculty set strategic goals to maximize the effectiveness and efficiency of both new DNP and existing PhD programs. The goals were to promote collaboration and complementarity between the programs through careful capstone and dissertation differentiation, complementary residency activities, joint courses and inter-professional experiences; promote collegiality in a blended on-line learning environment through shared orientation and intensive on-campus sessions; and maximize resources in program delivery through a supportive organizational structure, equal access to technology support, and shared faculty responsibilities as appropriate to terminal degrees. Successes such as student and faculty accomplishments, and challenges such as managing class size and workload, are described. Collaboration, collegiality and the sharing of resources have strengthened and enriched both programs and contributed to the success of students, faculty. These innovative program strategies can provide a solid foundation for DNP and PhD collaboration. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  7. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  8. A Large-Scale Initiative Inviting Patients to Share Personal Fitness Tracker Data with Their Providers: Initial Results.

    Directory of Open Access Journals (Sweden)

    Joshua M Pevnick

    Full Text Available Personal fitness trackers (PFT have substantial potential to improve healthcare.To quantify and characterize early adopters who shared their PFT data with providers.We used bivariate statistics and logistic regression to compare patients who shared any PFT data vs. patients who did not.A patient portal was used to invite 79,953 registered portal users to share their data. Of 66,105 users included in our analysis, 499 (0.8% uploaded data during an initial 37-day study period. Bivariate and regression analysis showed that early adopters were more likely than non-adopters to be younger, male, white, health system employees, and to have higher BMIs. Neither comorbidities nor utilization predicted adoption.Our results demonstrate that patients had little intrinsic desire to share PFT data with their providers, and suggest that patients most at risk for poor health outcomes are least likely to share PFT data. Marketing, incentives, and/or cultural change may be needed to induce such data-sharing.

  9. Large-Scale Power Production Potential on U.S. Department of Energy Lands

    Energy Technology Data Exchange (ETDEWEB)

    Kandt, Alicen J. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Elgqvist, Emma M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gagne, Douglas A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hillesheim, Michael B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Walker, H. A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Jeff [Colorado School of Mines, Golden, CO (United States); Boak, Jeremy [Colorado School of Mines, Golden, CO (United States); Washington, Jeremy [Colorado School of Mines, Golden, CO (United States); Sharp, Cory [Colorado School of Mines, Golden, CO (United States)

    2017-11-03

    This report summarizes the potential for independent power producers to generate large-scale power on U.S. Department of Energy (DOE) lands and export that power into a larger power market, rather than serving on-site DOE loads. The report focuses primarily on the analysis of renewable energy (RE) technologies that are commercially viable at utility scale, including photovoltaics (PV), concentrating solar power (CSP), wind, biomass, landfill gas (LFG), waste to energy (WTE), and geothermal technologies. The report also summarizes the availability of fossil fuel, uranium, or thorium resources at 55 DOE sites.

  10. Children use partial resource sharing as a cue to friendship.

    Science.gov (United States)

    Liberman, Zoe; Shaw, Alex

    2017-07-01

    Resource sharing is an important aspect of human society, and how resources are distributed can provide people with crucial information about social structure. Indeed, a recent partiality account of resource distribution suggested that people may use unequal partial resource distributions to make inferences about a distributor's social affiliations. To empirically test this suggestion derived from the theoretical argument of the partiality account, we presented 4- to 9-year-old children with distributors who gave out resources unequally using either a partial procedure (intentionally choosing which recipient would get more) or an impartial procedure (rolling a die to determine which recipient would get more) and asked children to make judgments about whom the distributor was better friends with. At each age tested, children expected a distributor who gave partially to be better friends with the favored recipient (Studies 1-3). Interestingly, younger children (4- to 6-year-olds) inferred friendship between the distributor and the favored recipient even in cases where the distributor used an impartial procedure, whereas older children (7- to 9-year-olds) did not infer friendship based on impartial distributions (Study 1). These studies demonstrate that children use third-party resource distributions to make important predictions about the social world and add to our knowledge about the developmental trajectory of understanding the importance of partiality in addition to inequity when making social inferences. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. The Human Salivary Microbiome Is Shaped by Shared Environment Rather than Genetics: Evidence from a Large Family of Closely Related Individuals.

    Science.gov (United States)

    Shaw, Liam; Ribeiro, Andre L R; Levine, Adam P; Pontikos, Nikolas; Balloux, Francois; Segal, Anthony W; Roberts, Adam P; Smith, Andrew M

    2017-09-12

    The human microbiome is affected by multiple factors, including the environment and host genetics. In this study, we analyzed the salivary microbiomes of an extended family of Ashkenazi Jewish individuals living in several cities and investigated associations with both shared household and host genetic similarities. We found that environmental effects dominated over genetic effects. While there was weak evidence of geographical structuring at the level of cities, we observed a large and significant effect of shared household on microbiome composition, supporting the role of the immediate shared environment in dictating the presence or absence of taxa. This effect was also seen when including adults who had grown up in the same household but moved out prior to the time of sampling, suggesting that the establishment of the salivary microbiome earlier in life may affect its long-term composition. We found weak associations between host genetic relatedness and microbiome dissimilarity when using family pedigrees as proxies for genetic similarity. However, this association disappeared when using more-accurate measures of kinship based on genome-wide genetic markers, indicating that the environment rather than host genetics is the dominant factor affecting the composition of the salivary microbiome in closely related individuals. Our results support the concept that there is a consistent core microbiome conserved across global scales but that small-scale effects due to a shared living environment significantly affect microbial community composition. IMPORTANCE Previous research shows that the salivary microbiomes of relatives are more similar than those of nonrelatives, but it remains difficult to distinguish the effects of relatedness and shared household environment. Furthermore, pedigree measures may not accurately measure host genetic similarity. In this study, we include genetic relatedness based on genome-wide single nucleotide polymorphisms (SNPs) (rather than

  12. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  13. Development and psychometric evaluation of the Core Nurse Resource Scale.

    Science.gov (United States)

    Simpson, Michelle R

    2010-11-01

    To examine the factor structure, internal consistency reliability and concurrent-related validity of the Core Nurse Resource Scale. A cross-sectional survey study design was used to obtain a sample of 149 nurses and nursing staff [Registered Nurse (RNs), Licensed Practical Nurse (LPNs) and Certified Nursing Assistant (CNAs)] working in long-term care facilities. Exploratory factor analysis, Cronbach's alpha and bivariate correlations were used to evaluate validity and reliability. Exploratory factor analysis yielded a scale with 18 items on three factors, accounting for 52% of the variance in scores. Internal consistency reliability for the composite and Core Nurse Resource Scale factors ranged from 0.79 to 0.91. The Core Nurse Resource Scale composite scale and subscales correlated positively with a measure of work engagement (r=0.247-0.572). The initial psychometric evaluation of the Core Nurse Resource Scale demonstrates it is a sound measure. Further validity and reliability assessment will need to be explored and assessed among nurses and other nursing staff working in other practice settings. The intent of the Core Nurse Resource Scale is to evaluate the presence of physical, psychological and social resources of the nursing work environment, to identify workplaces at risk for disengaged (low work engagement) nursing staff and to provide useful diagnostic information to healthcare administrators interested in interventions to improve the nursing work environment. © 2010 The Author. Journal compilation © 2010 Blackwell Publishing Ltd.

  14. HydroShare: A Platform for Collaborative Data and Model Sharing in Hydrology

    Science.gov (United States)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Couch, A.; Hooper, R. P.; Dash, P. K.; Stealey, M.; Yi, H.; Bandaragoda, C.; Castronova, A. M.

    2017-12-01

    HydroShare is an online, collaboration system for sharing of hydrologic data, analytical tools, and models. It supports the sharing of and collaboration around "resources" which are defined by standardized content types for data formats and models commonly used in hydrology. With HydroShare you can: Share your data and models with colleagues; Manage who has access to the content that you share; Share, access, visualize and manipulate a broad set of hydrologic data types and models; Use the web services application programming interface (API) to program automated and client access; Publish data and models and obtain a citable digital object identifier (DOI); Aggregate your resources into collections; Discover and access data and models published by others; Use web apps to visualize, analyze and run models on data in HydroShare. This presentation will describe the functionality and architecture of HydroShare highlighting its use as a virtual environment supporting education and research. HydroShare has components that support: (1) resource storage, (2) resource exploration, and (3) web apps for actions on resources. The HydroShare data discovery, sharing and publishing functions as well as HydroShare web apps provide the capability to analyze data and execute models completely in the cloud (servers remote from the user) overcoming desktop platform limitations. The HydroShare GIS app provides a basic capability to visualize spatial data. The HydroShare JupyterHub Notebook app provides flexible and documentable execution of Python code snippets for analysis and modeling in a way that results can be shared among HydroShare users and groups to support research collaboration and education. We will discuss how these developments can be used to support different types of educational efforts in Hydrology where being completely web based is of value in an educational setting as students can all have access to the same functionality regardless of their computer.

  15. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  16. Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping

    Science.gov (United States)

    Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.

    2017-12-01

    Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.

  17. Food security through large scale investments in agriculture

    Science.gov (United States)

    Rulli, M.; D'Odorico, P.

    2013-12-01

    Most of the human appropriation of freshwater resources is for food production. There is some concern that in the near future the finite freshwater resources available on Earth might not be sufficient to meet the increasing human demand for agricultural products. In the late 1700s Malthus argued that in the long run the humanity would not have enough resources to feed itself. Malthus' analysis, however, did not account for the emergence of technological innovations that could increase the rate of food production. The modern and contemporary history has seen at least three major technological advances that have increased humans' access to food, namely, the industrial revolution, the green revolution, and the intensification of global trade. Here we argue that a fourth revolution has just started to happen. It involves foreign direct investments in agriculture, which intensify the crop yields of potentially highly productive agricultural lands by introducing the use of more modern technologies. The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions for commercial farming will bring the technology required to close the existing yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of verified land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with large scale land acquisitions. We

  18. Thermal System Analysis and Optimization of Large-Scale Compressed Air Energy Storage (CAES

    Directory of Open Access Journals (Sweden)

    Zhongguang Fu

    2015-08-01

    Full Text Available As an important solution to issues regarding peak load and renewable energy resources on grids, large-scale compressed air energy storage (CAES power generation technology has recently become a popular research topic in the area of large-scale industrial energy storage. At present, the combination of high-expansion ratio turbines with advanced gas turbine technology is an important breakthrough in energy storage technology. In this study, a new gas turbine power generation system is coupled with current CAES technology. Moreover, a thermodynamic cycle system is optimized by calculating for the parameters of a thermodynamic system. Results show that the thermal efficiency of the new system increases by at least 5% over that of the existing system.

  19. Using social-ecological systems theory to evaluate large-scale comanagement efforts: a case study of the Inuvialuit Settlement Region

    Directory of Open Access Journals (Sweden)

    William Tyson

    2017-03-01

    Full Text Available Comanagement efforts are increasingly tasked with overseeing natural resource governance at a large scale. I examine comanagement of subsistence harvesting in the Inuvialuit Settlement Region (ISR of the western Canadian Arctic, using a social-ecological systems framework. In doing so, this study joins a growing list of research that reviews design principles commonly found in successful small-scale commons management and applies them to a large resource area. This research uses the management of beluga (Delphinapterus leucas and barren-ground caribou (Rangifer tarandus groenlandicus as case studies in understanding the management framework of the Inuvialuit Settlement Region, as each species is important in Inuvialuit culture and is actively managed and monitored. Comanagement bodies in the study area display many of the institutional design principles that are characteristic of successful social-ecological systems. Particularly mentionable are the presence of well-organized nested enterprises and a strong incorporation of local knowledge and monitoring. This supports the application of institutional design principles in large-scale analyses of resource management. However, due to the network of policy and management outside the ISR that influences each species, this research suggests that in cases of wide-ranging resource bases, these types of analyses may be better suited to evaluating broad management networks rather than discrete governing regions.

  20. On-demand provisioning of HEP compute resources on cloud sites and shared HPC centers

    Science.gov (United States)

    Erli, G.; Fischer, F.; Fleig, G.; Giffels, M.; Hauth, T.; Quast, G.; Schnepf, M.; Heese, J.; Leppert, K.; Arnaez de Pedro, J.; Sträter, R.

    2017-10-01

    This contribution reports on solutions, experiences and recent developments with the dynamic, on-demand provisioning of remote computing resources for analysis and simulation workflows. Local resources of a physics institute are extended by private and commercial cloud sites, ranging from the inclusion of desktop clusters over institute clusters to HPC centers. Rather than relying on dedicated HEP computing centers, it is nowadays more reasonable and flexible to utilize remote computing capacity via virtualization techniques or container concepts. We report on recent experience from incorporating a remote HPC center (NEMO Cluster, Freiburg University) and resources dynamically requested from the commercial provider 1&1 Internet SE into our intitute’s computing infrastructure. The Freiburg HPC resources are requested via the standard batch system, allowing HPC and HEP applications to be executed simultaneously, such that regular batch jobs run side by side to virtual machines managed via OpenStack [1]. For the inclusion of the 1&1 commercial resources, a Python API and SDK as well as the possibility to upload images were available. Large scale tests prove the capability to serve the scientific use case in the European 1&1 datacenters. The described environment at the Institute of Experimental Nuclear Physics (IEKP) at KIT serves the needs of researchers participating in the CMS and Belle II experiments. In total, resources exceeding half a million CPU hours have been provided by remote sites.

  1. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele

    2015-08-23

    The interaction between scales is investigated in a turbulent mixing layer. The large-scale amplitude modulation of the small scales already observed in other works depends on the crosswise location. Large-scale positive fluctuations correlate with a stronger activity of the small scales on the low speed-side of the mixing layer, and a reduced activity on the high speed-side. However, from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  2. Interspecific interference competition at the resource patch scale: do large herbivores spatially avoid elephants while accessing water?

    Science.gov (United States)

    Ferry, Nicolas; Dray, Stéphane; Fritz, Hervé; Valeix, Marion

    2016-11-01

    Animals may anticipate and try to avoid, at some costs, physical encounters with other competitors. This may ultimately impact their foraging distribution and intake rates. Such cryptic interference competition is difficult to measure in the field, and extremely little is known at the interspecific level. We tested the hypothesis that smaller species avoid larger ones because of potential costs of interference competition and hence expected them to segregate from larger competitors at the scale of a resource patch. We assessed fine-scale spatial segregation patterns between three African herbivore species (zebra Equus quagga, kudu Tragelaphus strepsiceros and giraffe Giraffa camelopardalis) and a megaherbivore, the African elephant Loxodonta africana, at the scale of water resource patches in the semi-arid ecosystem of Hwange National Park, Zimbabwe. Nine waterholes were monitored every two weeks during the dry season of a drought year, and observational scans of the spatial distribution of all herbivores were performed every 15 min. We developed a methodological approach to analyse such fine-scale spatial data. Elephants increasingly used waterholes as the dry season progressed, as did the probability of co-occurrence and agonistic interaction with elephants for the three study species. All three species segregated from elephants at the beginning of the dry season, suggesting a spatial avoidance of elephants and the existence of costs of being close to them. However, contrarily to our expectations, herbivores did not segregate from elephants the rest of the dry season but tended to increasingly aggregate with elephants as the dry season progressed. We discuss these surprising results and the existence of a trade-off between avoidance of interspecific interference competition and other potential factors such as access to quality water, which may have relative associated costs that change with the time of the year. © 2016 The Authors. Journal of Animal Ecology

  3. Complex dewetting scenarios of ultrathin silicon films for large-scale nanoarchitectures.

    Science.gov (United States)

    Naffouti, Meher; Backofen, Rainer; Salvalaglio, Marco; Bottein, Thomas; Lodari, Mario; Voigt, Axel; David, Thomas; Benkouider, Abdelmalek; Fraj, Ibtissem; Favre, Luc; Ronda, Antoine; Berbezier, Isabelle; Grosso, David; Abbarchi, Marco; Bollani, Monica

    2017-11-01

    Dewetting is a ubiquitous phenomenon in nature; many different thin films of organic and inorganic substances (such as liquids, polymers, metals, and semiconductors) share this shape instability driven by surface tension and mass transport. Via templated solid-state dewetting, we frame complex nanoarchitectures of monocrystalline silicon on insulator with unprecedented precision and reproducibility over large scales. Phase-field simulations reveal the dominant role of surface diffusion as a driving force for dewetting and provide a predictive tool to further engineer this hybrid top-down/bottom-up self-assembly method. Our results demonstrate that patches of thin monocrystalline films of metals and semiconductors share the same dewetting dynamics. We also prove the potential of our method by fabricating nanotransfer molding of metal oxide xerogels on silicon and glass substrates. This method allows the novel possibility of transferring these Si-based patterns on different materials, which do not usually undergo dewetting, offering great potential also for microfluidic or sensing applications.

  4. Scalable shared-memory multiprocessing

    CERN Document Server

    Lenoski, Daniel E

    1995-01-01

    Dr. Lenoski and Dr. Weber have experience with leading-edge research and practical issues involved in implementing large-scale parallel systems. They were key contributors to the architecture and design of the DASH multiprocessor. Currently, they are involved with commercializing scalable shared-memory technology.

  5. The use of production management techniques in the construction of large scale physics detectors

    CERN Document Server

    Bazan, A; Estrella, F; Kovács, Z; Le Flour, T; Le Goff, J M; Lieunard, S; McClatchey, R; Murray, S; Varga, L Z; Vialle, J P; Zsenei, M

    1999-01-01

    The construction process of detectors for the Large Hadron Collider (LHC) experiments is large scale, heavily constrained by resource availability and evolves with time. As a consequence, changes in detector component design need to be tracked and quickly reflected in the construction process. With similar problems in industry engineers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so- called Workflow Management software (WfMS) to coordinate production work processes. However, PDM and WfMS software are not generally integrated in industry. The scale of LHC experiments, like CMS, demands that industrial production techniques be applied in detector construction. This paper outlines the major functions and applications of the CRISTAL system (Cooperating Repositories and an information System for Tracking Assembly Lifecycles) in use in CMS which successfully integrates PDM and WfMS techniques in managing large scale physics detector ...

  6. Natural Resource Management at Four Social Scales: Psychological Type Matters

    Science.gov (United States)

    Allison, Helen; Hobbs, Richard

    2010-03-01

    Understanding organisation at different social scales is crucial to learning how social processes play a role in sustainable natural resource management. Research has neglected the potential role that individual personality plays in decision making in natural resource management. In the past two decades natural resource management across rural Australia has increasingly come under the direct influence of voluntary participatory groups, such as Catchment Management Authorities. The greater complexity of relationships among all stakeholders is a serious management challenge when attempting to align their differing aspirations and values at four social institutional scales—local, regional, state and national. This is an exploratory study on the psychological composition of groups of stakeholders at the four social scales in natural resource management in Australia. This article uses the theory of temperaments and the Myers-Briggs Type Indicator (MBTI®) to investigate the distribution of personality types. The distribution of personality types in decision-making roles in natural resource management was markedly different from the Australian Archive sample. Trends in personality were found across social scales with Stabilizer temperament more common at the local scale and Theorist temperament more common at the national scale. Greater similarity was found at the state and national scales. Two temperaments comprised between 76 and 90% of participants at the local and regional scales, the common temperament type was Stabilizer. The dissimilarity was Improviser (40%) at the local scale and Theorist (29%) at the regional scale. Implications for increasing participation and bridging the gap between community and government are discussed.

  7. In Whom Do We Trust - Sharing Security Events

    NARCIS (Netherlands)

    Steinberger, Jessica; Kuhnert, Benjamin; Sperotto, Anna; Baier, Harald; Pras, Aiko

    2016-01-01

    Security event sharing is deemed of critical importance to counteract large-scale attacks at Internet service provider (ISP) networks as these attacks have become larger, more sophisticated and frequent. On the one hand, security event sharing is regarded to speed up organization's mitigation and

  8. Ocean warming expands habitat of a rich natural resource and benefits a national economy

    DEFF Research Database (Denmark)

    Jansen, Teunis; Post, Søren Lorenzen; Kristiansen, Trond

    2016-01-01

    Geographic redistribution of living natural resources changes access and thereby harvesting opportunities between countries. Internationally shared fish resources can be sensitive to shifts in the marine environment and this may have great impact on the economies of countries and regions that rely...... northwest in the Atlantic. This change in migration pattern was followed by a rapid development of a large-scale fishery of substantial importance for the national economy of Greenland (23% of Greenland's export value of all goods in 2014). A pelagic trawl survey was conducted in mid-summer 2014...

  9. Resource allocation for two source-destination pairs sharing a single relay with a buffer

    KAUST Repository

    Zafar, Ammar; Shaqfeh, Mohammad; Alouini, Mohamed-Slim; Alnuweiri, Hussein M.

    2014-01-01

    In this paper, we obtain the optimal resource allocation scheme in order to maximize the achievable rate region in a dual-hop system that consists of two independent source-destination pairs sharing a single half-duplex relay. The relay decodes

  10. Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment

    Science.gov (United States)

    Ritsch, E.; Atlas Collaboration

    2014-06-01

    The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during Run 1 relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for Run 2, and beyond. A number of fast detector simulation, digitization and reconstruction techniques are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.

  11. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  12. A Model Supported Interactive Virtual Environment for Natural Resource Sharing in Environmental Education

    Science.gov (United States)

    Barbalios, N.; Ioannidou, I.; Tzionas, P.; Paraskeuopoulos, S.

    2013-01-01

    This paper introduces a realistic 3D model supported virtual environment for environmental education, that highlights the importance of water resource sharing by focusing on the tragedy of the commons dilemma. The proposed virtual environment entails simulations that are controlled by a multi-agent simulation model of a real ecosystem consisting…

  13. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  14. Using CyberShake Workflows to Manage Big Seismic Hazard Data on Large-Scale Open-Science HPC Resources

    Science.gov (United States)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2015-12-01

    The CyberShake computational platform, developed by the Southern California Earthquake Center (SCEC), is an integrated collection of scientific software and middleware that performs 3D physics-based probabilistic seismic hazard analysis (PSHA) for Southern California. CyberShake integrates large-scale and high-throughput research codes to produce probabilistic seismic hazard curves for individual locations of interest and hazard maps for an entire region. A recent CyberShake calculation produced about 500,000 two-component seismograms for each of 336 locations, resulting in over 300 million synthetic seismograms in a Los Angeles-area probabilistic seismic hazard model. CyberShake calculations require a series of scientific software programs. Early computational stages produce data used as inputs by later stages, so we describe CyberShake calculations using a workflow definition language. Scientific workflow tools automate and manage the input and output data and enable remote job execution on large-scale HPC systems. To satisfy the requests of broad impact users of CyberShake data, such as seismologists, utility companies, and building code engineers, we successfully completed CyberShake Study 15.4 in April and May 2015, calculating a 1 Hz urban seismic hazard map for Los Angeles. We distributed the calculation between the NSF Track 1 system NCSA Blue Waters, the DOE Leadership-class system OLCF Titan, and USC's Center for High Performance Computing. This study ran for over 5 weeks, burning about 1.1 million node-hours and producing over half a petabyte of data. The CyberShake Study 15.4 results doubled the maximum simulated seismic frequency from 0.5 Hz to 1.0 Hz as compared to previous studies, representing a factor of 16 increase in computational complexity. We will describe how our workflow tools supported splitting the calculation across multiple systems. We will explain how we modified CyberShake software components, including GPU implementations and

  15. Scale interactions in a mixing layer – the role of the large-scale gradients

    KAUST Repository

    Fiscaletti, D.

    2016-02-15

    © 2016 Cambridge University Press. The interaction between the large and the small scales of turbulence is investigated in a mixing layer, at a Reynolds number based on the Taylor microscale of , via direct numerical simulations. The analysis is performed in physical space, and the local vorticity root-mean-square (r.m.s.) is taken as a measure of the small-scale activity. It is found that positive large-scale velocity fluctuations correspond to large vorticity r.m.s. on the low-speed side of the mixing layer, whereas, they correspond to low vorticity r.m.s. on the high-speed side. The relationship between large and small scales thus depends on position if the vorticity r.m.s. is correlated with the large-scale velocity fluctuations. On the contrary, the correlation coefficient is nearly constant throughout the mixing layer and close to unity if the vorticity r.m.s. is correlated with the large-scale velocity gradients. Therefore, the small-scale activity appears closely related to large-scale gradients, while the correlation between the small-scale activity and the large-scale velocity fluctuations is shown to reflect a property of the large scales. Furthermore, the vorticity from unfiltered (small scales) and from low pass filtered (large scales) velocity fields tend to be aligned when examined within vortical tubes. These results provide evidence for the so-called \\'scale invariance\\' (Meneveau & Katz, Annu. Rev. Fluid Mech., vol. 32, 2000, pp. 1-32), and suggest that some of the large-scale characteristics are not lost at the small scales, at least at the Reynolds number achieved in the present simulation.

  16. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  17. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  18. Architecture for large-scale automatic web accessibility evaluation based on the UWEM methodology

    DEFF Research Database (Denmark)

    Ulltveit-Moe, Nils; Olsen, Morten Goodwin; Pillai, Anand B.

    2008-01-01

    The European Internet Accessibility project (EIAO) has developed an Observatory for performing large scale automatic web accessibility evaluations of public sector web sites in Europe. The architecture includes a distributed web crawler that crawls web sites for links until either a given budget...... of web pages have been identified or the web site has been crawled exhaustively. Subsequently, a uniform random subset of the crawled web pages is sampled and sent for accessibility evaluation and the evaluation results are stored in a Resource Description Format (RDF) database that is later loaded...... challenges that the project faced and the solutions developed towards building a system capable of regular large-scale accessibility evaluations with sufficient capacity and stability. It also outlines some possible future architectural improvements....

  19. Economic Model Predictive Control for Large-Scale and Distributed Energy Systems

    DEFF Research Database (Denmark)

    Standardi, Laura

    Sources (RESs) in the smart grids is increasing. These energy sources bring uncertainty to the production due to their fluctuations. Hence,smart grids need suitable control systems that are able to continuously balance power production and consumption.  We apply the Economic Model Predictive Control (EMPC......) strategy to optimise the economic performances of the energy systems and to balance the power production and consumption. In the case of large-scale energy systems, the electrical grid connects a high number of power units. Because of this, the related control problem involves a high number of variables......In this thesis, we consider control strategies for large and distributed energy systems that are important for the implementation of smart grid technologies.  An electrical grid has to ensure reliability and avoid long-term interruptions in the power supply. Moreover, the share of Renewable Energy...

  20. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  1. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  2. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  3. The Impact of Resource Scarcity on Bonding and Bridging Social Capital: the Case of Fishers' Information-Sharing Networks in Loreto, BCS, Mexico

    Directory of Open Access Journals (Sweden)

    Saudiel Ramirez-Sanchez

    2009-06-01

    Full Text Available Fishers often rely on their social capital to cope with resource fluctuations by sharing information on the abundance and location of fish. Drawing on research in seven coastal fishing communities in Loreto, Baja California Sur, Mexico, we examine the effect of resource scarcity on the bonding, bridging, and linking social-capital patterns of fishers' information-sharing networks. We found that: (1 fishers' information sharing is activated in response to varying ecological conditions; (2 resource scarcity is an ambiguous indicator of the extent to which fishers share information on the abundance and location of fish within and between communities; (3 information sharing is based on trust and occurs through kinship, friendship, and acquaintance social relations; (4 friendship ties play a key and flexible role in fishers' social networks within and between communities; (5 overall, the composition of fishers' social networks follows a friendship>kinship>acquaintance order of importance; and (6 the function of social ties, internal conflict, and settlement histories moderate the effects of resource scarcity on fishers' social capital. We conclude by arguing that the livelihoods of fishers from Loreto have adaptive capacity for dealing with fish fluctuations but little or no proactive resilience to address resource-management issues.

  4. The shared and unique values of optical, fluorescence, thermal and microwave satellite data for estimating large-scale crop yields

    Science.gov (United States)

    Large-scale crop monitoring and yield estimation are important for both scientific research and practical applications. Satellite remote sensing provides an effective means for regional and global cropland monitoring, particularly in data-sparse regions that lack reliable ground observations and rep...

  5. Effects of microcosm scaling and food resources on growth and survival of larval Culex pipiens

    Directory of Open Access Journals (Sweden)

    Paradise Christopher J

    2001-08-01

    Full Text Available Abstract Background We used a simple experimental design to test for the effects of microcosm scaling on the growth and survival of the mosquito, Culex pipiens. Microcosm and mesocosm studies are commonly used in ecology, and there is often an assumption that scaling doesn't affect experimental outcomes. The assumption is implicit in the design; choice of mesocosms may be arbitrary or based on convenience or cost. We tested the hypothesis that scale would influence larvae due to depth and surface area effects. Larvae were predicted to perform poorly in microcosms that were both deep and had small openings, due to buildup of waste products, less exchange with the environment, and increased competition. To determine if the choice of scale affected responses to other factors, we independently varied leaf litter quantity, whose effects on mosquitoes are well known. Results We found adverse effects of both a lower wall surface area and lower horizontal surface area, but microcosm scale interacted with resources such that C. pipiens is affected by habitat size only when food resources are scarce. At low resource levels mosquitoes were fewer, but larger, in microcosms with smaller horizontal surface area and greater depth than in microcosms with greater horizontal surface area and shallower depth. Microcosms with more vertical surface area/volume often produced larger mosquitoes; more food may have been available since mosquitoes browse on walls and other substrates for food. Conclusions The interaction between habitat size and food abundance is consequential to aquatic animals, and choice of scale in experiments may affect results. Varying surface area and depth causes the scale effect, with small horizontal surface area and large depth decreasing matter exchange with the surrounding environment. In addition, fewer resources leads to less leaf surface area, and the effects of varying surface area will be greater under conditions of limiting resources

  6. Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment

    CERN Document Server

    Chapman, J; Duehrssen, M; Elsing, M; Froidevaux, D; Harrington, R; Jansky, R; Langenberg, R; Mandrysch, R; Marshall, Z; Ritsch, E; Salzburger, A

    2014-01-01

    The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during run I relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for run II, and beyond. A number of fast detector simulation, digitization and reconstruction techniques and are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.

  7. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  8. How multiagency partnerships can successfully address large-scale pollution problems: a Hawaii case study.

    Science.gov (United States)

    Donohue, Mary J

    2003-06-01

    Oceanic circulation patterns deposit significant amounts of marine pollution, including derelict fishing gear from North Pacific Ocean fisheries, in the Hawaiian Archipelago [Mar. Pollut. Bull. 42(12) (2001) 1301]. Management responsibility for these islands and their associated natural resources is shared by several government authorities. Non-governmental organizations (NGOs) and private industry also have interests in the archipelago. Since the marine debris problem in this region is too large for any single agency to manage, a multiagency marine debris working group (group) was established in 1998 to improve marine debris mitigation in Hawaii. To date, 16 federal, state, and local agencies, working with industry and NGOs, have removed 195 tons of derelict fishing gear from the Northwestern Hawaiian Islands. This review details the evolution of the partnership, notes its challenges and rewards, and advocates its continued use as an effective resource management tool.

  9. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    Science.gov (United States)

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright

  10. A probabilistic assessment of large scale wind power development for long-term energy resource planning

    Science.gov (United States)

    Kennedy, Scott Warren

    A steady decline in the cost of wind turbines and increased experience in their successful operation have brought this technology to the forefront of viable alternatives for large-scale power generation. Methodologies for understanding the costs and benefits of large-scale wind power development, however, are currently limited. In this thesis, a new and widely applicable technique for estimating the social benefit of large-scale wind power production is presented. The social benefit is based upon wind power's energy and capacity services and the avoidance of environmental damages. The approach uses probabilistic modeling techniques to account for the stochastic interaction between wind power availability, electricity demand, and conventional generator dispatch. A method for including the spatial smoothing effect of geographically dispersed wind farms is also introduced. The model has been used to analyze potential offshore wind power development to the south of Long Island, NY. If natural gas combined cycle (NGCC) and integrated gasifier combined cycle (IGCC) are the alternative generation sources, wind power exhibits a negative social benefit due to its high capacity cost and the relatively low emissions of these advanced fossil-fuel technologies. Environmental benefits increase significantly if charges for CO2 emissions are included. Results also reveal a diminishing social benefit as wind power penetration increases. The dependence of wind power benefits on natural gas and coal prices is also discussed. In power systems with a high penetration of wind generated electricity, the intermittent availability of wind power may influence hourly spot prices. A price responsive electricity demand model is introduced that shows a small increase in wind power value when consumers react to hourly spot prices. The effectiveness of this mechanism depends heavily on estimates of the own- and cross-price elasticities of aggregate electricity demand. This work makes a valuable

  11. A continental-scale hydrology and water quality model for Europe: Calibration and uncertainty of a high-resolution large-scale SWAT model

    Science.gov (United States)

    Abbaspour, K. C.; Rouholahnejad, E.; Vaghefi, S.; Srinivasan, R.; Yang, H.; Kløve, B.

    2015-05-01

    A combination of driving forces are increasing pressure on local, national, and regional water supplies needed for irrigation, energy production, industrial uses, domestic purposes, and the environment. In many parts of Europe groundwater quantity, and in particular quality, have come under sever degradation and water levels have decreased resulting in negative environmental impacts. Rapid improvements in the economy of the eastern European block of countries and uncertainties with regard to freshwater availability create challenges for water managers. At the same time, climate change adds a new level of uncertainty with regard to freshwater supplies. In this research we build and calibrate an integrated hydrological model of Europe using the Soil and Water Assessment Tool (SWAT) program. Different components of water resources are simulated and crop yield and water quality are considered at the Hydrological Response Unit (HRU) level. The water resources are quantified at subbasin level with monthly time intervals. Leaching of nitrate into groundwater is also simulated at a finer spatial level (HRU). The use of large-scale, high-resolution water resources models enables consistent and comprehensive examination of integrated system behavior through physically-based, data-driven simulation. In this article we discuss issues with data availability, calibration of large-scale distributed models, and outline procedures for model calibration and uncertainty analysis. The calibrated model and results provide information support to the European Water Framework Directive and lay the basis for further assessment of the impact of climate change on water availability and quality. The approach and methods developed are general and can be applied to any large region around the world.

  12. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  13. Development of shared decision-making resources to help inform difficult healthcare decisions: An example focused on dysvascular partial foot and transtibial amputations.

    Science.gov (United States)

    Quigley, Matthew; Dillon, Michael P; Fatone, Stefania

    2018-02-01

    Shared decision making is a consultative process designed to encourage patient participation in decision making by providing accurate information about the treatment options and supporting deliberation with the clinicians about treatment options. The process can be supported by resources such as decision aids and discussion guides designed to inform and facilitate often difficult conversations. As this process increases in use, there is opportunity to raise awareness of shared decision making and the international standards used to guide the development of quality resources for use in areas of prosthetic/orthotic care. To describe the process used to develop shared decision-making resources, using an illustrative example focused on decisions about the level of dysvascular partial foot amputation or transtibial amputation. Development process: The International Patient Decision Aid Standards were used to guide the development of the decision aid and discussion guide focused on decisions about the level of dysvascular partial foot amputation or transtibial amputation. Examples from these shared decision-making resources help illuminate the stages of development including scoping and design, research synthesis, iterative development of a prototype, and preliminary testing with patients and clinicians not involved in the development process. Lessons learnt through the process, such as using the International Patient Decision Aid Standards checklist and development guidelines, may help inform others wanting to develop similar shared decision-making resources given the applicability of shared decision making to many areas of prosthetic-/orthotic-related practice. Clinical relevance Shared decision making is a process designed to guide conversations that help patients make an informed decision about their healthcare. Raising awareness of shared decision making and the international standards for development of high-quality decision aids and discussion guides is important

  14. Developing eThread Pipeline Using SAGA-Pilot Abstraction for Large-Scale Structural Bioinformatics

    Directory of Open Access Journals (Sweden)

    Anjani Ragothaman

    2014-01-01

    Full Text Available While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread—a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure.

  15. Local and Transboundary Sharing of Water Resources: Legal and Equity Issues

    International Nuclear Information System (INIS)

    Mumma, A

    2001-01-01

    The article reviewed the law on water in local and transboundary contexts.The aim was to highlight the mechanisms for facilitating equity in the allocation and sharing of the resource. It has been demonstrated that, the relevant local and transboundary laws are in need for further urgent development in order to be able to achieve their objectives. The objective that will be of greatest importance in the 21. century is that of ensuring that, water conservation is fostered and promoted. The effort to meet the increasing demand for water, on the whole, have focused on attempts to increase supply to water users. In the era of increasing water scarcity, the management of demand and development of legal and other mechanisms to ensure efficient utilisation of the available water resources will become the central issue of the day. Equity in allocation will take, as it's central premises the conservation of the limited resource. The law will therefore need to develop increasingly in the direction of fostering a conservation ethic

  16. Digital Scholarship and Resource Sharing Among Astronomy Libraries: A Case Study of RRI Library

    Science.gov (United States)

    Benegal, V.

    2012-08-01

    Prior to developing consortia, astronomy libraries in India were in an embryonic stage with meager resources and dwindling budgets. It was extremely difficult for them to respond to the needs of their users. Librarians at the various Indian astronomy institutes were forced to look at alternate strategies. Raman Research Institute in Bangalore will be examined in a case study where they attempt to implement resource sharing with other institutes in India and how they were able to provide efficient service to the astronomy community.

  17. Decoding Synteny Blocks and Large-Scale Duplications in Mammalian and Plant Genomes

    Science.gov (United States)

    Peng, Qian; Alekseyev, Max A.; Tesler, Glenn; Pevzner, Pavel A.

    The existing synteny block reconstruction algorithms use anchors (e.g., orthologous genes) shared over all genomes to construct the synteny blocks for multiple genomes. This approach, while efficient for a few genomes, cannot be scaled to address the need to construct synteny blocks in many mammalian genomes that are currently being sequenced. The problem is that the number of anchors shared among all genomes quickly decreases with the increase in the number of genomes. Another problem is that many genomes (plant genomes in particular) had extensive duplications, which makes decoding of genomic architecture and rearrangement analysis in plants difficult. The existing synteny block generation algorithms in plants do not address the issue of generating non-overlapping synteny blocks suitable for analyzing rearrangements and evolution history of duplications. We present a new algorithm based on the A-Bruijn graph framework that overcomes these difficulties and provides a unified approach to synteny block reconstruction for multiple genomes, and for genomes with large duplications.

  18. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele; Attili, Antonio; Bisetti, Fabrizio; Elsinga, Gerrit E.

    2015-01-01

    from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  19. Currently Situation, Some Cases and Implications of the Legislation on Access and Benefit-sharing to Biologi cal Genetic Resource in Australia

    Directory of Open Access Journals (Sweden)

    LI Yi-ding

    2017-01-01

    Full Text Available Australia is one of the most abundant in biodiversity country of the global which located in Oceanian and became a signatory coun try of the Convention on Biodiversity, International Treaty on Plant Genetic Resource for Food and Agriculture, Convention on International Trade in Endangered Species. This country stipulated the Environmental Protection and Biodiversity Conservation Act(EPBC, 1999 and Environmental Protection and Biodiversity Conservation Regulations, 2002. Queensland and the North Territory passed the Bio-discovery Act in 2004 and Biological Resource Act in 2006 separately. This paper firstly focus on current situation, characteristic of the legislation on ac cess and benefit-sharing to biological resource in the commonwealth and local place in Australia and then collected and analyzed the typical case of access and benefit-sharing in this country that could bring some experience to China in this field. The conclusion of this paper is that China should stipulated the specific legislation on access and benefit-sharing to biological genetic resource as like the Environmental Protection and Biodiversity Conservation Act(EPBC, 1999 and establish the rule of procedure related to the access and benefit-sharing as like the Environmental Protection and Biodiversity Conservation Regulations, 2002, Bio-discovery Act in 2004, Queensland and the Biological Resource Act in 2006, the North Territory.

  20. Trends in the use of flow-through shares

    International Nuclear Information System (INIS)

    Jennings, R. G.

    1998-01-01

    Flow-through shares financing is considered the most cost effective equity-based financing option for non-tax-paying exploration companies, a form of financing that has helped a very large number of resource-based companies start, stay alive and grow in a very competitive financial marketplace. This paper provides a brief historical review of the flow-though shares concept, outlines developments in recent legislation, changes in the Income Tax Act, and trends in financial structures, and reviews flow-through shares from a tax perspective of the investor and the issuer

  1. The scale concept and sustainable development: implications on the energetics and water resources; O conceito de escala e o desenvolvimento sustentavel: implicacoes sobre os recursos energeticos e hidricos

    Energy Technology Data Exchange (ETDEWEB)

    Demanboro, Antonio Carlos; Mariotoni, Carlos Alberto [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Engenharia Civil]. E-mail: cam@fec.unicamp.br

    1999-07-01

    The relationships between both the demographic growth and the water and energetic resources are focused. The planet scale and carrying capacity are discussed starting from the maximum and optimum sustainable concepts, both anthropocentric and biocentric. Two scenarios denominated 'sustainable agriculture' and 'sharing-water' are elaborated with the available resources of water, fertile lands and energy consumption, and with the population trends. (author)

  2. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  3. Large scale PV plants - also in Denmark. Project report

    Energy Technology Data Exchange (ETDEWEB)

    Ahm, P [PA Energy, Malling (Denmark); Vedde, J [SiCon. Silicon and PV consulting, Birkeroed (Denmark)

    2011-04-15

    Large scale PV (LPV) plants, plants with a capacity of more than 200 kW, has since 2007 constituted an increasing share of the global PV installations. In 2009 large scale PV plants with cumulative power more that 1,3 GWp were connected to the grid. The necessary design data for LPV plants in Denmark are available or can be found, although irradiance data could be improved. There seems to be very few institutional barriers for LPV projects, but as so far no real LPV projects have been processed, these findings have to be regarded as preliminary. The fast growing number of very large scale solar thermal plants for district heating applications supports these findings. It has further been investigated, how to optimize the lay-out of LPV plants. Under the Danish irradiance conditions with several winter months with very low solar height PV installations on flat surfaces will have to balance the requirements of physical space - and cost, and the loss of electricity production due to shadowing effects. The potential for LPV plants in Denmark are found in three main categories: PV installations on flat roof of large commercial buildings, PV installations on other large scale infrastructure such as noise barriers and ground mounted PV installations. The technical potential for all three categories is found to be significant and in the range of 50 - 250 km2. In terms of energy harvest PV plants will under Danish conditions exhibit an overall efficiency of about 10 % in conversion of the energy content of the light compared to about 0,3 % for biomass. The theoretical ground area needed to produce the present annual electricity consumption of Denmark at 33-35 TWh is about 300 km2 The Danish grid codes and the electricity safety regulations mention very little about PV and nothing about LPV plants. It is expected that LPV plants will be treated similarly to big wind turbines. A number of LPV plant scenarios have been investigated in detail based on real commercial offers and

  4. H2@Scale Resource and Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ruth, Mark

    2017-07-12

    This presentation overviews progress to date on the H2@Scale resource and market analysis work. The work finds, for example, that hydrogen demand of 60 MMT/yr is possible when transportation and industry are considered; resources are available to meet that demand; using renewable resources would reduce emissions and fossil use by over 15%; further impacts are possible when considering synergistic benefits; additional analysis is underway to improve understanding of potential markets and synergistic impacts; and further analysis will be necessary to estimate impacts due to spatial characteristics, feedback effects in the economy, and inertia characteristics.

  5. Resource Provisioning in Large-Scale Self-Organizing Distributed Systems

    Science.gov (United States)

    2012-06-01

    organizations. Due to scale, competition, and advertising revenues, services such as email, social networking, office document processing, file storage and...53] fastCGI, http://www.fastcgi.com/. 205 [54] B. Adida , “It all starts at the server [5.World Wide Web and FastCGI],” IEEE Internet

  6. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  7. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  8. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  9. Aeroelastic Stability Investigations for Large-scale Vertical Axis Wind Turbines

    International Nuclear Information System (INIS)

    2 P O Box 5800, Albuquerque, NM, 87185 (United States))" data-affiliation=" (Senior Member of Technical Staff, Analytical Structural Dynamics Sandia National Laboratories2 P O Box 5800, Albuquerque, NM, 87185 (United States))" >Owens, B C; 2 P O Box 5800, Albuquerque, NM, 87185 (United States))" data-affiliation=" (Principal Member of Technical Staff, Wind Energy Technologies Sandia National Laboratories2 P O Box 5800, Albuquerque, NM, 87185 (United States))" >Griffith, D T

    2014-01-01

    The availability of offshore wind resources in coastal regions, along with a high concentration of load centers in these areas, makes offshore wind energy an attractive opportunity for clean renewable electricity production. High infrastructure costs such as the offshore support structure and operation and maintenance costs for offshore wind technology, however, are significant obstacles that need to be overcome to make offshore wind a more cost-effective option. A vertical-axis wind turbine (VAWT) rotor configuration offers a potential transformative technology solution that significantly lowers cost of energy for offshore wind due to its inherent advantages for the offshore market. However, several potential challenges exist for VAWTs and this paper addresses one of them with an initial investigation of dynamic aeroelastic stability for large-scale, multi-megawatt VAWTs. The aeroelastic formulation and solution method from the BLade Aeroelastic STability Tool (BLAST) for HAWT blades was employed to extend the analysis capability of a newly developed structural dynamics design tool for VAWTs. This investigation considers the effect of configuration geometry, material system choice, and number of blades on the aeroelastic stability of a VAWT, and provides an initial scoping for potential aeroelastic instabilities in large-scale VAWT designs

  10. Aeroelastic Stability Investigations for Large-scale Vertical Axis Wind Turbines

    Science.gov (United States)

    Owens, B. C.; Griffith, D. T.

    2014-06-01

    The availability of offshore wind resources in coastal regions, along with a high concentration of load centers in these areas, makes offshore wind energy an attractive opportunity for clean renewable electricity production. High infrastructure costs such as the offshore support structure and operation and maintenance costs for offshore wind technology, however, are significant obstacles that need to be overcome to make offshore wind a more cost-effective option. A vertical-axis wind turbine (VAWT) rotor configuration offers a potential transformative technology solution that significantly lowers cost of energy for offshore wind due to its inherent advantages for the offshore market. However, several potential challenges exist for VAWTs and this paper addresses one of them with an initial investigation of dynamic aeroelastic stability for large-scale, multi-megawatt VAWTs. The aeroelastic formulation and solution method from the BLade Aeroelastic STability Tool (BLAST) for HAWT blades was employed to extend the analysis capability of a newly developed structural dynamics design tool for VAWTs. This investigation considers the effect of configuration geometry, material system choice, and number of blades on the aeroelastic stability of a VAWT, and provides an initial scoping for potential aeroelastic instabilities in large-scale VAWT designs.

  11. Education for All Revisited: On Concepts of Sharing in the Open Educational Resources (OER Movement

    Directory of Open Access Journals (Sweden)

    Theo Hug

    2014-11-01

    Full Text Available  Relationships between the private and public sphere in education have been discussed repeatedly and in various ways. However, the role of media and media dynamics is widely underestimated in this context. It is only recently, since the digital turn, that the focus of the debates has changed. In the past few years, manifold initiatives have aimed at opening up education on various levels using digital communications technologies and Creative Commons licenses. Additionally, massive open online courses (moocs have been developed. Today, OER (Open Educational Resources is used widely as an umbrella term for free content creation initiatives: OER Commons (http://www.oercommons.org/, Open Courseware (OCW, OER repositories, OCW search facilities, University OCW initiatives, and related activities. Shared resource sites such as Connexions (http://cnx.org, WikiEducator (http://wikieducator.org, and Curriki (www.curriki.org have an increasing number of visitors and contributors.On one hand, the motif of ‘education for all’ is once again appearing in related debates and practices. On the other hand, notions of sharing play a crucial role in open content and open education strategies. This purpose of this paper isthreefold: It starts with an outline of selected understandings of sharing in educational contexts; it then addresses their relevance for OER development through examining contrasting and relational conceptual dimensions. Lastly, the contribution aims to sketch different forms of sharing related to media forms.

  12. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  13. Sustainability in health care by allocating resources effectively (SHARE) 4: exploring opportunities and methods for consumer engagement in resource allocation in a local healthcare setting.

    Science.gov (United States)

    Harris, Claire; Ko, Henry; Waller, Cara; Sloss, Pamela; Williams, Pamela

    2017-05-05

    This is the fourth in a series of papers reporting a program of Sustainability in Health care by Allocating Resources Effectively (SHARE) in a local healthcare setting. Healthcare decision-makers have sought to improve the effectiveness and efficiency of services through removal or restriction of practices that are unsafe or of little benefit, often referred to as 'disinvestment'. A systematic, integrated, evidence-based program for disinvestment was being established within a large Australian health service network. Consumer engagement was acknowledged as integral to this process. This paper reports the process of developing a model to integrate consumer views and preferences into an organisation-wide approach to resource allocation. A literature search was conducted and interviews and workshops were undertaken with health service consumers and staff. Findings were drafted into a model for consumer engagement in resource allocation which was workshopped and refined. Although consumer engagement is increasingly becoming a requirement of publicly-funded health services and documented in standards and policies, participation in organisational decision-making is not widespread. Several consistent messages for consumer engagement in this context emerged from the literature and consumer responses. Opportunities, settings and activities for consumer engagement through communication, consultation and participation were identified within the resource allocation process. Sources of information regarding consumer values and perspectives in publications and locally-collected data, and methods to use them in health service decision-making, were identified. A model bringing these elements together was developed. The proposed model presents potential opportunities and activities for consumer engagement in the context of resource allocation.

  14. The Political Economy of Cross-Scale Networks in Resource Co-Management

    Directory of Open Access Journals (Sweden)

    W. Neil Adger

    2005-12-01

    Full Text Available We investigate linkages between stakeholders in resource management that occur at different spatial and institutional levels and identify the winners and losers in such interactions. So-called cross-scale interactions emerge because of the benefits to individual stakeholder groups in undertaking them or the high costs of not undertaking them. Hence there are uneven gains from cross-scale interactions that are themselves an integral part of social-ecological system governance. The political economy framework outlined here suggests that the determinants of the emergence of cross-scale interactions are the exercise of relative power between stakeholders and their costs of accessing and creating linkages. Cross-scale interactions by powerful stakeholders have the potential to undermine trust in resource management arrangements. If government regulators, for example, mobilize information and resources from cross-level interactions to reinforce their authority, this often disempowers other stakeholders such as resource users. Offsetting such impacts, some cross-scale interactions can be empowering for local level user groups in creating social and political capital. These issues are illustrated with observations on resource management in a marine protected area in Tobago in the Caribbean. The case study demonstrates that the structure of the cross-scale interplay, in terms of relative winners and losers, determines its contribution to the resilience of social-ecological systems.

  15. Large-scale runoff generation - parsimonious parameterisation using high-resolution topography

    Science.gov (United States)

    Gong, L.; Halldin, S.; Xu, C.-Y.

    2011-08-01

    World water resources have primarily been analysed by global-scale hydrological models in the last decades. Runoff generation in many of these models are based on process formulations developed at catchments scales. The division between slow runoff (baseflow) and fast runoff is primarily governed by slope and spatial distribution of effective water storage capacity, both acting at very small scales. Many hydrological models, e.g. VIC, account for the spatial storage variability in terms of statistical distributions; such models are generally proven to perform well. The statistical approaches, however, use the same runoff-generation parameters everywhere in a basin. The TOPMODEL concept, on the other hand, links the effective maximum storage capacity with real-world topography. Recent availability of global high-quality, high-resolution topographic data makes TOPMODEL attractive as a basis for a physically-based runoff-generation algorithm at large scales, even if its assumptions are not valid in flat terrain or for deep groundwater systems. We present a new runoff-generation algorithm for large-scale hydrology based on TOPMODEL concepts intended to overcome these problems. The TRG (topography-derived runoff generation) algorithm relaxes the TOPMODEL equilibrium assumption so baseflow generation is not tied to topography. TRG only uses the topographic index to distribute average storage to each topographic index class. The maximum storage capacity is proportional to the range of topographic index and is scaled by one parameter. The distribution of storage capacity within large-scale grid cells is obtained numerically through topographic analysis. The new topography-derived distribution function is then inserted into a runoff-generation framework similar VIC's. Different basin parts are parameterised by different storage capacities, and different shapes of the storage-distribution curves depend on their topographic characteristics. The TRG algorithm is driven by the

  16. Large Scale Cosmological Anomalies and Inhomogeneous Dark Energy

    Directory of Open Access Journals (Sweden)

    Leandros Perivolaropoulos

    2014-01-01

    Full Text Available A wide range of large scale observations hint towards possible modifications on the standard cosmological model which is based on a homogeneous and isotropic universe with a small cosmological constant and matter. These observations, also known as “cosmic anomalies” include unexpected Cosmic Microwave Background perturbations on large angular scales, large dipolar peculiar velocity flows of galaxies (“bulk flows”, the measurement of inhomogenous values of the fine structure constant on cosmological scales (“alpha dipole” and other effects. The presence of the observational anomalies could either be a large statistical fluctuation in the context of ΛCDM or it could indicate a non-trivial departure from the cosmological principle on Hubble scales. Such a departure is very much constrained by cosmological observations for matter. For dark energy however there are no significant observational constraints for Hubble scale inhomogeneities. In this brief review I discuss some of the theoretical models that can naturally lead to inhomogeneous dark energy, their observational constraints and their potential to explain the large scale cosmic anomalies.

  17. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Hardenberg, J. von; Parodi, A.; Passoni, G.; Provenzale, A.; Spiegel, E.A.

    2008-01-01

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 10 5 and 10 8 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  18. Electric vehicles and large-scale integration of wind power

    DEFF Research Database (Denmark)

    Liu, Wen; Hu, Weihao; Lund, Henrik

    2013-01-01

    with this imbalance and to reduce its high dependence on oil production. For this reason, it is interesting to analyse the extent to which transport electrification can further the renewable energy integration. This paper quantifies this issue in Inner Mongolia, where the share of wind power in the electricity supply...... was 6.5% in 2009 and which has the plan to develop large-scale wind power. The results show that electric vehicles (EVs) have the ability to balance the electricity demand and supply and to further the wind power integration. In the best case, the energy system with EV can increase wind power...... integration by 8%. The application of EVs benefits from saving both energy system cost and fuel cost. However, the negative consequences of decreasing energy system efficiency and increasing the CO2 emission should be noted when applying the hydrogen fuel cell vehicle (HFCV). The results also indicate...

  19. Analysis of the Economic Impact of Large-Scale Deployment of Biomass Resources for Energy and Materials in the Netherlands. Macro-economics biobased synthesis report

    International Nuclear Information System (INIS)

    Hoefnagels, R.; Dornburg, V.; Faaij, A.; Banse, M.

    2009-03-01

    The Bio-based Raw Materials Platform (PGG), part of the Energy Transition in The Netherlands, commissioned the Agricultural Economics Research Institute (LEI) and the Copernicus Institute of Utrecht University to conduct research on the macro-economic impact of large scale deployment of biomass for energy and materials in the Netherlands. Two model approaches were applied based on a consistent set of scenario assumptions: a bottom-up study including technoeconomic projections of fossil and bio-based conversion technologies and a topdown study including macro-economic modelling of (global) trade of biomass and fossil resources. The results of the top-down and bottom-up modelling work are reported separately. The results of the synthesis of the modelling work are presented in this report

  20. A fast approach to generate large-scale topographic maps based on new Chinese vehicle-borne Lidar system

    International Nuclear Information System (INIS)

    Youmei, Han; Bogang, Yang

    2014-01-01

    Large -scale topographic maps are important basic information for city and regional planning and management. Traditional large- scale mapping methods are mostly based on artificial mapping and photogrammetry. The traditional mapping method is inefficient and limited by the environments. While the photogrammetry methods(such as low-altitude aerial mapping) is an economical and effective way to map wide and regulate range of large scale topographic map but doesn't work well in the small area due to the high cost of manpower and resources. Recent years, the vehicle-borne LIDAR technology has a rapid development, and its application in surveying and mapping is becoming a new topic. The main objective of this investigation is to explore the potential of vehicle-borne LIDAR technology to be used to fast mapping large scale topographic maps based on new Chinese vehicle-borne LIDAR system. It studied how to use the new Chinese vehicle-borne LIDAR system measurement technology to map large scale topographic maps. After the field data capture, it can be mapped in the office based on the LIDAR data (point cloud) by software which programmed by ourselves. In addition, the detailed process and accuracy analysis were proposed by an actual case. The result show that this new technology provides a new fast method to generate large scale topographic maps, which is high efficient and accuracy compared to traditional methods

  1. Manufacturing test of large scale hollow capsule and long length cladding in the large scale oxide dispersion strengthened (ODS) martensitic steel

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2004-04-01

    Mass production capability of oxide dispersion strengthened (ODS) martensitic steel cladding (9Cr) has being evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube (raw materials powder production, mechanical alloying (MA) by ball mill, canning, hot extrusion, and machining) is a dominant factor in the total cost for manufacturing ODS ferritic steel cladding. In this study, the large-sale 9Cr-ODS martensitic steel mother tube which is made with a large-scale hollow capsule, and long length claddings were manufactured, and the applicability of these processes was evaluated. Following results were obtained in this study. (1) Manufacturing the large scale mother tube in the dimension of 32 mm OD, 21 mm ID, and 2 m length has been successfully carried out using large scale hollow capsule. This mother tube has a high degree of accuracy in size. (2) The chemical composition and the micro structure of the manufactured mother tube are similar to the existing mother tube manufactured by a small scale can. And the remarkable difference between the bottom and top sides in the manufactured mother tube has not been observed. (3) The long length cladding has been successfully manufactured from the large scale mother tube which was made using a large scale hollow capsule. (4) For reducing the manufacturing cost of the ODS steel claddings, manufacturing process of the mother tubes using a large scale hollow capsules is promising. (author)

  2. Implementation of Cyberinfrastructure and Data Management Workflow for a Large-Scale Sensor Network

    Science.gov (United States)

    Jones, A. S.; Horsburgh, J. S.

    2014-12-01

    Monitoring with in situ environmental sensors and other forms of field-based observation presents many challenges for data management, particularly for large-scale networks consisting of multiple sites, sensors, and personnel. The availability and utility of these data in addressing scientific questions relies on effective cyberinfrastructure that facilitates transformation of raw sensor data into functional data products. It also depends on the ability of researchers to share and access the data in useable formats. In addition to addressing the challenges presented by the quantity of data, monitoring networks need practices to ensure high data quality, including procedures and tools for post processing. Data quality is further enhanced if practitioners are able to track equipment, deployments, calibrations, and other events related to site maintenance and associate these details with observational data. In this presentation we will describe the overall workflow that we have developed for research groups and sites conducting long term monitoring using in situ sensors. Features of the workflow include: software tools to automate the transfer of data from field sites to databases, a Python-based program for data quality control post-processing, a web-based application for online discovery and visualization of data, and a data model and web interface for managing physical infrastructure. By automating the data management workflow, the time from collection to analysis is reduced and sharing and publication is facilitated. The incorporation of metadata standards and descriptions and the use of open-source tools enhances the sustainability and reusability of the data. We will describe the workflow and tools that we have developed in the context of the iUTAH (innovative Urban Transitions and Aridregion Hydrosustainability) monitoring network. The iUTAH network consists of aquatic and climate sensors deployed in three watersheds to monitor Gradients Along Mountain to Urban

  3. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  4. From Planetary Boundaries to national fair shares of the global safe operating space - How can the scales be bridged?

    Science.gov (United States)

    Häyhä, Tiina; Cornell, Sarah; Lucas, Paul; van Vuuren, Detlef; Hoff, Holger

    2016-04-01

    The planetary boundaries framework proposes precautionary quantitative global limits to the anthropogenic perturbation of crucial Earth system processes. In this way, it marks out a planetary 'safe operating space' for human activities. However, decisions regarding resource use and emissions are mostly made at much smaller scales, mostly by (sub-)national and regional governments, businesses, and other local actors. To operationalize the planetary boundaries, they need to be translated into and aligned with targets that are relevant at these smaller scales. In this paper, we develop a framework that addresses the three dimension of bridging across scales: biophysical, socio-economic and ethical, to provide a consistent universally applicable approach for translating the planetary boundaries into national level context-specific and fair shares of the safe operating space. We discuss our findings in the context of previous studies and their implications for future analyses and policymaking. In this way, we help link the planetary boundaries framework to widely- applied operational and policy concepts for more robust strong sustainability decision-making.

  5. Scaling up HIV viral load - lessons from the large-scale implementation of HIV early infant diagnosis and CD4 testing.

    Science.gov (United States)

    Peter, Trevor; Zeh, Clement; Katz, Zachary; Elbireer, Ali; Alemayehu, Bereket; Vojnov, Lara; Costa, Alex; Doi, Naoko; Jani, Ilesh

    2017-11-01

    The scale-up of effective HIV viral load (VL) testing is an urgent public health priority. Implementation of testing is supported by the availability of accurate, nucleic acid based laboratory and point-of-care (POC) VL technologies and strong WHO guidance recommending routine testing to identify treatment failure. However, test implementation faces challenges related to the developing health systems in many low-resource countries. The purpose of this commentary is to review the challenges and solutions from the large-scale implementation of other diagnostic tests, namely nucleic-acid based early infant HIV diagnosis (EID) and CD4 testing, and identify key lessons to inform the scale-up of VL. Experience with EID and CD4 testing provides many key lessons to inform VL implementation and may enable more effective and rapid scale-up. The primary lessons from earlier implementation efforts are to strengthen linkage to clinical care after testing, and to improve the efficiency of testing. Opportunities to improve linkage include data systems to support the follow-up of patients through the cascade of care and test delivery, rapid sample referral networks, and POC tests. Opportunities to increase testing efficiency include improvements to procurement and supply chain practices, well connected tiered laboratory networks with rational deployment of test capacity across different levels of health services, routine resource mapping and mobilization to ensure adequate resources for testing programs, and improved operational and quality management of testing services. If applied to VL testing programs, these approaches could help improve the impact of VL on ART failure management and patient outcomes, reduce overall costs and help ensure the sustainable access to reduced pricing for test commodities, as well as improve supportive health systems such as efficient, and more rigorous quality assurance. These lessons draw from traditional laboratory practices as well as fields

  6. Toward shared decision making: using the OPTION scale to analyze resident-patient consultations in family medicine

    NARCIS (Netherlands)

    Pellerin, M.A.; Elwyn, G.; Rousseau, M.; Stacey, D.; Robitaille, H.; Legare, F.

    2011-01-01

    PURPOSE: Do residents in family medicine practice share decision making with patients during consultations? This study used a validated scale to score family medicine residents' shared decision-making (SDM) skills in primary care consultations and to determine whether residents' demographic

  7. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    Science.gov (United States)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  8. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  9. Large-scale influences in near-wall turbulence.

    Science.gov (United States)

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  10. Sharing data is a shared responsibility: Commentary on: "The essential nature of sharing in science".

    Science.gov (United States)

    Giffels, Joe

    2010-12-01

    Research data should be made readily available. A robust data-sharing plan, led by the principal investigator of the research project, requires considerable administrative and operational resources. Because external support for data sharing is minimal, principal investigators should consider engaging existing institutional information experts, such as librarians and information systems personnel, to participate in data-sharing efforts.

  11. Performance Evaluation of Hadoop-based Large-scale Network Traffic Analysis Cluster

    Directory of Open Access Journals (Sweden)

    Tao Ran

    2016-01-01

    Full Text Available As Hadoop has gained popularity in big data era, it is widely used in various fields. The self-design and self-developed large-scale network traffic analysis cluster works well based on Hadoop, with off-line applications running on it to analyze the massive network traffic data. On purpose of scientifically and reasonably evaluating the performance of analysis cluster, we propose a performance evaluation system. Firstly, we set the execution times of three benchmark applications as the benchmark of the performance, and pick 40 metrics of customized statistical resource data. Then we identify the relationship between the resource data and the execution times by a statistic modeling analysis approach, which is composed of principal component analysis and multiple linear regression. After training models by historical data, we can predict the execution times by current resource data. Finally, we evaluate the performance of analysis cluster by the validated predicting of execution times. Experimental results show that the predicted execution times by trained models are within acceptable error range, and the evaluation results of performance are accurate and reliable.

  12. Matrix Sampling of Items in Large-Scale Assessments

    Directory of Open Access Journals (Sweden)

    Ruth A. Childs

    2003-07-01

    Full Text Available Matrix sampling of items -' that is, division of a set of items into different versions of a test form..-' is used by several large-scale testing programs. Like other test designs, matrixed designs have..both advantages and disadvantages. For example, testing time per student is less than if each..student received all the items, but the comparability of student scores may decrease. Also,..curriculum coverage is maintained, but reporting of scores becomes more complex. In this paper,..matrixed designs are compared with more traditional designs in nine categories of costs:..development costs, materials costs, administration costs, educational costs, scoring costs,..reliability costs, comparability costs, validity costs, and reporting costs. In choosing among test..designs, a testing program should examine the costs in light of its mandate(s, the content of the..tests, and the financial resources available, among other considerations.

  13. PKI security in large-scale healthcare networks.

    Science.gov (United States)

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  14. LARGE-SCALE HYDROGEN PRODUCTION FROM NUCLEAR ENERGY USING HIGH TEMPERATURE ELECTROLYSIS

    International Nuclear Information System (INIS)

    O'Brien, James E.

    2010-01-01

    Hydrogen can be produced from water splitting with relatively high efficiency using high-temperature electrolysis. This technology makes use of solid-oxide cells, running in the electrolysis mode to produce hydrogen from steam, while consuming electricity and high-temperature process heat. When coupled to an advanced high temperature nuclear reactor, the overall thermal-to-hydrogen efficiency for high-temperature electrolysis can be as high as 50%, which is about double the overall efficiency of conventional low-temperature electrolysis. Current large-scale hydrogen production is based almost exclusively on steam reforming of methane, a method that consumes a precious fossil fuel while emitting carbon dioxide to the atmosphere. Demand for hydrogen is increasing rapidly for refining of increasingly low-grade petroleum resources, such as the Athabasca oil sands and for ammonia-based fertilizer production. Large quantities of hydrogen are also required for carbon-efficient conversion of biomass to liquid fuels. With supplemental nuclear hydrogen, almost all of the carbon in the biomass can be converted to liquid fuels in a nearly carbon-neutral fashion. Ultimately, hydrogen may be employed as a direct transportation fuel in a 'hydrogen economy.' The large quantity of hydrogen that would be required for this concept should be produced without consuming fossil fuels or emitting greenhouse gases. An overview of the high-temperature electrolysis technology will be presented, including basic theory, modeling, and experimental activities. Modeling activities include both computational fluid dynamics and large-scale systems analysis. We have also demonstrated high-temperature electrolysis in our laboratory at the 15 kW scale, achieving a hydrogen production rate in excess of 5500 L/hr.

  15. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  16. Emerging large-scale solar heating applications

    Energy Technology Data Exchange (ETDEWEB)

    Wong, W.P.; McClung, J.L. [Science Applications International Corporation (SAIC Canada), Ottawa, Ontario (Canada)

    2009-07-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  17. Task-Sharing Approaches to Improve Mental Health Care in Rural and Other Low-Resource Settings: A Systematic Review.

    Science.gov (United States)

    Hoeft, Theresa J; Fortney, John C; Patel, Vikram; Unützer, Jürgen

    2018-12-01

    Rural areas persistently face a shortage of mental health specialists. Task shifting, or task sharing, is an approach in global mental health that may help address unmet mental health needs in rural and other low-resource areas. This review focuses on task-shifting approaches and highlights future directions for research in this area. Systematic review on task sharing of mental health care in rural areas of high-income countries included: (1) PubMed, (2) gray literature for innovations not yet published in peer-reviewed journals, and (3) outreach to experts for additional articles. We included English language articles published before August 31, 2013, on interventions sharing mental health care tasks across a team in rural settings. We excluded literature: (1) from low- and middle-income countries, (2) involving direct transfer of care to another provider, and (3) describing clinical guidelines and shared decision-making tools. The review identified approaches to task sharing focused mainly on community health workers and primary care providers. Technology was identified as a way to leverage mental health specialists to support care across settings both within primary care and out in the community. The review also highlighted how provider education, supervision, and partnerships with local communities can support task sharing. Challenges, such as confidentiality, are often not addressed in the literature. Approaches to task sharing may improve reach and effectiveness of mental health care in rural and other low-resource settings, though important questions remain. We recommend promising research directions to address these questions. © 2017 National Rural Health Association.

  18. Computational models of consumer confidence from large-scale online attention data: crowd-sourcing econometrics.

    Science.gov (United States)

    Dong, Xianlei; Bollen, Johan

    2015-01-01

    Economies are instances of complex socio-technical systems that are shaped by the interactions of large numbers of individuals. The individual behavior and decision-making of consumer agents is determined by complex psychological dynamics that include their own assessment of present and future economic conditions as well as those of others, potentially leading to feedback loops that affect the macroscopic state of the economic system. We propose that the large-scale interactions of a nation's citizens with its online resources can reveal the complex dynamics of their collective psychology, including their assessment of future system states. Here we introduce a behavioral index of Chinese Consumer Confidence (C3I) that computationally relates large-scale online search behavior recorded by Google Trends data to the macroscopic variable of consumer confidence. Our results indicate that such computational indices may reveal the components and complex dynamics of consumer psychology as a collective socio-economic phenomenon, potentially leading to improved and more refined economic forecasting.

  19. Computational models of consumer confidence from large-scale online attention data: crowd-sourcing econometrics.

    Directory of Open Access Journals (Sweden)

    Xianlei Dong

    Full Text Available Economies are instances of complex socio-technical systems that are shaped by the interactions of large numbers of individuals. The individual behavior and decision-making of consumer agents is determined by complex psychological dynamics that include their own assessment of present and future economic conditions as well as those of others, potentially leading to feedback loops that affect the macroscopic state of the economic system. We propose that the large-scale interactions of a nation's citizens with its online resources can reveal the complex dynamics of their collective psychology, including their assessment of future system states. Here we introduce a behavioral index of Chinese Consumer Confidence (C3I that computationally relates large-scale online search behavior recorded by Google Trends data to the macroscopic variable of consumer confidence. Our results indicate that such computational indices may reveal the components and complex dynamics of consumer psychology as a collective socio-economic phenomenon, potentially leading to improved and more refined economic forecasting.

  20. Large-scale effects of migration and conflict in pre-agricultural groups: Insights from a dynamic model.

    Directory of Open Access Journals (Sweden)

    Francesco Gargano

    Full Text Available The debate on the causes of conflict in human societies has deep roots. In particular, the extent of conflict in hunter-gatherer groups remains unclear. Some authors suggest that large-scale violence only arose with the spreading of agriculture and the building of complex societies. To shed light on this issue, we developed a model based on operatorial techniques simulating population-resource dynamics within a two-dimensional lattice, with humans and natural resources interacting in each cell of the lattice. The model outcomes under different conditions were compared with recently available demographic data for prehistoric South America. Only under conditions that include migration among cells and conflict was the model able to consistently reproduce the empirical data at a continental scale. We argue that the interplay between resource competition, migration, and conflict drove the population dynamics of South America after the colonization phase and before the introduction of agriculture. The relation between population and resources indeed emerged as a key factor leading to migration and conflict once the carrying capacity of the environment has been reached.

  1. Large-Scale Brain Networks Supporting Divided Attention across Spatial Locations and Sensory Modalities.

    Science.gov (United States)

    Santangelo, Valerio

    2018-01-01

    Higher-order cognitive processes were shown to rely on the interplay between large-scale neural networks. However, brain networks involved with the capability to split attentional resource over multiple spatial locations and multiple stimuli or sensory modalities have been largely unexplored to date. Here I re-analyzed data from Santangelo et al. (2010) to explore the causal interactions between large-scale brain networks during divided attention. During fMRI scanning, participants monitored streams of visual and/or auditory stimuli in one or two spatial locations for detection of occasional targets. This design allowed comparing a condition in which participants monitored one stimulus/modality (either visual or auditory) in two spatial locations vs. a condition in which participants monitored two stimuli/modalities (both visual and auditory) in one spatial location. The analysis of the independent components (ICs) revealed that dividing attentional resources across two spatial locations necessitated a brain network involving the left ventro- and dorso-lateral prefrontal cortex plus the posterior parietal cortex, including the intraparietal sulcus (IPS) and the angular gyrus, bilaterally. The analysis of Granger causality highlighted that the activity of lateral prefrontal regions were predictive of the activity of all of the posteriors parietal nodes. By contrast, dividing attention across two sensory modalities necessitated a brain network including nodes belonging to the dorsal frontoparietal network, i.e., the bilateral frontal eye-fields (FEF) and IPS, plus nodes belonging to the salience network, i.e., the anterior cingulated cortex and the left and right anterior insular cortex (aIC). The analysis of Granger causality highlights a tight interdependence between the dorsal frontoparietal and salience nodes in trials requiring divided attention between different sensory modalities. The current findings therefore highlighted a dissociation among brain networks

  2. Large-Scale Brain Networks Supporting Divided Attention across Spatial Locations and Sensory Modalities

    Directory of Open Access Journals (Sweden)

    Valerio Santangelo

    2018-02-01

    Full Text Available Higher-order cognitive processes were shown to rely on the interplay between large-scale neural networks. However, brain networks involved with the capability to split attentional resource over multiple spatial locations and multiple stimuli or sensory modalities have been largely unexplored to date. Here I re-analyzed data from Santangelo et al. (2010 to explore the causal interactions between large-scale brain networks during divided attention. During fMRI scanning, participants monitored streams of visual and/or auditory stimuli in one or two spatial locations for detection of occasional targets. This design allowed comparing a condition in which participants monitored one stimulus/modality (either visual or auditory in two spatial locations vs. a condition in which participants monitored two stimuli/modalities (both visual and auditory in one spatial location. The analysis of the independent components (ICs revealed that dividing attentional resources across two spatial locations necessitated a brain network involving the left ventro- and dorso-lateral prefrontal cortex plus the posterior parietal cortex, including the intraparietal sulcus (IPS and the angular gyrus, bilaterally. The analysis of Granger causality highlighted that the activity of lateral prefrontal regions were predictive of the activity of all of the posteriors parietal nodes. By contrast, dividing attention across two sensory modalities necessitated a brain network including nodes belonging to the dorsal frontoparietal network, i.e., the bilateral frontal eye-fields (FEF and IPS, plus nodes belonging to the salience network, i.e., the anterior cingulated cortex and the left and right anterior insular cortex (aIC. The analysis of Granger causality highlights a tight interdependence between the dorsal frontoparietal and salience nodes in trials requiring divided attention between different sensory modalities. The current findings therefore highlighted a dissociation among

  3. Data sharing by scientists: Practices and perceptions

    Science.gov (United States)

    Tenopir, C.; Allard, S.; Douglass, K.; Aydinoglu, A.U.; Wu, L.; Read, E.; Manoff, M.; Frame, M.

    2011-01-01

    Background: Scientific research in the 21st century is more data intensive and collaborative than in the past. It is important to study the data practices of researchers - data accessibility, discovery, re-use, preservation and, particularly, data sharing. Data sharing is a valuable part of the scientific method allowing for verification of results and extending research from prior results. Methodology/Principal Findings: A total of 1329 scientists participated in this survey exploring current data sharing practices and perceptions of the barriers and enablers of data sharing. Scientists do not make their data electronically available to others for various reasons, including insufficient time and lack of funding. Most respondents are satisfied with their current processes for the initial and short-term parts of the data or research lifecycle (collecting their research data; searching for, describing or cataloging, analyzing, and short-term storage of their data) but are not satisfied with long-term data preservation. Many organizations do not provide support to their researchers for data management both in the short- and long-term. If certain conditions are met (such as formal citation and sharing reprints) respondents agree they are willing to share their data. There are also significant differences and approaches in data management practices based on primary funding agency, subject discipline, age, work focus, and world region. Conclusions/Significance: Barriers to effective data sharing and preservation are deeply rooted in the practices and culture of the research process as well as the researchers themselves. New mandates for data management plans from NSF and other federal agencies and world-wide attention to the need to share and preserve data could lead to changes. Large scale programs, such as the NSF-sponsored DataNET (including projects like DataONE) will both bring attention and resources to the issue and make it easier for scientists to apply sound

  4. Environmental aspects of large-scale wind-power systems in the UK

    Science.gov (United States)

    Robson, A.

    1984-11-01

    Environmental issues relating to the introduction of large, MW-scale wind turbines at land-based sites in the UK are discussed. Noise, television interference, hazards to bird life, and visual effects are considered. Areas of uncertainty are identified, but enough is known from experience elsewhere in the world to enable the first UK machines to be introduced in a safe and environementally acceptable manner. Research to establish siting criteria more clearly, and significantly increase the potential wind-energy resource is mentioned. Studies of the comparative risk of energy systems are shown to be overpessimistic for UK wind turbines.

  5. Reducing the market impact of large shares of intermittent energy in Denmark

    International Nuclear Information System (INIS)

    Klinge Jacobsen, Henrik; Zvingilaite, Erika

    2010-01-01

    The increasing prevalence of renewable and intermittent energy sources in the electricity system is creating new challenges for the interaction of the system. In Denmark, high renewable shares have been achieved without great difficulty, mainly due to the flexibility of the nearby Nordic hydro-power dominated system. Further increases in the share of renewable energy sources require that additional options are considered to facilitate integration with the lowest possible cost. With large shares of intermittent energy, the impact can be observed on wholesale prices, giving both lower prices and higher volatility. A lack of wind that causes high prices is rarely seen because long periods without wind are uncommon. Therefore we focus on the low price effects and the increased value of flexible demand options. On the supply side, there is an increase in the value of other flexible generation technologies and the attractiveness of additional interconnection capacity. This paper also analyses options for increasing the flexibility of heat generation involving large and decentralized CHP plants and heat generation based on electricity. The incentives that the market provides for shifting demand and using electricity for heat production are discussed based on the variability of prices observed from 2006 to 2008.

  6. Towards a Scalable and Adaptive Application Support Platform for Large-Scale Distributed E-Sciences in High-Performance Network Environments

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Chase Qishi [New Jersey Inst. of Technology, Newark, NJ (United States); Univ. of Memphis, TN (United States); Zhu, Michelle Mengxia [Southern Illinois Univ., Carbondale, IL (United States)

    2016-06-06

    The advent of large-scale collaborative scientific applications has demonstrated the potential for broad scientific communities to pool globally distributed resources to produce unprecedented data acquisition, movement, and analysis. System resources including supercomputers, data repositories, computing facilities, network infrastructures, storage systems, and display devices have been increasingly deployed at national laboratories and academic institutes. These resources are typically shared by large communities of users over Internet or dedicated networks and hence exhibit an inherent dynamic nature in their availability, accessibility, capacity, and stability. Scientific applications using either experimental facilities or computation-based simulations with various physical, chemical, climatic, and biological models feature diverse scientific workflows as simple as linear pipelines or as complex as a directed acyclic graphs, which must be executed and supported over wide-area networks with massively distributed resources. Application users oftentimes need to manually configure their computing tasks over networks in an ad hoc manner, hence significantly limiting the productivity of scientists and constraining the utilization of resources. The success of these large-scale distributed applications requires a highly adaptive and massively scalable workflow platform that provides automated and optimized computing and networking services. This project is to design and develop a generic Scientific Workflow Automation and Management Platform (SWAMP), which contains a web-based user interface specially tailored for a target application, a set of user libraries, and several easy-to-use computing and networking toolkits for application scientists to conveniently assemble, execute, monitor, and control complex computing workflows in heterogeneous high-performance network environments. SWAMP will enable the automation and management of the entire process of scientific

  7. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  8. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  9. Large-scale ground motion simulation using GPGPU

    Science.gov (United States)

    Aoi, S.; Maeda, T.; Nishizawa, N.; Aoki, T.

    2012-12-01

    Huge computation resources are required to perform large-scale ground motion simulations using 3-D finite difference method (FDM) for realistic and complex models with high accuracy. Furthermore, thousands of various simulations are necessary to evaluate the variability of the assessment caused by uncertainty of the assumptions of the source models for future earthquakes. To conquer the problem of restricted computational resources, we introduced the use of GPGPU (General purpose computing on graphics processing units) which is the technique of using a GPU as an accelerator of the computation which has been traditionally conducted by the CPU. We employed the CPU version of GMS (Ground motion Simulator; Aoi et al., 2004) as the original code and implemented the function for GPU calculation using CUDA (Compute Unified Device Architecture). GMS is a total system for seismic wave propagation simulation based on 3-D FDM scheme using discontinuous grids (Aoi&Fujiwara, 1999), which includes the solver as well as the preprocessor tools (parameter generation tool) and postprocessor tools (filter tool, visualization tool, and so on). The computational model is decomposed in two horizontal directions and each decomposed model is allocated to a different GPU. We evaluated the performance of our newly developed GPU version of GMS on the TSUBAME2.0 which is one of the Japanese fastest supercomputer operated by the Tokyo Institute of Technology. First we have performed a strong scaling test using the model with about 22 million grids and achieved 3.2 and 7.3 times of the speed-up by using 4 and 16 GPUs. Next, we have examined a weak scaling test where the model sizes (number of grids) are increased in proportion to the degree of parallelism (number of GPUs). The result showed almost perfect linearity up to the simulation with 22 billion grids using 1024 GPUs where the calculation speed reached to 79.7 TFlops and about 34 times faster than the CPU calculation using the same number

  10. Optimizing CyberShake Seismic Hazard Workflows for Large HPC Resources

    Science.gov (United States)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2014-12-01

    The CyberShake computational platform is a well-integrated collection of scientific software and middleware that calculates 3D simulation-based probabilistic seismic hazard curves and hazard maps for the Los Angeles region. Currently each CyberShake model comprises about 235 million synthetic seismograms from about 415,000 rupture variations computed at 286 sites. CyberShake integrates large-scale parallel and high-throughput serial seismological research codes into a processing framework in which early stages produce files used as inputs by later stages. Scientific workflow tools are used to manage the jobs, data, and metadata. The Southern California Earthquake Center (SCEC) developed the CyberShake platform using USC High Performance Computing and Communications systems and open-science NSF resources.CyberShake calculations were migrated to the NSF Track 1 system NCSA Blue Waters when it became operational in 2013, via an interdisciplinary team approach including domain scientists, computer scientists, and middleware developers. Due to the excellent performance of Blue Waters and CyberShake software optimizations, we reduced the makespan (a measure of wallclock time-to-solution) of a CyberShake study from 1467 to 342 hours. We will describe the technical enhancements behind this improvement, including judicious introduction of new GPU software, improved scientific software components, increased workflow-based automation, and Blue Waters-specific workflow optimizations.Our CyberShake performance improvements highlight the benefits of scientific workflow tools. The CyberShake workflow software stack includes the Pegasus Workflow Management System (Pegasus-WMS, which includes Condor DAGMan), HTCondor, and Globus GRAM, with Pegasus-mpi-cluster managing the high-throughput tasks on the HPC resources. The workflow tools handle data management, automatically transferring about 13 TB back to SCEC storage.We will present performance metrics from the most recent Cyber

  11. A Hybrid Testbed for Performance Evaluation of Large-Scale Datacenter Networks

    DEFF Research Database (Denmark)

    Pilimon, Artur; Ruepp, Sarah Renée

    2018-01-01

    Datacenters (DC) as well as their network interconnects are growing in scale and complexity. They are constantly being challenged in terms of energy and resource utilization efficiency, scalability, availability, reliability and performance requirements. Therefore, these resource-intensive enviro......Datacenters (DC) as well as their network interconnects are growing in scale and complexity. They are constantly being challenged in terms of energy and resource utilization efficiency, scalability, availability, reliability and performance requirements. Therefore, these resource......-intensive environments must be properly tested and analyzed in order to make timely upgrades and transformations. However, a limited number of academic institutions and Research and Development companies have access to production scale DC Network (DCN) testing facilities, and resource-limited studies can produce...... misleading or inaccurate results. To address this problem, we introduce an alternative solution, which forms a solid base for a more realistic and comprehensive performance evaluation of different aspects of DCNs. It is based on the System-in-the-loop (SITL) concept, where real commercial DCN equipment...

  12. The use of production management techniques in the construction of large scale physics detectors

    International Nuclear Information System (INIS)

    Bazan, A.; Chevenier, G.; Estrella, F.

    1999-01-01

    The construction process of detectors for the Large Hadron Collider (LHC) experiments is large scale, heavily constrained by resource availability and evolves with time. As a consequence, changes in detector component design need to be tracked and quickly reflected in the construction process. With similar problems in industry engineers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so-called Workflow Management Software (WfMS) to coordinate production work processes. However, PDM and WfMS software are not generally integrated in industry. The scale of LHC experiments, like CMS, demands that industrial production techniques be applied in detector construction. This paper outlines the major functions and applications of the CRISTAL system (Cooperating Repositories and an Information System for Tracking Assembly Lifecycles) in use in CMS which successfully integrates PDM and WfMS techniques in managing large scale physics detector construction. This is the first time industrial production techniques have been deployed to this extent in detector construction

  13. Waters Without Borders: Scarcity and the Future of State Interactions over Shared Water Resources

    Science.gov (United States)

    2010-04-01

    earth’s water is fresh water , stored in rivers, lakes, reservoirs, glaciers, permanent snow, groundwater aquifers, and the atmosphere. 10 This... freshwater resources between and within countries. 13 There is significant media attention given to intra-state water sharing issues. One...intrusion into coastal ground freshwater sources, among other effects. Consequently, water scarcity brought about by climate change could drive

  14. Decentralized Opportunistic Spectrum Resources Access Model and Algorithm toward Cooperative Ad-Hoc Networks

    Science.gov (United States)

    Liu, Ming; Xu, Yang; Mohammed, Abdul-Wahid

    2016-01-01

    Limited communication resources have gradually become a critical factor toward efficiency of decentralized large scale multi-agent coordination when both system scales up and tasks become more complex. In current researches, due to the agent’s limited communication and observational capability, an agent in a decentralized setting can only choose a part of channels to access, but cannot perceive or share global information. Each agent’s cooperative decision is based on the partial observation of the system state, and as such, uncertainty in the communication network is unavoidable. In this situation, it is a major challenge working out cooperative decision-making under uncertainty with only a partial observation of the environment. In this paper, we propose a decentralized approach that allows agents cooperatively search and independently choose channels. The key to our design is to build an up-to-date observation for each agent’s view so that a local decision model is achievable in a large scale team coordination. We simplify the Dec-POMDP model problem, and each agent can jointly work out its communication policy in order to improve its local decision utilities for the choice of communication resources. Finally, we discuss an implicate resource competition game, and show that, there exists an approximate resources access tradeoff balance between agents. Based on this discovery, the tradeoff between real-time decision-making and the efficiency of cooperation using these channels can be well improved. PMID:26727504

  15. Decentralized Opportunistic Spectrum Resources Access Model and Algorithm toward Cooperative Ad-Hoc Networks.

    Directory of Open Access Journals (Sweden)

    Ming Liu

    Full Text Available Limited communication resources have gradually become a critical factor toward efficiency of decentralized large scale multi-agent coordination when both system scales up and tasks become more complex. In current researches, due to the agent's limited communication and observational capability, an agent in a decentralized setting can only choose a part of channels to access, but cannot perceive or share global information. Each agent's cooperative decision is based on the partial observation of the system state, and as such, uncertainty in the communication network is unavoidable. In this situation, it is a major challenge working out cooperative decision-making under uncertainty with only a partial observation of the environment. In this paper, we propose a decentralized approach that allows agents cooperatively search and independently choose channels. The key to our design is to build an up-to-date observation for each agent's view so that a local decision model is achievable in a large scale team coordination. We simplify the Dec-POMDP model problem, and each agent can jointly work out its communication policy in order to improve its local decision utilities for the choice of communication resources. Finally, we discuss an implicate resource competition game, and show that, there exists an approximate resources access tradeoff balance between agents. Based on this discovery, the tradeoff between real-time decision-making and the efficiency of cooperation using these channels can be well improved.

  16. Distributed weighted least-squares estimation with fast convergence for large-scale systems☆

    Science.gov (United States)

    Marelli, Damián Edgardo; Fu, Minyue

    2015-01-01

    In this paper we study a distributed weighted least-squares estimation problem for a large-scale system consisting of a network of interconnected sub-systems. Each sub-system is concerned with a subset of the unknown parameters and has a measurement linear in the unknown parameters with additive noise. The distributed estimation task is for each sub-system to compute the globally optimal estimate of its own parameters using its own measurement and information shared with the network through neighborhood communication. We first provide a fully distributed iterative algorithm to asymptotically compute the global optimal estimate. The convergence rate of the algorithm will be maximized using a scaling parameter and a preconditioning method. This algorithm works for a general network. For a network without loops, we also provide a different iterative algorithm to compute the global optimal estimate which converges in a finite number of steps. We include numerical experiments to illustrate the performances of the proposed methods. PMID:25641976

  17. Distributed weighted least-squares estimation with fast convergence for large-scale systems.

    Science.gov (United States)

    Marelli, Damián Edgardo; Fu, Minyue

    2015-01-01

    In this paper we study a distributed weighted least-squares estimation problem for a large-scale system consisting of a network of interconnected sub-systems. Each sub-system is concerned with a subset of the unknown parameters and has a measurement linear in the unknown parameters with additive noise. The distributed estimation task is for each sub-system to compute the globally optimal estimate of its own parameters using its own measurement and information shared with the network through neighborhood communication. We first provide a fully distributed iterative algorithm to asymptotically compute the global optimal estimate. The convergence rate of the algorithm will be maximized using a scaling parameter and a preconditioning method. This algorithm works for a general network. For a network without loops, we also provide a different iterative algorithm to compute the global optimal estimate which converges in a finite number of steps. We include numerical experiments to illustrate the performances of the proposed methods.

  18. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  19. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  20. Sharing common pool resources at the border of protected areas in the Romanian Carpathians

    Directory of Open Access Journals (Sweden)

    ANA-IRINA DINCA

    2014-10-01

    Full Text Available The common pool resources are a very actual topic a pproached by both scientists and practitioners preoccupied nowadays of gradually incr easing environmental problems. Protected areas in Romania and especially in Romanian Carpath ians of national and natural park type (IUCN II and V represent areas of particular interes t in the light of the common pool resources theory imposing conservation laws on areas meeting a n increased pressure from human communities around them. The important socio-econom ic and ownership changes that Romania met in the last decades changed its previous state unique ownership into a multiple stakeholder ownership. At the same time vulnerable human communi ties located in fragile mountain areas and depending to a high extent on natural resources met an increased stress when exploiting natural resources at the border of protected areas. Consequently sharing the common pool of resources in the buffer zone of protected areas in the Romanian Carpathians represents a very actual and important topic to be treated in the pre sent study.

  1. Regional scale groundwater resource assessment in the Australian outback - Geophysics is the only way.

    Science.gov (United States)

    Munday, T. J.; Davis, A. C.; Gilfedder, M.; Annetts, D.

    2015-12-01

    Resource development, whether in agriculture, mining and/or energy, is set to have significant consequences for the groundwater resources of Australia in the short to medium term. These industry sectors are of significant economic value to the country and consequently their support remains a priority for State and Federal Governments alike. The scale of potential developments facilitated in large part by the Government Programs, like the West Australian (WA) Government's "Water for Food" program, and the South Australian's Government's PACE program, will result in an increase in infrastructure requirements, including access to water resources and Aboriginal lands to support these developments. However, the increased demand for water, particularly groundwater, is likely to be compromised by the limited information we have about these resources. This is particularly so for remote parts of the country which are targeted as primary development areas. There is a recognised need to expand this knowledge so that water availability is not a limiting factor to development. Governments of all persuasions have therefore adopted geophysical technologies, particularly airborne electromagnetics (AEM), as a basis for extending the hydrogeological knowledge of data poor areas. In WA, the State Government has employed regional-scale AEM surveys as a basis for defining groundwater resources to support mining, regional agricultural developments whilst aiming to safeguard regional population centres, and environmental assets. A similar approach is being employed in South Australia. These surveys are being used to underpin conceptual hydrogeological frameworks, define basin-scale hydrogeological models, delimit the extent of saltwater intrusion in coastal areas, and to determine the groundwater resource potential of remote alluvial systems aimed at supporting new, irrigation-based, agricultural developments in arid parts of the Australian outback. In the absence of conventional

  2. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  3. Accelerating large-scale protein structure alignments with graphics processing units

    Directory of Open Access Journals (Sweden)

    Pang Bin

    2012-02-01

    Full Text Available Abstract Background Large-scale protein structure alignment, an indispensable tool to structural bioinformatics, poses a tremendous challenge on computational resources. To ensure structure alignment accuracy and efficiency, efforts have been made to parallelize traditional alignment algorithms in grid environments. However, these solutions are costly and of limited accessibility. Others trade alignment quality for speedup by using high-level characteristics of structure fragments for structure comparisons. Findings We present ppsAlign, a parallel protein structure Alignment framework designed and optimized to exploit the parallelism of Graphics Processing Units (GPUs. As a general-purpose GPU platform, ppsAlign could take many concurrent methods, such as TM-align and Fr-TM-align, into the parallelized algorithm design. We evaluated ppsAlign on an NVIDIA Tesla C2050 GPU card, and compared it with existing software solutions running on an AMD dual-core CPU. We observed a 36-fold speedup over TM-align, a 65-fold speedup over Fr-TM-align, and a 40-fold speedup over MAMMOTH. Conclusions ppsAlign is a high-performance protein structure alignment tool designed to tackle the computational complexity issues from protein structural data. The solution presented in this paper allows large-scale structure comparisons to be performed using massive parallel computing power of GPU.

  4. The genetic etiology of Tourette Syndrome: Large-scale collaborative efforts on the precipice of discovery

    Directory of Open Access Journals (Sweden)

    Marianthi Georgitsi

    2016-08-01

    Full Text Available Gilles de la Tourette Syndrome (TS is a childhood-onset neurodevelopmental disorder that is characterized by multiple motor and phonic tics. It has a complex etiology with multiple genes likely interacting with environmental factors to lead to the onset of symptoms. The genetic basis of the disorder remains elusive;however, multiple resources and large-scale projects are coming together, launching a new era in the field and bringing us on the verge of discovery. The large-scale efforts outlined in this report, are complementary and represent a range of different approaches to the study of disorders with complex inheritance. The Tourette Syndrome Association International Consortium for Genetics (TSAICG has focused on large families, parent-proband trios and cases for large case-control designs such as genomewide association studies (GWAS, copy number variation (CNV scans and exome/genome sequencing. TIC Genetics targets rare, large effect size mutations in simplex trios and multigenerational families. The European Multicentre Tics in Children Study (EMTICS seeks to elucidate gene-environment interactions including the involvement of infection and immune mechanisms in TS etiology. Finally, TS-EUROTRAIN, a Marie Curie Initial Training Network, aims to act as a platform to unify large-scale projects in the field and to educate the next generation of experts. Importantly, these complementary large-scale efforts are joining forces to uncover the full range of genetic variation and environmental risk factors for TS, holding great promise for indentifying definitive TS susceptibility genes and shedding light into the complex pathophysiology of this disorder.

  5. The Genetic Etiology of Tourette Syndrome: Large-Scale Collaborative Efforts on the Precipice of Discovery

    Science.gov (United States)

    Georgitsi, Marianthi; Willsey, A. Jeremy; Mathews, Carol A.; State, Matthew; Scharf, Jeremiah M.; Paschou, Peristera

    2016-01-01

    Gilles de la Tourette Syndrome (TS) is a childhood-onset neurodevelopmental disorder that is characterized by multiple motor and phonic tics. It has a complex etiology with multiple genes likely interacting with environmental factors to lead to the onset of symptoms. The genetic basis of the disorder remains elusive. However, multiple resources and large-scale projects are coming together, launching a new era in the field and bringing us on the verge of discovery. The large-scale efforts outlined in this report are complementary and represent a range of different approaches to the study of disorders with complex inheritance. The Tourette Syndrome Association International Consortium for Genetics (TSAICG) has focused on large families, parent-proband trios and cases for large case-control designs such as genomewide association studies (GWAS), copy number variation (CNV) scans, and exome/genome sequencing. TIC Genetics targets rare, large effect size mutations in simplex trios, and multigenerational families. The European Multicentre Tics in Children Study (EMTICS) seeks to elucidate gene-environment interactions including the involvement of infection and immune mechanisms in TS etiology. Finally, TS-EUROTRAIN, a Marie Curie Initial Training Network, aims to act as a platform to unify large-scale projects in the field and to educate the next generation of experts. Importantly, these complementary large-scale efforts are joining forces to uncover the full range of genetic variation and environmental risk factors for TS, holding great promise for identifying definitive TS susceptibility genes and shedding light into the complex pathophysiology of this disorder. PMID:27536211

  6. An analysis of factors affecting participation behavior of limited resource farmers in agricultural cost-share programs in Alabama

    Science.gov (United States)

    Okwudili Onianwa; Gerald Wheelock; Buddhi Gyawali; Jianbang Gan; Mark Dubois; John Schelhas

    2004-01-01

    This study examines factors that affect the participation behavior of limited resource farmers in agricultural cost-share programs in Alabama. The data were generated from a survey administered to a sample of limited resource farm operators. A binary logit model was employed to analyze the data. Results indicate that college education, age, gross sales, ratio of owned...

  7. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  8. Framework for Shared Drinking Water Risk Assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Thomas Stephen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Tidwell, Vincent C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Peplinski, William John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mitchell, Roger [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Binning, David [AEM Corp., Herndon, VA (United States); Meszaros, Jenny [AEM Corp., Herndon, VA (United States)

    2017-01-01

    Central to protecting our nation's critical infrastructure is the development of methodologies for prioritizing action and supporting resource allocation decisions associated with risk-reduction initiatives. Toward this need a web-based risk assessment framework that promotes the anonymous sharing of results among water utilities is demonstrated. Anonymous sharing of results offers a number of potential advantages such as assistance in recognizing and correcting bias, identification of 'unknown, unknowns', self-assessment and benchmarking for the local utility, treatment of shared assets and/or threats across multiple utilities, and prioritization of actions beyond the scale of a single utility. The constructed framework was demonstrated for three water utilities. Demonstration results were then compared to risk assessment results developed using a different risk assessment application by a different set of analysts.

  9. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  10. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  11. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  12. Nebhydro: Sharing Geospatial Data to Supportwater Management in Nebraska

    Science.gov (United States)

    Kamble, B.; Irmak, A.; Hubbard, K.; Deogun, J.; Dvorak, B.

    2012-12-01

    Recent advances in web-enabled geographical technologies have the potential to make a dramatic impact on development of highly interactive spatial applications on the web for visualization of large-scale geospatial data by water resources and irrigation scientists. Spatial and point scale water resources data visualization are an emerging and challenging application domain. Query based visual explorations of geospatial hydrological data can play an important role in stimulating scientific hypotheses and seeking causal relationships among hydro variables. The Nebraska Hydrological Information System (NebHydro) utilizes ESRI's ArcGIS server technology to increase technological awareness among farmers, irrigation managers and policy makers. Web-based geospatial applications are an effective way to expose scientific hydrological datasets to the research community and the public. NebHydro uses Adobe Flex technology to offer an online visualization and data analysis system for presentation of social and economic data. Internet mapping services is an integrated product of GIS and Internet technologies; it is a favored solution to achieve the interoperability of GIS. The development of Internet based GIS services in the state of Nebraska showcases the benefits of sharing geospatial hydrological data among agencies, resource managers and policy makers. Geospatial hydrological Information (Evapotranspiration from Remote Sensing, vegetation indices (NDVI), USGS Stream gauge data, Climatic data etc.) is generally generated through model simulation (METRIC, SWAP, Linux, Python based scripting etc). Information is compiled into and stored within object oriented relational spatial databases using a geodatabase information model that supports the key data types needed by applications including features, relationships, networks, imagery, terrains, maps and layers. The system provides online access, querying, visualization, and analysis of the hydrological data from several sources

  13. Evaluation of offshore wind resources by scale of development

    DEFF Research Database (Denmark)

    Möller, Bernd; Hong, Lixuan; Lonsing, Reinhard

    -economic model operating in a geographical information systems (GIS) environment, which describes resources, costs and area constraints in a spatially explicit way, the relation between project size, location, costs and ownership is analysed. Two scenarios are presented, which describe a state......Offshore wind energy has developed rapidly in terms of turbine and project size, and currently undergoes a significant up-scaling to turbines and parks at greater distance to shore and deeper waters. Expectations to the positive effect of economies of scale on power production costs, however, have...... can be explained by deeper water, higher distance to shore, bottlenecks in supply or higher raw material costs. The present paper addresses the scale of offshore wind parks for Denmark and invites to reconsider the technological and institutional choices made. Based on a continuous resource...

  14. Evaluation of offshore wind resources by scale of development

    DEFF Research Database (Denmark)

    Möller, Bernd; Hong, Lixuan; Lonsing, Reinhard

    2012-01-01

    -economic model operating in a geographical information systems (GIS) environment, which describes resources, costs and area constraints in a spatially explicit way, the relation between project size, location, costs and ownership is analysed. Two scenarios are presented, which describe a state......Offshore wind energy has developed rapidly in terms of turbine and project size, and currently undergoes a significant up-scaling to turbines and parks at greater distance to shore and deeper waters. Expectations to the positive effect of economies of scale on power production costs, however, have...... can be explained by deeper water, higher distance to shore, bottlenecks in supply or higher raw material costs. The present paper addresses the scale of offshore wind parks for Denmark and invites to reconsider the technological and institutional choices made. Based on a continuous resource...

  15. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  16. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  17. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  18. Collaborative Work without Large, Shared Displays: Looking for “the Big Picture” on a Small Screen?

    DEFF Research Database (Denmark)

    Hertzum, Morten

    2017-01-01

    Large, shared displays – such as electronic whiteboards – have proven successful in supporting actors in forming and maintaining an overview of tightly coupled collaborative activities. However, in many developing countries the technology of choice is mobile phones, which have neither a large nor...... a shared screen. It therefore appears relevant to ask: How may mobile devices with small screens support, or fail to support, actors in forming and maintaining an overview of their collaborative activities?...

  19. Large scale oil lease automation and electronic custody transfer

    International Nuclear Information System (INIS)

    Price, C.R.; Elmer, D.C.

    1995-01-01

    Typically, oil field production operations have only been automated at fields with long term production profiles and enhanced recovery. The automation generally consists of monitoring and control at the wellhead and centralized facilities. However, Union Pacific Resources Co. (UPRC) has successfully implemented a large scale automation program for rapid-decline primary recovery Austin Chalk wells where purchasers buy and transport oil from each individual wellsite. This project has resulted in two significant benefits. First, operators are using the system to re-engineer their work processes. Second, an inter-company team created a new electronic custody transfer method. This paper will describe: the progression of the company's automation objectives in the area; the field operator's interaction with the system, and the related benefits; the research and development of the new electronic custody transfer method

  20. Large-scale simulations of plastic neural networks on neuromorphic hardware

    Directory of Open Access Journals (Sweden)

    James Courtney Knight

    2016-04-01

    Full Text Available SpiNNaker is a digital, neuromorphic architecture designed for simulating large-scale spiking neural networks at speeds close to biological real-time. Rather than using bespoke analog or digital hardware, the basic computational unit of a SpiNNaker system is a general-purpose ARM processor, allowing it to be programmed to simulate a wide variety of neuron and synapse models. This flexibility is particularly valuable in the study of biological plasticity phenomena. A recently proposed learning rule based on the Bayesian Confidence Propagation Neural Network (BCPNN paradigm offers a generic framework for modeling the interaction of different plasticity mechanisms using spiking neurons. However, it can be computationally expensive to simulate large networks with BCPNN learning since it requires multiple state variables for each synapse, each of which needs to be updated every simulation time-step. We discuss the trade-offs in efficiency and accuracy involved in developing an event-based BCPNN implementation for SpiNNaker based on an analytical solution to the BCPNN equations, and detail the steps taken to fit this within the limited computational and memory resources of the SpiNNaker architecture. We demonstrate this learning rule by learning temporal sequences of neural activity within a recurrent attractor network which we simulate at scales of up to 20000 neurons and 51200000 plastic synapses: the largest plastic neural network ever to be simulated on neuromorphic hardware. We also run a comparable simulation on a Cray XC-30 supercomputer system and find that, if it is to match the run-time of our SpiNNaker simulation, the super computer system uses approximately more power. This suggests that cheaper, more power efficient neuromorphic systems are becoming useful discovery tools in the study of plasticity in large-scale brain models.

  1. All things weird and scary: Nanotechnology, theology and cultural resources

    DEFF Research Database (Denmark)

    Davies, Sarah Rachael; Kearnes, Matthew B.; Macnaghten, Phil M.

    2009-01-01

    's reflections on the ethics of nanotechnologies to focus on the talk of one group of participants, from a UK church. While we identify key themes which are common across all participants, including nanotechnology as a threat to the human, the importance of individual autonomy, and distrust of the large......-scale drivers behind the technology, we argue that the church-going group have a specific set of cultural resources with which to articulate responses to these. Using a language of spirituality and relationality these participants are able to express shared notions of what nanotechnology threatens (and promises...

  2. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  3. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  4. Large-scale runoff generation – parsimonious parameterisation using high-resolution topography

    Directory of Open Access Journals (Sweden)

    L. Gong

    2011-08-01

    Full Text Available World water resources have primarily been analysed by global-scale hydrological models in the last decades. Runoff generation in many of these models are based on process formulations developed at catchments scales. The division between slow runoff (baseflow and fast runoff is primarily governed by slope and spatial distribution of effective water storage capacity, both acting at very small scales. Many hydrological models, e.g. VIC, account for the spatial storage variability in terms of statistical distributions; such models are generally proven to perform well. The statistical approaches, however, use the same runoff-generation parameters everywhere in a basin. The TOPMODEL concept, on the other hand, links the effective maximum storage capacity with real-world topography. Recent availability of global high-quality, high-resolution topographic data makes TOPMODEL attractive as a basis for a physically-based runoff-generation algorithm at large scales, even if its assumptions are not valid in flat terrain or for deep groundwater systems. We present a new runoff-generation algorithm for large-scale hydrology based on TOPMODEL concepts intended to overcome these problems. The TRG (topography-derived runoff generation algorithm relaxes the TOPMODEL equilibrium assumption so baseflow generation is not tied to topography. TRG only uses the topographic index to distribute average storage to each topographic index class. The maximum storage capacity is proportional to the range of topographic index and is scaled by one parameter. The distribution of storage capacity within large-scale grid cells is obtained numerically through topographic analysis. The new topography-derived distribution function is then inserted into a runoff-generation framework similar VIC's. Different basin parts are parameterised by different storage capacities, and different shapes of the storage-distribution curves depend on their topographic characteristics. The TRG algorithm

  5. Resource-sharing between internal maintenance and external selection modulates attentional capture by working memory content

    Directory of Open Access Journals (Sweden)

    Anastasia eKiyonaga

    2014-08-01

    Full Text Available It is unclear why and under what circumstances working memory (WM and attention interact. Here, we apply the logic of the time-based resource-sharing (TBRS model of WM (e.g., Barrouillet, Bernardin, & Camos, 2004 to explore the mixed findings of a separate, but related, literature that studies the guidance of visual attention by WM contents. Specifically, we hypothesize that the linkage between WM representations and visual attention is governed by a time-shared cognitive resource that alternately refreshes internal (WM and selects external (visual attention information. If this were the case, WM content should guide visual attention (involuntarily, but only when there is time for it to be refreshed in an internal focus of attention. To provide an initial test for this hypothesis, we examined whether the amount of unoccupied time during a WM delay could impact the magnitude of attentional capture by WM contents. Participants were presented with a series of visual search trials while they maintained a WM cue for a delayed-recognition test. WM cues could coincide with the search target, a distracter, or neither. We varied both the number of searches to be performed, and the amount of available time to perform them. Slowing of visual search by a WM matching distracter—and facilitation by a matching target—were curtailed when the delay was filled with fast-paced (refreshing-preventing search trials, as was subsequent memory probe accuracy. WM content may, therefore, only capture visual attention when it can be refreshed, suggesting that internal (WM and external attention demands reciprocally impact one another because they share a limited resource. The TBRS rationale can thus be applied in a novel context to explain why WM contents capture attention, and under what conditions that effect should be observed.

  6. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed; Elsawy, Hesham; Gharbieh, Mohammad; Alouini, Mohamed-Slim; Adinoyi, Abdulkareem; Alshaalan, Furaih

    2017-01-01

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end

  7. A Case Study of the Global Group for Sharing Knowledge and Efforts in Human Resources within the Nuclear Industry

    International Nuclear Information System (INIS)

    Thomas, C.

    2016-01-01

    Full text: One of the main conclusions from the IAEA’s HRD Conference in 2014 was that people and organisations in the global nuclear industry could cooperate more in sharing information and efforts. This was an inspiring conclusion, and there seemed an especially great opportunity for such sharing of information and efforts related to the attraction, recruitment, development and retention of people within the nuclear workforce. Founding members include people from the IAEA, WNA, WANO, EDF and OPG amongst others, the global working group for Human Resource matters aimed at “Building and Sustaining a Competent Nuclear Workforce” was established. This global working group is free to join and is open to anyone concerned with Building and Sustaining a Competent NuclearWorkforce. The objectives of the group are to share useful information, find others with similar objectives to cooperate with, ask questions, share opinions and crucially to avoid unnecessary duplication of efforts. The group already has 160 members from more than 15 countries and is currently hosted as a group on the LinkedIn website. The vision for the group is that it will become an invaluable resource for people across the world in the nuclear industry for sharing information and efforts. (author

  8. [Location information acquisition and sharing application design in national census of Chinese medicine resources].

    Science.gov (United States)

    Zhang, Xiao-Bo; Li, Meng; Wang, Hui; Guo, Lan-Ping; Huang, Lu-Qi

    2017-11-01

    In literature, there are many information on the distribution of Chinese herbal medicine. Limited by the technical methods, the origin of Chinese herbal medicine or distribution of information in ancient literature were described roughly. It is one of the main objectives of the national census of Chinese medicine resources, which is the background information of the types and distribution of Chinese medicine resources in the region. According to the national Chinese medicine resource census technical specifications and pilot work experience, census team with "3S" technology, computer network technology, digital camera technology and other modern technology methods, can effectively collect the location information of traditional Chinese medicine resources. Detailed and specific location information, such as regional differences in resource endowment and similarity, biological characteristics and spatial distribution, the Chinese medicine resource census data access to the accuracy and objectivity evaluation work, provide technical support and data support. With the support of spatial information technology, based on location information, statistical summary and sharing of multi-source census data can be realized. The integration of traditional Chinese medicine resources and related basic data can be a spatial integration, aggregation and management of massive data, which can help for the scientific rules data mining of traditional Chinese medicine resources from the overall level and fully reveal its scientific connotation. Copyright© by the Chinese Pharmaceutical Association.

  9. Cross-scale phenological data integration to benefit resource management and monitoring

    Science.gov (United States)

    Richardson, Andrew D.; Weltzin, Jake F.; Morisette, Jeffrey T.

    2017-01-01

    Climate change is presenting new challenges for natural resource managers charged with maintaining sustainable ecosystems and landscapes. Phenology, a branch of science dealing with seasonal natural phenomena (bird migration or plant flowering in response to weather changes, for example), bridges the gap between the biosphere and the climate system. Phenological processes operate across scales that span orders of magnitude—from leaf to globe and from days to seasons—making phenology ideally suited to multiscale, multiplatform data integration and delivery of information at spatial and temporal scales suitable to inform resource management decisions.A workshop report: Workshop held June 2016 to investigate opportunities and challenges facing multi-scale, multi-platform integration of phenological data to support natural resource management decision-making.

  10. Metastrategies in large-scale bargaining settings

    NARCIS (Netherlands)

    Hennes, D.; Jong, S. de; Tuyls, K.; Gal, Y.

    2015-01-01

    This article presents novel methods for representing and analyzing a special class of multiagent bargaining settings that feature multiple players, large action spaces, and a relationship among players' goals, tasks, and resources. We show how to reduce these interactions to a set of bilateral

  11. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...

  12. Genuine multipartite entanglement of symmetric Gaussian states: Strong monogamy, unitary localization, scaling behavior, and molecular sharing structure

    Science.gov (United States)

    Adesso, Gerardo; Illuminati, Fabrizio

    2008-10-01

    We investigate the structural aspects of genuine multipartite entanglement in Gaussian states of continuous variable systems. Generalizing the results of Adesso and Illuminati [Phys. Rev. Lett. 99, 150501 (2007)], we analyze whether the entanglement shared by blocks of modes distributes according to a strong monogamy law. This property, once established, allows us to quantify the genuine N -partite entanglement not encoded into 2,…,K,…,(N-1) -partite quantum correlations. Strong monogamy is numerically verified, and the explicit expression of the measure of residual genuine multipartite entanglement is analytically derived, by a recursive formula, for a subclass of Gaussian states. These are fully symmetric (permutation-invariant) states that are multipartitioned into blocks, each consisting of an arbitrarily assigned number of modes. We compute the genuine multipartite entanglement shared by the blocks of modes and investigate its scaling properties with the number and size of the blocks, the total number of modes, the global mixedness of the state, and the squeezed resources needed for state engineering. To achieve the exact computation of the block entanglement, we introduce and prove a general result of symplectic analysis: Correlations among K blocks in N -mode multisymmetric and multipartite Gaussian states, which are locally invariant under permutation of modes within each block, can be transformed by a local (with respect to the partition) unitary operation into correlations shared by K single modes, one per block, in effective nonsymmetric states where N-K modes are completely uncorrelated. Due to this theorem, the above results, such as the derivation of the explicit expression for the residual multipartite entanglement, its nonnegativity, and its scaling properties, extend to the subclass of non-symmetric Gaussian states that are obtained by the unitary localization of the multipartite entanglement of symmetric states. These findings provide strong

  13. SOLID-DER. Reaching large-scale integration of Distributed Energy Resources in the enlarged European electricity market

    International Nuclear Information System (INIS)

    Van Oostvoorn, F.; Ten Donkelaar, M.

    2007-05-01

    The integration of DER (distributed energy resources) in the European electricity networks has become a key issue for energy producers, network operators, policy makers and the R and D community. In some countries it created already a number of challenges for the stability of the electricity supply system, thereby creating new barriers for further expansion of the share of DER in supply. On the other hand in many Member States there exists still a lack of awareness and understanding of the possible benefits and role of DER in the electricity system, while environmental goals and security of supply issues ask more and more for solutions that DER could provide in the future. The project SOLID-DER, a Coordination Action, will assess the barriers for further integration of DER, overcome both the lack of awareness of benefits of DER solutions and fragmentation in EU R and D results by consolidating all European DER research activities and report on its common findings. In particular awareness of DER solutions and benefits will be raised in the new Member States, thereby addressing their specific issues and barriers and incorporate them in the existing EU DER R and D community. The SOLID-DER Coordination Action will run from November 2005 to October 2008

  14. Processing structure in language and music: a case for shared reliance on cognitive control.

    Science.gov (United States)

    Slevc, L Robert; Okada, Brooke M

    2015-06-01

    The relationship between structural processing in music and language has received increasing interest in the past several years, spurred by the influential Shared Syntactic Integration Resource Hypothesis (SSIRH; Patel, Nature Neuroscience, 6, 674-681, 2003). According to this resource-sharing framework, music and language rely on separable syntactic representations but recruit shared cognitive resources to integrate these representations into evolving structures. The SSIRH is supported by findings of interactions between structural manipulations in music and language. However, other recent evidence suggests that such interactions also can arise with nonstructural manipulations, and some recent neuroimaging studies report largely nonoverlapping neural regions involved in processing musical and linguistic structure. These conflicting results raise the question of exactly what shared (and distinct) resources underlie musical and linguistic structural processing. This paper suggests that one shared resource is prefrontal cortical mechanisms of cognitive control, which are recruited to detect and resolve conflict that occurs when expectations are violated and interpretations must be revised. By this account, musical processing involves not just the incremental processing and integration of musical elements as they occur, but also the incremental generation of musical predictions and expectations, which must sometimes be overridden and revised in light of evolving musical input.

  15. Evolution of scaling emergence in large-scale spatial epidemic spreading.

    Science.gov (United States)

    Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan

    2011-01-01

    Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease.

  16. Large-Scale Structure and Hyperuniformity of Amorphous Ices

    Science.gov (United States)

    Martelli, Fausto; Torquato, Salvatore; Giovambattista, Nicolas; Car, Roberto

    2017-09-01

    We investigate the large-scale structure of amorphous ices and transitions between their different forms by quantifying their large-scale density fluctuations. Specifically, we simulate the isothermal compression of low-density amorphous ice (LDA) and hexagonal ice to produce high-density amorphous ice (HDA). Both HDA and LDA are nearly hyperuniform; i.e., they are characterized by an anomalous suppression of large-scale density fluctuations. By contrast, in correspondence with the nonequilibrium phase transitions to HDA, the presence of structural heterogeneities strongly suppresses the hyperuniformity and the system becomes hyposurficial (devoid of "surface-area fluctuations"). Our investigation challenges the largely accepted "frozen-liquid" picture, which views glasses as structurally arrested liquids. Beyond implications for water, our findings enrich our understanding of pressure-induced structural transformations in glasses.

  17. Shared Communications: Volume 1. A Summary and Literature Review

    Energy Technology Data Exchange (ETDEWEB)

    Franzese, O

    2004-09-22

    This paper provides a review of examples from the literature of shared communication resources and of agencies and/or organizations that share communication resources. The primary emphasis is on rural, intelligent transportation system communications involving transit. Citations will not be limited, however, to rural activities, or to ITS implementation, or even to transit. In addition, the term ''communication'' will be broadly applied to include all information resources. Literature references to issues that contribute to both successful and failed efforts at sharing communication resources are reviewed. The findings of this literature review indicate that: (1) The most frequently shared communication resources are information/data resources, (2) Telecommunications infrastructure and technologies are the next most frequently shared resources, (3) When resources are successfully shared, all parties benefit, (4) A few unsuccessful attempts of sharing resources have been recorded, along with lessons learned, (5) Impediments to sharing include security issues, concerns over system availability and reliability, service quality and performance, and institutional barriers, (6) Advantages of sharing include financial benefits to agencies from using shared resources and benefits to the public in terms of congestion mitigation, information transfer (e.g., traveler information systems), mobility (e.g., welfare-to-work paratransit), and safety (e.g., speed of incident response, incident avoidance), (7) Technology-based solutions exist to address technology-based concerns, and (8) Institutional issues can be addressed through leadership, enhanced knowledge and skills, open communication, responsiveness, and attractive pricing structures.

  18. Environmental aspects of large-scale wind-power systems in the UK

    Energy Technology Data Exchange (ETDEWEB)

    Robson, A

    1983-12-01

    Environmental issues relating to the introduction of large, MW-scale wind turbines at land-based sites in the U.K. are discussed. Areas of interest include noise, television interference, hazards to bird life and visual effects. A number of areas of uncertainty are identified, but enough is known from experience elsewhere in the world to enable the first U.K. machines to be introduced in a safe and environmentally acceptable manner. Research currently under way will serve to establish siting criteria more clearly, and could significantly increase the potential wind-energy resource. Certain studies of the comparative risk of energy systems are shown to be overpessimistic for U.K. wind turbines.

  19. Frequency Resource Sharing and Allocation Scheme Based on Coalition Formation Game in Hybrid D2D-Cellular Network

    Directory of Open Access Journals (Sweden)

    Qing Ou

    2015-01-01

    Full Text Available A distributed cooperation scheme on frequency resource sharing is proposed to improve the quality of service (QoS in device-to-device (D2D communications underlaying cellular networks. Specifically, we formulate the resource allocation problem as a coalition formation game with transferable utility, in which all users have the incentive to cooperate with some others and form a competitive group to maximize the probability of obtaining their favorite spectrum resources. Taking the cost for coalition formation into account, such as the path loss for data sharing, we prove that the core of the proposed game is empty, which shows the impossibility of grand coalition. Hence, we propose a distributed merge-and-split based coalition formation algorithm based on a new defined Max-Coalition order to effectively solve the coalition game. Compared with the exhaustive search, our algorithm has much lower computer complexity. In addition, we prove that stability and convergence of the proposed algorithm using the concept of a defection function. Finally, the simulation results show that the proposed scheme achieves a suboptimal performance in terms of network sum rate compared with the centralized optimal resource allocation scheme obtained via exhaustive search.

  20. Scale Dependence of Female Ungulate Reproductive Success in Relation to Nutritional Condition, Resource Selection and Multi-Predator Avoidance.

    Directory of Open Access Journals (Sweden)

    Jared F Duquette

    Full Text Available Female ungulate reproductive success is dependent on the survival of their young, and affected by maternal resource selection, predator avoidance, and nutritional condition. However, potential hierarchical effects of these factors on reproductive success are largely unknown, especially in multi-predator landscapes. We expanded on previous research of neonatal white-tailed deer (Odocoileus virginianus daily survival within home ranges to assess if resource use, integrated risk of 4 mammalian predators, maternal nutrition, winter severity, hiding cover, or interactions among these variables best explained landscape scale variation in daily or seasonal survival during the post-partum period. We hypothesized that reproductive success would be limited greater by predation risk at coarser spatiotemporal scales, but habitat use at finer scales. An additive model of daily non-ideal resource use and maternal nutrition explained the most (69% variation in survival; though 65% of this variation was related to maternal nutrition. Strong support of maternal nutrition across spatiotemporal scales did not fully support our hypothesis, but suggested reproductive success was related to dam behaviors directed at increasing nutritional condition. These behaviors were especially important following severe winters, when dams produced smaller fawns with less probability of survival. To increase nutritional condition and decrease wolf (Canis lupus predation risk, dams appeared to place fawns in isolated deciduous forest patches near roads. However, this resource selection represented non-ideal resources for fawns, which had greater predation risk that led to additive mortalities beyond those related to resources alone. Although the reproductive strategy of dams resulted in greater predation of fawns from alternative predators, it likely improved the life-long reproductive success of dams, as many were late-aged (>10 years old and could have produced multiple litters

  1. Modeling Relief Demands in an Emergency Supply Chain System under Large-Scale Disasters Based on a Queuing Network

    Science.gov (United States)

    He, Xinhua

    2014-01-01

    This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model. PMID:24688367

  2. Modeling Relief Demands in an Emergency Supply Chain System under Large-Scale Disasters Based on a Queuing Network

    Directory of Open Access Journals (Sweden)

    Xinhua He

    2014-01-01

    Full Text Available This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model.

  3. Modeling relief demands in an emergency supply chain system under large-scale disasters based on a queuing network.

    Science.gov (United States)

    He, Xinhua; Hu, Wenfa

    2014-01-01

    This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model.

  4. Operational Changes in a Shared Resource Laboratory with the Use of a Product Lifecycle Management Approach: A Case Study.

    Science.gov (United States)

    Hexley, Philip; Smith, Victoria; Wall, Samantha

    2016-04-01

    Shared Resource Laboratories (SRLs) provide investigators access to necessary scientific and resource expertise to leverage complex technologies fully for advancing high-quality biomedical research in a cost-effective manner. At the University of Nebraska Medical Center, the Flow Cytometry Research Facility (FCRF) offered access to exceptional technology, but the methods of operation were outdated and unsustainable. Whereas technology has advanced and the institute has expanded, the operations at the facility remained unchanged for 35 yr. To rectify this, at the end of 2013, we took a product lifecycle management approach to affect large operational changes and align the services offered with the SRL goal of education, as well as to provide service to researchers. These disruptive operational changes took over 10 mo to complete and allowed for independent end-user acquisition of flow cytometry data. The results have been monitored for the past 12 mo. The operational changes have had a positive impact on the quality of research, increased investigator-facility interaction, reduced stress of facility staff, and increased overall use of the resources. This product lifecycle management approach to facility operations allowed us to conceive of, design, implement, and monitor effectively the changes at the FCRF. This approach should be considered by SRL management when faced with the need for operationally disruptive measures.

  5. Large Scale Monte Carlo Simulation of Neutrino Interactions Using the Open Science Grid and Commercial Clouds

    International Nuclear Information System (INIS)

    Norman, A.; Boyd, J.; Davies, G.; Flumerfelt, E.; Herner, K.; Mayer, N.; Mhashilhar, P.; Tamsett, M.; Timm, S.

    2015-01-01

    Modern long baseline neutrino experiments like the NOvA experiment at Fermilab, require large scale, compute intensive simulations of their neutrino beam fluxes and backgrounds induced by cosmic rays. The amount of simulation required to keep the systematic uncertainties in the simulation from dominating the final physics results is often 10x to 100x that of the actual detector exposure. For the first physics results from NOvA this has meant the simulation of more than 2 billion cosmic ray events in the far detector and more than 200 million NuMI beam spill simulations. Performing these high statistics levels of simulation have been made possible for NOvA through the use of the Open Science Grid and through large scale runs on commercial clouds like Amazon EC2. We details the challenges in performing large scale simulation in these environments and how the computing infrastructure for the NOvA experiment has been adapted to seamlessly support the running of different simulation and data processing tasks on these resources. (paper)

  6. Double inflation: A possible resolution of the large-scale structure problem

    International Nuclear Information System (INIS)

    Turner, M.S.; Villumsen, J.V.; Vittorio, N.; Silk, J.; Juszkiewicz, R.

    1986-11-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Ω = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of ∼100 Mpc, while the small-scale structure over ≤ 10 Mpc resembles that in a low density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations. 38 refs., 6 figs

  7. Large-scale fracture mechancis testing -- requirements and possibilities

    International Nuclear Information System (INIS)

    Brumovsky, M.

    1993-01-01

    Application of fracture mechanics to very important and/or complicated structures, like reactor pressure vessels, brings also some questions about the reliability and precision of such calculations. These problems become more pronounced in cases of elastic-plastic conditions of loading and/or in parts with non-homogeneous materials (base metal and austenitic cladding, property gradient changes through material thickness) or with non-homogeneous stress fields (nozzles, bolt threads, residual stresses etc.). For such special cases some verification by large-scale testing is necessary and valuable. This paper discusses problems connected with planning of such experiments with respect to their limitations, requirements to a good transfer of received results to an actual vessel. At the same time, an analysis of possibilities of small-scale model experiments is also shown, mostly in connection with application of results between standard, small-scale and large-scale experiments. Experience from 30 years of large-scale testing in SKODA is used as an example to support this analysis. 1 fig

  8. Current Resource Imagery Projects

    Data.gov (United States)

    Farm Service Agency, Department of Agriculture — Map showing coverage of current Resource imagery projects. High resolution/large scale Resource imagery is typically acquired for the U.S. Forest Service and other...

  9. Size-density scaling in protists and the links between consumer-resource interaction parameters.

    Science.gov (United States)

    DeLong, John P; Vasseur, David A

    2012-11-01

    Recent work indicates that the interaction between body-size-dependent demographic processes can generate macroecological patterns such as the scaling of population density with body size. In this study, we evaluate this possibility for grazing protists and also test whether demographic parameters in these models are correlated after controlling for body size. We compiled data on the body-size dependence of consumer-resource interactions and population density for heterotrophic protists grazing algae in laboratory studies. We then used nested dynamic models to predict both the height and slope of the scaling relationship between population density and body size for these protists. We also controlled for consumer size and assessed links between model parameters. Finally, we used the models and the parameter estimates to assess the individual- and population-level dependence of resource use on body-size and prey-size selection. The predicted size-density scaling for all models matched closely to the observed scaling, and the simplest model was sufficient to predict the pattern. Variation around the mean size-density scaling relationship may be generated by variation in prey productivity and area of capture, but residuals are relatively insensitive to variation in prey size selection. After controlling for body size, many consumer-resource interaction parameters were correlated, and a positive correlation between residual prey size selection and conversion efficiency neutralizes the apparent fitness advantage of taking large prey. Our results indicate that widespread community-level patterns can be explained with simple population models that apply consistently across a range of sizes. They also indicate that the parameter space governing the dynamics and the steady states in these systems is structured such that some parts of the parameter space are unlikely to represent real systems. Finally, predator-prey size ratios represent a kind of conundrum, because they are

  10. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  11. Distributed Monitoring and Resource Management for Large Cloud Environments

    OpenAIRE

    Wuhib, Fetahi Zebenigus

    2010-01-01

    Over the last decade, the number, size and complexity of large-scale networked systems has been growing fast, and this trend is expected to accelerate. The best known example of a large-scale networked system is probably the Internet, while large datacenters for cloud services are the most recent ones. In such environments, a key challenge is to develop scalable and adaptive technologies for management functions. This thesis addresses the challenge by engineering several protocols  for distri...

  12. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    The present paper presents overtopping measurements from small scale model test performed at the Haudraulic & Coastal Engineering Laboratory, Aalborg University, Denmark and large scale model tests performed at the Largde Wave Channel,Hannover, Germany. Comparison between results obtained from...... small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects...

  13. Fair Shares and Sharing Fairly: A Survey of Public Views on Open Science, Informed Consent and Participatory Research in Biobanking.

    Directory of Open Access Journals (Sweden)

    Yann Joly

    Full Text Available Biobanks are important resources which enable large-scale genomic research with human samples and data, raising significant ethical concerns about how participants' information is managed and shared. Three previous studies of the Canadian public's opinion about these topics have been conducted. Building on those results, an online survey representing the first study of public perceptions about biobanking spanning all Canadian provinces was conducted. Specifically, this study examined qualitative views about biobank objectives, governance structure, control and ownership of samples and data, benefit sharing, consent practices and data sharing norms, as well as additional questions and ethical concerns expressed by the public.Over half the respondents preferred to give a one-time general consent for the future sharing of their samples among researchers. Most expressed willingness for their data to be shared with the international scientific community rather than used by one or more Canadian institutions. Whereas more respondents indicated a preference for one-time general consent than any other model of consent, they constituted less than half of the total responses, revealing a lack of consensus among survey respondents regarding this question. Respondents identified biobank objectives, governance structure and accountability as the most important information to provide participants. Respondents' concerns about biobanking generally centred around the control and ownership of biological samples and data, especially with respect to potential misuse by insurers, the government and other third parties. Although almost half the respondents suggested that these should be managed by the researchers' institutions, results indicate that the public is interested in being well-informed about these projects and suggest the importance of increased involvement from participants. In conclusion, the study discusses the viability of several proposed models for

  14. Fair Shares and Sharing Fairly: A Survey of Public Views on Open Science, Informed Consent and Participatory Research in Biobanking.

    Science.gov (United States)

    Joly, Yann; Dalpé, Gratien; So, Derek; Birko, Stanislav

    2015-01-01

    Biobanks are important resources which enable large-scale genomic research with human samples and data, raising significant ethical concerns about how participants' information is managed and shared. Three previous studies of the Canadian public's opinion about these topics have been conducted. Building on those results, an online survey representing the first study of public perceptions about biobanking spanning all Canadian provinces was conducted. Specifically, this study examined qualitative views about biobank objectives, governance structure, control and ownership of samples and data, benefit sharing, consent practices and data sharing norms, as well as additional questions and ethical concerns expressed by the public. Over half the respondents preferred to give a one-time general consent for the future sharing of their samples among researchers. Most expressed willingness for their data to be shared with the international scientific community rather than used by one or more Canadian institutions. Whereas more respondents indicated a preference for one-time general consent than any other model of consent, they constituted less than half of the total responses, revealing a lack of consensus among survey respondents regarding this question. Respondents identified biobank objectives, governance structure and accountability as the most important information to provide participants. Respondents' concerns about biobanking generally centred around the control and ownership of biological samples and data, especially with respect to potential misuse by insurers, the government and other third parties. Although almost half the respondents suggested that these should be managed by the researchers' institutions, results indicate that the public is interested in being well-informed about these projects and suggest the importance of increased involvement from participants. In conclusion, the study discusses the viability of several proposed models for informed consent

  15. Challenges and opportunities in coding the commons: problems, procedures, and potential solutions in large-N comparative case studies

    Directory of Open Access Journals (Sweden)

    Elicia Ratajczyk

    2016-09-01

    Full Text Available On-going efforts to understand the dynamics of coupled social-ecological (or more broadly, coupled infrastructure systems and common pool resources have led to the generation of numerous datasets based on a large number of case studies. This data has facilitated the identification of important factors and fundamental principles which increase our understanding of such complex systems. However, the data at our disposal are often not easily comparable, have limited scope and scale, and are based on disparate underlying frameworks inhibiting synthesis, meta-analysis, and the validation of findings. Research efforts are further hampered when case inclusion criteria, variable definitions, coding schema, and inter-coder reliability testing are not made explicit in the presentation of research and shared among the research community. This paper first outlines challenges experienced by researchers engaged in a large-scale coding project; then highlights valuable lessons learned; and finally discusses opportunities for further research on comparative case study analysis focusing on social-ecological systems and common pool resources.

  16. To share or not to share? Business aspects of network sharing for Mobile Network Operators

    NARCIS (Netherlands)

    Berkers, F.T.H.M.; Hendrix, G.; Chatzicharistou, I.; Haas, T. de; Hamera, D.

    2010-01-01

    Radio spectrum and network infrastructure are two essential resources for mobile service delivery, which are both costly and increasingly scarce. In this paper we consider drivers and barriers of network sharing, which is seen as a potential solution for scarcity in these resources. We considered a

  17. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  18. Large-scale structure of the Universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.

    1978-01-01

    The problems, discussed at the ''Large-scale Structure of the Universe'' symposium are considered on a popular level. Described are the cell structure of galaxy distribution in the Universe, principles of mathematical galaxy distribution modelling. The images of cell structures, obtained after reprocessing with the computer are given. Discussed are three hypothesis - vortical, entropic, adiabatic, suggesting various processes of galaxy and galaxy clusters origin. A considerable advantage of the adiabatic hypothesis is recognized. The relict radiation, as a method of direct studying the processes taking place in the Universe is considered. The large-scale peculiarities and small-scale fluctuations of the relict radiation temperature enable one to estimate the turbance properties at the pre-galaxy stage. The discussion of problems, pertaining to studying the hot gas, contained in galaxy clusters, the interactions within galaxy clusters and with the inter-galaxy medium, is recognized to be a notable contribution into the development of theoretical and observational cosmology

  19. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  20. Talking About The Smokes: a large-scale, community-based participatory research project.

    Science.gov (United States)

    Couzos, Sophia; Nicholson, Anna K; Hunt, Jennifer M; Davey, Maureen E; May, Josephine K; Bennet, Pele T; Westphal, Darren W; Thomas, David P

    2015-06-01

    To describe the Talking About The Smokes (TATS) project according to the World Health Organization guiding principles for conducting community-based participatory research (PR) involving indigenous peoples, to assist others planning large-scale PR projects. The TATS project was initiated in Australia in 2010 as part of the International Tobacco Control Policy Evaluation Project, and surveyed a representative sample of 2522 Aboriginal and Torres Strait Islander adults to assess the impact of tobacco control policies. The PR process of the TATS project, which aimed to build partnerships to create equitable conditions for knowledge production, was mapped and summarised onto a framework adapted from the WHO principles. Processes describing consultation and approval, partnerships and research agreements, communication, funding, ethics and consent, data and benefits of the research. The TATS project involved baseline and follow-up surveys conducted in 34 Aboriginal community-controlled health services and one Torres Strait community. Consistent with the WHO PR principles, the TATS project built on community priorities and strengths through strategic partnerships from project inception, and demonstrated the value of research agreements and trusting relationships to foster shared decision making, capacity building and a commitment to Indigenous data ownership. Community-based PR methodology, by definition, needs adaptation to local settings and priorities. The TATS project demonstrates that large-scale research can be participatory, with strong Indigenous community engagement and benefits.

  1. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  2. How do the multiple large-scale climate oscillations trigger extreme precipitation?

    Science.gov (United States)

    Shi, Pengfei; Yang, Tao; Xu, Chong-Yu; Yong, Bin; Shao, Quanxi; Li, Zhenya; Wang, Xiaoyan; Zhou, Xudong; Li, Shu

    2017-10-01

    Identifying the links between variations in large-scale climate patterns and precipitation is of tremendous assistance in characterizing surplus or deficit of precipitation, which is especially important for evaluation of local water resources and ecosystems in semi-humid and semi-arid regions. Restricted by current limited knowledge on underlying mechanisms, statistical correlation methods are often used rather than physical based model to characterize the connections. Nevertheless, available correlation methods are generally unable to reveal the interactions among a wide range of climate oscillations and associated effects on precipitation, especially on extreme precipitation. In this work, a probabilistic analysis approach by means of a state-of-the-art Copula-based joint probability distribution is developed to characterize the aggregated behaviors for large-scale climate patterns and their connections to precipitation. This method is employed to identify the complex connections between climate patterns (Atlantic Multidecadal Oscillation (AMO), El Niño-Southern Oscillation (ENSO) and Pacific Decadal Oscillation (PDO)) and seasonal precipitation over a typical semi-humid and semi-arid region, the Haihe River Basin in China. Results show that the interactions among multiple climate oscillations are non-uniform in most seasons and phases. Certain joint extreme phases can significantly trigger extreme precipitation (flood and drought) owing to the amplification effect among climate oscillations.

  3. The development of the Older Persons and Informal Caregivers Survey Minimum DataSet (TOPICS-MDS): a large-scale data sharing initiative.

    Science.gov (United States)

    Lutomski, Jennifer E; Baars, Maria A E; Schalk, Bianca W M; Boter, Han; Buurman, Bianca M; den Elzen, Wendy P J; Jansen, Aaltje P D; Kempen, Gertrudis I J M; Steunenberg, Bas; Steyerberg, Ewout W; Olde Rikkert, Marcel G M; Melis, René J F

    2013-01-01

    In 2008, the Ministry of Health, Welfare and Sport commissioned the National Care for the Elderly Programme. While numerous research projects in older persons' health care were to be conducted under this national agenda, the Programme further advocated the development of The Older Persons and Informal Caregivers Survey Minimum DataSet (TOPICS-MDS) which would be integrated into all funded research protocols. In this context, we describe TOPICS data sharing initiative (www.topics-mds.eu). A working group drafted TOPICS-MDS prototype, which was subsequently approved by a multidisciplinary panel. Using instruments validated for older populations, information was collected on demographics, morbidity, quality of life, functional limitations, mental health, social functioning and health service utilisation. For informal caregivers, information was collected on demographics, hours of informal care and quality of life (including subjective care-related burden). Between 2010 and 2013, a total of 41 research projects contributed data to TOPICS-MDS, resulting in preliminary data available for 32,310 older persons and 3,940 informal caregivers. The majority of studies sampled were from primary care settings and inclusion criteria differed across studies. TOPICS-MDS is a public data repository which contains essential data to better understand health challenges experienced by older persons and informal caregivers. Such findings are relevant for countries where increasing health-related expenditure has necessitated the evaluation of contemporary health care delivery. Although open sharing of data can be difficult to achieve in practice, proactively addressing issues of data protection, conflicting data analysis requests and funding limitations during TOPICS-MDS developmental phase has fostered a data sharing culture. To date, TOPICS-MDS has been successfully incorporated into 41 research projects, thus supporting the feasibility of constructing a large (>30,000 observations

  4. Adapting to large-scale changes in Advanced Placement Biology, Chemistry, and Physics: the impact of online teacher communities

    Science.gov (United States)

    Frumin, Kim; Dede, Chris; Fischer, Christian; Foster, Brandon; Lawrenz, Frances; Eisenkraft, Arthur; Fishman, Barry; Jurist Levy, Abigail; McCoy, Ayana

    2018-03-01

    Over the past decade, the field of teacher professional learning has coalesced around core characteristics of high quality professional development experiences (e.g. Borko, Jacobs, & Koellner, 2010. Contemporary approaches to teacher professional development. In P. L. Peterson, E. Baker, & B. McGaw (Eds.), International encyclopedia of education (Vol. 7, pp. 548-556). Oxford: Elsevier.; Darling-Hammond, Hyler, & Gardner, 2017. Effective teacher professional development. Palo Alto, CA: Learning Policy Institute). Many countries have found these advances of great interest because of a desire to build teacher capacity in science education and across the full curriculum. This paper continues this progress by examining the role and impact of an online professional development community within the top-down, large-scale curriculum and assessment revision of Advanced Placement (AP) Biology, Chemistry, and Physics. This paper is part of a five-year, longitudinal, U.S. National Science Foundation-funded project to study the relative effectiveness of various types of professional development in enabling teachers to adapt to the revised AP course goals and exams. Of the many forms of professional development our research has examined, preliminary analyses indicated that participation in the College Board's online AP Teacher Community (APTC) - where teachers can discuss teaching strategies, share resources, and connect with each other - had positive, direct, and statistically significant association with teacher self-reported shifts in practice and with gains in student AP scores (Fishman et al., 2014). This study explored how usage of the online APTC might be useful to teachers and examined a more robust estimate of these effects. Findings from the experience of AP teachers may be valuable in supporting other large-scale curriculum changes, such as the U.S. Next Generation Science Standards or Common Core Standards, as well as parallel curricular shifts in other countries.

  5. RegPrecise 3.0--a resource for genome-scale exploration of transcriptional regulation in bacteria.

    Science.gov (United States)

    Novichkov, Pavel S; Kazakov, Alexey E; Ravcheev, Dmitry A; Leyn, Semen A; Kovaleva, Galina Y; Sutormin, Roman A; Kazanov, Marat D; Riehl, William; Arkin, Adam P; Dubchak, Inna; Rodionov, Dmitry A

    2013-11-01

    bacterial genomes. Analytical capabilities include exploration of: regulon content, structure and function; TF binding site motifs; conservation and variations in genome-wide regulatory networks across all taxonomic groups of Bacteria. RegPrecise 3.0 was selected as a core resource on transcriptional regulation of the Department of Energy Systems Biology Knowledgebase, an emerging software and data environment designed to enable researchers to collaboratively generate, test and share new hypotheses about gene and protein functions, perform large-scale analyses, and model interactions in microbes, plants, and their communities.

  6. Homogenization of Large-Scale Movement Models in Ecology

    Science.gov (United States)

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  7. The role of large-scale, extratropical dynamics in climate change

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, T.G. [ed.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop`s University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database.

  8. The role of large-scale, extratropical dynamics in climate change

    International Nuclear Information System (INIS)

    Shepherd, T.G.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop's University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database

  9. The HydroShare Collaborative Repository for the Hydrology Community

    Science.gov (United States)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Couch, A.; Hooper, R. P.; Dash, P. K.; Stealey, M.; Yi, H.; Bandaragoda, C.; Castronova, A. M.

    2017-12-01

    HydroShare is an online, collaboration system for sharing of hydrologic data, analytical tools, and models. It supports the sharing of, and collaboration around, "resources" which are defined by standardized content types for data formats and models commonly used in hydrology. With HydroShare you can: Share your data and models with colleagues; Manage who has access to the content that you share; Share, access, visualize and manipulate a broad set of hydrologic data types and models; Use the web services application programming interface (API) to program automated and client access; Publish data and models and obtain a citable digital object identifier (DOI); Aggregate your resources into collections; Discover and access data and models published by others; Use web apps to visualize, analyze and run models on data in HydroShare. This presentation will describe the functionality and architecture of HydroShare highlighting our approach to making this system easy to use and serving the needs of the hydrology community represented by the Consortium of Universities for the Advancement of Hydrologic Sciences, Inc. (CUAHSI). Metadata for uploaded files is harvested automatically or captured using easy to use web user interfaces. Users are encouraged to add or create resources in HydroShare early in the data life cycle. To encourage this we allow users to share and collaborate on HydroShare resources privately among individual users or groups, entering metadata while doing the work. HydroShare also provides enhanced functionality for users through web apps that provide tools and computational capability for actions on resources. HydroShare's architecture broadly is comprised of: (1) resource storage, (2) resource exploration website, and (3) web apps for actions on resources. System components are loosely coupled and interact through APIs, which enhances robustness, as components can be upgraded and advanced relatively independently. The full power of this paradigm is the

  10. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  11. Country-scale phosphorus balancing as a base for resources conservation

    NARCIS (Netherlands)

    Seyhan, D.

    2009-01-01

    In order to effectively conserve the non-renewable resource phosphorus (P), flows and stocks of P must be known at national, regional and global scales. P is a key non-renewable resource because its use as fertilizer cannot be substituted posing a constraint on the global food production in the

  12. Large-scale weakly supervised object localization via latent category learning.

    Science.gov (United States)

    Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve

    2015-04-01

    Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.

  13. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  14. Event management for large scale event-driven digital hardware spiking neural networks.

    Science.gov (United States)

    Caron, Louis-Charles; D'Haene, Michiel; Mailhot, Frédéric; Schrauwen, Benjamin; Rouat, Jean

    2013-09-01

    The interest in brain-like computation has led to the design of a plethora of innovative neuromorphic systems. Individually, spiking neural networks (SNNs), event-driven simulation and digital hardware neuromorphic systems get a lot of attention. Despite the popularity of event-driven SNNs in software, very few digital hardware architectures are found. This is because existing hardware solutions for event management scale badly with the number of events. This paper introduces the structured heap queue, a pipelined digital hardware data structure, and demonstrates its suitability for event management. The structured heap queue scales gracefully with the number of events, allowing the efficient implementation of large scale digital hardware event-driven SNNs. The scaling is linear for memory, logarithmic for logic resources and constant for processing time. The use of the structured heap queue is demonstrated on a field-programmable gate array (FPGA) with an image segmentation experiment and a SNN of 65,536 neurons and 513,184 synapses. Events can be processed at the rate of 1 every 7 clock cycles and a 406×158 pixel image is segmented in 200 ms. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Adaptation of a pattern-scaling approach for assessment of local (village/valley) scale water resources and related vulnerabilities in the Upper Indus Basin

    Science.gov (United States)

    Forsythe, Nathan; Kilsby, Chris G.; Fowler, Hayley J.; Archer, David R.

    2010-05-01

    The water resources of the Upper Indus Basin (UIB) are of the utmost importance to the economic wellbeing of Pakistan. The irrigated agriculture made possible by Indus river runoff underpins the food security for Pakistan's nearly 200 million people. Contributions from hydropower account for more than one fifth of peak installed electrical generating capacity in a country where widespread, prolonged load-shedding handicaps business activity and industrial development. Pakistan's further socio-economic development thus depends largely on optimisation of its precious water resources. Confident, accurate seasonal predictions of water resource availability coupled with sound understanding of interannual variability are urgent insights needed by development planners and infrastructure managers at all levels. This study focuses on the challenge of providing meaningful quantitative information at the village/valley scale in the upper reaches of the UIB. Proceeding by progressive reductions in scale, the typology of the observed UIB hydrological regimes -- glacial, nival and pluvial -- are examined with special emphasis on interannual variability for individual seasons. Variations in discharge (runoff) are compared to observations of climate parameters (temperature, precipitation) and available spatial data (elevation, snow cover and snow-water-equivalent). The first scale presented is composed of the large-scale, long-record gauged UIB tributary basins. The Pakistan Water and Power Development Authority (WAPDA) has maintained these stations for several decades in order to monitor seasonal flows and accumulate data for design of further infrastructure. Data from basins defined by five gauging stations on the Indus, Hunza, Gilgit and Astore rivers are examined. The second scale presented is a set of smaller gauged headwater catchments with short records. These gauges were installed by WAPDA and its partners amongst the international development agencies to assess potential

  16. Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing.

    Science.gov (United States)

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance; Messina, Thomas; Fan, Hongtao; Jaeger, Edward; Stephens, Susan

    2013-06-27

    Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of the box. Rainbow is available

  17. Shared Communications: Volume 2. In-Depth Systems Research

    Energy Technology Data Exchange (ETDEWEB)

    Truett, LF

    2004-09-22

    This report is the second of two documents that examine the literature for actual examples of organizations and agencies that share communications resources. While the primary emphasis is on rural, intelligent transportation system (ITS) communications involving transit, examples will not be limited to rural activities, nor to ITS implementation, nor even to transit. In addition, the term ''communication'' will be broadly applied to include all information resources. The first document of this series, ''Shared Communications: Volume I. A Summary and Literature Review'', defines the meaning of the term ''shared communication resources'' and provides many examples of agencies that share resources. This document, ''Shared Communications: Volume II. In-Depth Systems Research'', reviews attributes that contributed to successful applications of the sharing communication resources concept. A few examples of each type of communication sharing are provided. Based on the issues and best practice realworld examples, recommendations for potential usage and recommended approaches for field operational tests are provided.

  18. An Novel Architecture of Large-scale Communication in IOT

    Science.gov (United States)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  19. Multi-granularity Bandwidth Allocation for Large-Scale WDM/TDM PON

    Science.gov (United States)

    Gao, Ziyue; Gan, Chaoqin; Ni, Cuiping; Shi, Qiongling

    2017-12-01

    WDM (wavelength-division multiplexing)/TDM (time-division multiplexing) PON (passive optical network) is being viewed as a promising solution for delivering multiple services and applications, such as high-definition video, video conference and data traffic. Considering the real-time transmission, QoS (quality of services) requirements and differentiated services model, a multi-granularity dynamic bandwidth allocation (DBA) in both domains of wavelengths and time for large-scale hybrid WDM/TDM PON is proposed in this paper. The proposed scheme achieves load balance by using the bandwidth prediction. Based on the bandwidth prediction, the wavelength assignment can be realized fairly and effectively to satisfy the different demands of various classes. Specially, the allocation of residual bandwidth further augments the DBA and makes full use of bandwidth resources in the network. To further improve the network performance, two schemes named extending the cycle of one free wavelength (ECoFW) and large bandwidth shrinkage (LBS) are proposed, which can prevent transmission from interruption when the user employs more than one wavelength. The simulation results show the effectiveness of the proposed scheme.

  20. Optimal Capacity Allocation of Large-Scale Wind-PV-Battery Units

    Directory of Open Access Journals (Sweden)

    Kehe Wu

    2014-01-01

    Full Text Available An optimal capacity allocation of large-scale wind-photovoltaic- (PV- battery units was proposed. First, an output power model was established according to meteorological conditions. Then, a wind-PV-battery unit was connected to the power grid as a power-generation unit with a rated capacity under a fixed coordinated operation strategy. Second, the utilization rate of renewable energy sources and maximum wind-PV complementation was considered and the objective function of full life cycle-net present cost (NPC was calculated through hybrid iteration/adaptive hybrid genetic algorithm (HIAGA. The optimal capacity ratio among wind generator, PV array, and battery device also was calculated simultaneously. A simulation was conducted based on the wind-PV-battery unit in Zhangbei, China. Results showed that a wind-PV-battery unit could effectively minimize the NPC of power-generation units under a stable grid-connected operation. Finally, the sensitivity analysis of the wind-PV-battery unit demonstrated that the optimization result was closely related to potential wind-solar resources and government support. Regions with rich wind resources and a reasonable government energy policy could improve the economic efficiency of their power-generation units.

  1. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  2. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  3. Modeling and Validating Time, Buffering, and Utilization of a Large-Scale, Real-Time Data Acquisition System

    CERN Document Server

    AUTHOR|(SzGeCERN)756497; The ATLAS collaboration; Garcia Garcia, Pedro Javier; Vandelli, Wainer; Froening, Holger

    2017-01-01

    Data acquisition systems for large-scale high-energy physics experiments have to handle hundreds of gigabytes per second of data, and are typically realized as specialized data centers that connect a very large number of front-end electronics devices to an event detection and storage system. The design of such systems is often based on many assumptions, small-scale experiments and a substantial amount of over-provisioning. In this work, we introduce a discrete event-based simulation tool that models the data flow of the current ATLAS data acquisition system, with the main goal to be accurate with regard to the main operational characteristics. We measure buffer occupancy counting the number of elements in buffers, resource utilization measuring output bandwidth and counting the number of active processing units, and their time evolution by comparing data over many consecutive and small periods of time. We perform studies on the error of simulation when comparing the results to a large amount of real-world ope...

  4. Modeling and Validating Time, Buffering, and Utilization of a Large-Scale, Real-Time Data Acquisition System

    CERN Document Server

    AUTHOR|(SzGeCERN)756497; The ATLAS collaboration; Garcia Garcia, Pedro Javier; Vandelli, Wainer; Froening, Holger

    2017-01-01

    Data acquisition systems for large-scale high-energy physics experiments have to handle hundreds of gigabytes per second of data, and are typically implemented as specialized data centers that connect a very large number of front-end electronics devices to an event detection and storage system. The design of such systems is often based on many assumptions, small-scale experiments and a substantial amount of over-provisioning. In this paper, we introduce a discrete event-based simulation tool that models the dataflow of the current ATLAS data acquisition system, with the main goal to be accurate with regard to the main operational characteristics. We measure buffer occupancy counting the number of elements in buffers; resource utilization measuring output bandwidth and counting the number of active processing units, and their time evolution by comparing data over many consecutive and small periods of time. We perform studies on the error in simulation when comparing the results to a large amount of real-world ...

  5. Stormbow: A Cloud-Based Tool for Reads Mapping and Expression Quantification in Large-Scale RNA-Seq Studies.

    Science.gov (United States)

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance

    2013-01-01

    RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets.

  6. Alternatives to electricity for transmission and annual-scale firming - Storage for diverse, stranded, renewable energy resources: hydrogen and ammonia

    Energy Technology Data Exchange (ETDEWEB)

    Leighty, William

    2010-09-15

    The world's richest renewable energy resources 'of large geographic extent and high intensity' are stranded: far from end-users with inadequate or nonexistent gathering and transmission systems to deliver energy. Output of most renewables varies greatly, at time scales of seconds-seasons: energy capture assets operate at low capacity factor; energy delivery is not 'firm'. New electric transmission systems, or fractions thereof, dedicated to renewables, suffer the same low CF: substantial stranded capital assets, increasing the cost of delivered renewable-source energy. Electricity storage cannot affordably firm large renewables at annual scale. Gaseous hydrogen and anhydrous ammonia fuels can: attractive alternatives.

  7. Capabilities of the Large-Scale Sediment Transport Facility

    Science.gov (United States)

    2016-04-01

    pump flow meters, sediment trap weigh tanks , and beach profiling lidar. A detailed discussion of the original LSTF features and capabilities can be...ERDC/CHL CHETN-I-88 April 2016 Approved for public release; distribution is unlimited. Capabilities of the Large-Scale Sediment Transport...describes the Large-Scale Sediment Transport Facility (LSTF) and recent upgrades to the measurement systems. The purpose of these upgrades was to increase

  8. Spatiotemporal property and predictability of large-scale human mobility

    Science.gov (United States)

    Zhang, Hai-Tao; Zhu, Tao; Fu, Dongfei; Xu, Bowen; Han, Xiao-Pu; Chen, Duxin

    2018-04-01

    Spatiotemporal characteristics of human mobility emerging from complexity on individual scale have been extensively studied due to the application potential on human behavior prediction and recommendation, and control of epidemic spreading. We collect and investigate a comprehensive data set of human activities on large geographical scales, including both websites browse and mobile towers visit. Numerical results show that the degree of activity decays as a power law, indicating that human behaviors are reminiscent of scale-free random walks known as Lévy flight. More significantly, this study suggests that human activities on large geographical scales have specific non-Markovian characteristics, such as a two-segment power-law distribution of dwelling time and a high possibility for prediction. Furthermore, a scale-free featured mobility model with two essential ingredients, i.e., preferential return and exploration, and a Gaussian distribution assumption on the exploration tendency parameter is proposed, which outperforms existing human mobility models under scenarios of large geographical scales.

  9. Problems of large-scale vertically-integrated aquaculture

    Energy Technology Data Exchange (ETDEWEB)

    Webber, H H; Riordan, P F

    1976-01-01

    The problems of vertically-integrated aquaculture are outlined; they are concerned with: species limitations (in the market, biological and technological); site selection, feed, manpower needs, and legal, institutional and financial requirements. The gaps in understanding of, and the constraints limiting, large-scale aquaculture are listed. Future action is recommended with respect to: types and diversity of species to be cultivated, marketing, biotechnology (seed supply, disease control, water quality and concerted effort), siting, feed, manpower, legal and institutional aids (granting of water rights, grants, tax breaks, duty-free imports, etc.), and adequate financing. The last of hard data based on experience suggests that large-scale vertically-integrated aquaculture is a high risk enterprise, and with the high capital investment required, banks and funding institutions are wary of supporting it. Investment in pilot projects is suggested to demonstrate that large-scale aquaculture can be a fully functional and successful business. Construction and operation of such pilot farms is judged to be in the interests of both the public and private sector.

  10. Large-scale computing with Quantum Espresso

    International Nuclear Information System (INIS)

    Giannozzi, P.; Cavazzoni, C.

    2009-01-01

    This paper gives a short introduction to Quantum Espresso: a distribution of software for atomistic simulations in condensed-matter physics, chemical physics, materials science, and to its usage in large-scale parallel computing.

  11. Shared control on lunar spacecraft teleoperation rendezvous operations with large time delay

    Science.gov (United States)

    Ya-kun, Zhang; Hai-yang, Li; Rui-xue, Huang; Jiang-hui, Liu

    2017-08-01

    Teleoperation could be used in space on-orbit serving missions, such as object deorbits, spacecraft approaches, and automatic rendezvous and docking back-up systems. Teleoperation rendezvous and docking in lunar orbit may encounter bottlenecks for the inherent time delay in the communication link and the limited measurement accuracy of sensors. Moreover, human intervention is unsuitable in view of the partial communication coverage problem. To solve these problems, a shared control strategy for teleoperation rendezvous and docking is detailed. The control authority in lunar orbital maneuvers that involves two spacecraft as rendezvous and docking in the final phase was discussed in this paper. The predictive display model based on the relative dynamic equations is established to overcome the influence of the large time delay in communication link. We discuss and attempt to prove via consistent, ground-based simulations the relative merits of fully autonomous control mode (i.e., onboard computer-based), fully manual control (i.e., human-driven at the ground station) and shared control mode. The simulation experiments were conducted on the nine-degrees-of-freedom teleoperation rendezvous and docking simulation platform. Simulation results indicated that the shared control methods can overcome the influence of time delay effects. In addition, the docking success probability of shared control method was enhanced compared with automatic and manual modes.

  12. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  13. Scaling cost-sharing to wages: how employers can reduce health spending and provide greater economic security.

    Science.gov (United States)

    Robertson, Christopher T

    2014-01-01

    In the employer-sponsored insurance market that covers most Americans; many workers are "underinsured." The evidence shows onerous out-of-pocket payments causing them to forgo needed care, miss work, and fall into bankruptcies and foreclosures. Nonetheless, many higher-paid workers are "overinsured": the evidence shows that in this domain, surplus insurance stimulates spending and price inflation without improving health. Employers can solve these problems together by scaling cost-sharing to wages. This reform would make insurance better protect against risk and guarantee access to care, while maintaining or even reducing insurance premiums. Yet, there are legal obstacles to scaled cost-sharing. The group-based nature of employer health insurance, reinforced by federal law, makes it difficult for scaling to be achieved through individual choices. The Affordable Care Act's (ACA) "essential coverage" mandate also caps cost-sharing even for wealthy workers that need no such cap. Additionally, there is a tax distortion in favor of highly paid workers purchasing healthcare through insurance rather than out-of-pocket. These problems are all surmountable. In particular, the ACA has expanded the applicability of an unenforced employee-benefits rule that prohibits "discrimination" in favor of highly compensated workers. A novel analysis shows that this statute gives the Internal Revenue Service the authority to require scaling and to thereby eliminate the current inequities and inefficiencies caused by the tax distortion. The promise is smarter insurance for over 150 million Americans.

  14. Limited accessibility to designs and results of Japanese large-scale clinical trials for cardiovascular diseases.

    Science.gov (United States)

    Sawata, Hiroshi; Ueshima, Kenji; Tsutani, Kiichiro

    2011-04-14

    Clinical evidence is important for improving the treatment of patients by health care providers. In the study of cardiovascular diseases, large-scale clinical trials involving thousands of participants are required to evaluate the risks of cardiac events and/or death. The problems encountered in conducting the Japanese Acute Myocardial Infarction Prospective (JAMP) study highlighted the difficulties involved in obtaining the financial and infrastructural resources necessary for conducting large-scale clinical trials. The objectives of the current study were: 1) to clarify the current funding and infrastructural environment surrounding large-scale clinical trials in cardiovascular and metabolic diseases in Japan, and 2) to find ways to improve the environment surrounding clinical trials in Japan more generally. We examined clinical trials examining cardiovascular diseases that evaluated true endpoints and involved 300 or more participants using Pub-Med, Ichushi (by the Japan Medical Abstracts Society, a non-profit organization), websites of related medical societies, the University Hospital Medical Information Network (UMIN) Clinical Trials Registry, and clinicaltrials.gov at three points in time: 30 November, 2004, 25 February, 2007 and 25 July, 2009. We found a total of 152 trials that met our criteria for 'large-scale clinical trials' examining cardiovascular diseases in Japan. Of these, 72.4% were randomized controlled trials (RCTs). Of 152 trials, 9.2% of the trials examined more than 10,000 participants, and 42.8% examined between 1,000 and 10,000 participants. The number of large-scale clinical trials markedly increased from 2001 to 2004, but suddenly decreased in 2007, then began to increase again. Ischemic heart disease (39.5%) was the most common target disease. Most of the larger-scale trials were funded by private organizations such as pharmaceutical companies. The designs and results of 13 trials were not disclosed. To improve the quality of clinical

  15. Limited accessibility to designs and results of Japanese large-scale clinical trials for cardiovascular diseases

    Directory of Open Access Journals (Sweden)

    Tsutani Kiichiro

    2011-04-01

    Full Text Available Abstract Background Clinical evidence is important for improving the treatment of patients by health care providers. In the study of cardiovascular diseases, large-scale clinical trials involving thousands of participants are required to evaluate the risks of cardiac events and/or death. The problems encountered in conducting the Japanese Acute Myocardial Infarction Prospective (JAMP study highlighted the difficulties involved in obtaining the financial and infrastructural resources necessary for conducting large-scale clinical trials. The objectives of the current study were: 1 to clarify the current funding and infrastructural environment surrounding large-scale clinical trials in cardiovascular and metabolic diseases in Japan, and 2 to find ways to improve the environment surrounding clinical trials in Japan more generally. Methods We examined clinical trials examining cardiovascular diseases that evaluated true endpoints and involved 300 or more participants using Pub-Med, Ichushi (by the Japan Medical Abstracts Society, a non-profit organization, websites of related medical societies, the University Hospital Medical Information Network (UMIN Clinical Trials Registry, and clinicaltrials.gov at three points in time: 30 November, 2004, 25 February, 2007 and 25 July, 2009. Results We found a total of 152 trials that met our criteria for 'large-scale clinical trials' examining cardiovascular diseases in Japan. Of these, 72.4% were randomized controlled trials (RCTs. Of 152 trials, 9.2% of the trials examined more than 10,000 participants, and 42.8% examined between 1,000 and 10,000 participants. The number of large-scale clinical trials markedly increased from 2001 to 2004, but suddenly decreased in 2007, then began to increase again. Ischemic heart disease (39.5% was the most common target disease. Most of the larger-scale trials were funded by private organizations such as pharmaceutical companies. The designs and results of 13 trials were not

  16. Capturing subregional variability in regional-scale climate change vulnerability assessments of natural resources

    Science.gov (United States)

    Polly C. Buotte; David L. Peterson; Kevin S. McKelvey; Jeffrey A. Hicke

    2016-01-01

    Natural resource vulnerability to climate change can depend on the climatology and ecological conditions at a particular site. Here we present a conceptual framework for incorporating spatial variability in natural resource vulnerability to climate change in a regional-scale assessment. The framework was implemented in the first regional-scale vulnerability...

  17. A Carrier Bag Story of (waste) food, hens and the sharing economy

    DEFF Research Database (Denmark)

    Fjalland, Emmy Laura Perez

    2018-01-01

    futures by showing the collaborative, compassionate, responsible qualities of the sharing economy of the exchange of waste food. With the help from The Carrier Bag Theory – an alternative, feminist narrative – and the mobilities paradigm, this article shows the transformative gestures of ethical......, ecologies and different waste, recycling and/or upcycling systems. Within these disposal systems, valuable resources are being lost. Based on empirical work from a Danish project called Sharing City and a local small-scale organic farm (named Hegnsholt), this article elaborates upon how particular waste......, this article ought to inspire us to rethink how to share this planet with earth-others....

  18. Energy Decomposition Analysis Based on Absolutely Localized Molecular Orbitals for Large-Scale Density Functional Theory Calculations in Drug Design.

    Science.gov (United States)

    Phipps, M J S; Fox, T; Tautermann, C S; Skylaris, C-K

    2016-07-12

    We report the development and implementation of an energy decomposition analysis (EDA) scheme in the ONETEP linear-scaling electronic structure package. Our approach is hybrid as it combines the localized molecular orbital EDA (Su, P.; Li, H. J. Chem. Phys., 2009, 131, 014102) and the absolutely localized molecular orbital EDA (Khaliullin, R. Z.; et al. J. Phys. Chem. A, 2007, 111, 8753-8765) to partition the intermolecular interaction energy into chemically distinct components (electrostatic, exchange, correlation, Pauli repulsion, polarization, and charge transfer). Limitations shared in EDA approaches such as the issue of basis set dependence in polarization and charge transfer are discussed, and a remedy to this problem is proposed that exploits the strictly localized property of the ONETEP orbitals. Our method is validated on a range of complexes with interactions relevant to drug design. We demonstrate the capabilities for large-scale calculations with our approach on complexes of thrombin with an inhibitor comprised of up to 4975 atoms. Given the capability of ONETEP for large-scale calculations, such as on entire proteins, we expect that our EDA scheme can be applied in a large range of biomolecular problems, especially in the context of drug design.

  19. Large sharing networks and unusual injection practices explain the rapid rise in HIV among IDUs in Sargodha, Pakistan

    Directory of Open Access Journals (Sweden)

    Qureshi Salman U

    2009-06-01

    Full Text Available Abstract Background Of the nearly 100,000 street-based IDUs in Pakistan, 20% have HIV. We investigated the recent rise in HIV prevalence from 12 to 52% among IDUs in Sargodha despite > 70% coverage with syringe exchanges. Methods We interviewed approximately 150 IDUs and 30 outreach workers in focus group discussions. Results We found six rural and 28 urban injecting locations. Urban locations have about 20–30 people at any time and about 100 daily; rural locations have twice as many (national average: 4–15. About half of the IDUs started injecting within the past 2 years and are not proficient at injecting themselves. They use street injectors, who have 15–16 clients daily. Heroin is almost exclusively the drug used. Most inject 5–7 times daily. Nearly all injectors claim to use fresh syringes. However, they load, inject and share using a locally developed method called scale. Most Pakistani IDUs prefer to double pump drug the syringe, which allows mixing of blood with drug in the syringe. The injector injects 3 ml and keeps 2 ml (the scale as injection fee. The injector usually pools all the leftover scale (now with some blood mixed with drug either for his own use or to sell it. Most IDUs backload the scale they buy into their own fresh syringes. Discussion Use of an unprecedented method of injecting drugs that largely bypasses fresh syringes, larger size of sharing networks, higher injection frequency and near universal use of street injectors likely explain for the rapid rise in HIV prevalence among IDUs in Sargodha despite high level provision of fresh syringes. This had been missed by us and the national surveillance, which is quantitative. We have addressed this by hiring injectors as peer outreach workers and increasing syringe supply. Our findings highlight both the importance of qualitative research and operations research to enrich the quality of HIV prevention programs.

  20. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  1. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Administrator

    structure and chemical purity of 99⋅1% by inductively coupled plasma optical emission spectroscopy on a large scale. Keywords. Sol–gel; yttria-stabilized zirconia; large scale; nanopowder; Pechini method. 1. Introduction. Zirconia has attracted the attention of many scientists because of its tremendous thermal, mechanical ...

  2. The Large Scale Distribution of Water Ice in the Polar Regions of the Moon

    Science.gov (United States)

    Jordan, A.; Wilson, J. K.; Schwadron, N.; Spence, H. E.

    2017-12-01

    For in situ resource utilization, one must know where water ice is on the Moon. Many datasets have revealed both surface deposits of water ice and subsurface deposits of hydrogen near the lunar poles, but it has proved difficult to resolve the differences among the locations of these deposits. Despite these datasets disagreeing on how deposits are distributed on small scales, we show that most of these datasets do agree on the large scale distribution of water ice. We present data from the Cosmic Ray Telescope for the Effects of Radiation (CRaTER) on the Lunar Reconnaissance Orbiter (LRO), LRO's Lunar Exploration Neutron Detector (LEND), the Neutron Spectrometer on Lunar Prospector (LPNS), LRO's Lyman Alpha Mapping Project (LAMP), LRO's Lunar Orbiter Laser Altimeter (LOLA), and Chandrayaan-1's Moon Mineralogy Mapper (M3). All, including those that show clear evidence for water ice, reveal surprisingly similar trends with latitude, suggesting that both surface and subsurface datasets are measuring ice. All show that water ice increases towards the poles, and most demonstrate that its signature appears at about ±70° latitude and increases poleward. This is consistent with simulations of how surface and subsurface cold traps are distributed with latitude. This large scale agreement constrains the origin of the ice, suggesting that an ancient cometary impact (or impacts) created a large scale deposit that has been rendered locally heterogeneous by subsequent impacts. Furthermore, it also shows that water ice may be available down to ±70°—latitudes that are more accessible than the poles for landing.

  3. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  4. Antimicrobial residues in animal waste and water resources proximal to large-scale swine and poultry feeding operations

    Science.gov (United States)

    Campagnolo, E.R.; Johnson, K.R.; Karpati, A.; Rubin, C.S.; Kolpin, D.W.; Meyer, M.T.; Esteban, J. Emilio; Currier, R.W.; Smith, K.; Thu, K.M.; McGeehin, M.

    2002-01-01

    Expansion and intensification of large-scale animal feeding operations (AFOs) in the United States has resulted in concern about environmental contamination and its potential public health impacts. The objective of this investigation was to obtain background data on a broad profile of antimicrobial residues in animal wastes and surface water and groundwater proximal to large-scale swine and poultry operations. The samples were measured for antimicrobial compounds using both radioimmunoassay and liquid chromatography/electrospray ionization-mass spectrometry (LC/ESI-MS) techniques. Multiple classes of antimicrobial compounds (commonly at concentrations of >100 μg/l) were detected in swine waste storage lagoons. In addition, multiple classes of antimicrobial compounds were detected in surface and groundwater samples collected proximal to the swine and poultry farms. This information indicates that animal waste used as fertilizer for crops may serve as a source of antimicrobial residues for the environment. Further research is required to determine if the levels of antimicrobials detected in this study are of consequence to human and/or environmental ecosystems. A comparison of the radioimmunoassay and LC/ESI-MS analytical methods documented that radioimmunoassay techniques were only appropriate for measuring residues in animal waste samples likely to contain high levels of antimicrobials. More sensitive LC/ESI-MS techniques are required in environmental samples, where low levels of antimicrobial residues are more likely.

  5. Antimicrobial residues in animal waste and water resources proximal to large-scale swine and poultry feeding operations.

    Science.gov (United States)

    Campagnolo, Enzo R; Johnson, Kammy R; Karpati, Adam; Rubin, Carol S; Kolpin, Dana W; Meyer, Michael T; Esteban, J Emilio; Currier, Russell W; Smith, Kathleen; Thu, Kendall M; McGeehin, Michael

    2002-11-01

    Expansion and intensification of large-scale animal feeding operations (AFOs) in the United States has resulted in concern about environmental contamination and its potential public health impacts. The objective of this investigation was to obtain background data on a broad profile of antimicrobial residues in animal wastes and surface water and groundwater proximal to large-scale swine and poultry operations. The samples were measured for antimicrobial compounds using both radioimmunoassay and liquid chromatography/electrospray ionization-mass spectrometry (LC/ESI-MS) techniques. Multiple classes of antimicrobial compounds (commonly at concentrations of > 100 microg/l) were detected in swine waste storage lagoons. In addition, multiple classes of antimicrobial compounds were detected in surface and groundwater samples collected proximal to the swine and poultry farms. This information indicates that animal waste used as fertilizer for crops may serve as a source of antimicrobial residues for the environment. Further research is required to determine if the levels of antimicrobials detected in this study are of consequence to human and/or environmental ecosystems. A comparison of the radioimmunoassay and LC/ESI-MS analytical methods documented that radioimmunoassay techniques were only appropriate for measuring residues in animal waste samples likely to contain high levels of antimicrobials. More sensitive LC/ESI-MS techniques are required in environmental samples, where low levels of antimicrobial residues are more likely.

  6. A Wireless Power Sharing Control Strategy for Hybrid Energy Storage Systems in DC Microgrids

    DEFF Research Database (Denmark)

    Yang, Jie; Jin, Xinmin; Wu, Xuezhi

    2017-01-01

    In order to compensate multiple time scales power fluctuation resulted from distributed energy resources and loads, hybrid energy storage systems are employed as the buffer unit in DC microgrid. In this paper, a wireless hierarchical control strategy is proposed to realize power sharing between...

  7. Life as an emergent phenomenon: studies from a large-scale boid simulation and web data

    Science.gov (United States)

    Ikegami, Takashi; Mototake, Yoh-ichi; Kobori, Shintaro; Oka, Mizuki; Hashimoto, Yasuhiro

    2017-11-01

    A large group with a special structure can become the mother of emergence. We discuss this hypothesis in relation to large-scale boid simulations and web data. In the boid swarm simulations, the nucleation, organization and collapse dynamics were found to be more diverse in larger flocks than in smaller flocks. In the second analysis, large web data, consisting of shared photos with descriptive tags, tended to group together users with similar tendencies, allowing the network to develop a core-periphery structure. We show that the generation rate of novel tags and their usage frequencies are high in the higher-order cliques. In this case, novelty is not considered to arise randomly; rather, it is generated as a result of a large and structured network. We contextualize these results in terms of adjacent possible theory and as a new way to understand collective intelligence. We argue that excessive information and material flow can become a source of innovation. This article is part of the themed issue 'Reconceptualizing the origins of life'.

  8. Participatory Modeling Processes to Build Community Knowledge Using Shared Model and Data Resources and in a Transboundary Pacific Northwest Watershed (Nooksack River Basin, Washington, USA)

    Science.gov (United States)

    Bandaragoda, C.; Dumas, M.

    2014-12-01

    As with many western US watersheds, the Nooksack River Basin faces strong pressures associated with climate variability and change, rapid population growth, and deep-rooted water law. This transboundary basin includes contributing areas in British Columbia, Canada, and has a long history of joint data collection, model development, and facilitated communication between governmental (federal, tribal, state, local), environmental, timber, agricultural, and recreational user groups. However, each entity in the watershed responds to unique data coordination, information sharing, and adaptive management regimes and thresholds, further increasing the complexity of watershed management. Over the past four years, participatory methods were used to compile and review scientific data and models, including fish habitat (endangered salmonid species), channel hydraulics, climate data, agricultural, municipal and industrial water use, and integrated watershed scale distributed hydrologic models from over 15 years of projects (from jointly funded to independent shared work by individual companies, agencies, and universities). A specific outcome of the work includes participatory design of a collective problem statement used for guidance on future investment of shared resources and development of a data-generation process where modeling results are communicated in a three-tiers for 1) public/decision-making, 2) technical, and 3) research audiences. We establish features for successful participation using tools that are iteratively developed, tested for usability through incremental knowledge building, and designed to provide rigor in modeling. A general outcome of the work is ongoing support by tribal, state, and local governments, as well as the agricultural community, to continue the generation of shared watershed data using models in a dynamic legal and regulatory setting, where two federally recognized tribes have requested federal court resolution of federal treaty rights

  9. Towards agile large-scale predictive modelling in drug discovery with flow-based programming design principles.

    Science.gov (United States)

    Lampa, Samuel; Alvarsson, Jonathan; Spjuth, Ola

    2016-01-01

    Predictive modelling in drug discovery is challenging to automate as it often contains multiple analysis steps and might involve cross-validation and parameter tuning that create complex dependencies between tasks. With large-scale data or when using computationally demanding modelling methods, e-infrastructures such as high-performance or cloud computing are required, adding to the existing challenges of fault-tolerant automation. Workflow management systems can aid in many of these challenges, but the currently available systems are lacking in the functionality needed to enable agile and flexible predictive modelling. We here present an approach inspired by elements of the flow-based programming paradigm, implemented as an extension of the Luigi system which we name SciLuigi. We also discuss the experiences from using the approach when modelling a large set of biochemical interactions using a shared computer cluster.Graphical abstract.

  10. Community response to large-scale federal projects: the case of the MX

    International Nuclear Information System (INIS)

    Albrecht, S.L.

    1983-01-01

    An analysis of community response to large-scale defense projects, such as the proposals to site MX missiles in Utah and Nevada, is one way to identify those factors likely to be important in determining community response to nuclear waste repository siting. This chapter gives a brief overview of the MX system's characteristics and the potential impacts it would have had on the rural areas, describes the patterns of community mobilization that occurred in Utah and Nevada, and suggests where this response may parallel community concerns about a repository siting. Three lessons from the MX experience are that local residents, asked to assume a disproportionate share of the negative impacts, should be involved in the siting process, that local residents should be treated as equal, and that compensation should be offered when local residents suffer from political expediency

  11. Resource-Use Efficiency in Rice Production Under Small Scale ...

    African Journals Online (AJOL)

    acer

    specific objectives of the study were to determine resource use efficiency, describe ... economic level. ... this key variable with a view to stepping ... focused on small-scale irrigation systems for ... farmers were assumed to be operating under.

  12. Large-scale Agricultural Land Acquisitions in West Africa | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project will examine large-scale agricultural land acquisitions in nine West African countries -Burkina Faso, Guinea-Bissau, Guinea, Benin, Mali, Togo, Senegal, Niger, and Côte d'Ivoire. ... They will use the results to increase public awareness and knowledge about the consequences of large-scale land acquisitions.

  13. Simulation modelling of central order processing system under resource sharing strategy in demand-driven garment supply chains

    Science.gov (United States)

    Ma, K.; Thomassey, S.; Zeng, X.

    2017-10-01

    In this paper we proposed a central order processing system under resource sharing strategy for demand-driven garment supply chains to increase supply chain performances. We examined this system by using simulation technology. Simulation results showed that significant improvement in various performance indicators was obtained in new collaborative model with proposed system.

  14. Large-scale motions in the universe: a review

    International Nuclear Information System (INIS)

    Burstein, D.

    1990-01-01

    The expansion of the universe can be retarded in localised regions within the universe both by the presence of gravity and by non-gravitational motions generated in the post-recombination universe. The motions of galaxies thus generated are called 'peculiar motions', and the amplitudes, size scales and coherence of these peculiar motions are among the most direct records of the structure of the universe. As such, measurements of these properties of the present-day universe provide some of the severest tests of cosmological theories. This is a review of the current evidence for large-scale motions of galaxies out to a distance of ∼5000 km s -1 (in an expanding universe, distance is proportional to radial velocity). 'Large-scale' in this context refers to motions that are correlated over size scales larger than the typical sizes of groups of galaxies, up to and including the size of the volume surveyed. To orient the reader into this relatively new field of study, a short modern history is given together with an explanation of the terminology. Careful consideration is given to the data used to measure the distances, and hence the peculiar motions, of galaxies. The evidence for large-scale motions is presented in a graphical fashion, using only the most reliable data for galaxies spanning a wide range in optical properties and over the complete range of galactic environments. The kinds of systematic errors that can affect this analysis are discussed, and the reliability of these motions is assessed. The predictions of two models of large-scale motion are compared to the observations, and special emphasis is placed on those motions in which our own Galaxy directly partakes. (author)

  15. State of the Art in Large-Scale Soil Moisture Monitoring

    Science.gov (United States)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  16. A route to explosive large-scale magnetic reconnection in a super-ion-scale current sheet

    Directory of Open Access Journals (Sweden)

    K. G. Tanaka

    2009-01-01

    Full Text Available How to trigger magnetic reconnection is one of the most interesting and important problems in space plasma physics. Recently, electron temperature anisotropy (αeo=Te⊥/Te|| at the center of a current sheet and non-local effect of the lower-hybrid drift instability (LHDI that develops at the current sheet edges have attracted attention in this context. In addition to these effects, here we also study the effects of ion temperature anisotropy (αio=Ti⊥/Ti||. Electron anisotropy effects are known to be helpless in a current sheet whose thickness is of ion-scale. In this range of current sheet thickness, the LHDI effects are shown to weaken substantially with a small increase in thickness and the obtained saturation level is too low for a large-scale reconnection to be achieved. Then we investigate whether introduction of electron and ion temperature anisotropies in the initial stage would couple with the LHDI effects to revive quick triggering of large-scale reconnection in a super-ion-scale current sheet. The results are as follows. (1 The initial electron temperature anisotropy is consumed very quickly when a number of minuscule magnetic islands (each lateral length is 1.5~3 times the ion inertial length form. These minuscule islands do not coalesce into a large-scale island to enable large-scale reconnection. (2 The subsequent LHDI effects disturb the current sheet filled with the small islands. This makes the triggering time scale to be accelerated substantially but does not enhance the saturation level of reconnected flux. (3 When the ion temperature anisotropy is added, it survives through the small island formation stage and makes even quicker triggering to happen when the LHDI effects set-in. Furthermore the saturation level is seen to be elevated by a factor of ~2 and large-scale reconnection is achieved only in this case. Comparison with two-dimensional simulations that exclude the LHDI effects confirms that the saturation level

  17. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    Science.gov (United States)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  18. GoFFish: A Sub-Graph Centric Framework for Large-Scale Graph Analytics1

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Kumbhare, Alok; Wickramaarachchi, Charith; Nagarkar, Soonil; Ravi, Santosh; Raghavendra, Cauligi; Prasanna, Viktor

    2014-08-25

    Large scale graph processing is a major research area for Big Data exploration. Vertex centric programming models like Pregel are gaining traction due to their simple abstraction that allows for scalable execution on distributed systems naturally. However, there are limitations to this approach which cause vertex centric algorithms to under-perform due to poor compute to communication overhead ratio and slow convergence of iterative superstep. In this paper we introduce GoFFish a scalable sub-graph centric framework co-designed with a distributed persistent graph storage for large scale graph analytics on commodity clusters. We introduce a sub-graph centric programming abstraction that combines the scalability of a vertex centric approach with the flexibility of shared memory sub-graph computation. We map Connected Components, SSSP and PageRank algorithms to this model to illustrate its flexibility. Further, we empirically analyze GoFFish using several real world graphs and demonstrate its significant performance improvement, orders of magnitude in some cases, compared to Apache Giraph, the leading open source vertex centric implementation. We map Connected Components, SSSP and PageRank algorithms to this model to illustrate its flexibility. Further, we empirically analyze GoFFish using several real world graphs and demonstrate its significant performance improvement, orders of magnitude in some cases, compared to Apache Giraph, the leading open source vertex centric implementation.

  19. Large-scale heat pumps in sustainable energy systems: System and project perspectives

    Directory of Open Access Journals (Sweden)

    Blarke Morten B.

    2007-01-01

    Full Text Available This paper shows that in support of its ability to improve the overall economic cost-effectiveness and flexibility of the Danish energy system, the financially feasible integration of large-scale heat pumps (HP with existing combined heat and power (CHP plants, is critically sensitive to the operational mode of the HP vis-à-vis the operational coefficient of performance, mainly given by the temperature level of the heat source. When using ground source for low-temperature heat source, heat production costs increases by about 10%, while partial use of condensed flue gasses for low-temperature heat source results in an 8% cost reduction. Furthermore, the analysis shows that when a large-scale HP is integrated with an existing CHP plant, the projected spot market situation in The Nordic Power Exchange (Nord Pool towards 2025, which reflects a growing share of wind power and heat-supply constrained power generation electricity, further reduces the operational hours of the CHP unit over time, while increasing the operational hours of the HP unit. In result, an HP unit at half the heat production capacity as the CHP unit in combination with a heat-only boiler represents as a possibly financially feasible alternative to CHP operation, rather than a supplement to CHP unit operation. While such revised operational strategy would have impacts on policies to promote co-generation, these results indicate that the integration of large-scale HP may jeopardize efforts to promote co-generation. Policy instruments should be designed to promote the integration of HP with lower than half of the heating capacity of the CHP unit. Also it is found, that CHP-HP plant designs should allow for the utilization of heat recovered from the CHP unit’s flue gasses for both concurrent (CHP unit and HP unit and independent operation (HP unit only. For independent operation, the recovered heat is required to be stored. .

  20. Large-scale information flow in conscious and unconscious states: an ECoG study in monkeys.

    Directory of Open Access Journals (Sweden)

    Toru Yanagawa

    Full Text Available Consciousness is an emergent property of the complex brain network. In order to understand how consciousness is constructed, neural interactions within this network must be elucidated. Previous studies have shown that specific neural interactions between the thalamus and frontoparietal cortices; frontal and parietal cortices; and parietal and temporal cortices are correlated with levels of consciousness. However, due to technical limitations, the network underlying consciousness has not been investigated in terms of large-scale interactions with high temporal and spectral resolution. In this study, we recorded neural activity with dense electrocorticogram (ECoG arrays and used the spectral Granger causality to generate a more comprehensive network that relates to consciousness in monkeys. We found that neural interactions were significantly different between conscious and unconscious states in all combinations of cortical region pairs. Furthermore, the difference in neural interactions between conscious and unconscious states could be represented in 4 frequency-specific large-scale networks with unique interaction patterns: 2 networks were related to consciousness and showed peaks in alpha and beta bands, while the other 2 networks were related to unconsciousness and showed peaks in theta and gamma bands. Moreover, networks in the unconscious state were shared amongst 3 different unconscious conditions, which were induced either by ketamine and medetomidine, propofol, or sleep. Our results provide a novel picture that the difference between conscious and unconscious states is characterized by a switch in frequency-specific modes of large-scale communications across the entire cortex, rather than the cessation of interactions between specific cortical regions.

  1. The Saskatchewan River Basin - a large scale observatory for water security research (Invited)

    Science.gov (United States)

    Wheater, H. S.

    2013-12-01

    The 336,000 km2 Saskatchewan River Basin (SaskRB) in Western Canada illustrates many of the issues of Water Security faced world-wide. It poses globally-important science challenges due to the diversity in its hydro-climate and ecological zones. With one of the world's more extreme climates, it embodies environments of global significance, including the Rocky Mountains (source of the major rivers in Western Canada), the Boreal Forest (representing 30% of Canada's land area) and the Prairies (home to 80% of Canada's agriculture). Management concerns include: provision of water resources to more than three million inhabitants, including indigenous communities; balancing competing needs for water between different uses, such as urban centres, industry, agriculture, hydropower and environmental flows; issues of water allocation between upstream and downstream users in the three prairie provinces; managing the risks of flood and droughts; and assessing water quality impacts of discharges from major cities and intensive agricultural production. Superimposed on these issues is the need to understand and manage uncertain water futures, including effects of economic growth and environmental change, in a highly fragmented water governance environment. Key science questions focus on understanding and predicting the effects of land and water management and environmental change on water quantity and quality. To address the science challenges, observational data are necessary across multiple scales. This requires focussed research at intensively monitored sites and small watersheds to improve process understanding and fine-scale models. To understand large-scale effects on river flows and quality, land-atmosphere feedbacks, and regional climate, integrated monitoring, modelling and analysis is needed at large basin scale. And to support water management, new tools are needed for operational management and scenario-based planning that can be implemented across multiple scales and

  2. Large-scale structure observables in general relativity

    International Nuclear Information System (INIS)

    Jeong, Donghui; Schmidt, Fabian

    2015-01-01

    We review recent studies that rigorously define several key observables of the large-scale structure of the Universe in a general relativistic context. Specifically, we consider (i) redshift perturbation of cosmic clock events; (ii) distortion of cosmic rulers, including weak lensing shear and magnification; and (iii) observed number density of tracers of the large-scale structure. We provide covariant and gauge-invariant expressions of these observables. Our expressions are given for a linearly perturbed flat Friedmann–Robertson–Walker metric including scalar, vector, and tensor metric perturbations. While we restrict ourselves to linear order in perturbation theory, the approach can be straightforwardly generalized to higher order. (paper)

  3. Fatigue Analysis of Large-scale Wind turbine

    Directory of Open Access Journals (Sweden)

    Zhu Yongli

    2017-01-01

    Full Text Available The paper does research on top flange fatigue damage of large-scale wind turbine generator. It establishes finite element model of top flange connection system with finite element analysis software MSC. Marc/Mentat, analyzes its fatigue strain, implements load simulation of flange fatigue working condition with Bladed software, acquires flange fatigue load spectrum with rain-flow counting method, finally, it realizes fatigue analysis of top flange with fatigue analysis software MSC. Fatigue and Palmgren-Miner linear cumulative damage theory. The analysis result indicates that its result provides new thinking for flange fatigue analysis of large-scale wind turbine generator, and possesses some practical engineering value.

  4. Real-time simulation of large-scale floods

    Science.gov (United States)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  5. Successful large-scale hatchery culture of sandfish (Holothuria scabra using micro-algae concentrates as a larval food source

    Directory of Open Access Journals (Sweden)

    Thane A. Militz

    2018-02-01

    Full Text Available This paper reports methodology for large-scale hatchery culture of sandfish, Holothuria scabra, in the absence of live, cultured micro-algae. We demonstrate how commercially-available micro-algae concentrates can be incorporated into hatchery protocols as the sole larval food source to completely replace live, cultured micro-algae. Micro-algae concentrates supported comparable hatchery production of sandfish to that of live, cultured micro-algae traditionally used in large-scale hatchery culture. The hatchery protocol presented allowed a single technician to achieve production of more than 18,800 juvenile sandfish at 40 days post-fertilisation in a low-resource hatchery in Papua New Guinea. Growth of auricularia larvae fed micro-algae concentrates was represented by the equation length (μm = 307.8 × ln(day + 209.2 (R2 = 0.93 while survival over the entire 40 day hatchery cycle was described by the equation survival = 2 × day−1.06 (R2 = 0.74. These results show that micro-algae concentrates have great potential for simplifying hatchery culture of sea cucumbers by reducing infrastructural and technical resources required for live micro-algae culture. The hatchery methodology described in this study is likely to have applicability to low-resource hatcheries throughout the Indo-Pacific and could support regional expansion of sandfish hatchery production.

  6. An amodal shared resource model of language-mediated visual attention

    Directory of Open Access Journals (Sweden)

    Alastair Charles Smith

    2013-08-01

    Full Text Available Language-mediated visual attention describes the interaction of two fundamental components of the human cognitive system, language and vision. Within this paper we present an amodal shared resource model of language-mediated visual attention that offers a description of the information and processes involved in this complex multimodal behaviour and a potential explanation for how this ability is acquired. We demonstrate that the model is not only sufficient to account for the experimental effects of Visual World Paradigm studies but also that these effects are emergent properties of the architecture of the model itself, rather than requiring separate information processing channels or modular processing systems. The model provides an explicit description of the connection between the modality-specific input from language and vision and the distribution of eye gaze in language mediated visual attention. The paper concludes by discussing future applications for the model, specifically its potential for investigating the factors driving observed individual differences in language mediated eye gaze.

  7. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  8. Nearly incompressible fluids: Hydrodynamics and large scale inhomogeneity

    International Nuclear Information System (INIS)

    Hunana, P.; Zank, G. P.; Shaikh, D.

    2006-01-01

    A system of hydrodynamic equations in the presence of large-scale inhomogeneities for a high plasma beta solar wind is derived. The theory is derived under the assumption of low turbulent Mach number and is developed for the flows where the usual incompressible description is not satisfactory and a full compressible treatment is too complex for any analytical studies. When the effects of compressibility are incorporated only weakly, a new description, referred to as 'nearly incompressible hydrodynamics', is obtained. The nearly incompressible theory, was originally applied to homogeneous flows. However, large-scale gradients in density, pressure, temperature, etc., are typical in the solar wind and it was unclear how inhomogeneities would affect the usual incompressible and nearly incompressible descriptions. In the homogeneous case, the lowest order expansion of the fully compressible equations leads to the usual incompressible equations, followed at higher orders by the nearly incompressible equations, as introduced by Zank and Matthaeus. With this work we show that the inclusion of large-scale inhomogeneities (in this case time-independent and radially symmetric background solar wind) modifies the leading-order incompressible description of solar wind flow. We find, for example, that the divergence of velocity fluctuations is nonsolenoidal and that density fluctuations can be described to leading order as a passive scalar. Locally (for small lengthscales), this system of equations converges to the usual incompressible equations and we therefore use the term 'locally incompressible' to describe the equations. This term should be distinguished from the term 'nearly incompressible', which is reserved for higher-order corrections. Furthermore, we find that density fluctuations scale with Mach number linearly, in contrast to the original homogeneous nearly incompressible theory, in which density fluctuations scale with the square of Mach number. Inhomogeneous nearly

  9. Study of large shareholders’ behavior after non-tradable shares reform: A perspective of related party transactions

    Directory of Open Access Journals (Sweden)

    Hongbo Zhang

    2013-09-01

    Full Text Available Purpose: This paper explores the behavior choice of large shareholders in the related party transactions which occur between the large shareholders and listed companies by using the data of shares from 2007 to 2010. Design/methodology/appraoch: Based on the classical research paradigm (that is, LLSV, we analysis controlling shareholders’ propping and tunneling behaviors aiming to make sure their impacts to the medium and small shareholders in theory. Findings: We get the following findings: After our capital market entering the era of full circulation, we find that the relationship between the ratio of controlling shareholders and the related party transactions present (RPTs an inverted “U” shape curve, which means that it exits a typical “Grab-synergy” effect. we should take different measures to the transactions occurred between the large shareholders and listed companies according to the property nature of the large shareholders. State-owned shareholders choose to realize their private benefits by means of RPTs, while the non state-owned shareholders conduct RPTs with an expectation of reducing costs.Practical implications: Since Guo Shuqing, the Chairman of China Securities Regulatory Commission, took office, he has taken a lot measures to curb the related party transactions harshly. Under this circumstance, it is just the right time to have a research on large shareholders’ behavior. It has important significance both in theory and practice. Originality/value: Considering the Chinese special national conditions, this paper added lots of comprehensive facts to study large shareholders’ behavior including the rate of the share held by indirect controller, the probability of thievish behaviors have been discovered, and the strict punishment regulations. The discussions in this paper help to bring into focus a highly topical issue within the context of the large shareholders’ behavior after Non-tradable Shares Reform.

  10. Factors influencing women's perceptions of shared decision making during labor and delivery: Results from a large-scale cohort study of first childbirth.

    Science.gov (United States)

    Attanasio, Laura B; Kozhimannil, Katy B; Kjerulff, Kristen H

    2018-06-01

    To examine correlates of shared decision making during labor and delivery. Data were from a cohort of women who gave birth to their first baby in Pennsylvania, 2009-2011 (N = 3006). We used logistic regression models to examine the association between labor induction and mode of delivery in relation to women's perceptions of shared decision making, and to investigate race/ethnicity and SES as potential moderators. Women who were Black and who did not have a college degree or private insurance were less likely to report high shared decision making, as well as women who underwent labor induction, instrumental vaginal or cesarean delivery. Models with interaction terms showed that the reduction in odds of shared decision making associated with cesarean delivery was greater for Black women than for White women. Women in marginalized social groups were less likely to report shared decision making during birth and Black women who delivered by cesarean had particularly low odds of shared decision making. Strategies designed to improve the quality of patient-provider communication, information sharing, and shared decision making must be attentive to the needs of vulnerable groups to ensure that such interventions reduce rather than widen disparities. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed to advance scientific understanding of the human brain. Computer simulation studies can be used to produce surrogate observational data for better conceptual models and new how...

  12. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek; Verma, Mahendra K.; Sukhatme, Jai

    2017-01-01

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  13. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek

    2017-01-11

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  14. Economic and hydrogeologic disparities govern the vulnerability of shared groundwater to strategic overdraft

    Science.gov (United States)

    Mullen, C.; Muller, M. F.

    2017-12-01

    Groundwater resources are depleting globally at an alarming rate. When the resource is shared, exploitation by individual users affects groundwater levels and increases pumping costs to all users. This incentivizes individual users to strategically over-pump, an effect that is challenging to keep in check because the underground nature of the resource often precludes regulations from being effectively implemented. As a result, shared groundwater resources are prone to tragedies of the commons that exacerbate their rapid depletion. However, we showed in a recent study that the vulnerability of aquifer systems to strategic overuse is strongly affected by local economic and physical characteristics, which suggests that not all shared aquifers are subject to tragedies of the commons. Building on these findings, we develop a vulnerability index based on coupled game theoretical and groundwater flow models. We show that vulnerability to strategic overdraft is driven by four intuitively interpretable adimensional parameters that describe economic and hydrogeologic disparities between the agents exploiting the aquifer. This suggests a scale-independent relation between the vulnerability of groundwater systems to common-pool overdraft and their economic and physical characteristics. We investigate this relation for a sample of existing aquifer systems and explore implications for enforceable groundwater agreements that would effectively mitigate strategic overdraft.

  15. Exploring the large-scale structure of Taylor–Couette turbulence through Large-Eddy Simulations

    Science.gov (United States)

    Ostilla-Mónico, Rodolfo; Zhu, Xiaojue; Verzicco, Roberto

    2018-04-01

    Large eddy simulations (LES) of Taylor-Couette (TC) flow, the flow between two co-axial and independently rotating cylinders are performed in an attempt to explore the large-scale axially-pinned structures seen in experiments and simulations. Both static and dynamic LES models are used. The Reynolds number is kept fixed at Re = 3.4 · 104, and the radius ratio η = ri /ro is set to η = 0.909, limiting the effects of curvature and resulting in frictional Reynolds numbers of around Re τ ≈ 500. Four rotation ratios from Rot = ‑0.0909 to Rot = 0.3 are simulated. First, the LES of TC is benchmarked for different rotation ratios. Both the Smagorinsky model with a constant of cs = 0.1 and the dynamic model are found to produce reasonable results for no mean rotation and cyclonic rotation, but deviations increase for increasing rotation. This is attributed to the increasing anisotropic character of the fluctuations. Second, “over-damped” LES, i.e. LES with a large Smagorinsky constant is performed and is shown to reproduce some features of the large-scale structures, even when the near-wall region is not adequately modeled. This shows the potential for using over-damped LES for fast explorations of the parameter space where large-scale structures are found.

  16. Monitoring strategies and scale appropriate hydrologic and biogeochemical modelling for natural resource management

    DEFF Research Database (Denmark)

    Bende-Michl, Ulrike; Volk, Martin; Harmel, Daren

    2011-01-01

    This short communication paper presents recommendations for developing scale-appropriate monitoring and modelling strategies to assist decision making in natural resource management (NRM). These ideas presented here were discussed in the session (S5) ‘Monitoring strategies and scale...... and communication between researcher and model developer on the one side, and natural resource managers and the model users on the other side to increase knowledge in: 1) the limitations and uncertainties of current monitoring and modelling strategies, 2) scale-dependent linkages between monitoring and modelling...

  17. Large-scale preparation of hollow graphitic carbon nanospheres

    International Nuclear Information System (INIS)

    Feng, Jun; Li, Fu; Bai, Yu-Jun; Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning; Lu, Xi-Feng

    2013-01-01

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 °C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g −1 after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 °C, which exhibit superior electrochemical performance to graphite. Highlights: ► Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 °C ► The preparation is simple, effective and eco-friendly. ► The in situ yielded MgO nanocrystals promote the graphitization. ► The HGCNSs exhibit superior electrochemical performance to graphite.

  18. Accelerating large-scale phase-field simulations with GPU

    Directory of Open Access Journals (Sweden)

    Xiaoming Shi

    2017-10-01

    Full Text Available A new package for accelerating large-scale phase-field simulations was developed by using GPU based on the semi-implicit Fourier method. The package can solve a variety of equilibrium equations with different inhomogeneity including long-range elastic, magnetostatic, and electrostatic interactions. Through using specific algorithm in Compute Unified Device Architecture (CUDA, Fourier spectral iterative perturbation method was integrated in GPU package. The Allen-Cahn equation, Cahn-Hilliard equation, and phase-field model with long-range interaction were solved based on the algorithm running on GPU respectively to test the performance of the package. From the comparison of the calculation results between the solver executed in single CPU and the one on GPU, it was found that the speed on GPU is enormously elevated to 50 times faster. The present study therefore contributes to the acceleration of large-scale phase-field simulations and provides guidance for experiments to design large-scale functional devices.

  19. Allometric Scaling and Resource Limitations Model of Total Aboveground Biomass in Forest Stands: Site-scale Test of Model

    Science.gov (United States)

    CHOI, S.; Shi, Y.; Ni, X.; Simard, M.; Myneni, R. B.

    2013-12-01

    Sparseness in in-situ observations has precluded the spatially explicit and accurate mapping of forest biomass. The need for large-scale maps has raised various approaches implementing conjugations between forest biomass and geospatial predictors such as climate, forest type, soil property, and topography. Despite the improved modeling techniques (e.g., machine learning and spatial statistics), a common limitation is that biophysical mechanisms governing tree growth are neglected in these black-box type models. The absence of a priori knowledge may lead to false interpretation of modeled results or unexplainable shifts in outputs due to the inconsistent training samples or study sites. Here, we present a gray-box approach combining known biophysical processes and geospatial predictors through parametric optimizations (inversion of reference measures). Total aboveground biomass in forest stands is estimated by incorporating the Forest Inventory and Analysis (FIA) and Parameter-elevation Regressions on Independent Slopes Model (PRISM). Two main premises of this research are: (a) The Allometric Scaling and Resource Limitations (ASRL) theory can provide a relationship between tree geometry and local resource availability constrained by environmental conditions; and (b) The zeroth order theory (size-frequency distribution) can expand individual tree allometry into total aboveground biomass at the forest stand level. In addition to the FIA estimates, two reference maps from the National Biomass and Carbon Dataset (NBCD) and U.S. Forest Service (USFS) were produced to evaluate the model. This research focuses on a site-scale test of the biomass model to explore the robustness of predictors, and to potentially improve models using additional geospatial predictors such as climatic variables, vegetation indices, soil properties, and lidar-/radar-derived altimetry products (or existing forest canopy height maps). As results, the optimized ASRL estimates satisfactorily

  20. Postoperative Neurosurgical Infection Rates After Shared-Resource Intraoperative Magnetic Resonance Imaging: A Single-Center Experience with 195 Cases.

    Science.gov (United States)

    Dinevski, Nikolaj; Sarnthein, Johannes; Vasella, Flavio; Fierstra, Jorn; Pangalu, Athina; Holzmann, David; Regli, Luca; Bozinov, Oliver

    2017-07-01

    To determine the rate of surgical-site infections (SSI) in neurosurgical procedures involving a shared-resource intraoperative magnetic resonance imaging (ioMRI) scanner at a single institution derived from a prospective clinical quality management database. All consecutive neurosurgical procedures that were performed with a high-field, 2-room ioMRI between April 2013 and June 2016 were included (N = 195; 109 craniotomies and 86 endoscopic transsphenoidal procedures). The incidence of SSIs within 3 months after surgery was assessed for both operative groups (craniotomies vs. transsphenoidal approach). Of the 109 craniotomies, 6 patients developed an SSI (5.5%, 95% confidence interval [CI] 1.2-9.8%), including 1 superficial SSI, 2 cases of bone flap osteitis, 1 intracranial abscess, and 2 cases of meningitis/ventriculitis. Wound revision surgery due to infection was necessary in 4 patients (4%). Of the 86 transsphenoidal skull base surgeries, 6 patients (7.0%, 95% CI 1.5-12.4%) developed an infection, including 2 non-central nervous system intranasal SSIs (3%) and 4 cases of meningitis (5%). Logistic regression analysis revealed that the likelihood of infection significantly decreased with the number of operations in the new operational setting (odds ratio 0.982, 95% CI 0.969-0.995, P = 0.008). The use of a shared-resource ioMRI in neurosurgery did not demonstrate increased rates of infection compared with the current available literature. The likelihood of infection decreased with the accumulating number of operations, underlining the importance of surgical staff training after the introduction of a shared-resource ioMRI. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Secure-Network-Coding-Based File Sharing via Device-to-Device Communication

    OpenAIRE

    Wang, Lei; Wang, Qing

    2017-01-01

    In order to increase the efficiency and security of file sharing in the next-generation networks, this paper proposes a large scale file sharing scheme based on secure network coding via device-to-device (D2D) communication. In our scheme, when a user needs to share data with others in the same area, the source node and all the intermediate nodes need to perform secure network coding operation before forwarding the received data. This process continues until all the mobile devices in the netw...

  2. Sustainability in Health care by Allocating Resources Effectively (SHARE) 10: operationalising disinvestment in a conceptual framework for resource allocation.

    Science.gov (United States)

    Harris, Claire; Green, Sally; Elshaug, Adam G

    2017-09-08

    This is the tenth in a series of papers reporting a program of Sustainability in Health care by Allocating Resources Effectively (SHARE) in a local healthcare setting. After more than a decade of research, there is little published evidence of active and successful disinvestment. The paucity of frameworks, methods and tools is reported to be a factor in the lack of success. However there are clear and consistent messages in the literature that can be used to inform development of a framework for operationalising disinvestment. This paper, along with the conceptual review of disinvestment in Paper 9 of this series, aims to integrate the findings of the SHARE Program with the existing disinvestment literature to address the lack of information regarding systematic organisation-wide approaches to disinvestment at the local health service level. A framework for disinvestment in a local healthcare setting is proposed. Definitions for essential terms and key concepts underpinning the framework have been made explicit to address the lack of consistent terminology. Given the negative connotations of the word 'disinvestment' and the problems inherent in considering disinvestment in isolation, the basis for the proposed framework is 'resource allocation' to address the spectrum of decision-making from investment to disinvestment. The focus is positive: optimising healthcare, improving health outcomes, using resources effectively. The framework is based on three components: a program for decision-making, projects to implement decisions and evaluate outcomes, and research to understand and improve the program and project activities. The program consists of principles for decision-making and settings that provide opportunities to introduce systematic prompts and triggers to initiate disinvestment. The projects follow the steps in the disinvestment process. Potential methods and tools are presented, however the framework does not stipulate project design or conduct; allowing

  3. The U.S. Shale Oil and Gas Resource - a Multi-Scale Analysis of Productivity

    Science.gov (United States)

    O'sullivan, F.

    2014-12-01

    Over the past decade, the large-scale production of natural gas, and more recently oil, from U.S. shale formations has had a transformative impact on the energy industry. The emergence of shale oil and gas as recoverable resources has altered perceptions regarding both the future abundance and cost of hydrocarbons, and has shifted the balance of global energy geopolitics. However, despite the excitement, shale is a resource in its nascency, and many challenges surrounding its exploitation remain. One of the most significant of these is the dramatic variation in resource productivity across multiple length scales, which is a feature of all of today's shale plays. This paper will describe the results of work that has looked to characterize the spatial and temporal variations in the productivity of the contemporary shale resource. Analysis will be presented that shows there is a strong stochastic element to observed shale well productivity in all the major plays. It will be shown that the nature of this stochasticity is consistent regardless of specific play being considered. A characterization of this stochasticity will be proposed. As a parallel to the discussion of productivity, the paper will also address the issue of "learning" in shale development. It will be shown that "creaming" trends are observable and that although "absolute" well productivity levels have increased, "specific" productivity levels (i.e. considering well and stimulation size) have actually falling markedly in many plays. The paper will also show that among individual operators' well ensembles, normalized well-to-well performance distributions are almost identical, and have remained consistent year-to-year. This result suggests little if any systematic learning regarding the effective management of well-to-well performance variability has taken place. The paper will conclude with an articulation of how the productivity characteristics of the shale resource are impacting on the resources

  4. Lightweight electric-powered vehicles. Which financial incentives after the large-scale field tests at Mendrisio?

    International Nuclear Information System (INIS)

    Keller, M.; Frick, R.; Hammer, S.

    1999-08-01

    How should lightweight electric-powered vehicles be promoted, after the large-scale fleet test being conducted at Mendrisio (southern Switzerland) is completed in 2001, and are there reasons to put question marks behind the current approach? The demand for electric vehicles, and particularly the one in the automobile category, has remained at a persistently low level. As it proved, any appreciable improvement of this situation is almost impossible, even with substantial financial incentives. However, the unsatisfactory sales figures have little to do with the nature of the fleet test itself or with the specific conditions at Mendrisio. The problem is rather of structural nature. For (battery-operated) electric cars the main problem at present is the lack of an expanding market which could become self-supporting with only a few additional incentives. Various strategies have been evaluated. Two alternatives were considered in particular: a strategy to promote explicitly electric vehicles ('EL-strategy'), and a strategy to promote efficient road vehicles in general which would have to meet specific energy and environmental-efficiency criteria ('EF-strategy'). The EL-strategies make the following dilemma clear. If the aim is to raise the share of these vehicles up to 5% of all cars on the road (or even 8%) in a mid-term prospect, then substantial interventions in the relevant vehicle markets would be required, either with penalties for conventional cars, or a large-scale funding scheme, or interventions at the supply level. The study suggests a differentiated strategy with two components: (i) 'institutionalised' promotion with the aim of a substantial increase of the share of 'efficient' vehicles (independently of the propulsion technology), and (ii) the continuation of pilot and demonstration projects for the promotion of different types of innovative technologies. (author) [de

  5. Large Scale Computing and Storage Requirements for Nuclear Physics Research

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard A.; Wasserman, Harvey J.

    2012-03-02

    IThe National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,000 users and hosting some 550 projects that involve nearly 700 codes for a wide variety of scientific disciplines. In addition to large-scale computing resources NERSC provides critical staff support and expertise to help scientists make the most efficient use of these resources to advance the scientific mission of the Office of Science. In May 2011, NERSC, DOE’s Office of Advanced Scientific Computing Research (ASCR) and DOE’s Office of Nuclear Physics (NP) held a workshop to characterize HPC requirements for NP research over the next three to five years. The effort is part of NERSC’s continuing involvement in anticipating future user needs and deploying necessary resources to meet these demands. The workshop revealed several key requirements, in addition to achieving its goal of characterizing NP computing. The key requirements include: 1. Larger allocations of computational resources at NERSC; 2. Visualization and analytics support; and 3. Support at NERSC for the unique needs of experimental nuclear physicists. This report expands upon these key points and adds others. The results are based upon representative samples, called “case studies,” of the needs of science teams within NP. The case studies were prepared by NP workshop participants and contain a summary of science goals, methods of solution, current and future computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, “multi-core” environment that is expected to dominate HPC architectures over the next few years. The report also includes a section with NERSC responses to the workshop findings. NERSC has many initiatives already underway that address key workshop findings and all of the action items are aligned with NERSC strategic plans.

  6. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  7. A shared resource between declarative memory and motor memory.

    Science.gov (United States)

    Keisler, Aysha; Shadmehr, Reza

    2010-11-03

    The neural systems that support motor adaptation in humans are thought to be distinct from those that support the declarative system. Yet, during motor adaptation changes in motor commands are supported by a fast adaptive process that has important properties (rapid learning, fast decay) that are usually associated with the declarative system. The fast process can be contrasted to a slow adaptive process that also supports motor memory, but learns gradually and shows resistance to forgetting. Here we show that after people stop performing a motor task, the fast motor memory can be disrupted by a task that engages declarative memory, but the slow motor memory is immune from this interference. Furthermore, we find that the fast/declarative component plays a major role in the consolidation of the slow motor memory. Because of the competitive nature of declarative and nondeclarative memory during consolidation, impairment of the fast/declarative component leads to improvements in the slow/nondeclarative component. Therefore, the fast process that supports formation of motor memory is not only neurally distinct from the slow process, but it shares critical resources with the declarative memory system.

  8. A shared resource between declarative memory and motor memory

    Science.gov (United States)

    Keisler, Aysha; Shadmehr, Reza

    2010-01-01

    The neural systems that support motor adaptation in humans are thought to be distinct from those that support the declarative system. Yet, during motor adaptation changes in motor commands are supported by a fast adaptive process that has important properties (rapid learning, fast decay) that are usually associated with the declarative system. The fast process can be contrasted to a slow adaptive process that also supports motor memory, but learns gradually and shows resistance to forgetting. Here we show that after people stop performing a motor task, the fast motor memory can be disrupted by a task that engages declarative memory, but the slow motor memory is immune from this interference. Furthermore, we find that the fast/declarative component plays a major role in the consolidation of the slow motor memory. Because of the competitive nature of declarative and non-declarative memory during consolidation, impairment of the fast/declarative component leads to improvements in the slow/non-declarative component. Therefore, the fast process that supports formation of motor memory is not only neurally distinct from the slow process, but it shares critical resources with the declarative memory system. PMID:21048140

  9. Collaborative filtering to improve navigation of large radiology knowledge resources.

    Science.gov (United States)

    Kahn, Charles E

    2005-06-01

    Collaborative filtering is a knowledge-discovery technique that can help guide readers to items of potential interest based on the experience of prior users. This study sought to determine the impact of collaborative filtering on navigation of a large, Web-based radiology knowledge resource. Collaborative filtering was applied to a collection of 1,168 radiology hypertext documents available via the Internet. An item-based collaborative filtering algorithm identified each document's six most closely related documents based on 248,304 page views in an 18-day period. Documents were amended to include links to their related documents, and use was analyzed over the next 5 days. The mean number of documents viewed per visit increased from 1.57 to 1.74 (P Collaborative filtering can increase a radiology information resource's utilization and can improve its usefulness and ease of navigation. The technique holds promise for improving navigation of large Internet-based radiology knowledge resources.

  10. Large-scale retrieval for medical image analytics: A comprehensive review.

    Science.gov (United States)

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Large-scale parallel genome assembler over cloud computing environment.

    Science.gov (United States)

    Das, Arghya Kusum; Koppa, Praveen Kumar; Goswami, Sayan; Platania, Richard; Park, Seung-Jong

    2017-06-01

    The size of high throughput DNA sequencing data has already reached the terabyte scale. To manage this huge volume of data, many downstream sequencing applications started using locality-based computing over different cloud infrastructures to take advantage of elastic (pay as you go) resources at a lower cost. However, the locality-based programming model (e.g. MapReduce) is relatively new. Consequently, developing scalable data-intensive bioinformatics applications using this model and understanding the hardware environment that these applications require for good performance, both require further research. In this paper, we present a de Bruijn graph oriented Parallel Giraph-based Genome Assembler (GiGA), as well as the hardware platform required for its optimal performance. GiGA uses the power of Hadoop (MapReduce) and Giraph (large-scale graph analysis) to achieve high scalability over hundreds of compute nodes by collocating the computation and data. GiGA achieves significantly higher scalability with competitive assembly quality compared to contemporary parallel assemblers (e.g. ABySS and Contrail) over traditional HPC cluster. Moreover, we show that the performance of GiGA is significantly improved by using an SSD-based private cloud infrastructure over traditional HPC cluster. We observe that the performance of GiGA on 256 cores of this SSD-based cloud infrastructure closely matches that of 512 cores of traditional HPC cluster.

  12. Bi-Level Decentralized Active Power Control for Large-Scale Wind Farm Cluster

    DEFF Research Database (Denmark)

    Huang, Shengli; Wu, Qiuwei; Guo, Yifei

    2018-01-01

    This paper presents a bi-level decentralized active power control (DAPC) for a large-scale wind farm cluster, consisting of several wind farms for better active power dispatch. In the upper level, a distributed active power control scheme based on the distributed consensus is designed to achieve...... fair active power sharing among multiple wind farms, which generates the power reference for each wind farm. A distributed estimator is used to estimate the total available power of all wind farms. In the lower level, a centralized control scheme based on the Model Predictive Control (MPC) is proposed...... to regulate active power outputs of all wind turbines (WTs) within a wind farm, which reduces the fatigue loads of WTs while tracking the power reference obtained from the upper level control. A wind farm cluster with 8 wind farms and totally 160 WTs, was used to test the control performance of the proposed...

  13. Equitably sharing benefits from the utilization of natural genetic resources: the Brazilian interpretation of the Convention of Biological Diversity

    NARCIS (Netherlands)

    Pena-Neira, S.; Dieperink, C.; Addink, G.H.

    2002-01-01

    The utilization of natural genetic resources could yield great benefits. The Convention on Biological Diversity introduced a number of rules concerning the sharing of these benefits. However, the interpretation and application (legal implementation) of these rules is a matter of discussion among

  14. Photorealistic large-scale urban city model reconstruction.

    Science.gov (United States)

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite).

  15. A study of rotor and platform design trade-offs for large-scale floating vertical axis wind turbines

    Science.gov (United States)

    Griffith, D. Todd; Paquette, Joshua; Barone, Matthew; Goupee, Andrew J.; Fowler, Matthew J.; Bull, Diana; Owens, Brian

    2016-09-01

    Vertical axis wind turbines are receiving significant attention for offshore siting. In general, offshore wind offers proximity to large populations centers, a vast & more consistent wind resource, and a scale-up opportunity, to name a few beneficial characteristics. On the other hand, offshore wind suffers from high levelized cost of energy (LCOE) and in particular high balance of system (BoS) costs owing to accessibility challenges and limited project experience. To address these challenges associated with offshore wind, Sandia National Laboratories is researching large-scale (MW class) offshore floating vertical axis wind turbines (VAWTs). The motivation for this work is that floating VAWTs are a potential transformative technology solution to reduce offshore wind LCOE in deep-water locations. This paper explores performance and cost trade-offs within the design space for floating VAWTs between the configurations for the rotor and platform.

  16. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  17. Solving large scale unit dilemma in electricity system by applying commutative law

    Science.gov (United States)

    Legino, Supriadi; Arianto, Rakhmat

    2018-03-01

    The conventional system, pooling resources with large centralized power plant interconnected as a network. provides a lot of advantages compare to the isolated one include optimizing efficiency and reliability. However, such a large plant need a huge capital. In addition, more problems emerged to hinder the construction of big power plant as well as its associated transmission lines. By applying commutative law of math, ab = ba, for all a,b €-R, the problem associated with conventional system as depicted above, can be reduced. The idea of having small unit but many power plants, namely “Listrik Kerakyatan,” abbreviated as LK provides both social and environmental benefit that could be capitalized by using proper assumption. This study compares the cost and benefit of LK to those of conventional system, using simulation method to prove that LK offers alternative solution to answer many problems associated with the large system. Commutative Law of Algebra can be used as a simple mathematical model to analyze whether the LK system as an eco-friendly distributed generation can be applied to solve various problems associated with a large scale conventional system. The result of simulation shows that LK provides more value if its plants operate in less than 11 hours as peaker power plant or load follower power plant to improve load curve balance of the power system. The result of simulation indicates that the investment cost of LK plant should be optimized in order to minimize the plant investment cost. This study indicates that the benefit of economies of scale principle does not always apply to every condition, particularly if the portion of intangible cost and benefit is relatively high.

  18. Large-scale computation at PSI scientific achievements and future requirements

    International Nuclear Information System (INIS)

    Adelmann, A.; Markushin, V.

    2008-11-01

    ' (SNSP-HPCN) is discussing this complex. Scientific results which are made possible by PSI's engagement at CSCS (named Horizon) are summarised and PSI's future high-performance computing requirements are evaluated. The data collected shows the current situation and a 5 year extrapolation of the users' needs with respect to HPC resources is made. In consequence this report can serve as a basis for future strategic decisions with respect to a non-existing HPC road-map for PSI. PSI's institutional HPC area started hardware-wise approximately in 1999 with the assembly of a 32-processor LINUX cluster called Merlin. Merlin was upgraded several times, lastly in 2007. The Merlin cluster at PSI is used for small scale parallel jobs, and is the only general purpose computing system at PSI. Several dedicated small scale clusters followed the Merlin scheme. Many of the clusters are used to analyse data from experiments at PSI or CERN, because dedicated clusters are most efficient. The intellectual and financial involvement of the procurement (including a machine update in 2007) results in a PSI share of 25 % of the available computing resources at CSCS. The (over) usage of available computing resources by PSI scientists is demonstrated. We actually get more computing cycles than we have paid for. The reason is the fair share policy that is implemented on the Horizon machine. This policy allows us to get cycles, with a low priority, even when our bi-monthly share is used. Five important observations can be drawn from the analysis of the scientific output and the survey of future requirements of main PSI HPC users: (1) High Performance Computing is a main pillar in many important PSI research areas; (2) there is a lack in the order of 10 times the current computing resources (measured in available core-hours per year); (3) there is a trend to use in the order of 600 processors per average production run; (4) the disk and tape storage growth is dramatic; (5) small HPC clusters located

  19. Large-scale computation at PSI scientific achievements and future requirements

    Energy Technology Data Exchange (ETDEWEB)

    Adelmann, A.; Markushin, V

    2008-11-15

    and Networking' (SNSP-HPCN) is discussing this complex. Scientific results which are made possible by PSI's engagement at CSCS (named Horizon) are summarised and PSI's future high-performance computing requirements are evaluated. The data collected shows the current situation and a 5 year extrapolation of the users' needs with respect to HPC resources is made. In consequence this report can serve as a basis for future strategic decisions with respect to a non-existing HPC road-map for PSI. PSI's institutional HPC area started hardware-wise approximately in 1999 with the assembly of a 32-processor LINUX cluster called Merlin. Merlin was upgraded several times, lastly in 2007. The Merlin cluster at PSI is used for small scale parallel jobs, and is the only general purpose computing system at PSI. Several dedicated small scale clusters followed the Merlin scheme. Many of the clusters are used to analyse data from experiments at PSI or CERN, because dedicated clusters are most efficient. The intellectual and financial involvement of the procurement (including a machine update in 2007) results in a PSI share of 25 % of the available computing resources at CSCS. The (over) usage of available computing resources by PSI scientists is demonstrated. We actually get more computing cycles than we have paid for. The reason is the fair share policy that is implemented on the Horizon machine. This policy allows us to get cycles, with a low priority, even when our bi-monthly share is used. Five important observations can be drawn from the analysis of the scientific output and the survey of future requirements of main PSI HPC users: (1) High Performance Computing is a main pillar in many important PSI research areas; (2) there is a lack in the order of 10 times the current computing resources (measured in available core-hours per year); (3) there is a trend to use in the order of 600 processors per average production run; (4) the disk and tape storage growth

  20. Secure-Network-Coding-Based File Sharing via Device-to-Device Communication

    Directory of Open Access Journals (Sweden)

    Lei Wang

    2017-01-01

    Full Text Available In order to increase the efficiency and security of file sharing in the next-generation networks, this paper proposes a large scale file sharing scheme based on secure network coding via device-to-device (D2D communication. In our scheme, when a user needs to share data with others in the same area, the source node and all the intermediate nodes need to perform secure network coding operation before forwarding the received data. This process continues until all the mobile devices in the networks successfully recover the original file. The experimental results show that secure network coding is very feasible and suitable for such file sharing. Moreover, the sharing efficiency and security outperform traditional replication-based sharing scheme.