WorldWideScience

Sample records for reservoir cache county

  1. Feasibility Report and Environmental Statement for Water Resources Development, Cache Creek Basin, California

    Science.gov (United States)

    1979-02-01

    classified as Porno , Lake Miwok, and Patwin. Recent surveys within the Clear Lake-Cache Creek Basin have located 28 archeological sites, some of which...additional 8,400 acre-feet annually to the Lakeport area. Porno Reservoir on Kelsey Creek, being studied by Lake County, also would supplement M&l water...project on Scotts Creek could provide 9,100 acre- feet annually of irrigation water. Also, as previously discussed, Porno Reservoir would furnish

  2. Application of computer graphics to generate coal resources of the Cache coal bed, Recluse geologic model area, Campbell County, Wyoming

    Science.gov (United States)

    Schneider, G.B.; Crowley, S.S.; Carey, M.A.

    1982-01-01

    Low-sulfur subbituminous coal resources have been calculated, using both manual and computer methods, for the Cache coal bed in the Recluse Model Area, which covers the White Tail Butte, Pitch Draw, Recluse, and Homestead Draw SW 7 1/2 minute quadrangles, Campbell County, Wyoming. Approximately 275 coal thickness measurements obtained from drill hole data are evenly distributed throughout the area. The Cache coal and associated beds are in the Paleocene Tongue River Member of the Fort Union Formation. The depth from the surface to the Cache bed ranges from 269 to 1,257 feet. The thickness of the coal is as much as 31 feet, but in places the Cache coal bed is absent. Comparisons between hand-drawn and computer-generated isopach maps show minimal differences. Total coal resources calculated by computer show the bed to contain 2,316 million short tons or about 6.7 percent more than the hand-calculated figure of 2,160 million short tons.

  3. Web Caching

    Indian Academy of Sciences (India)

    leveraged through Web caching technology. Specifically, Web caching becomes an ... Web routing can improve the overall performance of the Internet. Web caching is similar to memory system caching - a Web cache stores Web resources in ...

  4. CryptoCache: A Secure Sharable File Cache for Roaming Users

    DEFF Research Database (Denmark)

    Jensen, Christian D.

    2000-01-01

    . Conventional distributed file systems cache everything locally or not at all; there is no possibility to cache files on nearby nodes.In this paper we present the design of a secure cache system called CryptoCache that allows roaming users to cache files on untrusted file hosting servers. The system allows...... flexible sharing of cached files among unauthenticated users, i.e. unlike most distributed file systems CryptoCache does not require a global authentication framework.Files are encrypted when they are transferred over the network and while stored on untrusted servers. The system uses public key......Small mobile computers are now sufficiently powerful to run many applications, but storage capacity remains limited so working files cannot be cached or stored locally. Even if files can be stored locally, the mobile device is not powerful enough to act as server in collaborations with other users...

  5. Geochemistry of mercury and other constituents in subsurface sediment—Analyses from 2011 and 2012 coring campaigns, Cache Creek Settling Basin, Yolo County, California

    Science.gov (United States)

    Arias, Michelle R.; Alpers, Charles N.; Marvin-DiPasquale, Mark C.; Fuller, Christopher C.; Agee, Jennifer L.; Sneed, Michelle; Morita, Andrew Y.; Salas, Antonia

    2017-10-31

    Cache Creek Settling Basin was constructed in 1937 to trap sediment from Cache Creek before delivery to the Yolo Bypass, a flood conveyance for the Sacramento River system that is tributary to the Sacramento–San Joaquin Delta. Sediment management options being considered by stakeholders in the Cache Creek Settling Basin include sediment excavation; however, that could expose sediments containing elevated mercury concentrations from historical mercury mining in the watershed. In cooperation with the California Department of Water Resources, the U.S. Geological Survey undertook sediment coring campaigns in 2011–12 (1) to describe lateral and vertical distributions of mercury concentrations in deposits of sediment in the Cache Creek Settling Basin and (2) to improve constraint of estimates of the rate of sediment deposition in the basin.Sediment cores were collected in the Cache Creek Settling Basin, Yolo County, California, during October 2011 at 10 locations and during August 2012 at 5 other locations. Total core depths ranged from approximately 4.6 to 13.7 meters (15 to 45 feet), with penetration to about 9.1 meters (30 feet) at most locations. Unsplit cores were logged for two geophysical parameters (gamma bulk density and magnetic susceptibility); then, selected cores were split lengthwise. One half of each core was then photographed and archived, and the other half was subsampled. Initial subsamples from the cores (20-centimeter composite samples from five predetermined depths in each profile) were analyzed for total mercury, methylmercury, total reduced sulfur, iron speciation, organic content (as the percentage of weight loss on ignition), and grain-size distribution. Detailed follow-up subsampling (3-centimeter intervals) was done at six locations along an east-west transect in the southern part of the Cache Creek Settling Basin and at one location in the northern part of the basin for analyses of total mercury; organic content; and cesium-137, which was

  6. Caching Patterns and Implementation

    Directory of Open Access Journals (Sweden)

    Octavian Paul ROTARU

    2006-01-01

    Full Text Available Repetitious access to remote resources, usually data, constitutes a bottleneck for many software systems. Caching is a technique that can drastically improve the performance of any database application, by avoiding multiple read operations for the same data. This paper addresses the caching problems from a pattern perspective. Both Caching and caching strategies, like primed and on demand, are presented as patterns and a pattern-based flexible caching implementation is proposed.The Caching pattern provides method of expensive resources reacquisition circumvention. Primed Cache pattern is applied in situations in which the set of required resources, or at least a part of it, can be predicted, while Demand Cache pattern is applied whenever the resources set required cannot be predicted or is unfeasible to be buffered.The advantages and disadvantages of all the caching patterns presented are also discussed, and the lessons learned are applied in the implementation of the pattern-based flexible caching solution proposed.

  7. CacheCard : Caching static and dynamic content on the NIC

    NARCIS (Netherlands)

    Bos, Herbert; Huang, Kaiming

    2009-01-01

    CacheCard is a NIC-based cache for static and dynamic web content in a way that allows for implementation on simple devices like NICs. It requires neither understanding of the way dynamic data is generated, nor execution of scripts on the cache. By monitoring file system activity and potential

  8. A method cache for Patmos

    DEFF Research Database (Denmark)

    Degasperi, Philipp; Hepp, Stefan; Puffitsch, Wolfgang

    2014-01-01

    For real-time systems we need time-predictable processors. This paper presents a method cache as a time-predictable solution for instruction caching. The method cache caches whole methods (or functions) and simplifies worst-case execution time analysis. We have integrated the method cache...... in the time-predictable processor Patmos. We evaluate the method cache with a large set of embedded benchmarks. Most benchmarks show a good hit rate for a method cache size in the range between 4 and 16 KB....

  9. A Time-predictable Stack Cache

    DEFF Research Database (Denmark)

    Abbaspour, Sahar; Brandner, Florian; Schoeberl, Martin

    2013-01-01

    Real-time systems need time-predictable architectures to support static worst-case execution time (WCET) analysis. One architectural feature, the data cache, is hard to analyze when different data areas (e.g., heap allocated and stack allocated data) share the same cache. This sharing leads to le...... of a cache for stack allocated data. Our port of the LLVM C++ compiler supports the management of the stack cache. The combination of stack cache instructions and the hardware implementation of the stack cache is a further step towards timepredictable architectures.......Real-time systems need time-predictable architectures to support static worst-case execution time (WCET) analysis. One architectural feature, the data cache, is hard to analyze when different data areas (e.g., heap allocated and stack allocated data) share the same cache. This sharing leads to less...... precise results of the cache analysis part of the WCET analysis. Splitting the data cache for different data areas enables composable data cache analysis. The WCET analysis tool can analyze the accesses to these different data areas independently. In this paper we present the design and implementation...

  10. Assessment of managed aquifer recharge at Sand Hollow Reservoir, Washington County, Utah, updated to conditions through 2014

    Science.gov (United States)

    Marston, Thomas M.; Heilweil, Victor M.

    2016-09-08

    Sand Hollow Reservoir in Washington County, Utah, was completed in March 2002 and is operated primarily for managed aquifer recharge by the Washington County Water Conservancy District. From 2002 through 2014, diversions of about 216,000 acre-feet from the Virgin River to Sand Hollow Reservoir have allowed the reservoir to remain nearly full since 2006. Groundwater levels in monitoring wells near the reservoir rose through 2006 and have fluctuated more recently because of variations in reservoir stage and nearby pumping from production wells. Between 2004 and 2014, about 29,000 acre-feet of groundwater was withdrawn by these wells for municipal supply. In addition, about 31,000 acre-feet of shallow seepage was captured by French drains adjacent to the North and West Dams and used for municipal supply, irrigation, or returned to the reservoir. From 2002 through 2014, about 127,000 acre-feet of water seeped beneath the reservoir to recharge the underlying Navajo Sandstone aquifer.Water quality continued to be monitored at various wells in Sand Hollow during 2013–14 to evaluate the timing and location of reservoir recharge as it moved through the aquifer. Changing geochemical conditions at monitoring wells WD 4 and WD 12 indicate rising groundwater levels and mobilization of vadose-zone salts, which could be a precursor to the arrival of reservoir recharge.

  11. Assessment of managed aquifer recharge from Sand Hollow Reservoir, Washington County, Utah, updated to conditions in 2010

    Science.gov (United States)

    Heilweil, Victor M.; Marston, Thomas M.

    2011-01-01

    Sand Hollow Reservoir in Washington County, Utah, was completed in March 2002 and is operated primarily for managed aquifer recharge by the Washington County Water Conservancy District. From 2002 through 2009, total surface-water diversions of about 154,000 acre-feet to Sand Hollow Reservoir have allowed it to remain nearly full since 2006. Groundwater levels in monitoring wells near the reservoir rose through 2006 and have fluctuated more recently because of variations in reservoir water-level altitude and nearby pumping from production wells. Between 2004 and 2009, a total of about 13,000 acre-feet of groundwater has been withdrawn by these wells for municipal supply. In addition, a total of about 14,000 acre-feet of shallow seepage was captured by French drains adjacent to the North and West Dams and used for municipal supply, irrigation, or returned to the reservoir.From 2002 through 2009, about 86,000 acre-feet of water seeped beneath the reservoir to recharge the underlying Navajo Sandstone aquifer. Water-quality sampling was conducted at various monitoring wells in Sand Hollow to evaluate the timing and location of reservoir recharge moving through the aquifer. Tracers of reservoir recharge include major and minor dissolved inorganic ions, tritium, dissolved organic carbon, chlorofluorocarbons, sulfur hexafluoride, and noble gases. By 2010, this recharge arrived at monitoring wells within about 1,000 feet of the reservoir.

  12. Don't make cache too complex: A simple probability-based cache management scheme for SSDs.

    Directory of Open Access Journals (Sweden)

    Seungjae Baek

    Full Text Available Solid-state drives (SSDs have recently become a common storage component in computer systems, and they are fueled by continued bit cost reductions achieved with smaller feature sizes and multiple-level cell technologies. However, as the flash memory stores more bits per cell, the performance and reliability of the flash memory degrade substantially. To solve this problem, a fast non-volatile memory (NVM-based cache has been employed within SSDs to reduce the long latency required to write data. Absorbing small writes in a fast NVM cache can also reduce the number of flash memory erase operations. To maximize the benefits of an NVM cache, it is important to increase the NVM cache utilization. In this paper, we propose and study ProCache, a simple NVM cache management scheme, that makes cache-entrance decisions based on random probability testing. Our scheme is motivated by the observation that frequently written hot data will eventually enter the cache with a high probability, and that infrequently accessed cold data will not enter the cache easily. Owing to its simplicity, ProCache is easy to implement at a substantially smaller cost than similar previously studied techniques. We evaluate ProCache and conclude that it achieves comparable performance compared to a more complex reference counter-based cache-management scheme.

  13. Assessment of managed aquifer recharge at Sand Hollow Reservoir, Washington County, Utah, updated to conditions in 2012

    Science.gov (United States)

    Marston, Thomas M.; Heilweil, Victor M.

    2013-01-01

    Sand Hollow Reservoir in Washington County, Utah, was completed in March 2002 and is operated primarily for managed aquifer recharge by the Washington County Water Conservancy District. From 2002 through 2011, surface-water diversions of about 199,000 acre-feet to Sand Hollow Reservoir have allowed the reservoir to remain nearly full since 2006. Groundwater levels in monitoring wells near the reservoir rose through 2006 and have fluctuated more recently because of variations in reservoir altitude and nearby pumping from production wells. Between 2004 and 2011, a total of about 19,000 acre-feet of groundwater was withdrawn by these wells for municipal supply. In addition, a total of about 21,000 acre-feet of shallow seepage was captured by French drains adjacent to the North and West Dams and used for municipal supply, irrigation, or returned to the reservoir. From 2002 through 2011, about 106,000 acre-feet of water seeped beneath the reservoir to recharge the underlying Navajo Sandstone aquifer. Water quality was sampled at various monitoring wells in Sand Hollow to evaluate the timing and location of reservoir recharge as it moved through the aquifer. Tracers of reservoir recharge include major and minor dissolved inorganic ions, tritium, dissolved organic carbon, chlorofluorocarbons, sulfur hexafluoride, and noble gases. By 2012, this recharge arrived at four monitoring wells located within about 1,000 feet of the reservoir. Changing geochemical conditions at five other monitoring wells could indicate other processes, such as changing groundwater levels and mobilization of vadose-zone salts, rather than arrival of reservoir recharge.

  14. Maintaining Web Cache Coherency

    Directory of Open Access Journals (Sweden)

    2000-01-01

    Full Text Available Document coherency is a challenging problem for Web caching. Once the documents are cached throughout the Internet, it is often difficult to keep them coherent with the origin document without generating a new traffic that could increase the traffic on the international backbone and overload the popular servers. Several solutions have been proposed to solve this problem, among them two categories have been widely discussed: the strong document coherency and the weak document coherency. The cost and the efficiency of the two categories are still a controversial issue, while in some studies the strong coherency is far too expensive to be used in the Web context, in other studies it could be maintained at a low cost. The accuracy of these analysis is depending very much on how the document updating process is approximated. In this study, we compare some of the coherence methods proposed for Web caching. Among other points, we study the side effects of these methods on the Internet traffic. The ultimate goal is to study the cache behavior under several conditions, which will cover some of the factors that play an important role in the Web cache performance evaluation and quantify their impact on the simulation accuracy. The results presented in this study show indeed some differences in the outcome of the simulation of a Web cache depending on the workload being used, and the probability distribution used to approximate updates on the cached documents. Each experiment shows two case studies that outline the impact of the considered parameter on the performance of the cache.

  15. Time-predictable Stack Caching

    DEFF Research Database (Denmark)

    Abbaspourseyedi, Sahar

    completely. Thus, in systems with hard deadlines the worst-case execution time (WCET) of the real-time software running on them needs to be bounded. Modern architectures use features such as pipelining and caches for improving the average performance. These features, however, make the WCET analysis more...... addresses, provides an opportunity to predict and tighten the WCET of accesses to data in caches. In this thesis, we introduce the time-predictable stack cache design and implementation within a time-predictable processor. We introduce several optimizations to our design for tightening the WCET while...... keeping the timepredictability of the design intact. Moreover, we provide a solution for reducing the cost of context switching in a system using the stack cache. In design of these caches, we use custom hardware and compiler support for delivering time-predictable stack data accesses. Furthermore...

  16. Cache-Aware and Cache-Oblivious Adaptive Sorting

    DEFF Research Database (Denmark)

    Brodal, Gerth Stølting; Fagerberg, Rolf; Moruz, Gabriel

    2005-01-01

    Two new adaptive sorting algorithms are introduced which perform an optimal number of comparisons with respect to the number of inversions in the input. The first algorithm is based on a new linear time reduction to (non-adaptive) sorting. The second algorithm is based on a new division protocol...... for the GenericSort algorithm by Estivill-Castro and Wood. From both algorithms we derive I/O-optimal cache-aware and cache-oblivious adaptive sorting algorithms. These are the first I/O-optimal adaptive sorting algorithms....

  17. Data cache organization for accurate timing analysis

    DEFF Research Database (Denmark)

    Schoeberl, Martin; Huber, Benedikt; Puffitsch, Wolfgang

    2013-01-01

    it is important to classify memory accesses as either cache hit or cache miss. The addresses of instruction fetches are known statically and static cache hit/miss classification is possible for the instruction cache. The access to data that is cached in the data cache is harder to predict statically. Several...

  18. Research on Cache Placement in ICN

    Directory of Open Access Journals (Sweden)

    Yu Zhang

    2017-08-01

    Full Text Available Ubiquitous in-network caching is one of key features of Information Centric Network, together with receiver-drive content retrieval paradigm, Information Centric Network is better support for content distribution, multicast, mobility, etc. Cache placement strategy is crucial to improving utilization of cache space and reducing the occupation of link bandwidth. Most of the literature about caching policies considers the overall cost and bandwidth, but ignores the limits of node cache capacity. This paper proposes a G-FMPH algorithm which takes into ac-count both constrains on the link bandwidth and the cache capacity of nodes. Our algorithm aims at minimizing the overall cost of contents caching afterwards. The simulation results have proved that our proposed algorithm has a better performance.

  19. Cache-Oblivious Mesh Layouts

    International Nuclear Information System (INIS)

    Yoon, S; Lindstrom, P; Pascucci, V; Manocha, D

    2005-01-01

    We present a novel method for computing cache-oblivious layouts of large meshes that improve the performance of interactive visualization and geometric processing algorithms. Given that the mesh is accessed in a reasonably coherent manner, we assume no particular data access patterns or cache parameters of the memory hierarchy involved in the computation. Furthermore, our formulation extends directly to computing layouts of multi-resolution and bounding volume hierarchies of large meshes. We develop a simple and practical cache-oblivious metric for estimating cache misses. Computing a coherent mesh layout is reduced to a combinatorial optimization problem. We designed and implemented an out-of-core multilevel minimization algorithm and tested its performance on unstructured meshes composed of tens to hundreds of millions of triangles. Our layouts can significantly reduce the number of cache misses. We have observed 2-20 times speedups in view-dependent rendering, collision detection, and isocontour extraction without any modification of the algorithms or runtime applications

  20. On the Limits of Cache-Obliviousness

    DEFF Research Database (Denmark)

    Brodal, Gerth Stølting; Fagerberg, Rolf

    2003-01-01

    In this paper, we present lower bounds for permuting and sorting in the cache-oblivious model. We prove that (1) I/O optimal cache-oblivious comparison based sorting is not possible without a tall cache assumption, and (2) there does not exist an I/O optimal cache-oblivious algorithm for permutin...

  1. Optimizing Maintenance of Constraint-Based Database Caches

    Science.gov (United States)

    Klein, Joachim; Braun, Susanne

    Caching data reduces user-perceived latency and often enhances availability in case of server crashes or network failures. DB caching aims at local processing of declarative queries in a DBMS-managed cache close to the application. Query evaluation must produce the same results as if done at the remote database backend, which implies that all data records needed to process such a query must be present and controlled by the cache, i. e., to achieve “predicate-specific” loading and unloading of such record sets. Hence, cache maintenance must be based on cache constraints such that “predicate completeness” of the caching units currently present can be guaranteed at any point in time. We explore how cache groups can be maintained to provide the data currently needed. Moreover, we design and optimize loading and unloading algorithms for sets of records keeping the caching units complete, before we empirically identify the costs involved in cache maintenance.

  2. Cache-Oblivious Algorithms and Data Structures

    DEFF Research Database (Denmark)

    Brodal, Gerth Stølting

    2004-01-01

    Frigo, Leiserson, Prokop and Ramachandran in 1999 introduced the ideal-cache model as a formal model of computation for developing algorithms in environments with multiple levels of caching, and coined the terminology of cache-oblivious algorithms. Cache-oblivious algorithms are described...... as standard RAM algorithms with only one memory level, i.e. without any knowledge about memory hierarchies, but are analyzed in the two-level I/O model of Aggarwal and Vitter for an arbitrary memory and block size and an optimal off-line cache replacement strategy. The result are algorithms that automatically...... apply to multi-level memory hierarchies. This paper gives an overview of the results achieved on cache-oblivious algorithms and data structures since the seminal paper by Frigo et al....

  3. Arsenic in freshwater fish in the Chihuahua County water reservoirs (Mexico).

    Science.gov (United States)

    Nevárez, Myrna; Moreno, Myriam Verónica; Sosa, Manuel; Bundschuh, Jochen

    2011-01-01

    Water reservoirs in Chihuahua County, Mexico, are affected by some punctual and non-punctual geogenic and anthropogenic pollution sources; fish are located at the top of the food chain and are good indicators for the ecosystems pollution. The study goal was to: (i) determine arsenic concentration in fish collected from the Chuviscar, Chihuahua, San Marcos and El Rejon water reservoirs; (ii) to assess if the fishes are suitable for human consumption and (iii) link the arsenic contents in fish with those in sediment and water reported in studies made the same year for these water reservoirs. Sampling was done in summer, fall and winter. The highest arsenic concentration in the species varied through the sampling periods: Channel catfish (Ictalurus punctatus) with 0.22 ± 0.15 mg/kg dw in winter and Green sunfish (Lepomis cyanellus) with 2.00 ± 0.15 mg/kg dw in summer in El Rejon water reservoir. A positive correlation of arsenic contents was found through all sampling seasons in fish samples and the samples of sediment and water. The contribution of the weekly intake of inorganic arsenic, based on the consumption of 0.245 kg fish muscles/body weight/week was found lower than the acceptable weekly intake of 0.015 mg/kg/body weight for inorganic arsenic suggested by FAO/WHO.

  4. Bathymetric maps and water-quality profiles of Table Rock and North Saluda Reservoirs, Greenville County, South Carolina

    Science.gov (United States)

    Clark, Jimmy M.; Journey, Celeste A.; Nagle, Doug D.; Lanier, Timothy H.

    2014-01-01

    Lakes and reservoirs are the water-supply source for many communities. As such, water-resource managers that oversee these water supplies require monitoring of the quantity and quality of the resource. Monitoring information can be used to assess the basic conditions within the reservoir and to establish a reliable estimate of storage capacity. In April and May 2013, a global navigation satellite system receiver and fathometer were used to collect bathymetric data, and an autonomous underwater vehicle was used to collect water-quality and bathymetric data at Table Rock Reservoir and North Saluda Reservoir in Greenville County, South Carolina. These bathymetric data were used to create a bathymetric contour map and stage-area and stage-volume relation tables for each reservoir. Additionally, statistical summaries of the water-quality data were used to provide a general description of water-quality conditions in the reservoirs.

  5. Web cache location

    Directory of Open Access Journals (Sweden)

    Boffey Brian

    2004-01-01

    Full Text Available Stress placed on network infrastructure by the popularity of the World Wide Web may be partially relieved by keeping multiple copies of Web documents at geographically dispersed locations. In particular, use of proxy caches and replication provide a means of storing information 'nearer to end users'. This paper concentrates on the locational aspects of Web caching giving both an overview, from an operational research point of view, of existing research and putting forward avenues for possible further research. This area of research is in its infancy and the emphasis will be on themes and trends rather than on algorithm construction. Finally, Web caching problems are briefly related to referral systems more generally.

  6. Caching web service for TICF project

    International Nuclear Information System (INIS)

    Pais, V.F.; Stancalie, V.

    2008-01-01

    A caching web service was developed to allow caching of any object to a network cache, presented in the form of a web service. This application was used to increase the speed of previously implemented web services and for new ones. Various tests were conducted to determine the impact of using this caching web service in the existing network environment and where it should be placed in order to achieve the greatest increase in performance. Since the cache is presented to applications as a web service, it can also be used for remote access to stored data and data sharing between applications

  7. dCache, agile adoption of storage technology

    Energy Technology Data Exchange (ETDEWEB)

    Millar, A. P. [Hamburg U.; Baranova, T. [Hamburg U.; Behrmann, G. [Unlisted, DK; Bernardt, C. [Hamburg U.; Fuhrmann, P. [Hamburg U.; Litvintsev, D. O. [Fermilab; Mkrtchyan, T. [Hamburg U.; Petersen, A. [Hamburg U.; Rossi, A. [Fermilab; Schwank, K. [Hamburg U.

    2012-01-01

    For over a decade, dCache has been synonymous with large-capacity, fault-tolerant storage using commodity hardware that supports seamless data migration to and from tape. In this paper we provide some recent news of changes within dCache and the community surrounding it. We describe the flexible nature of dCache that allows both externally developed enhancements to dCache facilities and the adoption of new technologies. Finally, we present information about avenues the dCache team is exploring for possible future improvements in dCache.

  8. Test data generation for LRU cache-memory testing

    OpenAIRE

    Evgeni, Kornikhin

    2009-01-01

    System functional testing of microprocessors deals with many assembly programs of given behavior. The paper proposes new constraint-based algorithm of initial cache-memory contents generation for given behavior of assembly program (with cache misses and hits). Although algorithm works for any types of cache-memory, the paper describes algorithm in detail for basis types of cache-memory only: fully associative cache and direct mapped cache.

  9. MESI Cache Coherence Simulator for Teaching Purposes

    OpenAIRE

    Gómez Luna, Juan; Herruzo Gómez, Ezequiel; Benavides Benítez, José Ignacio

    2009-01-01

    Nowadays, the computational systems (multi and uniprocessors) need to avoid the cache coherence problem. There are some techniques to solve this problem. The MESI cache coherence protocol is one of them. This paper presents a simulator of the MESI protocol which is used for teaching the cache memory coherence on the computer systems with hierarchical memory system and for explaining the process of the cache memory location in multilevel cache memory systems. The paper shows a d...

  10. Efficient sorting using registers and caches

    DEFF Research Database (Denmark)

    Wickremesinghe, Rajiv; Arge, Lars Allan; Chase, Jeffrey S.

    2002-01-01

    . Inadequate models lead to poor algorithmic choices and an incomplete understanding of algorithm behavior on real machines.A key step toward developing better models is to quantify the performance effects of features not reflected in the models. This paper explores the effect of memory system features...... on sorting performance. We introduce a new cache-conscious sorting algorithm, R-MERGE, which achieves better performance in practice over algorithms that are superior in the theoretical models. R-MERGE is designed to minimize memory stall cycles rather than cache misses by considering features common to many......Modern computer systems have increasingly complex memory systems. Common machine models for algorithm analysis do not reflect many of the features of these systems, e.g., large register sets, lockup-free caches, cache hierarchies, associativity, cache line fetching, and streaming behavior...

  11. Cache-aware network-on-chip for chip multiprocessors

    Science.gov (United States)

    Tatas, Konstantinos; Kyriacou, Costas; Dekoulis, George; Demetriou, Demetris; Avraam, Costas; Christou, Anastasia

    2009-05-01

    This paper presents the hardware prototype of a Network-on-Chip (NoC) for a chip multiprocessor that provides support for cache coherence, cache prefetching and cache-aware thread scheduling. A NoC with support to these cache related mechanisms can assist in improving systems performance by reducing the cache miss ratio. The presented multi-core system employs the Data-Driven Multithreading (DDM) model of execution. In DDM thread scheduling is done according to data availability, thus the system is aware of the threads to be executed in the near future. This characteristic of the DDM model allows for cache aware thread scheduling and cache prefetching. The NoC prototype is a crossbar switch with output buffering that can support a cache-aware 4-node chip multiprocessor. The prototype is built on the Xilinx ML506 board equipped with a Xilinx Virtex-5 FPGA.

  12. Store operations to maintain cache coherence

    Energy Technology Data Exchange (ETDEWEB)

    Evangelinos, Constantinos; Nair, Ravi; Ohmacht, Martin

    2017-08-01

    In one embodiment, a computer-implemented method includes encountering a store operation during a compile-time of a program, where the store operation is applicable to a memory line. It is determined, by a computer processor, that no cache coherence action is necessary for the store operation. A store-without-coherence-action instruction is generated for the store operation, responsive to determining that no cache coherence action is necessary. The store-without-coherence-action instruction specifies that the store operation is to be performed without a cache coherence action, and cache coherence is maintained upon execution of the store-without-coherence-action instruction.

  13. Store operations to maintain cache coherence

    Energy Technology Data Exchange (ETDEWEB)

    Evangelinos, Constantinos; Nair, Ravi; Ohmacht, Martin

    2017-09-12

    In one embodiment, a computer-implemented method includes encountering a store operation during a compile-time of a program, where the store operation is applicable to a memory line. It is determined, by a computer processor, that no cache coherence action is necessary for the store operation. A store-without-coherence-action instruction is generated for the store operation, responsive to determining that no cache coherence action is necessary. The store-without-coherence-action instruction specifies that the store operation is to be performed without a cache coherence action, and cache coherence is maintained upon execution of the store-without-coherence-action instruction.

  14. Assessment of managed aquifer recharge at Sand Hollow Reservoir, Washington County, Utah, updated to conditions through 2007

    Science.gov (United States)

    Heilweil, Victor M.; Ortiz, Gema; Susong, David D.

    2009-01-01

    Sand Hollow Reservoir in Washington County, Utah, was completed in March 2002 and is operated primarily as an aquifer storage and recovery project by the Washington County Water Conservancy District (WCWCD). Since its inception in 2002 through 2007, surface-water diversions of about 126,000 acre-feet to Sand Hollow Reservoir have resulted in a generally rising reservoir stage and surface area. Large volumes of runoff during spring 2005-06 allowed the WCWCD to fill the reservoir to a total storage capacity of more than 50,000 acre-feet, with a corresponding surface area of about 1,300 acres and reservoir stage of about 3,060 feet during 2006. During 2007, reservoir stage generally decreased to about 3,040 feet with a surface-water storage volume of about 30,000 acre-feet. Water temperature in the reservoir shows large seasonal variation and has ranged from about 3 to 30 deg C from 2003 through 2007. Except for anomalously high recharge rates during the first year when the vadose zone beneath the reservoir was becoming saturated, estimated ground-water recharge rates have ranged from 0.01 to 0.09 feet per day. Estimated recharge volumes have ranged from about 200 to 3,500 acre-feet per month from March 2002 through December 2007. Total ground-water recharge during the same period is estimated to have been about 69,000 acre-feet. Estimated evaporation rates have varied from 0.04 to 0.97 feet per month, resulting in evaporation losses of 20 to 1,200 acre-feet per month. Total evaporation from March 2002 through December 2007 is estimated to have been about 25,000 acre-feet. Results of water-quality sampling at monitoring wells indicate that by 2007, managed aquifer recharge had arrived at sites 37 and 36, located 60 and 160 feet from the reservoir, respectively. However, different peak arrival dates for specific conductance, chloride, chloride/bromide ratios, dissolved oxygen, and total dissolved-gas pressures at each monitoring well indicate the complicated nature of

  15. Cache management of tape files in mass storage system

    International Nuclear Information System (INIS)

    Cheng Yaodong; Ma Nan; Yu Chuansong; Chen Gang

    2006-01-01

    This paper proposes the group-cooperative caching policy according to the characteristics of tapes and requirements of high energy physics domain. This policy integrates the advantages of traditional local caching and cooperative caching on basis of cache model. It divides cache into independent groups; the same group of cache is made of cooperating disks on network. This paper also analyzes the directory management, update algorithm and cache consistency of the policy. The experiment shows the policy can meet the requirements of data processing and mass storage in high energy physics domain very well. (authors)

  16. Reducing Competitive Cache Misses in Modern Processor Architectures

    OpenAIRE

    Prisagjanec, Milcho; Mitrevski, Pece

    2017-01-01

    The increasing number of threads inside the cores of a multicore processor, and competitive access to the shared cache memory, become the main reasons for an increased number of competitive cache misses and performance decline. Inevitably, the development of modern processor architectures leads to an increased number of cache misses. In this paper, we make an attempt to implement a technique for decreasing the number of competitive cache misses in the first level of cache memory. This tec...

  17. Temperature and Discharge on a Highly Altered Stream in Utah's Cache Valley

    OpenAIRE

    Pappas, Andy

    2013-01-01

    To study the River Continuum Concept (RCC) and the Serial Discontinuity Hypothesis (SDH), I looked at temperature and discharge changes along 52 km of the Little Bear River in Cache Valley, Utah. The Little Bear River is a fourth order stream with one major reservoir, a number of irrigation diversions, and one major tributary, the East Fork of the Little Bear River. Discharge data was collected at six sites on 29 September 2012 and temperature data was collected hourly at eleven sites from 1 ...

  18. Software trace cache

    OpenAIRE

    Ramírez Bellido, Alejandro; Larriba Pey, Josep; Valero Cortés, Mateo

    2005-01-01

    We explore the use of compiler optimizations, which optimize the layout of instructions in memory. The target is to enable the code to make better use of the underlying hardware resources regardless of the specific details of the processor/architecture in order to increase fetch performance. The Software Trace Cache (STC) is a code layout algorithm with a broader target than previous layout optimizations. We target not only an improvement in the instruction cache hit rate, but also an increas...

  19. The dCache scientific storage cloud

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    For over a decade, the dCache team has provided software for handling big data for a diverse community of scientists. The team has also amassed a wealth of operational experience from using this software in production. With this experience, the team have refined dCache with the goal of providing a "scientific cloud": a storage solution that satisfies all requirements of a user community by exposing different facets of dCache with which users interact. Recent development, as part of this "scientific cloud" vision, has introduced a new facet: a sync-and-share service, often referred to as "dropbox-like storage". This work has been strongly focused on local requirements, but will be made available in future releases of dCache allowing others to adopt dCache solutions. In this presentation we will outline the current status of the work: both the successes and limitations, and the direction and time-scale of future work.

  20. A Distributed Cache Update Deployment Strategy in CDN

    Science.gov (United States)

    E, Xinhua; Zhu, Binjie

    2018-04-01

    The CDN management system distributes content objects to the edge of the internet to achieve the user's near access. Cache strategy is an important problem in network content distribution. A cache strategy was designed in which the content effective diffusion in the cache group, so more content was storage in the cache, and it improved the group hit rate.

  1. Search-Order Independent State Caching

    DEFF Research Database (Denmark)

    Evangelista, Sami; Kristensen, Lars Michael

    2010-01-01

    State caching is a memory reduction technique used by model checkers to alleviate the state explosion problem. It has traditionally been coupled with a depth-first search to ensure termination.We propose and experimentally evaluate an extension of the state caching method for general state...

  2. Cache Management of Big Data in Equipment Condition Assessment

    Directory of Open Access Journals (Sweden)

    Ma Yan

    2016-01-01

    Full Text Available Big data platform for equipment condition assessment is built for comprehensive analysis. The platform has various application demands. According to its response time, its application can be divided into offline, interactive and real-time types. For real-time application, its data processing efficiency is important. In general, data cache is one of the most efficient ways to improve query time. However, big data caching is different from the traditional data caching. In the paper we propose a distributed cache management framework of big data for equipment condition assessment. It consists of three parts: cache structure, cache replacement algorithm and cache placement algorithm. Cache structure is the basis of the latter two algorithms. Based on the framework and algorithms, we make full use of the characteristics of just accessing some valuable data during a period of time, and put relevant data on the neighborhood nodes, which largely reduce network transmission cost. We also validate the performance of our proposed approaches through extensive experiments. It demonstrates that the proposed cache replacement algorithm and cache management framework has higher hit rate or lower query time than LRU algorithm and round-robin algorithm.

  3. WATCHMAN: A Data Warehouse Intelligent Cache Manager

    Science.gov (United States)

    Scheuermann, Peter; Shim, Junho; Vingralek, Radek

    1996-01-01

    Data warehouses store large volumes of data which are used frequently by decision support applications. Such applications involve complex queries. Query performance in such an environment is critical because decision support applications often require interactive query response time. Because data warehouses are updated infrequently, it becomes possible to improve query performance by caching sets retrieved by queries in addition to query execution plans. In this paper we report on the design of an intelligent cache manager for sets retrieved by queries called WATCHMAN, which is particularly well suited for data warehousing environment. Our cache manager employs two novel, complementary algorithms for cache replacement and for cache admission. WATCHMAN aims at minimizing query response time and its cache replacement policy swaps out entire retrieved sets of queries instead of individual pages. The cache replacement and admission algorithms make use of a profit metric, which considers for each retrieved set its average rate of reference, its size, and execution cost of the associated query. We report on a performance evaluation based on the TPC-D and Set Query benchmarks. These experiments show that WATCHMAN achieves a substantial performance improvement in a decision support environment when compared to a traditional LRU replacement algorithm.

  4. Static analysis of worst-case stack cache behavior

    DEFF Research Database (Denmark)

    Jordan, Alexander; Brandner, Florian; Schoeberl, Martin

    2013-01-01

    Utilizing a stack cache in a real-time system can aid predictability by avoiding interference that heap memory traffic causes on the data cache. While loads and stores are guaranteed cache hits, explicit operations are responsible for managing the stack cache. The behavior of these operations can......-graph, the worst-case bounds can be efficiently yet precisely determined. Our evaluation using the MiBench benchmark suite shows that only 37% and 21% of potential stack cache operations actually store to and load from memory, respectively. Analysis times are modest, on average running between 0.46s and 1.30s per...

  5. Truth Space Method for Caching Database Queries

    Directory of Open Access Journals (Sweden)

    S. V. Mosin

    2015-01-01

    Full Text Available We propose a new method of client-side data caching for relational databases with a central server and distant clients. Data are loaded into the client cache based on queries executed on the server. Every query has the corresponding DB table – the result of the query execution. These queries have a special form called "universal relational query" based on three fundamental Relational Algebra operations: selection, projection and natural join. We have to mention that such a form is the closest one to the natural language and the majority of database search queries can be expressed in this way. Besides, this form allows us to analyze query correctness by checking lossless join property. A subsequent query may be executed in a client’s local cache if we can determine that the query result is entirely contained in the cache. For this we compare truth spaces of the logical restrictions in a new user’s query and the results of the queries execution in the cache. Such a comparison can be performed analytically , without need in additional Database queries. This method may be used to define lacking data in the cache and execute the query on the server only for these data. To do this the analytical approach is also used, what distinguishes our paper from the existing technologies. We propose four theorems for testing the required conditions. The first and the third theorems conditions allow us to define the existence of required data in cache. The second and the fourth theorems state conditions to execute queries with cache only. The problem of cache data actualizations is not discussed in this paper. However, it can be solved by cataloging queries on the server and their serving by triggers in background mode. The article is published in the author’s wording.

  6. Cache memory modelling method and system

    OpenAIRE

    Posadas Cobo, Héctor; Villar Bonet, Eugenio; Díaz Suárez, Luis

    2011-01-01

    The invention relates to a method for modelling a data cache memory of a destination processor, in order to simulate the behaviour of said data cache memory during the execution of a software code on a platform comprising said destination processor. According to the invention, the simulation is performed on a native platform having a processor different from the destination processor comprising the aforementioned data cache memory to be modelled, said modelling being performed by means of the...

  7. Efficient Mobile Client Caching Supporting Transaction Semantics

    Directory of Open Access Journals (Sweden)

    IlYoung Chung

    2000-05-01

    Full Text Available In mobile client-server database systems, caching of frequently accessed data is an important technique that will reduce the contention on the narrow bandwidth wireless channel. As the server in mobile environments may not have any information about the state of its clients' cache(stateless server, using broadcasting approach to transmit the updated data lists to numerous concurrent mobile clients is an attractive approach. In this paper, a caching policy is proposed to maintain cache consistency for mobile computers. The proposed protocol adopts asynchronous(non-periodic broadcasting as the cache invalidation scheme, and supports transaction semantics in mobile environments. With the asynchronous broadcasting approach, the proposed protocol can improve the throughput by reducing the abortion of transactions with low communication costs. We study the performance of the protocol by means of simulation experiments.

  8. Efficacy of Code Optimization on Cache-based Processors

    Science.gov (United States)

    VanderWijngaart, Rob F.; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    The current common wisdom in the U.S. is that the powerful, cost-effective supercomputers of tomorrow will be based on commodity (RISC) micro-processors with cache memories. Already, most distributed systems in the world use such hardware as building blocks. This shift away from vector supercomputers and towards cache-based systems has brought about a change in programming paradigm, even when ignoring issues of parallelism. Vector machines require inner-loop independence and regular, non-pathological memory strides (usually this means: non-power-of-two strides) to allow efficient vectorization of array operations. Cache-based systems require spatial and temporal locality of data, so that data once read from main memory and stored in high-speed cache memory is used optimally before being written back to main memory. This means that the most cache-friendly array operations are those that feature zero or unit stride, so that each unit of data read from main memory (a cache line) contains information for the next iteration in the loop. Moreover, loops ought to be 'fat', meaning that as many operations as possible are performed on cache data-provided instruction caches do not overflow and enough registers are available. If unit stride is not possible, for example because of some data dependency, then care must be taken to avoid pathological strides, just ads on vector computers. For cache-based systems the issues are more complex, due to the effects of associativity and of non-unit block (cache line) size. But there is more to the story. Most modern micro-processors are superscalar, which means that they can issue several (arithmetic) instructions per clock cycle, provided that there are enough independent instructions in the loop body. This is another argument for providing fat loop bodies. With these restrictions, it appears fairly straightforward to produce code that will run efficiently on any cache-based system. It can be argued that although some of the important

  9. Design Space Exploration of Object Caches with Cross-Profiling

    DEFF Research Database (Denmark)

    Schoeberl, Martin; Binder, Walter; Villazon, Alex

    2011-01-01

    . However, before implementing such an object cache, an empirical analysis of different organization forms is needed. We use a cross-profiling technique based on aspect-oriented programming in order to evaluate different object cache organizations with standard Java benchmarks. From the evaluation we......To avoid data cache trashing between heap-allocated data and other data areas, a distinct object cache has been proposed for embedded real-time Java processors. This object cache uses high associativity in order to statically track different object pointers for worst-case execution-time analysis...... conclude that field access exhibits some temporal locality, but almost no spatial locality. Therefore, filling long cache lines on a miss just introduces a high miss penalty without increasing the hit rate enough to make up for the increased miss penalty. For an object cache, it is more efficient to fill...

  10. Archeological Excavations at the Wanapum Cache Site

    International Nuclear Information System (INIS)

    T. E. Marceau

    2000-01-01

    This report was prepared to document the actions taken to locate and excavate an abandoned Wanapum cache located east of the 100-H Reactor area. Evidence (i.e., glass, ceramics, metal, and wood) obtained from shovel and backhoe excavations at the Wanapum cache site indicate that the storage caches were found. The highly fragmented condition of these materials argues that the contents of the caches were collected or destroyed prior to the caches being burned and buried by mechanical equipment. While the fiber nets would have been destroyed by fire, the specialized stone weights would have remained behind. The fact that the site might have been gleaned of desirable artifacts prior to its demolition is consistent with the account by Riddell (1948) for a contemporary village site. Unfortunately, fishing equipment, owned by and used on behalf of the village, that might have returned to productive use has been irretrievably lost

  11. Engineering a Cache-Oblivious Sorting Algorithm

    DEFF Research Database (Denmark)

    Brodal, Gerth Stølting; Fagerberg, Rolf; Vinther, Kristoffer

    2007-01-01

    This paper is an algorithmic engineering study of cache-oblivious sorting. We investigate by empirical methods a number of implementation issues and parameter choices for the cache-oblivious sorting algorithm Lazy Funnelsort, and compare the final algorithm with Quicksort, the established standard...

  12. Corvid re-caching without 'theory of mind': a model.

    Science.gov (United States)

    van der Vaart, Elske; Verbrugge, Rineke; Hemelrijk, Charlotte K

    2012-01-01

    Scrub jays are thought to use many tactics to protect their caches. For instance, they predominantly bury food far away from conspecifics, and if they must cache while being watched, they often re-cache their worms later, once they are in private. Two explanations have been offered for such observations, and they are intensely debated. First, the birds may reason about their competitors' mental states, with a 'theory of mind'; alternatively, they may apply behavioral rules learned in daily life. Although this second hypothesis is cognitively simpler, it does seem to require a different, ad-hoc behavioral rule for every caching and re-caching pattern exhibited by the birds. Our new theory avoids this drawback by explaining a large variety of patterns as side-effects of stress and the resulting memory errors. Inspired by experimental data, we assume that re-caching is not motivated by a deliberate effort to safeguard specific caches from theft, but by a general desire to cache more. This desire is brought on by stress, which is determined by the presence and dominance of onlookers, and by unsuccessful recovery attempts. We study this theory in two experiments similar to those done with real birds with a kind of 'virtual bird', whose behavior depends on a set of basic assumptions about corvid cognition, and a well-established model of human memory. Our results show that the 'virtual bird' acts as the real birds did; its re-caching reflects whether it has been watched, how dominant its onlooker was, and how close to that onlooker it has cached. This happens even though it cannot attribute mental states, and it has only a single behavioral rule assumed to be previously learned. Thus, our simulations indicate that corvid re-caching can be explained without sophisticated social cognition. Given our specific predictions, our theory can easily be tested empirically.

  13. Corvid re-caching without 'theory of mind': a model.

    Directory of Open Access Journals (Sweden)

    Elske van der Vaart

    Full Text Available Scrub jays are thought to use many tactics to protect their caches. For instance, they predominantly bury food far away from conspecifics, and if they must cache while being watched, they often re-cache their worms later, once they are in private. Two explanations have been offered for such observations, and they are intensely debated. First, the birds may reason about their competitors' mental states, with a 'theory of mind'; alternatively, they may apply behavioral rules learned in daily life. Although this second hypothesis is cognitively simpler, it does seem to require a different, ad-hoc behavioral rule for every caching and re-caching pattern exhibited by the birds. Our new theory avoids this drawback by explaining a large variety of patterns as side-effects of stress and the resulting memory errors. Inspired by experimental data, we assume that re-caching is not motivated by a deliberate effort to safeguard specific caches from theft, but by a general desire to cache more. This desire is brought on by stress, which is determined by the presence and dominance of onlookers, and by unsuccessful recovery attempts. We study this theory in two experiments similar to those done with real birds with a kind of 'virtual bird', whose behavior depends on a set of basic assumptions about corvid cognition, and a well-established model of human memory. Our results show that the 'virtual bird' acts as the real birds did; its re-caching reflects whether it has been watched, how dominant its onlooker was, and how close to that onlooker it has cached. This happens even though it cannot attribute mental states, and it has only a single behavioral rule assumed to be previously learned. Thus, our simulations indicate that corvid re-caching can be explained without sophisticated social cognition. Given our specific predictions, our theory can easily be tested empirically.

  14. Analysis of preemption costs for the stack cache

    DEFF Research Database (Denmark)

    Naji, Amine; Abbaspour, Sahar; Brandner, Florian

    2018-01-01

    , the analysis of the stack cache was limited to individual tasks, ignoring aspects related to multitasking. A major drawback of the original stack cache design is that, due to its simplicity, it cannot hold the data of multiple tasks at the same time. Consequently, the entire cache content needs to be saved...

  15. OPTIMAL DATA REPLACEMENT TECHNIQUE FOR COOPERATIVE CACHING IN MANET

    Directory of Open Access Journals (Sweden)

    P. Kuppusamy

    2014-09-01

    Full Text Available A cooperative caching approach improves data accessibility and reduces query latency in Mobile Ad hoc Network (MANET. Maintaining the cache is challenging issue in large MANET due to mobility, cache size and power. The previous research works on caching primarily have dealt with LRU, LFU and LRU-MIN cache replacement algorithms that offered low query latency and greater data accessibility in sparse MANET. This paper proposes Memetic Algorithm (MA to locate the better replaceable data based on neighbours interest and fitness value of cached data to store the newly arrived data. This work also elects ideal CH using Meta heuristic search Ant Colony Optimization algorithm. The simulation results shown that proposed algorithm reduces the latency, control overhead and increases the packet delivery rate than existing approach by increasing nodes and speed respectively.

  16. The Cost of Cache-Oblivious Searching

    DEFF Research Database (Denmark)

    Bender, Michael A.; Brodal, Gert Stølting; Fagerberg, Rolf

    2003-01-01

    , multilevel memory hierarchies can be modelled. It is shown that as k grows, the search costs of the optimal k-level DAM search structure and of the optimal cache-oblivious search structure rapidly converge. This demonstrates that for a multilevel memory hierarchy, a simple cache-oblivious structure almost......Tight bounds on the cost of cache-oblivious searching are proved. It is shown that no cache-oblivious search structure can guarantee that a search performs fewer than lg e log B N block transfers between any two levels of the memory hierarchy. This lower bound holds even if all of the block sizes...... the random placement of the rst element of the structure in memory. As searching in the Disk Access Model (DAM) can be performed in log B N + 1 block transfers, this result shows a separation between the 2-level DAM and cacheoblivious memory-hierarchy models. By extending the DAM model to k levels...

  17. A Two-Level Cache for Distributed Information Retrieval in Search Engines

    Directory of Open Access Journals (Sweden)

    Weizhe Zhang

    2013-01-01

    Full Text Available To improve the performance of distributed information retrieval in search engines, we propose a two-level cache structure based on the queries of the users’ logs. We extract the highest rank queries of users from the static cache, in which the queries are the most popular. We adopt the dynamic cache as an auxiliary to optimize the distribution of the cache data. We propose a distribution strategy of the cache data. The experiments prove that the hit rate, the efficiency, and the time consumption of the two-level cache have advantages compared with other structures of cache.

  18. A two-level cache for distributed information retrieval in search engines.

    Science.gov (United States)

    Zhang, Weizhe; He, Hui; Ye, Jianwei

    2013-01-01

    To improve the performance of distributed information retrieval in search engines, we propose a two-level cache structure based on the queries of the users' logs. We extract the highest rank queries of users from the static cache, in which the queries are the most popular. We adopt the dynamic cache as an auxiliary to optimize the distribution of the cache data. We propose a distribution strategy of the cache data. The experiments prove that the hit rate, the efficiency, and the time consumption of the two-level cache have advantages compared with other structures of cache.

  19. Optimal Caching in Multicast 5G Networks with Opportunistic Spectrum Access

    KAUST Repository

    Emara, Mostafa

    2018-01-15

    Cache-enabled small base station (SBS) densification is foreseen as a key component of 5G cellular networks. This architecture enables storing popular files at the network edge (i.e., SBS caches), which empowers local communication and alleviates traffic congestions at the core/backhaul network. This paper develops a mathematical framework, based on stochastic geometry, to characterize the hit probability of a cache-enabled multicast 5G network with SBS multi-channel capabilities and opportunistic spectrum access. To this end, we first derive the hit probability by characterizing opportunistic spectrum access success probabilities, service distance distributions, and coverage probabilities. The optimal caching distribution to maximize the hit probability is then computed. The performance and trade-offs of the derived optimal caching distributions are then assessed and compared with two widely employed caching distribution schemes, namely uniform and Zipf caching, through numerical results and extensive simulations. It is shown that the Zipf caching almost optimal only in scenarios with large number of available channels and large cache sizes.

  20. Compiler-Enforced Cache Coherence Using a Functional Language

    Directory of Open Access Journals (Sweden)

    Rich Wolski

    1996-01-01

    Full Text Available The cost of hardware cache coherence, both in terms of execution delay and operational cost, is substantial for scalable systems. Fortunately, compiler-generated cache management can reduce program serialization due to cache contention; increase execution performance; and reduce the cost of parallel systems by eliminating the need for more expensive hardware support. In this article, we use the Sisal functional language system as a vehicle to implement and investigate automatic, compiler-based cache management. We describe our implementation of Sisal for the IBM Power/4. The Power/4, briefly available as a product, represents an early attempt to build a shared memory machine that relies strictly on the language system for cache coherence. We discuss the issues associated with deterministic execution and program correctness on a system without hardware coherence, and demonstrate how Sisal (as a functional language is able to address those issues.

  1. Version pressure feedback mechanisms for speculative versioning caches

    Science.gov (United States)

    Eichenberger, Alexandre E.; Gara, Alan; O& #x27; Brien, Kathryn M.; Ohmacht, Martin; Zhuang, Xiaotong

    2013-03-12

    Mechanisms are provided for controlling version pressure on a speculative versioning cache. Raw version pressure data is collected based on one or more threads accessing cache lines of the speculative versioning cache. One or more statistical measures of version pressure are generated based on the collected raw version pressure data. A determination is made as to whether one or more modifications to an operation of a data processing system are to be performed based on the one or more statistical measures of version pressure, the one or more modifications affecting version pressure exerted on the speculative versioning cache. An operation of the data processing system is modified based on the one or more determined modifications, in response to a determination that one or more modifications to the operation of the data processing system are to be performed, to affect the version pressure exerted on the speculative versioning cache.

  2. Dynamic web cache publishing for IaaS clouds using Shoal

    International Nuclear Information System (INIS)

    Gable, Ian; Chester, Michael; Berghaus, Frank; Leavett-Brown, Colin; Paterson, Michael; Prior, Robert; Sobie, Randall; Taylor, Ryan; Armstrong, Patrick; Charbonneau, Andre

    2014-01-01

    We have developed a highly scalable application, called Shoal, for tracking and utilizing a distributed set of HTTP web caches. Our application uses the Squid HTTP cache. Squid servers advertise their existence to the Shoal server via AMQP messaging by running Shoal Agent. The Shoal server provides a simple REST interface that allows clients to determine their closest Squid cache. Our goal is to dynamically instantiate Squid caches on IaaS clouds in response to client demand. Shoal provides the VMs on IaaS clouds with the location of the nearest dynamically instantiated Squid Cache

  3. Search-Order Independent State Caching

    DEFF Research Database (Denmark)

    Evangelista, Sami; Kristensen, Lars Michael

    2009-01-01

    State caching is a memory reduction technique used by model checkers to alleviate the state explosion problem. It has traditionally been coupled with a depth-first search to ensure termination.We propose and experimentally evaluate an extension of the state caching method for general state...... exploring algorithms that are independent of the search order (i.e., search algorithms that partition the state space into closed (visited) states, open (to visit) states and unmet states)....

  4. dCache on Steroids - Delegated Storage Solutions

    Science.gov (United States)

    Mkrtchyan, T.; Adeyemi, F.; Ashish, A.; Behrmann, G.; Fuhrmann, P.; Litvintsev, D.; Millar, P.; Rossi, A.; Sahakyan, M.; Starek, J.

    2017-10-01

    For over a decade, dCache.org has delivered a robust software used at more than 80 Universities and research institutes around the world, allowing these sites to provide reliable storage services for the WLCG experiments as well as many other scientific communities. The flexible architecture of dCache allows running it in a wide variety of configurations and platforms - from a SoC based all-in-one Raspberry-Pi up to hundreds of nodes in a multipetabyte installation. Due to lack of managed storage at the time, dCache implemented data placement, replication and data integrity directly. Today, many alternatives are available: S3, GlusterFS, CEPH and others. While such solutions position themselves as scalable storage systems, they cannot be used by many scientific communities out of the box. The absence of community-accepted authentication and authorization mechanisms, the use of product specific protocols and the lack of namespace are some of the reasons that prevent wide-scale adoption of these alternatives. Most of these limitations are already solved by dCache. By delegating low-level storage management functionality to the above-mentioned new systems and providing the missing layer through dCache, we provide a solution which combines the benefits of both worlds - industry standard storage building blocks with the access protocols and authentication required by scientific communities. In this paper, we focus on CEPH, a popular software for clustered storage that supports file, block and object interfaces. CEPH is often used in modern computing centers, for example as a backend to OpenStack services. We will show prototypes of dCache running with a CEPH backend and discuss the benefits and limitations of such an approach. We will also outline the roadmap for supporting ‘delegated storage’ within the dCache releases.

  5. Cooperative Caching in Mobile Ad Hoc Networks Based on Data Utility

    Directory of Open Access Journals (Sweden)

    Narottam Chand

    2007-01-01

    Full Text Available Cooperative caching, which allows sharing and coordination of cached data among clients, is a potential technique to improve the data access performance and availability in mobile ad hoc networks. However, variable data sizes, frequent data updates, limited client resources, insufficient wireless bandwidth and client's mobility make cache management a challenge. In this paper, we propose a utility based cache replacement policy, least utility value (LUV, to improve the data availability and reduce the local cache miss ratio. LUV considers several factors that affect cache performance, namely access probability, distance between the requester and data source/cache, coherency and data size. A cooperative cache management strategy, Zone Cooperative (ZC, is developed that employs LUV as replacement policy. In ZC one-hop neighbors of a client form a cooperation zone since the cost for communication with them is low both in terms of energy consumption and message exchange. Simulation experiments have been conducted to evaluate the performance of LUV based ZC caching strategy. The simulation results show that, LUV replacement policy substantially outperforms the LRU policy.

  6. An Adaptive Insertion and Promotion Policy for Partitioned Shared Caches

    Science.gov (United States)

    Mahrom, Norfadila; Liebelt, Michael; Raof, Rafikha Aliana A.; Daud, Shuhaizar; Hafizah Ghazali, Nur

    2018-03-01

    Cache replacement policies in chip multiprocessors (CMP) have been investigated extensively and proven able to enhance shared cache management. However, competition among multiple processors executing different threads that require simultaneous access to a shared memory may cause cache contention and memory coherence problems on the chip. These issues also exist due to some drawbacks of the commonly used Least Recently Used (LRU) policy employed in multiprocessor systems, which are because of the cache lines residing in the cache longer than required. In image processing analysis of for example extra pulmonary tuberculosis (TB), an accurate diagnosis for tissue specimen is required. Therefore, a fast and reliable shared memory management system to execute algorithms for processing vast amount of specimen image is needed. In this paper, the effects of the cache replacement policy in a partitioned shared cache are investigated. The goal is to quantify whether better performance can be achieved by using less complex replacement strategies. This paper proposes a Middle Insertion 2 Positions Promotion (MI2PP) policy to eliminate cache misses that could adversely affect the access patterns and the throughput of the processors in the system. The policy employs a static predefined insertion point, near distance promotion, and the concept of ownership in the eviction policy to effectively improve cache thrashing and to avoid resource stealing among the processors.

  7. Numerical simulation of groundwater movement and managed aquifer recharge from Sand Hollow Reservoir, Hurricane Bench area, Washington County, Utah

    Science.gov (United States)

    Marston, Thomas M.; Heilweil, Victor M.

    2012-01-01

    The Hurricane Bench area of Washington County, Utah, is a 70 square-mile area extending south from the Virgin River and encompassing Sand Hollow basin. Sand Hollow Reservoir, located on Hurricane Bench, was completed in March 2002 and is operated primarily as a managed aquifer recharge project by the Washington County Water Conservancy District. The reservoir is situated on a thick sequence of the Navajo Sandstone and Kayenta Formation. Total recharge to the underlying Navajo aquifer from the reservoir was about 86,000 acre-feet from 2002 to 2009. Natural recharge as infiltration of precipitation was approximately 2,100 acre-feet per year for the same period. Discharge occurs as seepage to the Virgin River, municipal and irrigation well withdrawals, and seepage to drains at the base of reservoir dams. Within the Hurricane Bench area, unconfined groundwater-flow conditions generally exist throughout the Navajo Sandstone. Navajo Sandstone hydraulic-conductivity values from regional aquifer testing range from 0.8 to 32 feet per day. The large variability in hydraulic conductivity is attributed to bedrock fractures that trend north-northeast across the study area.A numerical groundwater-flow model was developed to simulate groundwater movement in the Hurricane Bench area and to simulate the movement of managed aquifer recharge from Sand Hollow Reservoir through the groundwater system. The model was calibrated to combined steady- and transient-state conditions. The steady-state portion of the simulation was developed and calibrated by using hydrologic data that represented average conditions for 1975. The transient-state portion of the simulation was developed and calibrated by using hydrologic data collected from 1976 to 2009. Areally, the model grid was 98 rows by 76 columns with a variable cell size ranging from about 1.5 to 25 acres. Smaller cells were used to represent the reservoir to accurately simulate the reservoir bathymetry and nearby monitoring wells; larger

  8. A Stack Cache for Real-Time Systems

    DEFF Research Database (Denmark)

    Schoeberl, Martin; Nielsen, Carsten

    2016-01-01

    Real-time systems need time-predictable computing platforms to allowfor static analysis of the worst-case execution time. Caches are important for good performance, but data caches arehard to analyze for the worst-case execution time. Stack allocated data has different properties related...

  9. An evaluation of seepage gains and losses in Indian Creek Reservoir, Ada County, Idaho, April 2010–November 2011

    Science.gov (United States)

    Williams, Marshall L.; Etheridge, Alexandra B.

    2013-01-01

    The U.S. Geological Survey, in cooperation with the Idaho Department of Water Resources, conducted an investigation on Indian Creek Reservoir, a small impoundment in east Ada County, Idaho, to quantify groundwater seepage into and out of the reservoir. Data from the study will assist the Idaho Water Resources Department’s Comprehensive Aquifer Management Planning effort to estimate available water resources in Ada County. Three independent methods were utilized to estimate groundwater seepage: (1) the water-budget method; (2) the seepage-meter method; and (3) the segmented Darcy method. Reservoir seepage was quantified during the periods of April through August 2010 and February through November 2011. With the water-budget method, all measureable sources of inflow to and outflow from the reservoir were quantified, with the exception of groundwater; the water-budget equation was solved for groundwater inflow to or outflow from the reservoir. The seepage-meter method relies on the placement of seepage meters into the bottom sediments of the reservoir for the direct measurement of water flux across the sediment-water interface. The segmented-Darcy method utilizes a combination of water-level measurements in the reservoir and in adjacent near-shore wells to calculate water-table gradients between the wells and the reservoir within defined segments of the reservoir shoreline. The Darcy equation was used to calculate groundwater inflow to and outflow from the reservoir. Water-budget results provided continuous, daily estimates of seepage over the full period of data collection, while the seepage-meter and segmented Darcy methods provided instantaneous estimates of seepage. As a result of these and other difference in methodologies, comparisons of seepage estimates provided by the three methods are considered semi-quantitative. The results of the water-budget derived estimates of seepage indicate seepage to be seasonally variable in terms of the direction and magnitude

  10. Advanced Oil Recovery Technologies for Improved Recovery from Slope Basin Clastic Reservoirs, Nash Draw Brushy Canyon Pool, Eddy County, NM

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, Mark B.

    1999-02-24

    The Nash Draw Brushy Canyon Pool in Eddy County New Mexico is a cost-shared field demonstration project in the US Department of Energy Class II Program. A major goal of the Class III Program is to stimulate the use of advanced technologies to increase ultimate recovery from slope-basin clastic reservoirs. Advanced characterization techniques are being used at the Nash Draw project to develop reservoir management strategies for optimizing oil recovery from this Delaware reservoir. Analysis, interpretation, and integration of recently acquired geologic, geophysical, and engineering data revealed that the initial reservoir characterization was too simplistic to capture the critical features of this complex formation. Contrary to the initial characterization, a new reservoir description evolved that provided sufficient detail regarding the complexity of the Brushy Canyon interval at Nash Draw. This new reservoir description is being used as a risk reduction tool to identify ''sweet spots'' for a development drilling program as well as to evaluate pressure maintenance strategies. The reservoir characterization, geological modeling, 3-D seismic interpretation, and simulation studies have provided a detailed model of the Brushy Canyon zones. This model was used to predict the success of different reservoir management scenarios and to aid in determining the most favorable combination of targeted drilling, pressure maintenance, well simulation, and well spacing to improve recovery from this reservoir.

  11. A distributed storage system with dCache

    DEFF Research Database (Denmark)

    Behrmann, Gerd; Fuhrmann, Patrick; Grønager, Michael

    2008-01-01

    The LCG collaboration is encompassed by a number of Tier 1 centers. The Nordic LCG Tier 1, operated by NDGF, is in contrast to many other Tier 1 centers distributed over the Nordic countries. A distributed setup was chosen for both political and technical reasons, but also provides a number...... of unique challenges. dCache is well known and respected as a powerful distributed storage resource manager, and was chosen for implementing the storage aspects of the Nordic Tier 1. In contrast to classic dCache deployments, we deploy dCache over a WAN with limited bandwidth, high latency, frequent network...

  12. Smart caching based on mobile agent of power WebGIS platform.

    Science.gov (United States)

    Wang, Xiaohui; Wu, Kehe; Chen, Fei

    2013-01-01

    Power information construction is developing towards intensive, platform, distributed direction with the expansion of power grid and improvement of information technology. In order to meet the trend, power WebGIS was designed and developed. In this paper, we first discuss the architecture and functionality of power WebGIS, and then we study caching technology in detail, which contains dynamic display cache model, caching structure based on mobile agent, and cache data model. We have designed experiments of different data capacity to contrast performance between WebGIS with the proposed caching model and traditional WebGIS. The experimental results showed that, with the same hardware environment, the response time of WebGIS with and without caching model increased as data capacity growing, while the larger the data was, the higher the performance of WebGIS with proposed caching model improved.

  13. Behavior-aware cache hierarchy optimization for low-power multi-core embedded systems

    Science.gov (United States)

    Zhao, Huatao; Luo, Xiao; Zhu, Chen; Watanabe, Takahiro; Zhu, Tianbo

    2017-07-01

    In modern embedded systems, the increasing number of cores requires efficient cache hierarchies to ensure data throughput, but such cache hierarchies are restricted by their tumid size and interference accesses which leads to both performance degradation and wasted energy. In this paper, we firstly propose a behavior-aware cache hierarchy (BACH) which can optimally allocate the multi-level cache resources to many cores and highly improved the efficiency of cache hierarchy, resulting in low energy consumption. The BACH takes full advantage of the explored application behaviors and runtime cache resource demands as the cache allocation bases, so that we can optimally configure the cache hierarchy to meet the runtime demand. The BACH was implemented on the GEM5 simulator. The experimental results show that energy consumption of a three-level cache hierarchy can be saved from 5.29% up to 27.94% compared with other key approaches while the performance of the multi-core system even has a slight improvement counting in hardware overhead.

  14. dCache, agile adoption of storage technology

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    For over a decade, dCache has been synonymous with large-capacity, fault-tolerant storage using commodity hardware that supports seamless data migration to and from tape. Over that time, it has satisfied the requirements of various demanding scientific user communities to store their data, transfer it between sites and fast, site-local access. When the dCache project started, the focus was on managing a relatively small disk cache in front of large tape archives. Over the project's lifetime storage technology has changed. During this period, technology changes have driven down the cost-per-GiB of harddisks. This resulted in a shift towards systems where the majority of data is stored on disk. More recently, the availability of Solid State Disks, while not yet a replacement for magnetic disks, offers an intriguing opportunity for significant performance improvement if they can be used intelligently within an existing system. New technologies provide new opportunities and dCache user communities' computi...

  15. Study of cache performance in distributed environment for data processing

    International Nuclear Information System (INIS)

    Makatun, Dzmitry; Lauret, Jérôme; Šumbera, Michal

    2014-01-01

    Processing data in distributed environment has found its application in many fields of science (Nuclear and Particle Physics (NPP), astronomy, biology to name only those). Efficiently transferring data between sites is an essential part of such processing. The implementation of caching strategies in data transfer software and tools, such as the Reasoner for Intelligent File Transfer (RIFT) being developed in the STAR collaboration, can significantly decrease network load and waiting time by reusing the knowledge of data provenance as well as data placed in transfer cache to further expand on the availability of sources for files and data-sets. Though, a great variety of caching algorithms is known, a study is needed to evaluate which one can deliver the best performance in data access considering the realistic demand patterns. Records of access to the complete data-sets of NPP experiments were analyzed and used as input for computer simulations. Series of simulations were done in order to estimate the possible cache hits and cache hits per byte for known caching algorithms. The simulations were done for cache of different sizes within interval 0.001 – 90% of complete data-set and low-watermark within 0-90%. Records of data access were taken from several experiments and within different time intervals in order to validate the results. In this paper, we will discuss the different data caching strategies from canonical algorithms to hybrid cache strategies, present the results of our simulations for the diverse algorithms, debate and identify the choice for the best algorithm in the context of Physics Data analysis in NPP. While the results of those studies have been implemented in RIFT, they can also be used when setting up cache in any other computational work-flow (Cloud processing for example) or managing data storages with partial replicas of the entire data-set

  16. A Cache System Design for CMPs with Built-In Coherence Verification

    Directory of Open Access Journals (Sweden)

    Mamata Dalui

    2016-01-01

    Full Text Available This work reports an effective design of cache system for Chip Multiprocessors (CMPs. It introduces built-in logic for verification of cache coherence in CMPs realizing directory based protocol. It is developed around the cellular automata (CA machine, invented by John von Neumann in the 1950s. A special class of CA referred to as single length cycle 2-attractor cellular automata (TACA has been planted to detect the inconsistencies in cache line states of processors’ private caches. The TACA module captures coherence status of the CMPs’ cache system and memorizes any inconsistent recording of the cache line states during the processors’ reference to a memory block. Theory has been developed to empower a TACA to analyse the cache state updates and then to settle to an attractor state indicating quick decision on a faulty recording of cache line status. The introduction of segmentation of the CMPs’ processor pool ensures a better efficiency, in determining the inconsistencies, by reducing the number of computation steps in the verification logic. The hardware requirement for the verification logic points to the fact that the overhead of proposed coherence verification module is much lesser than that of the conventional verification units and is insignificant with respect to the cost involved in CMPs’ cache system.

  17. Scope-Based Method Cache Analysis

    DEFF Research Database (Denmark)

    Huber, Benedikt; Hepp, Stefan; Schoeberl, Martin

    2014-01-01

    The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution, as it req......The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution...

  18. Funnel Heap - A Cache Oblivious Priority Queue

    DEFF Research Database (Denmark)

    Brodal, Gerth Stølting; Fagerberg, Rolf

    2002-01-01

    The cache oblivious model of computation is a two-level memory model with the assumption that the parameters of the model are unknown to the algorithms. A consequence of this assumption is that an algorithm efficient in the cache oblivious model is automatically efficient in a multi-level memory...

  19. Value-Based Caching in Information-Centric Wireless Body Area Networks

    Directory of Open Access Journals (Sweden)

    Fadi M. Al-Turjman

    2017-01-01

    Full Text Available We propose a resilient cache replacement approach based on a Value of sensed Information (VoI policy. To resolve and fetch content when the origin is not available due to isolated in-network nodes (fragmentation and harsh operational conditions, we exploit a content caching approach. Our approach depends on four functional parameters in sensory Wireless Body Area Networks (WBANs. These four parameters are: age of data based on periodic request, popularity of on-demand requests, communication interference cost, and the duration for which the sensor node is required to operate in active mode to capture the sensed readings. These parameters are considered together to assign a value to the cached data to retain the most valuable information in the cache for prolonged time periods. The higher the value, the longer the duration for which the data will be retained in the cache. This caching strategy provides significant availability for most valuable and difficult to retrieve data in the WBANs. Extensive simulations are performed to compare the proposed scheme against other significant caching schemes in the literature while varying critical aspects in WBANs (e.g., data popularity, cache size, publisher load, connectivity-degree, and severe probabilities of node failures. These simulation results indicate that the proposed VoI-based approach is a valid tool for the retrieval of cached content in disruptive and challenging scenarios, such as the one experienced in WBANs, since it allows the retrieval of content for a long period even while experiencing severe in-network node failures.

  20. A detailed GPU cache model based on reuse distance theory

    NARCIS (Netherlands)

    Nugteren, C.; Braak, van den G.J.W.; Corporaal, H.; Bal, H.E.

    2014-01-01

    As modern GPUs rely partly on their on-chip memories to counter the imminent off-chip memory wall, the efficient use of their caches has become important for performance and energy. However, optimising cache locality systematically requires insight into and prediction of cache behaviour. On

  1. Adjustable Two-Tier Cache for IPTV Based on Segmented Streaming

    Directory of Open Access Journals (Sweden)

    Kai-Chun Liang

    2012-01-01

    Full Text Available Internet protocol TV (IPTV is a promising Internet killer application, which integrates video, voice, and data onto a single IP network, and offers viewers an innovative set of choices and control over their TV content. To provide high-quality IPTV services, an effective strategy is based on caching. This work proposes a segment-based two-tier caching approach, which divides each video into multiple segments to be cached. This approach also partitions the cache space into two layers, where the first layer mainly caches to-be-played segments and the second layer saves possibly played segments. As the segment access becomes frequent, the proposed approach enlarges the first layer and reduces the second layer, and vice versa. Because requested segments may not be accessed frequently, this work further designs an admission control mechanism to determine whether an incoming segment should be cached or not. The cache architecture takes forward/stop playback into account and may replace the unused segments under the interrupted playback. Finally, we conduct comprehensive simulation experiments to evaluate the performance of the proposed approach. The results show that our approach can yield higher hit ratio than previous work under various environmental parameters.

  2. Novel dynamic caching for hierarchically distributed video-on-demand systems

    Science.gov (United States)

    Ogo, Kenta; Matsuda, Chikashi; Nishimura, Kazutoshi

    1998-02-01

    It is difficult to simultaneously serve the millions of video streams that will be needed in the age of 'Mega-Media' networks by using only one high-performance server. To distribute the service load, caching servers should be location near users. However, in previously proposed caching mechanisms, the grade of service depends on whether the data is already cached at a caching server. To make the caching servers transparent to the users, the ability to randomly access the large volume of data stored in the central server should be supported, and the operational functions of the provided service should not be narrowly restricted. We propose a mechanism for constructing a video-stream-caching server that is transparent to the users and that will always support all special playback functions for all available programs to all the contents with a latency of only 1 or 2 seconds. This mechanism uses Variable-sized-quantum-segment- caching technique derived from an analysis of the historical usage log data generated by a line-on-demand-type service experiment and based on the basic techniques used by a time- slot-based multiple-stream video-on-demand server.

  3. Alignment of Memory Transfers of a Time-Predictable Stack Cache

    DEFF Research Database (Denmark)

    Abbaspourseyedi, Sahar; Brandner, Florian

    2014-01-01

    of complex cache states. Instead, only the occupancy level of the cache has to be determined. The memory transfers generated by the standard stack cache are not generally aligned. These unaligned accesses risk to introduce complexity to the otherwise simple WCET analysis. In this work, we investigate three...

  4. A trace-driven analysis of name and attribute caching in a distributed system

    Science.gov (United States)

    Shirriff, Ken W.; Ousterhout, John K.

    1992-01-01

    This paper presents the results of simulating file name and attribute caching on client machines in a distributed file system. The simulation used trace data gathered on a network of about 40 workstations. Caching was found to be advantageous: a cache on each client containing just 10 directories had a 91 percent hit rate on name look ups. Entry-based name caches (holding individual directory entries) had poorer performance for several reasons, resulting in a maximum hit rate of about 83 percent. File attribute caching obtained a 90 percent hit rate with a cache on each machine of the attributes for 30 files. The simulations show that maintaining cache consistency between machines is not a significant problem; only 1 in 400 name component look ups required invalidation of a remotely cached entry. Process migration to remote machines had little effect on caching. Caching was less successful in heavily shared and modified directories such as /tmp, but there weren't enough references to /tmp overall to affect the results significantly. We estimate that adding name and attribute caching to the Sprite operating system could reduce server load by 36 percent and the number of network packets by 30 percent.

  5. A Novel Cache Invalidation Scheme for Mobile Networks

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    In this paper, we propose a strategy of maintaining cache consistency in wireless mobile environments, which adds a validation server (VS) into the GPRS network, utilizes the location information of mobile terminal in SGSN located at GPRS backbone, just sends invalidation information to mobile terminal which is online in accordance with the cached data, and reduces the information amount in asynchronous transmission. This strategy enables mobile terminal to access cached data with very little computing amount, little delay and arbitrary disconnection intervals, and excels the synchronous IR and asynchronous state (AS) in the total performances.

  6. A distributed storage system with dCache

    Science.gov (United States)

    Behrmann, G.; Fuhrmann, P.; Grønager, M.; Kleist, J.

    2008-07-01

    The LCG collaboration is encompassed by a number of Tier 1 centers. The Nordic LCG Tier 1, operated by NDGF, is in contrast to many other Tier 1 centers distributed over the Nordic countries. A distributed setup was chosen for both political and technical reasons, but also provides a number of unique challenges. dCache is well known and respected as a powerful distributed storage resource manager, and was chosen for implementing the storage aspects of the Nordic Tier 1. In contrast to classic dCache deployments, we deploy dCache over a WAN with limited bandwidth, high latency, frequent network failures, and spanning many administrative domains. These properties provide unique challenges, covering topics such as security, administration, maintenance, upgradability, reliability, and performance. Our initial focus has been on implementing the GFD.47 OGF recommendation (which introduced the GridFTP 2 protocol) in dCache and the Globus Toolkit. Compared to GridFTP 1, GridFTP 2 allows for more intelligent data flow between clients and storage pools, thus enabling more efficient use of our limited bandwidth.

  7. A distributed storage system with dCache

    International Nuclear Information System (INIS)

    Behrmann, G; Groenager, M; Fuhrmann, P; Kleist, J

    2008-01-01

    The LCG collaboration is encompassed by a number of Tier 1 centers. The Nordic LCG Tier 1, operated by NDGF, is in contrast to many other Tier 1 centers distributed over the Nordic countries. A distributed setup was chosen for both political and technical reasons, but also provides a number of unique challenges. dCache is well known and respected as a powerful distributed storage resource manager, and was chosen for implementing the storage aspects of the Nordic Tier 1. In contrast to classic dCache deployments, we deploy dCache over a WAN with limited bandwidth, high latency, frequent network failures, and spanning many administrative domains. These properties provide unique challenges, covering topics such as security, administration, maintenance, upgradability, reliability, and performance. Our initial focus has been on implementing the GFD.47 OGF recommendation (which introduced the GridFTP 2 protocol) in dCache and the Globus Toolkit. Compared to GridFTP 1, GridFTP 2 allows for more intelligent data flow between clients and storage pools, thus enabling more efficient use of our limited bandwidth

  8. The People of Bear Hunter Speak: Oral Histories of the Cache Valley Shoshones Regarding the Bear River Massacre

    OpenAIRE

    Crawford, Aaron L.

    2007-01-01

    The Cache Valley Shoshone are the survivors of the Bear River Massacre, where a battle between a group of US. volunteer troops from California and a Shoshone village degenerated into the worst Indian massacre in US. history, resulting in the deaths of over 200 Shoshones. The massacre occurred due to increasing tensions over land use between the Shoshones and the Mormon settlers. Following the massacre, the Shoshones attempted settling in several different locations in Box Elder County, eventu...

  9. Efficient Context Switching for the Stack Cache: Implementation and Analysis

    DEFF Research Database (Denmark)

    Abbaspourseyedi, Sahar; Brandner, Florian; Naji, Amine

    2015-01-01

    , the analysis of the stack cache was limited to individual tasks, ignoring aspects related to multitasking. A major drawback of the original stack cache design is that, due to its simplicity, it cannot hold the data of multiple tasks at the same time. Consequently, the entire cache content needs to be saved...

  10. Energy Efficient Caching in Backhaul-Aware Cellular Networks with Dynamic Content Popularity

    Directory of Open Access Journals (Sweden)

    Jiequ Ji

    2018-01-01

    Full Text Available Caching popular contents at base stations (BSs has been regarded as an effective approach to alleviate the backhaul load and to improve the quality of service. To meet the explosive data traffic demand and to save energy consumption, energy efficiency (EE has become an extremely important performance index for the 5th generation (5G cellular networks. In general, there are two ways for improving the EE for caching, that is, improving the cache-hit rate and optimizing the cache size. In this work, we investigate the energy efficient caching problem in backhaul-aware cellular networks jointly considering these two approaches. Note that most existing works are based on the assumption that the content catalog and popularity are static. However, in practice, content popularity is dynamic. To timely estimate the dynamic content popularity, we propose a method based on shot noise model (SNM. Then we propose a distributed caching policy to improve the cache-hit rate in such a dynamic environment. Furthermore, we analyze the tradeoff between energy efficiency and cache capacity for which an optimization is formulated. We prove its convexity and derive a closed-form optimal cache capacity for maximizing the EE. Simulation results validate the proposed scheme and show that EE can be improved with appropriate choice of cache capacity.

  11. On Optimal Geographical Caching in Heterogeneous Cellular Networks

    NARCIS (Netherlands)

    Serbetci, Berksan; Goseling, Jasper

    2017-01-01

    In this work we investigate optimal geographical caching in heterogeneous cellular networks where different types of base stations (BSs) have different cache capacities. Users request files from a content library according to a known probability distribution. The performance metric is the total hit

  12. Distributed caching mechanism for various MPE software services

    CERN Document Server

    Svec, Andrej

    2017-01-01

    The MPE Software Section provides multiple software services to facilitate the testing and the operation of the CERN Accelerator complex. Continuous growth in the number of users and the amount of processed data result in the requirement of high scalability. Our current priority is to move towards a distributed and properly load balanced set of services based on containers. The aim of this project is to implement the generic caching mechanism applicable to our services and chosen architecture. The project will at first require research about the different aspects of distributed caching (persistence, no gc-caching, cache consistency etc.) and the available technologies followed by the implementation of the chosen solution. In order to validate the correctness and performance of the implementation in the last phase of the project it will be required to implement a monitoring layer and integrate it with the current ELK stack.

  13. Caching Efficiency Enhancement at Wireless Edges with Concerns on User’s Quality of Experience

    Directory of Open Access Journals (Sweden)

    Feng Li

    2018-01-01

    Full Text Available Content caching is a promising approach to enhancing bandwidth utilization and minimizing delivery delay for new-generation Internet applications. The design of content caching is based on the principles that popular contents are cached at appropriate network edges in order to reduce transmission delay and avoid backhaul bottleneck. In this paper, we propose a cooperative caching replacement and efficiency optimization scheme for IP-based wireless networks. Wireless edges are designed to establish a one-hop scope of caching information table for caching replacement in cases when there is not enough cache resource available within its own space. During the course, after receiving the caching request, every caching node should determine the weight of the required contents and provide a response according to the availability of its own caching space. Furthermore, to increase the caching efficiency from a practical perspective, we introduce the concept of quality of user experience (QoE and try to properly allocate the cache resource of the whole networks to better satisfy user demands. Different caching allocation strategies are devised to be adopted to enhance user QoE in various circumstances. Numerical results are further provided to justify the performance improvement of our proposal from various aspects.

  14. Cache timing attacks on recent microarchitectures

    DEFF Research Database (Denmark)

    Andreou, Alexandres; Bogdanov, Andrey; Tischhauser, Elmar Wolfgang

    2017-01-01

    Cache timing attacks have been known for a long time, however since the rise of cloud computing and shared hardware resources, such attacks found new potentially devastating applications. One prominent example is S$A (presented by Irazoqui et al at S&P 2015) which is a cache timing attack against...... AES or similar algorithms in virtualized environments. This paper applies variants of this cache timing attack to Intel's latest generation of microprocessors. It enables a spy-process to recover cryptographic keys, interacting with the victim processes only over TCP. The threat model is a logically...... separated but CPU co-located attacker with root privileges. We report successful and practically verified applications of this attack against a wide range of microarchitectures, from a two-core Nehalem processor (i5-650) to two-core Haswell (i7-4600M) and four-core Skylake processors (i7-6700). The attack...

  15. Unfavorable Strides in Cache Memory Systems (RNR Technical Report RNR-92-015

    Directory of Open Access Journals (Sweden)

    David H. Bailey

    1995-01-01

    Full Text Available An important issue in obtaining high performance on a scientific application running on a cache-based computer system is the behavior of the cache when data are accessed at a constant stride. Others who have discussed this issue have noted an odd phenomenon in such situations: A few particular innocent-looking strides result in sharply reduced cache efficiency. In this article, this problem is analyzed, and a simple formula is presented that accurately gives the cache efficiency for various cache parameters and data strides.

  16. Enhancing Leakage Power in CPU Cache Using Inverted Architecture

    OpenAIRE

    Bilal A. Shehada; Ahmed M. Serdah; Aiman Abu Samra

    2013-01-01

    Power consumption is an increasingly pressing problem in modern processor design. Since the on-chip caches usually consume a significant amount of power so power and energy consumption parameters have become one of the most important design constraint. It is one of the most attractive targets for power reduction. This paper presents an approach to enhance the dynamic power consumption of CPU cache using inverted cache architecture. Our assumption tries to reduce dynamic write power dissipatio...

  17. Combining instruction prefetching with partial cache locking to improve WCET in real-time systems.

    Directory of Open Access Journals (Sweden)

    Fan Ni

    Full Text Available Caches play an important role in embedded systems to bridge the performance gap between fast processor and slow memory. And prefetching mechanisms are proposed to further improve the cache performance. While in real-time systems, the application of caches complicates the Worst-Case Execution Time (WCET analysis due to its unpredictable behavior. Modern embedded processors often equip locking mechanism to improve timing predictability of the instruction cache. However, locking the whole cache may degrade the cache performance and increase the WCET of the real-time application. In this paper, we proposed an instruction-prefetching combined partial cache locking mechanism, which combines an instruction prefetching mechanism (termed as BBIP with partial cache locking to improve the WCET estimates of real-time applications. BBIP is an instruction prefetching mechanism we have already proposed to improve the worst-case cache performance and in turn the worst-case execution time. The estimations on typical real-time applications show that the partial cache locking mechanism shows remarkable WCET improvement over static analysis and full cache locking.

  18. Combining instruction prefetching with partial cache locking to improve WCET in real-time systems.

    Science.gov (United States)

    Ni, Fan; Long, Xiang; Wan, Han; Gao, Xiaopeng

    2013-01-01

    Caches play an important role in embedded systems to bridge the performance gap between fast processor and slow memory. And prefetching mechanisms are proposed to further improve the cache performance. While in real-time systems, the application of caches complicates the Worst-Case Execution Time (WCET) analysis due to its unpredictable behavior. Modern embedded processors often equip locking mechanism to improve timing predictability of the instruction cache. However, locking the whole cache may degrade the cache performance and increase the WCET of the real-time application. In this paper, we proposed an instruction-prefetching combined partial cache locking mechanism, which combines an instruction prefetching mechanism (termed as BBIP) with partial cache locking to improve the WCET estimates of real-time applications. BBIP is an instruction prefetching mechanism we have already proposed to improve the worst-case cache performance and in turn the worst-case execution time. The estimations on typical real-time applications show that the partial cache locking mechanism shows remarkable WCET improvement over static analysis and full cache locking.

  19. Effects of simulated mountain lion caching on decomposition of ungulate carcasses

    Science.gov (United States)

    Bischoff-Mattson, Z.; Mattson, D.

    2009-01-01

    Caching of animal remains is common among carnivorous species of all sizes, yet the effects of caching on larger prey are unstudied. We conducted a summer field experiment designed to test the effects of simulated mountain lion (Puma concolor) caching on mass loss, relative temperature, and odor dissemination of 9 prey-like carcasses. We deployed all but one of the carcasses in pairs, with one of each pair exposed and the other shaded and shallowly buried (cached). Caching substantially reduced wastage during dry and hot (drought) but not wet and cool (monsoon) periods, and it also reduced temperature and discernable odor to some degree during both seasons. These results are consistent with the hypotheses that caching serves to both reduce competition from arthropods and microbes and reduce odds of detection by larger vertebrates such as bears (Ursus spp.), wolves (Canis lupus), or other lions.

  20. Explicit Content Caching at Mobile Edge Networks with Cross-Layer Sensing

    Science.gov (United States)

    Chen, Lingyu; Su, Youxing; Luo, Wenbin; Hong, Xuemin; Shi, Jianghong

    2018-01-01

    The deployment density and computational power of small base stations (BSs) are expected to increase significantly in the next generation mobile communication networks. These BSs form the mobile edge network, which is a pervasive and distributed infrastructure that can empower a variety of edge/fog computing applications. This paper proposes a novel edge-computing application called explicit caching, which stores selective contents at BSs and exposes such contents to local users for interactive browsing and download. We formulate the explicit caching problem as a joint content recommendation, caching, and delivery problem, which aims to maximize the expected user quality-of-experience (QoE) with varying degrees of cross-layer sensing capability. Optimal and effective heuristic algorithms are presented to solve the problem. The theoretical performance bounds of the explicit caching system are derived in simplified scenarios. The impacts of cache storage space, BS backhaul capacity, cross-layer information, and user mobility on the system performance are simulated and discussed in realistic scenarios. Results suggest that, compared with conventional implicit caching schemes, explicit caching can better exploit the mobile edge network infrastructure for personalized content dissemination. PMID:29565313

  1. A Scalable and Highly Configurable Cache-Aware Hybrid Flash Translation Layer

    Directory of Open Access Journals (Sweden)

    Jalil Boukhobza

    2014-03-01

    Full Text Available This paper presents a cache-aware configurable hybrid flash translation layer (FTL, named CACH-FTL. It was designed based on the observation that most state-of­­-the-art flash-specific cache systems above FTLs flush groups of pages belonging to the same data block. CACH-FTL relies on this characteristic to optimize flash write operations placement, as large groups of pages are flushed to a block-mapped region, named BMR, whereas small groups are buffered into a page-mapped region, named PMR. Page group placement is based on a configurable threshold defining the limit under which it is more cost-effective to use page mapping (PMR and wait for grouping more pages before flushing to the BMR. CACH-FTL is scalable in terms of mapping table size and flexible in terms of Input/Output (I/O workload support. CACH-FTL performs very well, as the performance difference with the ideal page-mapped FTL is less than 15% in most cases and has a mean of 4% for the best CACH-FTL configurations, while using at least 78% less memory for table mapping storage on RAM.

  2. Explicit Content Caching at Mobile Edge Networks with Cross-Layer Sensing.

    Science.gov (United States)

    Chen, Lingyu; Su, Youxing; Luo, Wenbin; Hong, Xuemin; Shi, Jianghong

    2018-03-22

    The deployment density and computational power of small base stations (BSs) are expected to increase significantly in the next generation mobile communication networks. These BSs form the mobile edge network, which is a pervasive and distributed infrastructure that can empower a variety of edge/fog computing applications. This paper proposes a novel edge-computing application called explicit caching, which stores selective contents at BSs and exposes such contents to local users for interactive browsing and download. We formulate the explicit caching problem as a joint content recommendation, caching, and delivery problem, which aims to maximize the expected user quality-of-experience (QoE) with varying degrees of cross-layer sensing capability. Optimal and effective heuristic algorithms are presented to solve the problem. The theoretical performance bounds of the explicit caching system are derived in simplified scenarios. The impacts of cache storage space, BS backhaul capacity, cross-layer information, and user mobility on the system performance are simulated and discussed in realistic scenarios. Results suggest that, compared with conventional implicit caching schemes, explicit caching can better exploit the mobile edge network infrastructure for personalized content dissemination.

  3. Randomized Caches Considered Harmful in Hard Real-Time Systems

    Directory of Open Access Journals (Sweden)

    Jan Reineke

    2014-06-01

    Full Text Available We investigate the suitability of caches with randomized placement and replacement in the context of hard real-time systems. Such caches have been claimed to drastically reduce the amount of information required by static worst-case execution time (WCET analysis, and to be an enabler for measurement-based probabilistic timing analysis. We refute these claims and conclude that with prevailing static and measurement-based analysis techniques caches with deterministic placement and least-recently-used replacement are preferable over randomized ones.

  4. Learning Automata Based Caching for Efficient Data Access in Delay Tolerant Networks

    Directory of Open Access Journals (Sweden)

    Zhenjie Ma

    2018-01-01

    Full Text Available Effective data access is one of the major challenges in Delay Tolerant Networks (DTNs that are characterized by intermittent network connectivity and unpredictable node mobility. Currently, different data caching schemes have been proposed to improve the performance of data access in DTNs. However, most existing data caching schemes perform poorly due to the lack of global network state information and the changing network topology in DTNs. In this paper, we propose a novel data caching scheme based on cooperative caching in DTNs, aiming at improving the successful rate of data access and reducing the data access delay. In the proposed scheme, learning automata are utilized to select a set of caching nodes as Caching Node Set (CNS in DTNs. Unlike the existing caching schemes failing to address the challenging characteristics of DTNs, our scheme is designed to automatically self-adjust to the changing network topology through the well-designed voting and updating processes. The proposed scheme improves the overall performance of data access in DTNs compared with the former caching schemes. The simulations verify the feasibility of our scheme and the improvements in performance.

  5. Re-caching by Western scrub-jays (Aphelocoma californica cannot be attributed to stress.

    Directory of Open Access Journals (Sweden)

    James M Thom

    Full Text Available Western scrub-jays (Aphelocoma californica live double lives, storing food for the future while raiding the stores of other birds. One tactic scrub-jays employ to protect stores is "re-caching"-relocating caches out of sight of would-be thieves. Recent computational modelling work suggests that re-caching might be mediated not by complex cognition, but by a combination of memory failure and stress. The "Stress Model" asserts that re-caching is a manifestation of a general drive to cache, rather than a desire to protect existing stores. Here, we present evidence strongly contradicting the central assumption of these models: that stress drives caching, irrespective of social context. In Experiment (i, we replicate the finding that scrub-jays preferentially relocate food they were watched hiding. In Experiment (ii we find no evidence that stress increases caching. In light of our results, we argue that the Stress Model cannot account for scrub-jay re-caching.

  6. Cache and memory hierarchy design a performance directed approach

    CERN Document Server

    Przybylski, Steven A

    1991-01-01

    An authoritative book for hardware and software designers. Caches are by far the simplest and most effective mechanism for improving computer performance. This innovative book exposes the characteristics of performance-optimal single and multi-level cache hierarchies by approaching the cache design process through the novel perspective of minimizing execution times. It presents useful data on the relative performance of a wide spectrum of machines and offers empirical and analytical evaluations of the underlying phenomena. This book will help computer professionals appreciate the impact of ca

  7. High Performance Analytics with the R3-Cache

    Science.gov (United States)

    Eavis, Todd; Sayeed, Ruhan

    Contemporary data warehouses now represent some of the world’s largest databases. As these systems grow in size and complexity, however, it becomes increasingly difficult for brute force query processing approaches to meet the performance demands of end users. Certainly, improved indexing and more selective view materialization are helpful in this regard. Nevertheless, with warehouses moving into the multi-terabyte range, it is clear that the minimization of external memory accesses must be a primary performance objective. In this paper, we describe the R 3-cache, a natively multi-dimensional caching framework designed specifically to support sophisticated warehouse/OLAP environments. R 3-cache is based upon an in-memory version of the R-tree that has been extended to support buffer pages rather than disk blocks. A key strength of the R 3-cache is that it is able to utilize multi-dimensional fragments of previous query results so as to significantly minimize the frequency and scale of disk accesses. Moreover, the new caching model directly accommodates the standard relational storage model and provides mechanisms for pro-active updates that exploit the existence of query “hot spots”. The current prototype has been evaluated as a component of the Sidera DBMS, a “shared nothing” parallel OLAP server designed for multi-terabyte analytics. Experimental results demonstrate significant performance improvements relative to simpler alternatives.

  8. Adaptive Neuro-fuzzy Inference System as Cache Memory Replacement Policy

    Directory of Open Access Journals (Sweden)

    CHUNG, Y. M.

    2014-02-01

    Full Text Available To date, no cache memory replacement policy that can perform efficiently for all types of workloads is yet available. Replacement policies used in level 1 cache memory may not be suitable in level 2. In this study, we focused on developing an adaptive neuro-fuzzy inference system (ANFIS as a replacement policy for improving level 2 cache performance in terms of miss ratio. The recency and frequency of referenced blocks were used as input data for ANFIS to make decisions on replacement. MATLAB was employed as a training tool to obtain the trained ANFIS model. The trained ANFIS model was implemented on SimpleScalar. Simulations on SimpleScalar showed that the miss ratio improved by as high as 99.95419% and 99.95419% for instruction level 2 cache, and up to 98.04699% and 98.03467% for data level 2 cache compared with least recently used and least frequently used, respectively.

  9. Probabilistic Caching Placement in the Presence of Multiple Eavesdroppers

    Directory of Open Access Journals (Sweden)

    Fang Shi

    2018-01-01

    Full Text Available The wireless caching has attracted a lot of attention in recent years, since it can reduce the backhaul cost significantly and improve the user-perceived experience. The existing works on the wireless caching and transmission mainly focus on the communication scenarios without eavesdroppers. When the eavesdroppers appear, it is of vital importance to investigate the physical-layer security for the wireless caching aided networks. In this paper, a caching network is studied in the presence of multiple eavesdroppers, which can overhear the secure information transmission. We model the locations of eavesdroppers by a homogeneous Poisson Point Process (PPP, and the eavesdroppers jointly receive and decode contents through the maximum ratio combining (MRC reception which yields the worst case of wiretap. Moreover, the main performance metric is measured by the average probability of successful transmission, which is the probability of finding and successfully transmitting all the requested files within a radius R. We study the system secure transmission performance by deriving a single integral result, which is significantly affected by the probability of caching each file. Therefore, we extend to build the optimization problem of the probability of caching each file, in order to optimize the system secure transmission performance. This optimization problem is nonconvex, and we turn to use the genetic algorithm (GA to solve the problem. Finally, simulation and numerical results are provided to validate the proposed studies.

  10. Horizontally scaling dCache SRM with the Terracotta platform

    International Nuclear Information System (INIS)

    Perelmutov, T; Crawford, M; Moibenko, A; Oleynik, G

    2011-01-01

    The dCache disk caching file system has been chosen by a majority of LHC experiments' Tier 1 centers for their data storage needs. It is also deployed at many Tier 2 centers. The Storage Resource Manager (SRM) is a standardized grid storage interface and a single point of remote entry into dCache, and hence is a critical component. SRM must scale to increasing transaction rates and remain resilient against changing usage patterns. The initial implementation of the SRM service in dCache suffered from an inability to support clustered deployment, and its performance was limited by the hardware of a single node. Using the Terracotta platform[l], we added the ability to horizontally scale the dCache SRM service to run on multiple nodes in a cluster configuration, coupled with network load balancing. This gives site administrators the ability to increase the performance and reliability of SRM service to face the ever-increasing requirements of LHC data handling. In this paper we will describe the previous limitations of the architecture SRM server and how the Terracotta platform allowed us to readily convert single node service into a highly scalable clustered application.

  11. Método y sistema de modelado de memoria cache

    OpenAIRE

    Posadas Cobo, Héctor; Villar Bonet, Eugenio; Díaz Suárez, Luis

    2010-01-01

    Un método de modelado de una memoria cache de datos de un procesador destino, para simular el comportamiento de dicha memoria cache de datos en la ejecución de un código software en una plataforma que comprenda dicho procesador destino, donde dicha simulación se realiza en una plataforma nativa que tiene un procesador diferente del procesador destino que comprende dicha memoria cache de datos que se va a modelar, donde dicho modelado se realiza mediante la ejecución en dicha plataforma nativa...

  12. Fundamental Parallel Algorithms for Private-Cache Chip Multiprocessors

    DEFF Research Database (Denmark)

    Arge, Lars Allan; Goodrich, Michael T.; Nelson, Michael

    2008-01-01

    about the way cores are interconnected, for we assume that all inter-processor communication occurs through the memory hierarchy. We study several fundamental problems, including prefix sums, selection, and sorting, which often form the building blocks of other parallel algorithms. Indeed, we present...... two sorting algorithms, a distribution sort and a mergesort. Our algorithms are asymptotically optimal in terms of parallel cache accesses and space complexity under reasonable assumptions about the relationships between the number of processors, the size of memory, and the size of cache blocks....... In addition, we study sorting lower bounds in a computational model, which we call the parallel external-memory (PEM) model, that formalizes the essential properties of our algorithms for private-cache CMPs....

  13. Water- and air-quality and surficial bed-sediment monitoring of the Sweetwater Reservoir watershed, San Diego County, California, 2003-09

    Science.gov (United States)

    Mendez, Gregory O.; Majewski, Michael S.; Foreman, William T.; Morita, Andrew Y.

    2015-01-01

    In 1998, the U.S. Geological Survey, in cooperation with the Sweetwater Authority, began a study to assess the overall health of the Sweetwater watershed in San Diego County, California. This study was designed to provide a data set that could be used to evaluate potential effects from the construction and operation of State Route 125 within the broader context of the water quality and air quality in the watershed. The study included regular sampling of water, air, and surficial bed sediment at Sweetwater Reservoir (SWR) for chemical constituents, including volatile organic compounds (VOCs), base-neutral and acid- extractable organic compounds (BNAs) that include polycyclic aromatic hydrocarbons (PAHs), pesticides, and metals. Additionally, water samples were collected for anthropogenic organic indicator compounds in and around SWR. Background water samples were collected at Loveland Reservoir for VOCs, BNAs, pesticides, and metals. Surficial bed-sediment samples were collected for PAHs, organochlorine pesticides, and metals at Sweetwater and Loveland Reservoirs.

  14. A Scalable proxy cache for Grid Data Access

    International Nuclear Information System (INIS)

    Cristian Cirstea, Traian; Just Keijser, Jan; Arthur Koeroo, Oscar; Starink, Ronald; Alan Templon, Jeffrey

    2012-01-01

    We describe a prototype grid proxy cache system developed at Nikhef, motivated by a desire to construct the first building block of a future https-based Content Delivery Network for grid infrastructures. Two goals drove the project: firstly to provide a “native view” of the grid for desktop-type users, and secondly to improve performance for physics-analysis type use cases, where multiple passes are made over the same set of data (residing on the grid). We further constrained the design by requiring that the system should be made of standard components wherever possible. The prototype that emerged from this exercise is a horizontally-scalable, cooperating system of web server / cache nodes, fronted by a customized webDAV server. The webDAV server is custom only in the sense that it supports http redirects (providing horizontal scaling) and that the authentication module has, as back end, a proxy delegation chain that can be used by the cache nodes to retrieve files from the grid. The prototype was deployed at Nikhef and tested at a scale of several terabytes of data and approximately one hundred fast cores of computing. Both small and large files were tested, in a number of scenarios, and with various numbers of cache nodes, in order to understand the scaling properties of the system. For properly-dimensioned cache-node hardware, the system showed speedup of several integer factors for the analysis-type use cases. These results and others are presented and discussed.

  15. Evidence for cache surveillance by a scatter-hoarding rodent

    NARCIS (Netherlands)

    Hirsch, B.T.; Kays, R.; Jansen, P.A.

    2013-01-01

    The mechanisms by which food-hoarding animals are capable of remembering the locations of numerous cached food items over long time spans has been the focus of intensive research. The ‘memory enhancement hypothesis’ states that hoarders reinforce spatial memory of their caches by repeatedly

  16. A high level implementation and performance evaluation of level-I asynchronous cache on FPGA

    Directory of Open Access Journals (Sweden)

    Mansi Jhamb

    2017-07-01

    Full Text Available To bridge the ever-increasing performance gap between the processor and the main memory in a cost-effective manner, novel cache designs and implementations are indispensable. Cache is responsible for a major part of energy consumption (approx. 50% of processors. This paper presents a high level implementation of a micropipelined asynchronous architecture of L1 cache. Due to the fact that each cache memory implementation is time consuming and error-prone process, a synthesizable and a configurable model proves out to be of immense help as it aids in generating a range of caches in a reproducible and quick fashion. The micropipelined cache, implemented using C-Elements acts as a distributed message-passing system. The RTL cache model implemented in this paper, comprising of data and instruction caches has a wide array of configurable parameters. In addition to timing robustness our implementation has high average cache throughput and low latency. The implemented architecture comprises of two direct-mapped, write-through caches for data and instruction. The architecture is implemented in a Field Programmable Gate Array (FPGA chip using Very High Speed Integrated Circuit Hardware Description Language (VHSIC HDL along with advanced synthesis and place-and-route tools.

  17. Analisis Algoritma Pergantian Cache Pada Proxy Web Server Internet Dengan Simulasi

    OpenAIRE

    Nurwarsito, Heru

    2007-01-01

    Pertumbuhan jumlah client internet dari waktu ke waktu terus bertambah, maka respon akses internet menjadi semakin lambat. Untuk membantu kecepatan akses tersebut maka diperlukan cache pada Proxy Server. Penelitian ini bertujuan untuk menganalisis performansi Proxy Server pada Jaringan Internet terhadap penggunaan algoritma pergantian cache-nya.Analisis Algoritma Pergantian Cache Pada Proxy Server didesain dengan metoda pemodelan simulasi jaringan internet yang terdiri dari Web server, Proxy ...

  18. Advanced Reservoir Characterization and Development through High-Resolution 3C3D Seismic and Horizontal Drilling: Eva South Marrow Sand Unit, Texas County, Oklahoma

    Energy Technology Data Exchange (ETDEWEB)

    Wheeler,David M.; Miller, William A.; Wilson, Travis C.

    2002-03-11

    The Eva South Morrow Sand Unit is located in western Texas County, Oklahoma. The field produces from an upper Morrow sandstone, termed the Eva sandstone, deposited in a transgressive valley-fill sequence. The field is defined as a combination structural stratigraphic trap; the reservoir lies in a convex up -dip bend in the valley and is truncated on the west side by the Teepee Creek fault. Although the field has been a successful waterflood since 1993, reservoir heterogeneity and compartmentalization has impeded overall sweep efficiency. A 4.25 square mile high-resolution, three component three-dimensional (3C3D) seismic survey was acquired in order to improve reservoir characterization and pinpoint the optimal location of a new horizontal producing well, the ESU 13-H.

  19. Nature as a treasure map! Teaching geoscience with the help of earth caches?!

    Science.gov (United States)

    Zecha, Stefanie; Schiller, Thomas

    2015-04-01

    This presentation looks at how earth caches are influence the learning process in the field of geo science in non-formal education. The development of mobile technologies using Global Positioning System (GPS) data to point geographical location together with the evolving Web 2.0 supporting the creation and consumption of content, suggest a potential for collaborative informal learning linked to location. With the help of the GIS in smartphones you can go directly in nature, search for information by your smartphone, and learn something about nature. Earth caches are a very good opportunity, which are organized and supervised geocaches with special information about physical geography high lights. Interested people can inform themselves about aspects in geoscience area by earth caches. The main question of this presentation is how these caches are created in relation to learning processes. As is not possible, to analyze all existing earth caches, there was focus on Bavaria and a certain feature of earth caches. At the end the authors show limits and potentials for the use of earth caches and give some remark for the future.

  20. A Novel Architecture of Metadata Management System Based on Intelligent Cache

    Institute of Scientific and Technical Information of China (English)

    SONG Baoyan; ZHAO Hongwei; WANG Yan; GAO Nan; XU Jin

    2006-01-01

    This paper introduces a novel architecture of metadata management system based on intelligent cache called Metadata Intelligent Cache Controller (MICC). By using an intelligent cache to control the metadata system, MICC can deal with different scenarios such as splitting and merging of queries into sub-queries for available metadata sets in local, in order to reduce access time of remote queries. Application can find results patially from local cache and the remaining portion of the metadata that can be fetched from remote locations. Using the existing metadata, it can not only enhance the fault tolerance and load balancing of system effectively, but also improve the efficiency of access while ensuring the access quality.

  1. Organizing the pantry: cache management improves quality of overwinter food stores in a montane mammal

    Science.gov (United States)

    Jakopak, Rhiannon P.; Hall, L. Embere; Chalfoun, Anna D.

    2017-01-01

    Many mammals create food stores to enhance overwinter survival in seasonal environments. Strategic arrangement of food within caches may facilitate the physical integrity of the cache or improve access to high-quality food to ensure that cached resources meet future nutritional demands. We used the American pika (Ochotona princeps), a food-caching lagomorph, to evaluate variation in haypile (cache) structure (i.e., horizontal layering by plant functional group) in Wyoming, United States. Fifty-five percent of 62 haypiles contained at least 2 discrete layers of vegetation. Adults and juveniles layered haypiles in similar proportions. The probability of layering increased with haypile volume, but not haypile number per individual or nearby forage diversity. Vegetation cached in layered haypiles was also higher in nitrogen compared to vegetation in unlayered piles. We found that American pikas frequently structured their food caches, structured caches were larger, and the cached vegetation in structured piles was of higher nutritional quality. Improving access to stable, high-quality vegetation in haypiles, a critical overwinter food resource, may allow individuals to better persist amidst harsh conditions.

  2. Tier 3 batch system data locality via managed caches

    Science.gov (United States)

    Fischer, Max; Giffels, Manuel; Jung, Christopher; Kühn, Eileen; Quast, Günter

    2015-05-01

    Modern data processing increasingly relies on data locality for performance and scalability, whereas the common HEP approaches aim for uniform resource pools with minimal locality, recently even across site boundaries. To combine advantages of both, the High- Performance Data Analysis (HPDA) Tier 3 concept opportunistically establishes data locality via coordinated caches. In accordance with HEP Tier 3 activities, the design incorporates two major assumptions: First, only a fraction of data is accessed regularly and thus the deciding factor for overall throughput. Second, data access may fallback to non-local, making permanent local data availability an inefficient resource usage strategy. Based on this, the HPDA design generically extends available storage hierarchies into the batch system. Using the batch system itself for scheduling file locality, an array of independent caches on the worker nodes is dynamically populated with high-profile data. Cache state information is exposed to the batch system both for managing caches and scheduling jobs. As a result, users directly work with a regular, adequately sized storage system. However, their automated batch processes are presented with local replications of data whenever possible.

  3. Cache Oblivious Distribution Sweeping

    DEFF Research Database (Denmark)

    Brodal, G.S.; Fagerberg, R.

    2002-01-01

    We adapt the distribution sweeping method to the cache oblivious model. Distribution sweeping is the name used for a general approach for divide-and-conquer algorithms where the combination of solved subproblems can be viewed as a merging process of streams. We demonstrate by a series of algorith...

  4. Do Clark's nutcrackers demonstrate what-where-when memory on a cache-recovery task?

    Science.gov (United States)

    Gould, Kristy L; Ort, Amy J; Kamil, Alan C

    2012-01-01

    What-where-when (WWW) memory during cache recovery was investigated in six Clark's nutcrackers. During caching, both red- and blue-colored pine seeds were cached by the birds in holes filled with sand. Either a short (3 day) retention interval (RI) or a long (9 day) RI was followed by a recovery session during which caches were replaced with either a single seed or wooden bead depending upon the color of the cache and length of the retention interval. Knowledge of what was in the cache (seed or bead), where it was located, and when the cache had been made (3 or 9 days ago) were the three WWW memory components under investigation. Birds recovered items (bead or seed) at above chance levels, demonstrating accurate spatial memory. They also recovered seeds more than beads after the long RI, but not after the short RI, when they recovered seeds and beads equally often. The differential recovery after the long RI demonstrates that nutcrackers may have the capacity for WWW memory during this task, but it is not clear why it was influenced by RI duration.

  5. Efficacy of Code Optimization on Cache-Based Processors

    Science.gov (United States)

    VanderWijngaart, Rob F.; Saphir, William C.; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    In this paper a number of techniques for improving the cache performance of a representative piece of numerical software is presented. Target machines are popular processors from several vendors: MIPS R5000 (SGI Indy), MIPS R8000 (SGI PowerChallenge), MIPS R10000 (SGI Origin), DEC Alpha EV4 + EV5 (Cray T3D & T3E), IBM RS6000 (SP Wide-node), Intel PentiumPro (Ames' Whitney), Sun UltraSparc (NERSC's NOW). The optimizations all attempt to increase the locality of memory accesses. But they meet with rather varied and often counterintuitive success on the different computing platforms. We conclude that it may be genuinely impossible to obtain portable performance on the current generation of cache-based machines. At the least, it appears that the performance of modern commodity processors cannot be described with parameters defining the cache alone.

  6. Cache Aided Decode-and-Forward Relaying Networks: From the Spatial View

    Directory of Open Access Journals (Sweden)

    Junjuan Xia

    2018-01-01

    Full Text Available We investigate cache technique from the spatial view and study its impact on the relaying networks. In particular, we consider a dual-hop relaying network, where decode-and-forward (DF relays can assist the data transmission from the source to the destination. In addition to the traditional dual-hop relaying, we also consider the cache from the spatial view, where the source can prestore the data among the memories of the nodes around the destination. For the DF relaying networks without and with cache, we study the system performance by deriving the analytical expressions of outage probability and symbol error rate (SER. We also derive the asymptotic outage probability and SER in the high regime of transmit power, from which we find the system diversity order can be rapidly increased by using cache and the system performance can be significantly improved. Simulation and numerical results are demonstrated to verify the proposed studies and find that the system power resources can be efficiently saved by using cache technique.

  7. A Cache Considering Role-Based Access Control and Trust in Privilege Management Infrastructure

    Institute of Scientific and Technical Information of China (English)

    ZHANG Shaomin; WANG Baoyi; ZHOU Lihua

    2006-01-01

    PMI(privilege management infrastructure) is used to perform access control to resource in an E-commerce or E-government system. With the ever-increasing need for secure transaction, the need for systems that offer a wide variety of QoS (quality-of-service) features is also growing. In order to improve the QoS of PMI system, a cache based on RBAC(Role-based Access Control) and trust is proposed. Our system is realized based on Web service. How to design the cache based on RBAC and trust in the access control model is described in detail. The algorithm to query role permission in cache and to add records in cache is dealt with. The policy to update cache is introduced also.

  8. Magpies can use local cues to retrieve their food caches.

    Science.gov (United States)

    Feenders, Gesa; Smulders, Tom V

    2011-03-01

    Much importance has been placed on the use of spatial cues by food-hoarding birds in the retrieval of their caches. In this study, we investigate whether food-hoarding birds can be trained to use local cues ("beacons") in their cache retrieval. We test magpies (Pica pica) in an active hoarding-retrieval paradigm, where local cues are always reliable, while spatial cues are not. Our results show that the birds use the local cues to retrieve their caches, even when occasionally contradicting spatial information is available. The design of our study does not allow us to test rigorously whether the birds prefer using local over spatial cues, nor to investigate the process through which they learn to use local cues. We furthermore provide evidence that magpies develop landmark preferences, which improve their retrieval accuracy. Our findings support the hypothesis that birds are flexible in their use of memory information, using a combination of the most reliable or salient information to retrieve their caches. © Springer-Verlag 2010

  9. Analyzing data distribution on disk pools for dCache

    Energy Technology Data Exchange (ETDEWEB)

    Halstenberg, S; Jung, C; Ressmann, D [Forschungszentrum Karlsruhe, Steinbuch Centre for Computing, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen (Germany)

    2010-04-01

    Most Tier-1 centers of LHC Computing Grid are using dCache as their storage system. dCache uses a cost model incorporating CPU and space costs for the distribution of data on its disk pools. Storage resources at Tier-1 centers are usually upgraded once or twice a year according to given milestones. One of the effects of this procedure is the accumulation of heterogeneous hardware resources. For a dCache system, a heterogeneous set of disk pools complicates the process of weighting CPU and space costs for an efficient distribution of data. In order to evaluate the data distribution on the disk pools, the distribution is simulated in Java. The results are discussed and suggestions for improving the weight scheme are given.

  10. Massively parallel algorithms for trace-driven cache simulations

    Science.gov (United States)

    Nicol, David M.; Greenberg, Albert G.; Lubachevsky, Boris D.

    1991-01-01

    Trace driven cache simulation is central to computer design. A trace is a very long sequence of reference lines from main memory. At the t(exp th) instant, reference x sub t is hashed into a set of cache locations, the contents of which are then compared with x sub t. If at the t sup th instant x sub t is not present in the cache, then it is said to be a miss, and is loaded into the cache set, possibly forcing the replacement of some other memory line, and making x sub t present for the (t+1) sup st instant. The problem of parallel simulation of a subtrace of N references directed to a C line cache set is considered, with the aim of determining which references are misses and related statistics. A simulation method is presented for the Least Recently Used (LRU) policy, which regradless of the set size C runs in time O(log N) using N processors on the exclusive read, exclusive write (EREW) parallel model. A simpler LRU simulation algorithm is given that runs in O(C log N) time using N/log N processors. Timings are presented of the second algorithm's implementation on the MasPar MP-1, a machine with 16384 processors. A broad class of reference based line replacement policies are considered, which includes LRU as well as the Least Frequently Used and Random replacement policies. A simulation method is presented for any such policy that on any trace of length N directed to a C line set runs in the O(C log N) time with high probability using N processors on the EREW model. The algorithms are simple, have very little space overhead, and are well suited for SIMD implementation.

  11. California scrub-jays reduce visual cues available to potential pilferers by matching food colour to caching substrate.

    Science.gov (United States)

    Kelley, Laura A; Clayton, Nicola S

    2017-07-01

    Some animals hide food to consume later; however, these caches are susceptible to theft by conspecifics and heterospecifics. Caching animals can use protective strategies to minimize sensory cues available to potential pilferers, such as caching in shaded areas and in quiet substrate. Background matching (where object patterning matches the visual background) is commonly seen in prey animals to reduce conspicuousness, and caching animals may also use this tactic to hide caches, for example, by hiding coloured food in a similar coloured substrate. We tested whether California scrub-jays ( Aphelocoma californica ) camouflage their food in this way by offering them caching substrates that either matched or did not match the colour of food available for caching. We also determined whether this caching behaviour was sensitive to social context by allowing the birds to cache when a conspecific potential pilferer could be both heard and seen (acoustic and visual cues present), or unseen (acoustic cues only). When caching events could be both heard and seen by a potential pilferer, birds cached randomly in matching and non-matching substrates. However, they preferentially hid food in the substrate that matched the food colour when only acoustic cues were present. This is a novel cache protection strategy that also appears to be sensitive to social context. We conclude that studies of cache protection strategies should consider the perceptual capabilities of the cacher and potential pilferers. © 2017 The Author(s).

  12. Replication Strategy for Spatiotemporal Data Based on Distributed Caching System.

    Science.gov (United States)

    Xiong, Lian; Yang, Liu; Tao, Yang; Xu, Juan; Zhao, Lun

    2018-01-14

    The replica strategy in distributed cache can effectively reduce user access delay and improve system performance. However, developing a replica strategy suitable for varied application scenarios is still quite challenging, owing to differences in user access behavior and preferences. In this paper, a replication strategy for spatiotemporal data (RSSD) based on a distributed caching system is proposed. By taking advantage of the spatiotemporal locality and correlation of user access, RSSD mines high popularity and associated files from historical user access information, and then generates replicas and selects appropriate cache node for placement. Experimental results show that the RSSD algorithm is simple and efficient, and succeeds in significantly reducing user access delay.

  13. Optimal Caching in Multicast 5G Networks with Opportunistic Spectrum Access

    KAUST Repository

    Emara, Mostafa; Elsawy, Hesham; Sorour, Sameh; Al-Ghadhban, Samir; Alouini, Mohamed-Slim; Al-Naffouri, Tareq Y.

    2018-01-01

    Cache-enabled small base station (SBS) densification is foreseen as a key component of 5G cellular networks. This architecture enables storing popular files at the network edge (i.e., SBS caches), which empowers local communication and alleviates

  14. LPPS: A Distributed Cache Pushing Based K-Anonymity Location Privacy Preserving Scheme

    Directory of Open Access Journals (Sweden)

    Ming Chen

    2016-01-01

    Full Text Available Recent years have witnessed the rapid growth of location-based services (LBSs for mobile social network applications. To enable location-based services, mobile users are required to report their location information to the LBS servers and receive answers of location-based queries. Location privacy leak happens when such servers are compromised, which has been a primary concern for information security. To address this issue, we propose the Location Privacy Preservation Scheme (LPPS based on distributed cache pushing. Unlike existing solutions, LPPS deploys distributed cache proxies to cover users mostly visited locations and proactively push cache content to mobile users, which can reduce the risk of leaking users’ location information. The proposed LPPS includes three major process. First, we propose an algorithm to find the optimal deployment of proxies to cover popular locations. Second, we present cache strategies for location-based queries based on the Markov chain model and propose update and replacement strategies for cache content maintenance. Third, we introduce a privacy protection scheme which is proved to achieve k-anonymity guarantee for location-based services. Extensive experiments illustrate that the proposed LPPS achieves decent service coverage ratio and cache hit ratio with lower communication overhead compared to existing solutions.

  15. Enabling MPEG-2 video playback in embedded systems through improved data cache efficiency

    Science.gov (United States)

    Soderquist, Peter; Leeser, Miriam E.

    1999-01-01

    Digital video decoding, enabled by the MPEG-2 Video standard, is an important future application for embedded systems, particularly PDAs and other information appliances. Many such system require portability and wireless communication capabilities, and thus face severe limitations in size and power consumption. This places a premium on integration and efficiency, and favors software solutions for video functionality over specialized hardware. The processors in most embedded system currently lack the computational power needed to perform video decoding, but a related and equally important problem is the required data bandwidth, and the need to cost-effectively insure adequate data supply. MPEG data sets are very large, and generate significant amounts of excess memory traffic for standard data caches, up to 100 times the amount required for decoding. Meanwhile, cost and power limitations restrict cache sizes in embedded systems. Some systems, including many media processors, eliminate caches in favor of memories under direct, painstaking software control in the manner of digital signal processors. Yet MPEG data has locality which caches can exploit if properly optimized, providing fast, flexible, and automatic data supply. We propose a set of enhancements which target the specific needs of the heterogeneous types within the MPEG decoder working set. These optimizations significantly improve the efficiency of small caches, reducing cache-memory traffic by almost 70 percent, and can make an enhanced 4 KB cache perform better than a standard 1 MB cache. This performance improvement can enable high-resolution, full frame rate video playback in cheaper, smaller system than woudl otherwise be possible.

  16. The Optimization of In-Memory Space Partitioning Trees for Cache Utilization

    Science.gov (United States)

    Yeo, Myung Ho; Min, Young Soo; Bok, Kyoung Soo; Yoo, Jae Soo

    In this paper, a novel cache conscious indexing technique based on space partitioning trees is proposed. Many researchers investigated efficient cache conscious indexing techniques which improve retrieval performance of in-memory database management system recently. However, most studies considered data partitioning and targeted fast information retrieval. Existing data partitioning-based index structures significantly degrade performance due to the redundant accesses of overlapped spaces. Specially, R-tree-based index structures suffer from the propagation of MBR (Minimum Bounding Rectangle) information by updating data frequently. In this paper, we propose an in-memory space partitioning index structure for optimal cache utilization. The proposed index structure is compared with the existing index structures in terms of update performance, insertion performance and cache-utilization rate in a variety of environments. The results demonstrate that the proposed index structure offers better performance than existing index structures.

  17. Proposal and development of a reconfigurable associativity algorithm in cache memories.

    OpenAIRE

    Roberto Borges Kerr Junior

    2008-01-01

    A evolução constante dos processadores está aumentando cada vez o overhead dos acessos à memória. Tentando evitar este problema, os desenvolvedores de processadores utilizam diversas técnicas, entre elas, o emprego de memórias cache na hierarquia de memórias dos computadores. As memórias cache, por outro lado, não conseguem suprir totalmente as suas necessidades, sendo interessante alguma técnica que tornasse possível aproveitar melhor a memória cache. Para resolver este problema, autores pro...

  18. Worst-case execution time analysis-driven object cache design

    DEFF Research Database (Denmark)

    Huber, Benedikt; Puffitsch, Wolfgang; Schoeberl, Martin

    2012-01-01

    result in a WCET analysis‐friendly design. Aiming for a time‐predictable design, we therefore propose to employ WCET analysis techniques for the design space exploration of processor architectures. We evaluated different object cache configurations using static analysis techniques. The number of field......Hard real‐time systems need a time‐predictable computing platform to enable static worst‐case execution time (WCET) analysis. All performance‐enhancing features need to be WCET analyzable. However, standard data caches containing heap‐allocated data are very hard to analyze statically....... In this paper we explore a new object cache design, which is driven by the capabilities of static WCET analysis. Simulations of standard benchmarks estimating the expected average case performance usually drive computer architecture design. The design decisions derived from this methodology do not necessarily...

  19. Caching at the Mobile Edge: a Practical Implementation

    DEFF Research Database (Denmark)

    Poderys, Justas; Artuso, Matteo; Lensbøl, Claus Michael Oest

    2018-01-01

    Thanks to recent advances in mobile networks, it is becoming increasingly popular to access heterogeneous content from mobile terminals. There are, however, unique challenges in mobile networks that affect the perceived quality of experience (QoE) at the user end. One such challenge is the higher...... latency that users typically experience in mobile networks compared to wired ones. Cloud-based radio access networks with content caches at the base stations are seen as a key contributor in reducing the latency required to access content and thus improve the QoE at the mobile user terminal. In this paper...... for the mobile user obtained by caching content at the base stations. This is quantified with a comparison to non-cached content by means of ping tests (10–11% shorter times), a higher response rate for web traffic (1.73–3.6 times higher), and an improvement in the jitter (6% reduction)....

  20. EqualChance: Addressing Intra-set Write Variation to Increase Lifetime of Non-volatile Caches

    Energy Technology Data Exchange (ETDEWEB)

    Mittal, Sparsh [ORNL; Vetter, Jeffrey S [ORNL

    2014-01-01

    To address the limitations of SRAM such as high-leakage and low-density, researchers have explored use of non-volatile memory (NVM) devices, such as ReRAM (resistive RAM) and STT-RAM (spin transfer torque RAM) for designing on-chip caches. A crucial limitation of NVMs, however, is that their write endurance is low and the large intra-set write variation introduced by existing cache management policies may further exacerbate this problem, thereby reducing the cache lifetime significantly. We present EqualChance, a technique to increase cache lifetime by reducing intra-set write variation. EqualChance works by periodically changing the physical cache-block location of a write-intensive data item within a set to achieve wear-leveling. Simulations using workloads from SPEC CPU2006 suite and HPC (high-performance computing) field show that EqualChance improves the cache lifetime by 4.29X. Also, its implementation overhead is small, and it incurs very small performance and energy loss.

  1. ARC Cache: A solution for lightweight Grid sites in ATLAS

    CERN Document Server

    Garonne, Vincent; The ATLAS collaboration

    2016-01-01

    Many Grid sites have the need to reduce operational manpower, and running a storage element consumes a large amount of effort. In addition, setting up a new Grid site including a storage element involves a steep learning curve and large investment of time. For these reasons so-called storage-less sites are becoming more popular as a way to provide Grid computing resources with less operational overhead. ARC CE is a widely-used and mature Grid middleware which was designed from the start to be used on sites with no persistent storage element. Instead, it maintains a local self-managing cache of data which retains popular data for future jobs. As the cache is simply an area on a local posix shared filesystem with no external-facing service, it requires no extra maintenance. The cache can be scaled up as required by increasing the size of the filesystem or adding new filesystems. This paper describes how ARC CE and its cache are an ideal solution for lightweight Grid sites in the ATLAS experiment, and the integr...

  2. Cache-Conscious Radix-Decluster Projections

    NARCIS (Netherlands)

    S. Manegold (Stefan); P.A. Boncz (Peter); N.J. Nes (Niels); M.L. Kersten (Martin)

    2004-01-01

    textabstractAs CPUs become more powerful with Moore's law and memory latencies stay constant, the impact of the memory access performance bottleneck continues to grow on relational operators like join, which can exhibit random access on a memory region larger than the hardware caches. While

  3. Cooperative Coding and Caching for Streaming Data in Multihop Wireless Networks

    Directory of Open Access Journals (Sweden)

    Liu Jiangchuan

    2010-01-01

    Full Text Available This paper studies the distributed caching managements for the current flourish of the streaming applications in multihop wireless networks. Many caching managements to date use randomized network coding approach, which provides an elegant solution for ubiquitous data accesses in such systems. However, the encoding, essentially a combination operation, makes the coded data difficult to be changed. In particular, to accommodate new data, the system may have to first decode all the combined data segments, remove some unimportant ones, and then reencode the data segments again. This procedure is clearly expensive for continuously evolving data storage. As such, we introduce a novel Cooperative Coding and Caching ( scheme, which allows decoding-free data removal through a triangle-like codeword organization. Its decoding performance is very close to the conventional network coding with only a sublinear overhead. Our scheme offers a promising solution to the caching management for streaming data.

  4. dCache data storage system implementations at a Tier-2 centre

    Energy Technology Data Exchange (ETDEWEB)

    Tsigenov, Oleg; Nowack, Andreas; Kress, Thomas [III. Physikalisches Institut B, RWTH Aachen (Germany)

    2009-07-01

    The experimental high energy physics groups of the RWTH Aachen University operate one of the largest Grid Tier-2 sites in the world and offer more than 2000 modern CPU cores and about 550 TB of disk space mainly to the CMS experiment and to a lesser extent to the Auger and Icecube collaborations.Running such a large data cluster requires a flexible storage system with high performance. We use dCache for this purpose and are integrated into the dCache support team to the benefit of the German Grid sites. Recently, a storage pre-production cluster has been built to study the setup and the behavior of novel dCache features within Chimera without interfering with the production system. This talk gives an overview about the practical experience gained with dCache on both the production and the testbed cluster and discusses future plans.

  5. Replication Strategy for Spatiotemporal Data Based on Distributed Caching System

    Science.gov (United States)

    Xiong, Lian; Tao, Yang; Xu, Juan; Zhao, Lun

    2018-01-01

    The replica strategy in distributed cache can effectively reduce user access delay and improve system performance. However, developing a replica strategy suitable for varied application scenarios is still quite challenging, owing to differences in user access behavior and preferences. In this paper, a replication strategy for spatiotemporal data (RSSD) based on a distributed caching system is proposed. By taking advantage of the spatiotemporal locality and correlation of user access, RSSD mines high popularity and associated files from historical user access information, and then generates replicas and selects appropriate cache node for placement. Experimental results show that the RSSD algorithm is simple and efficient, and succeeds in significantly reducing user access delay. PMID:29342897

  6. Energy-Efficient Caching for Mobile Edge Computing in 5G Networks

    Directory of Open Access Journals (Sweden)

    Zhaohui Luo

    2017-05-01

    Full Text Available Mobile Edge Computing (MEC, which is considered a promising and emerging paradigm to provide caching capabilities in proximity to mobile devices in 5G networks, enables fast, popular content delivery of delay-sensitive applications at the backhaul capacity of limited mobile networks. Most existing studies focus on cache allocation, mechanism design and coding design for caching. However, grid power supply with fixed power uninterruptedly in support of a MEC server (MECS is costly and even infeasible, especially when the load changes dynamically over time. In this paper, we investigate the energy consumption of the MECS problem in cellular networks. Given the average download latency constraints, we take the MECS’s energy consumption, backhaul capacities and content popularity distributions into account and formulate a joint optimization framework to minimize the energy consumption of the system. As a complicated joint optimization problem, we apply a genetic algorithm to solve it. Simulation results show that the proposed solution can effectively determine the near-optimal caching placement to obtain better performance in terms of energy efficiency gains compared with conventional caching placement strategies. In particular, it is shown that the proposed scheme can significantly reduce the joint cost when backhaul capacity is low.

  7. Cooperative Coding and Caching for Streaming Data in Multihop Wireless Networks

    Directory of Open Access Journals (Sweden)

    Dan Wang

    2010-01-01

    Full Text Available This paper studies the distributed caching managements for the current flourish of the streaming applications in multihop wireless networks. Many caching managements to date use randomized network coding approach, which provides an elegant solution for ubiquitous data accesses in such systems. However, the encoding, essentially a combination operation, makes the coded data difficult to be changed. In particular, to accommodate new data, the system may have to first decode all the combined data segments, remove some unimportant ones, and then reencode the data segments again. This procedure is clearly expensive for continuously evolving data storage. As such, we introduce a novel Cooperative Coding and Caching (C3 scheme, which allows decoding-free data removal through a triangle-like codeword organization. Its decoding performance is very close to the conventional network coding with only a sublinear overhead. Our scheme offers a promising solution to the caching management for streaming data.

  8. CACHING DATA STORED IN SQL SERVER FOR OPTIMIZING THE PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Demian Horia

    2016-12-01

    Full Text Available This paper present the architecture of web site with different techniques used for optimize the performance of loading the web content. The architecture presented here is for e-commerce site developed on windows with MVC, IIS and Micosoft SQL Server. Caching the data is one technique used by the browsers, by the web servers itself or by proxy servers. Caching the data is made without the knowledge of users and need to provide to user the more recent information from the server. This means that caching mechanism has to be aware of any modification of data on the server. There are different information’s presented in e-commerce site related to products like images, code of product, description, properties or stock

  9. 3D Seismic Reflection Amplitude and Instantaneous Frequency Attributes in Mapping Thin Hydrocarbon Reservoir Lithofacies: Morrison NE Field and Morrison Field, Clark County, KS

    Science.gov (United States)

    Raef, Abdelmoneam; Totten, Matthew; Vohs, Andrew; Linares, Aria

    2017-12-01

    Thin hydrocarbon reservoir facies pose resolution challenges and waveform-signature opportunities in seismic reservoir characterization and prospect identification. In this study, we present a case study, where instantaneous frequency variation in response to a thin hydrocarbon pay zone is analyzed and integrated with other independent information to explain drilling results and optimize future drilling decisions. In Morrison NE Field, some wells with poor economics have resulted from well-placement incognizant of reservoir heterogeneities. The study area in Clark County, Kanas, USA, has been covered by a surface 3D seismic reflection survey in 2010. The target horizon is the Viola limestone, which continues to produce from 7 of the 12 wells drilled within the survey area. Seismic attributes extraction and analyses were conducted with emphasis on instantaneous attributes and amplitude anomalies to better understand and predict reservoir heterogeneities and their control on hydrocarbon entrapment settings. We have identified a higher instantaneous frequency, lower amplitude seismic facies that is in good agreement with distinct lithofacies that exhibit better (higher porosity) reservoir properties, as inferred from well-log analysis and petrographic inspection of well cuttings. This study presents a pre-drilling, data-driven approach of identifying sub-resolution reservoir seismic facies in a carbonate formation. This workflow will assist in placing new development wells in other locations within the area. Our low amplitude high instantaneous frequency seismic reservoir facies have been corroborated by findings based on well logs, petrographic analysis data, and drilling results.

  10. Dynamic Video Streaming in Caching-enabled Wireless Mobile Networks

    OpenAIRE

    Liang, C.; Hu, S.

    2017-01-01

    Recent advances in software-defined mobile networks (SDMNs), in-network caching, and mobile edge computing (MEC) can have great effects on video services in next generation mobile networks. In this paper, we jointly consider SDMNs, in-network caching, and MEC to enhance the video service in next generation mobile networks. With the objective of maximizing the mean measurement of video quality, an optimization problem is formulated. Due to the coupling of video data rate, computing resource, a...

  11. On the Feasibility of Prefetching and Caching for Online TV Services: A Measurement Study on Hulu

    Science.gov (United States)

    Krishnappa, Dilip Kumar; Khemmarat, Samamon; Gao, Lixin; Zink, Michael

    Lately researchers are looking at ways to reduce the delay on video playback through mechanisms like prefetching and caching for Video-on-Demand (VoD) services. The usage of prefetching and caching also has the potential to reduce the amount of network bandwidth usage, as most popular requests are served from a local cache rather than the server containing the original content. In this paper, we investigate the advantages of having such a prefetching and caching scheme for a free hosting service of professionally created video (movies and TV shows) named "hulu". We look into the advantages of using a prefetching scheme where the most popular videos of the week, as provided by the hulu website, are prefetched and compare this approach with a conventional LRU caching scheme with limited storage space and a combined scheme of prefetching and caching. Results from our measurement and analysis shows that employing a basic caching scheme at the proxy yields a hit ratio of up to 77.69%, but requires storage of about 236GB. Further analysis shows that a prefetching scheme where the top-100 popular videos of the week are downloaded to the proxy yields a hit ratio of 44% with a storage requirement of 10GB. A LRU caching scheme with a storage limitation of 20GB can achieve a hit ratio of 55% but downloads 4713 videos to achieve such high hit ratio compared to 100 videos in prefetching scheme, whereas a scheme with both prefetching and caching with the same storage yields a hit ratio of 59% with download requirement of 4439 videos. We find that employing a scheme of prefetching along with caching with trade-off on the storage will yield a better hit ratio and bandwidth saving than individual caching or prefetching schemes.

  12. Performance Evaluation of Moving Small-Cell Network with Proactive Cache

    Directory of Open Access Journals (Sweden)

    Young Min Kwon

    2016-01-01

    Full Text Available Due to rapid growth in mobile traffic, mobile network operators (MNOs are considering the deployment of moving small-cells (mSCs. mSC is a user-centric network which provides voice and data services during mobility. mSC can receive and forward data traffic via wireless backhaul and sidehaul links. In addition, due to the predictive nature of users demand, mSCs can proactively cache the predicted contents in off-peak-traffic periods. Due to these characteristics, MNOs consider mSCs as a cost-efficient solution to not only enhance the system capacity but also provide guaranteed quality of service (QoS requirements to moving user equipment (UE in peak-traffic periods. In this paper, we conduct extensive system level simulations to analyze the performance of mSCs with varying cache size and content popularity and their effect on wireless backhaul load. The performance evaluation confirms that the QoS of moving small-cell UE (mSUE notably improves by using mSCs together with proactive caching. We also show that the effective use of proactive cache significantly reduces the wireless backhaul load and increases the overall network capacity.

  13. I-Structure software cache for distributed applications

    Directory of Open Access Journals (Sweden)

    Alfredo Cristóbal Salas

    2004-01-01

    Full Text Available En este artículo, describimos el caché de software I-Structure para entornos de memoria distribuida (D-ISSC, lo cual toma ventaja de la localidad de los datos mientras mantiene la capacidad de tolerancia a la latencia de sistemas de memoria I-Structure. Las facilidades de programación de los programas MPI, le ocultan los problemas de sincronización al programador. Nuestra evaluación experimental usando un conjunto de pruebas de rendimiento indica que clusters de PC con I-Structure y su mecanismo de cache D-ISSC son más robustos. El sistema puede acelerar aplicaciones de comunicación intensiva regulares e irregulares.

  14. Content Delivery in Fog-Aided Small-Cell Systems with Offline and Online Caching: An Information—Theoretic Analysis

    Directory of Open Access Journals (Sweden)

    Seyyed Mohammadreza Azimi

    2017-07-01

    Full Text Available The storage of frequently requested multimedia content at small-cell base stations (BSs can reduce the load of macro-BSs without relying on high-speed backhaul links. In this work, the optimal operation of a system consisting of a cache-aided small-cell BS and a macro-BS is investigated for both offline and online caching settings. In particular, a binary fading one-sided interference channel is considered in which the small-cell BS, whose transmission is interfered by the macro-BS, has a limited-capacity cache. The delivery time per bit (DTB is adopted as a measure of the coding latency, that is, the duration of the transmission block, required for reliable delivery. For offline caching, assuming a static set of popular contents, the minimum achievable DTB is characterized through information-theoretic achievability and converse arguments as a function of the cache capacity and of the capacity of the backhaul link connecting cloud and small-cell BS. For online caching, under a time-varying set of popular contents, the long-term (average DTB is evaluated for both proactive and reactive caching policies. Furthermore, a converse argument is developed to characterize the minimum achievable long-term DTB for online caching in terms of the minimum achievable DTB for offline caching. The performance of both online and offline caching is finally compared using numerical results.

  15. Optimal and Scalable Caching for 5G Using Reinforcement Learning of Space-Time Popularities

    Science.gov (United States)

    Sadeghi, Alireza; Sheikholeslami, Fatemeh; Giannakis, Georgios B.

    2018-02-01

    Small basestations (SBs) equipped with caching units have potential to handle the unprecedented demand growth in heterogeneous networks. Through low-rate, backhaul connections with the backbone, SBs can prefetch popular files during off-peak traffic hours, and service them to the edge at peak periods. To intelligently prefetch, each SB must learn what and when to cache, while taking into account SB memory limitations, the massive number of available contents, the unknown popularity profiles, as well as the space-time popularity dynamics of user file requests. In this work, local and global Markov processes model user requests, and a reinforcement learning (RL) framework is put forth for finding the optimal caching policy when the transition probabilities involved are unknown. Joint consideration of global and local popularity demands along with cache-refreshing costs allow for a simple, yet practical asynchronous caching approach. The novel RL-based caching relies on a Q-learning algorithm to implement the optimal policy in an online fashion, thus enabling the cache control unit at the SB to learn, track, and possibly adapt to the underlying dynamics. To endow the algorithm with scalability, a linear function approximation of the proposed Q-learning scheme is introduced, offering faster convergence as well as reduced complexity and memory requirements. Numerical tests corroborate the merits of the proposed approach in various realistic settings.

  16. A general approach for cache-oblivious range reporting and approximate range counting

    DEFF Research Database (Denmark)

    Afshani, Peyman; Hamilton, Chris; Zeh, Norbert

    2010-01-01

    We present cache-oblivious solutions to two important variants of range searching: range reporting and approximate range counting. Our main contribution is a general approach for constructing cache-oblivious data structures that provide relative (1+ε)-approximations for a general class of range c...

  17. Ground-water quality, levels, and flow direction near Fort Cobb Reservoir, Caddo County, Oklahoma, 1998-2000

    Science.gov (United States)

    Becker, Carol J.

    2001-01-01

    Fort Cobb Reservoir in northwest Caddo County Oklahoma is managed by the Bureau of Reclamation for water supply, recreation, flood control, and wildlife. Excessive amounts of nitrogen in the watershed have the potential to cause long-term eutrophication of the reservoir and increase already elevated concentrations of nitrogen in the Rush Springs aquifer. The U.S. Geological Survey in cooperation with the Bureau of Reclamation studied ground water in the area surrounding a swine feeding operation located less than 2 miles upgradient from Fort Cobb Reservoir in Caddo County, Oklahoma. Objectives of the study were to (1) determine if the operation was contributing nitrogen to the ground water and (2) measure changes in ground-water levels and determine the local ground-water flow direction in the area surrounding the swine feeding operation. Nitrate concentrations (28.1 and 31.5 milligrams per liter) were largest in two ground-water samples from a well upgradient of the wastewater lagoon. Nitrate concentrations ranged from 4.30 to 8.20 milligrams per liter in samples from downgradient wells. Traces of ammonia and nitrite were detected in a downgradient well, but not in upgradient wells. d15N values indicate atmospheric nitrogen, synthetic fertilizer, or plants were the predominate sources of nitrate in ground water from the downgradient wells. The d15N values in these samples are depleted in nitrogen-15, indicating that animal waste was not a significant contributor of nitrate. Manganese concentrations (1,150 and 965 micrograms per liter) in samples from a downgradient well were substantially larger than concentrations in samples from other wells, exceeding the secondary drinking-water standard of 50 micrograms per liter. Larger concentrations of bicarbonate, magnesium, fluoride, and iron and a higher pH were also measured in water from a downgradient well. Ground-water levels in an observation well were higher from April to mid-July and lower during the late summer

  18. TaPT: Temperature-Aware Dynamic Cache Optimization for Embedded Systems

    Directory of Open Access Journals (Sweden)

    Tosiron Adegbija

    2017-12-01

    Full Text Available Embedded systems have stringent design constraints, which has necessitated much prior research focus on optimizing energy consumption and/or performance. Since embedded systems typically have fewer cooling options, rising temperature, and thus temperature optimization, is an emergent concern. Most embedded systems only dissipate heat by passive convection, due to the absence of dedicated thermal management hardware mechanisms. The embedded system’s temperature not only affects the system’s reliability, but can also affect the performance, power, and cost. Thus, embedded systems require efficient thermal management techniques. However, thermal management can conflict with other optimization objectives, such as execution time and energy consumption. In this paper, we focus on managing the temperature using a synergy of cache optimization and dynamic frequency scaling, while also optimizing the execution time and energy consumption. This paper provides new insights on the impact of cache parameters on efficient temperature-aware cache tuning heuristics. In addition, we present temperature-aware phase-based tuning, TaPT, which determines Pareto optimal clock frequency and cache configurations for fine-grained execution time, energy, and temperature tradeoffs. TaPT enables autonomous system optimization and also allows designers to specify temperature constraints and optimization priorities. Experiments show that TaPT can effectively reduce execution time, energy, and temperature, while imposing minimal hardware overhead.

  19. Web Cache Prefetching as an Aspect: Towards a Dynamic-Weaving Based Solution

    DEFF Research Database (Denmark)

    Segura-Devillechaise, Marc; Menaud, Jean-Marc; Muller, Gilles

    2003-01-01

    Given the high proportion of HTTP traffic in the Internet, Web caches are crucial to reduce user access time, network latency, and bandwidth consumption. Prefetching in a Web cache can further enhance these benefits. For the best performance, however, the prefetching policy must match user and Web...

  20. Cache aware mapping of streaming apllications on a multiprocessor system-on-chip

    NARCIS (Netherlands)

    Moonen, A.J.M.; Bekooij, M.J.G.; Berg, van den R.M.J.; Meerbergen, van J.; Sciuto, D.; Peng, Z.

    2008-01-01

    Efficient use of the memory hierarchy is critical for achieving high performance in a multiprocessor system- on-chip. An external memory that is shared between processors is a bottleneck in current and future systems. Cache misses and a large cache miss penalty contribute to a low processor

  1. Fast and Cache-Oblivious Dynamic Programming with Local Dependencies

    DEFF Research Database (Denmark)

    Bille, Philip; Stöckel, Morten

    2012-01-01

    are widely used in bioinformatics to compare DNA and protein sequences. These problems can all be solved using essentially the same dynamic programming scheme over a two-dimensional matrix, where each entry depends locally on at most 3 neighboring entries. We present a simple, fast, and cache......-oblivious algorithm for this type of local dynamic programming suitable for comparing large-scale strings. Our algorithm outperforms the previous state-of-the-art solutions. Surprisingly, our new simple algorithm is competitive with a complicated, optimized, and tuned implementation of the best cache-aware algorithm...

  2. Cache and energy efficient algorithms for Nussinov's RNA Folding.

    Science.gov (United States)

    Zhao, Chunchun; Sahni, Sartaj

    2017-12-06

    An RNA folding/RNA secondary structure prediction algorithm determines the non-nested/pseudoknot-free structure by maximizing the number of complementary base pairs and minimizing the energy. Several implementations of Nussinov's classical RNA folding algorithm have been proposed. Our focus is to obtain run time and energy efficiency by reducing the number of cache misses. Three cache-efficient algorithms, ByRow, ByRowSegment and ByBox, for Nussinov's RNA folding are developed. Using a simple LRU cache model, we show that the Classical algorithm of Nussinov has the highest number of cache misses followed by the algorithms Transpose (Li et al.), ByRow, ByRowSegment, and ByBox (in this order). Extensive experiments conducted on four computational platforms-Xeon E5, AMD Athlon 64 X2, Intel I7 and PowerPC A2-using two programming languages-C and Java-show that our cache efficient algorithms are also efficient in terms of run time and energy. Our benchmarking shows that, depending on the computational platform and programming language, either ByRow or ByBox give best run time and energy performance. The C version of these algorithms reduce run time by as much as 97.2% and energy consumption by as much as 88.8% relative to Classical and by as much as 56.3% and 57.8% relative to Transpose. The Java versions reduce run time by as much as 98.3% relative to Classical and by as much as 75.2% relative to Transpose. Transpose achieves run time and energy efficiency at the expense of memory as it takes twice the memory required by Classical. The memory required by ByRow, ByRowSegment, and ByBox is the same as that of Classical. As a result, using the same amount of memory, the algorithms proposed by us can solve problems up to 40% larger than those solvable by Transpose.

  3. dCache: Big Data storage for HEP communities and beyond

    International Nuclear Information System (INIS)

    Millar, A P; Bernardt, C; Fuhrmann, P; Mkrtchyan, T; Petersen, A; Schwank, K; Behrmann, G; Litvintsev, D; Rossi, A

    2014-01-01

    With over ten years in production use dCache data storage system has evolved to match ever changing lansdcape of continually evolving storage technologies with new solutions to both existing problems and new challenges. In this paper, we present three areas of innovation in dCache: providing efficient access to data with NFS v4.1 pNFS, adoption of CDMI and WebDAV as an alternative to SRM for managing data, and integration with alternative authentication mechanisms.

  4. Using shadow page cache to improve isolated drivers performance.

    Science.gov (United States)

    Zheng, Hao; Dong, Xiaoshe; Wang, Endong; Chen, Baoke; Zhu, Zhengdong; Liu, Chengzhe

    2015-01-01

    With the advantage of the reusability property of the virtualization technology, users can reuse various types and versions of existing operating systems and drivers in a virtual machine, so as to customize their application environment. In order to prevent users' virtualization environments being impacted by driver faults in virtual machine, Chariot examines the correctness of driver's write operations by the method of combining a driver's write operation capture and a driver's private access control table. However, this method needs to keep the write permission of shadow page table as read-only, so as to capture isolated driver's write operations through page faults, which adversely affect the performance of the driver. Based on delaying setting frequently used shadow pages' write permissions to read-only, this paper proposes an algorithm using shadow page cache to improve the performance of isolated drivers and carefully study the relationship between the performance of drivers and the size of shadow page cache. Experimental results show that, through the shadow page cache, the performance of isolated drivers can be greatly improved without impacting Chariot's reliability too much.

  5. Lack of caching of direct-seeded Douglas fir seeds by deer mice

    International Nuclear Information System (INIS)

    Sullivan, T.P.

    1978-01-01

    Seed caching by deer mice was investigated by radiotagging seeds in forest and clear-cut areas in coastal British Columbia. Deer mice tend to cache very few Douglas fir seeds in the fall when the seed is uniformly distributed and is at densities comparable with those used in direct-seeding programs. (author)

  6. Decision-cache based XACML authorisation and anonymisation for XML documents

    OpenAIRE

    Ulltveit-Moe, Nils; Oleshchuk, Vladimir A

    2012-01-01

    Author's version of an article in the journal: Computer Standards and Interfaces. Also available from the publisher at: http://dx.doi.org/10.1016/j.csi.2011.10.007 This paper describes a decision cache for the eXtensible Access Control Markup Language (XACML) that supports fine-grained authorisation and anonymisation of XML based messages and documents down to XML attribute and element level. The decision cache is implemented as an XACML obligation service, where a specification of the XML...

  7. Turbidity and Total Suspended Solids on the Lower Cache River Watershed, AR.

    Science.gov (United States)

    Rosado-Berrios, Carlos A; Bouldin, Jennifer L

    2016-06-01

    The Cache River Watershed (CRW) in Arkansas is part of one of the largest remaining bottomland hardwood forests in the US. Although wetlands are known to improve water quality, the Cache River is listed as impaired due to sedimentation and turbidity. This study measured turbidity and total suspended solids (TSS) in seven sites of the lower CRW; six sites were located on the Bayou DeView tributary of the Cache River. Turbidity and TSS levels ranged from 1.21 to 896 NTU, and 0.17 to 386.33 mg/L respectively and had an increasing trend over the 3-year study. However, a decreasing trend from upstream to downstream in the Bayou DeView tributary was noted. Sediment loading calculated from high precipitation events and mean TSS values indicate that contributions from the Cache River main channel was approximately 6.6 times greater than contributions from Bayou DeView. Land use surrounding this river channel affects water quality as wetlands provide a filter for sediments in the Bayou DeView channel.

  8. On the Performance of the Cache Coding Protocol

    Directory of Open Access Journals (Sweden)

    Behnaz Maboudi

    2018-03-01

    Full Text Available Network coding approaches typically consider an unrestricted recoding of coded packets in the relay nodes to increase performance. However, this can expose the system to pollution attacks that cannot be detected during transmission, until the receivers attempt to recover the data. To prevent these attacks while allowing for the benefits of coding in mesh networks, the cache coding protocol was proposed. This protocol only allows recoding at the relays when the relay has received enough coded packets to decode an entire generation of packets. At that point, the relay node recodes and signs the recoded packets with its own private key, allowing the system to detect and minimize the effect of pollution attacks and making the relays accountable for changes on the data. This paper analyzes the delay performance of cache coding to understand the security-performance trade-off of this scheme. We introduce an analytical model for the case of two relays in an erasure channel relying on an absorbing Markov chain and an approximate model to estimate the performance in terms of the number of transmissions before successfully decoding at the receiver. We confirm our analysis using simulation results. We show that cache coding can overcome the security issues of unrestricted recoding with only a moderate decrease in system performance.

  9. Cache-Oblivious Search Trees via Binary Trees of Small Height

    DEFF Research Database (Denmark)

    Brodal, G.S.; Fagerberg, R.; Jacob, R.

    2002-01-01

    We propose a version of cache oblivious search trees which is simpler than the previous proposal of Bender, Demaine and Farach-Colton and has the same complexity bounds. In particular, our data structure avoids the use of weight balanced B-trees, and can be implemented as just a single array......, and range queries in worst case O(logB n + k/B) memory transfers, where k is the size of the output.The basic idea of our data structure is to maintain a dynamic binary tree of height log n+O(1) using existing methods, embed this tree in a static binary tree, which in turn is embedded in an array in a cache...... oblivious fashion, using the van Emde Boas layout of Prokop.We also investigate the practicality of cache obliviousness in the area of search trees, by providing an empirical comparison of different methods for laying out a search tree in memory....

  10. Field guide to Muddy Formation outcrops, Crook County, Wyoming

    Energy Technology Data Exchange (ETDEWEB)

    Rawn-Schatzinger, V.

    1993-11-01

    The objectives of this research program are to (1) determine the reservoir characteristics and production problems of shoreline barrier reservoirs; and (2) develop methods and methodologies to effectively characterize shoreline bamer reservoirs to predict flow patterns of injected and produced fluids. Two reservoirs were selected for detailed reservoir characterization studies -- Bell Creek field, Carter County, Montana that produces from the Lower Cretaceous (Albian-Cenomanian) Muddy Formation, and Patrick Draw field, Sweetwater County, Wyoming that produces from the Upper Cretaceous (Campanian) Almond Formation of the Mesaverde Group. An important component of the research project was to use information from outcrop exposures of the producing formations to study the spatial variations of reservoir properties and the degree to which outcrop information can be used in the construction of reservoir models. This report contains the data and analyses collected from outcrop exposures of the Muddy Formation, located in Crook County, Wyoming, 40 miles south of Bell Creek oil field. The outcrop data set contains permeability, porosity, petrographic, grain size and geologic data from 1-inch-diameter core plugs chilled from the outcrop face, as well as geological descriptions and sedimentological interpretations of the outcrop exposures. The outcrop data set provides information about facies characteristics and geometries and the spatial distribution of permeability and porosity on interwell scales. Appendices within this report include a micropaleontological analyses of selected outcrop samples, an annotated bibliography of papers on the Muddy Formation in the Powder River Basin, and over 950 permeability and porosity values measured from 1-inch-diameter core plugs drilled from the outcrop. All data contained in this resort are available in electronic format upon request. The core plugs drilled from the outcrop are available for measurement.

  11. Evaluation of low-temperature geothermal potential in Cache Valley, Utah. Report of investigation No. 174

    Energy Technology Data Exchange (ETDEWEB)

    de Vries, J.L.

    1982-11-01

    Field work consisted of locating 90 wells and springs throughout the study area, collecting water samples for later laboratory analyses, and field measurement of pH, temperature, bicarbonate alkalinity, and electrical conductivity. Na/sup +/, K/sup +/, Ca/sup +2/, Mg/sup +2/, SiO/sub 2/, Fe, SO/sub 4//sup -2/, Cl/sup -/, F/sup -/, and total dissolved solids were determined in the laboratory. Temperature profiles were measured in 12 additional, unused walls. Thermal gradients calculated from the profiles were approximately the same as the average for the Basin and Range province, about 35/sup 0/C/km. One well produced a gradient of 297/sup 0/C/km, most probably as a result of a near-surface occurrence of warm water. Possible warm water reservoir temperatures were calculated using both the silica and the Na-K-Ca geothermometers, with the results averaging about 50 to 100/sup 0/C. If mixing calculations were applied, taking into account the temperatures and silica contents of both warm springs or wells and the cold groundwater, reservoir temperatures up to about 200/sup 0/C were indicated. Considering measured surface water temperatures, calculated reservoir temperatures, thermal gradients, and the local geology, most of the Cache Valley, Utah area is unsuited for geothermal development. However, the areas of North Logan, Benson, and Trenton were found to have anomalously warm groundwater in comparison to the background temperature of 13.0/sup 0/C for the study area. The warm water has potential for isolated energy development but is not warm enough for major commercial development.

  12. Sediment accumulation and water volume in Loch Raven Reservoir, Baltimore County, Maryland

    Science.gov (United States)

    Banks, William S.L.; LaMotte, Andrew E.

    1999-01-01

    Baltimore City and its metropolitan area are supplied with water from three reservoirs, Liberty Reservoir, Prettyboy Reservoir, and Loch Raven Reservoir. Prettyboy and Loch Raven Reservoirs are located on the Gunpowder Falls (figure 1). The many uses of the reservoir system necessitate coordination and communication among resource managers. The 1996 Amendment to the Safe Drinking Water Act require States to complete source-water assessments for public drinking-water supplies. As part of an ongoing effort to provide safe drinking water and as a direct result of these laws, the City of Baltimore and the Maryland Department of the Environment (MDE), in cooperation with other State and local agencies, are studying the Gunpowder Falls Basin and its role as a source of water supply to the Baltimore area. As a part of this study, the U.S. Geological Survey (USGS), in cooperation with the Maryland Geological Survey (MGS), with funding provided by the City of Baltimore and MDE, is examining sediment accumulation in Loch Raven Reservoir. The Baltimore City Department of Public Works periodically determines the amount of water that can be stored in its reservoirs. To make this determination, field crews measure the water depth along predetermined transects or ranges. These transects provide consistent locations where water depth, or bathymetric, measurements can be made. Range surveys are repeated to provide a record of the change in storage capacity due to sediment accumulation over time. Previous bathymetric surveys of Loch Raven Reservoir were performed in 1943, 1961, 1972, and 1985. Errors in data-collection and analysis methods have been assessed and documented (Baltimore City Department of Public Works, 1989). Few comparisons can be made among survey results because of changing data-collection techniques and analysis methods.

  13. Consistencia de ejecución: una propuesta no cache coherente

    OpenAIRE

    García, Rafael B.; Ardenghi, Jorge Raúl

    2005-01-01

    La presencia de uno o varios niveles de memoria cache en los procesadores modernos, cuyo objetivo es reducir el tiempo efectivo de acceso a memoria, adquiere especial relevancia en un ambiente multiprocesador del tipo DSM dado el mucho mayor costo de las referencias a memoria en módulos remotos. Claramente, el protocolo de coherencia de cache debe responder al modelo de consistencia de memoria adoptado. El modelo secuencial SC, aceptado generalmente como el más natural, junto a una serie de m...

  14. Randomized Caches Can Be Pretty Useful to Hard Real-Time Systems

    Directory of Open Access Journals (Sweden)

    Enrico Mezzetti

    2015-03-01

    Full Text Available Cache randomization per se, and its viability for probabilistic timing analysis (PTA of critical real-time systems, are receiving increasingly close attention from the scientific community and the industrial practitioners. In fact, the very notion of introducing randomness and probabilities in time-critical systems has caused strenuous debates owing to the apparent clash that this idea has with the strictly deterministic view traditionally held for those systems. A paper recently appeared in LITES (Reineke, J. (2014. Randomized Caches Considered Harmful in Hard Real-Time Systems. LITES, 1(1, 03:1-03:13. provides a critical analysis of the weaknesses and risks entailed in using randomized caches in hard real-time systems. In order to provide the interested reader with a fuller, balanced appreciation of the subject matter, a critical analysis of the benefits brought about by that innovation should be provided also. This short paper addresses that need by revisiting the array of issues addressed in the cited work, in the light of the latest advances to the relevant state of the art. Accordingly, we show that the potential benefits of randomized caches do offset their limitations, causing them to be - when used in conjunction with PTA - a serious competitor to conventional designs.

  15. Greatly improved cache update times for conditions data with Frontier/Squid

    International Nuclear Information System (INIS)

    Dykstra, Dave; Lueking, Lee

    2009-01-01

    The CMS detector project loads copies of conditions data to over 100,000 computer cores worldwide by using a software subsystem called Frontier. This subsystem translates database queries into HTTP, looks up the results in a central database at CERN, and caches the results in an industry-standard HTTP proxy/caching server called Squid. One of the most challenging aspects of any cache system is coherency, that is, ensuring that changes made to the underlying data get propagated out to all clients in a timely manner. Recently, the Frontier system was enhanced to drastically reduce the time for changes to be propagated everywhere without heavily loading servers. The propagation time is now as low as 15 minutes for some kinds of data and no more than 60 minutes for the rest of the data. This was accomplished by taking advantage of an HTTP and Squid feature called If-Modified-Since. In order to use this feature, the Frontier server sends a Last-Modified timestamp, but since modification times are not normally tracked by Oracle databases, a PL/SQL program was developed to track the modification times of database tables. We discuss the details of this caching scheme and the obstacles overcome including database and Squid bugs.

  16. Greatly improved cache update times for conditions data with Frontier/Squid

    Energy Technology Data Exchange (ETDEWEB)

    Dykstra, Dave; Lueking, Lee, E-mail: dwd@fnal.go [Computing Division, Fermilab, Batavia, IL (United States)

    2010-04-01

    The CMS detector project loads copies of conditions data to over 100,000 computer cores worldwide by using a software subsystem called Frontier. This subsystem translates database queries into HTTP, looks up the results in a central database at CERN, and caches the results in an industry-standard HTTP proxy/caching server called Squid. One of the most challenging aspects of any cache system is coherency, that is, ensuring that changes made to the underlying data get propagated out to all clients in a timely manner. Recently, the Frontier system was enhanced to drastically reduce the time for changes to be propagated everywhere without heavily loading servers. The propagation time is now as low as 15 minutes for some kinds of data and no more than 60 minutes for the rest of the data. This was accomplished by taking advantage of an HTTP and Squid feature called If-Modified-Since. In order to use this feature, the Frontier server sends a Last-Modified timestamp, but since modification times are not normally tracked by Oracle databases, a PL/SQL program was developed to track the modification times of database tables. We discuss the details of this caching scheme and the obstacles overcome including database and Squid bugs.

  17. The Prado Dam and Reservoir, Riverside and San Bernardino Counties, California

    Science.gov (United States)

    1989-10-31

    County’s Renewed Push for Water Conservation ............. 72 Riverside County Reaction , Late 1940s ........................... 76 Development of...is sloped to the typography to reduce erosion below the concrete-lined section. The emergency spillway had a designed pond elevation of 556 feet, and a...means of pumping water downstream (Nick Richardson, personal communication 1989). 75 4R CL 44- t,, v I. 76 Riverside County Reaction , Late 1940s The

  18. Sex, estradiol, and spatial memory in a food-caching corvid.

    Science.gov (United States)

    Rensel, Michelle A; Ellis, Jesse M S; Harvey, Brigit; Schlinger, Barney A

    2015-09-01

    Estrogens significantly impact spatial memory function in mammalian species. Songbirds express the estrogen synthetic enzyme aromatase at relatively high levels in the hippocampus and there is evidence from zebra finches that estrogens facilitate performance on spatial learning and/or memory tasks. It is unknown, however, whether estrogens influence hippocampal function in songbirds that naturally exhibit memory-intensive behaviors, such as cache recovery observed in many corvid species. To address this question, we examined the impact of estradiol on spatial memory in non-breeding Western scrub-jays, a species that routinely participates in food caching and retrieval in nature and in captivity. We also asked if there were sex differences in performance or responses to estradiol. Utilizing a combination of an aromatase inhibitor, fadrozole, with estradiol implants, we found that while overall cache recovery rates were unaffected by estradiol, several other indices of spatial memory, including searching efficiency and efficiency to retrieve the first item, were impaired in the presence of estradiol. In addition, males and females differed in some performance measures, although these differences appeared to be a consequence of the nature of the task as neither sex consistently out-performed the other. Overall, our data suggest that a sustained estradiol elevation in a food-caching bird impairs some, but not all, aspects of spatial memory on an innate behavioral task, at times in a sex-specific manner. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Using Shadow Page Cache to Improve Isolated Drivers Performance

    Directory of Open Access Journals (Sweden)

    Hao Zheng

    2015-01-01

    Full Text Available With the advantage of the reusability property of the virtualization technology, users can reuse various types and versions of existing operating systems and drivers in a virtual machine, so as to customize their application environment. In order to prevent users’ virtualization environments being impacted by driver faults in virtual machine, Chariot examines the correctness of driver’s write operations by the method of combining a driver’s write operation capture and a driver’s private access control table. However, this method needs to keep the write permission of shadow page table as read-only, so as to capture isolated driver’s write operations through page faults, which adversely affect the performance of the driver. Based on delaying setting frequently used shadow pages’ write permissions to read-only, this paper proposes an algorithm using shadow page cache to improve the performance of isolated drivers and carefully study the relationship between the performance of drivers and the size of shadow page cache. Experimental results show that, through the shadow page cache, the performance of isolated drivers can be greatly improved without impacting Chariot’s reliability too much.

  20. The development of caching and object permanence in Western scrub-jays (Aphelocoma californica): which emerges first?

    Science.gov (United States)

    Salwiczek, Lucie H; Emery, Nathan J; Schlinger, Barney; Clayton, Nicola S

    2009-08-01

    Recent studies on the food-caching behavior of corvids have revealed complex physical and social skills, yet little is known about the ontogeny of food caching in relation to the development of cognitive capacities. Piagetian object permanence is the understanding that objects continue to exist even when they are no longer visible. Here, the authors focus on Piagetian Stages 3 and 4, because they are hallmarks in the cognitive development of both young children and animals. Our aim is to determine in a food-caching corvid, the Western scrub-jay, whether (1) Piagetian Stage 4 competence and tentative caching (i.e., hiding an item invisibly and retrieving it without delay), emerge concomitantly or consecutively; (2) whether experiencing the reappearance of hidden objects enhances the timing of the appearance of object permanence; and (3) discuss how the development of object permanence is related to behavioral development and sensorimotor intelligence. Our findings suggest that object permanence Stage 4 emerges before tentative caching, and independent of environmental influences, but that once the birds have developed simple object-permanence, then social learning might advance the interval after which tentative caching commences. Copyright 2009 APA, all rights reserved.

  1. Web proxy cache replacement strategies simulation, implementation, and performance evaluation

    CERN Document Server

    ElAarag, Hala; Cobb, Jake

    2013-01-01

    This work presents a study of cache replacement strategies designed for static web content. Proxy servers can improve performance by caching static web content such as cascading style sheets, java script source files, and large files such as images. This topic is particularly important in wireless ad hoc networks, in which mobile devices act as proxy servers for a group of other mobile devices. Opening chapters present an introduction to web requests and the characteristics of web objects, web proxy servers and Squid, and artificial neural networks. This is followed by a comprehensive review o

  2. Language-Based Caching of Dynamically Generated HTML

    DEFF Research Database (Denmark)

    Brabrand, Claus; Møller, Anders; Olesen, Steffan

    2002-01-01

    Increasingly, HTML documents are dynamically generated by interactive Web services. To ensure that the client is presented with the newest versions of such documents it is customary to disable client caching causing a seemingly inevitable performance penalty. In the system, dynamic HTML documents...

  3. dCache: implementing a high-end NFSv4.1 service using a Java NIO framework

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    dCache is a high performance scalable storage system widely used by HEP community. In addition to set of home grown protocols we also provide industry standard access mechanisms like WebDAV and NFSv4.1. This support places dCache as a direct competitor to commercial solutions. Nevertheless conforming to a protocol is not enough; our implementations must perform comparably or even better than commercial systems. To achieve this, dCache uses two high-end IO frameworks from well know application servers: GlassFish and JBoss. This presentation describes how we implemented an rfc1831 and rfc2203 compliant ONC RPC (Sun RPC) service based on the Grizzly NIO framework, part of the GlassFish application server. This ONC RPC service is the key component of dCache’s NFSv4.1 implementation, but is independent of dCache and available for other projects. We will also show some details of dCache NFS v4.1 implementations, describe some of the Java NIO techniques used and, finally, present details of our performance e...

  4. Planetary Sample Caching System Design Options

    Science.gov (United States)

    Collins, Curtis; Younse, Paulo; Backes, Paul

    2009-01-01

    Potential Mars Sample Return missions would aspire to collect small core and regolith samples using a rover with a sample acquisition tool and sample caching system. Samples would need to be stored in individual sealed tubes in a canister that could be transfered to a Mars ascent vehicle and returned to Earth. A sample handling, encapsulation and containerization system (SHEC) has been developed as part of an integrated system for acquiring and storing core samples for application to future potential MSR and other potential sample return missions. Requirements and design options for the SHEC system were studied and a recommended design concept developed. Two families of solutions were explored: 1)transfer of a raw sample from the tool to the SHEC subsystem and 2)transfer of a tube containing the sample to the SHEC subsystem. The recommended design utilizes sample tool bit change out as the mechanism for transferring tubes to and samples in tubes from the tool. The SHEC subsystem design, called the Bit Changeout Caching(BiCC) design, is intended for operations on a MER class rover.

  5. New distributive web-caching technique for VOD services

    Science.gov (United States)

    Kim, Iksoo; Woo, Yoseop; Hwang, Taejune; Choi, Jintak; Kim, Youngjune

    2002-12-01

    At present, one of the most popular services through internet is on-demand services including VOD, EOD and NOD. But the main problems for on-demand service are excessive load of server and insufficiency of network resources. Therefore the service providers require a powerful expensive server and clients are faced with long end-to-end delay and network congestion problem. This paper presents a new distributive web-caching technique for fluent VOD services using distributed proxies in Head-end-Network (HNET). The HNET consists of a Switching-Agent (SA) as a control node, some Head-end Nodes (HEN) as proxies and clients connected to HEN. And each HEN is composing a LAN. Clients request VOD services to server through a HEN and SA. The SA operates the heart of HNET, all the operations using proposed distributive caching technique perform under the control of SA. This technique stores some parts of a requested video on the corresponding HENs when clients connected to each HEN request an identical video. Thus, clients access those HENs (proxies) alternatively for acquiring video streams. Eventually, this fact leads to equi-loaded proxy (HEN). We adopt the cache replacement strategy using the combination of LRU, LFU, remove streams from other HEN prior to server streams and the method of replacing the first block of video last to reduce end-to end delay.

  6. Servidor proxy caché: comprensión y asimilación tecnológica

    Directory of Open Access Journals (Sweden)

    Carlos E. Gómez

    2012-01-01

    Full Text Available Los proveedores de acceso a Internet usualmente incluyen el concepto de aceleradores de Internet para reducir el tiempo promedio que tarda un navegador en obtener los archivos solicitados. Para los administradores del sistema es difícil elegir la configuración del servidor proxy caché, ya que es necesario decidir los valores que se deben usar en diferentes variables. En este artículo se presenta la forma como se abordó el proceso de comprensión y asimilación tecnológica del servicio de proxy caché, un servicio de alto impacto organizacional. Además, este artículo es producto del proyecto de investigación “Análisis de configuraciones de servidores proxy caché”, en el cual se estudiaron aspectos relevantes del rendimiento de Squid como servidor proxy caché.

  7. Secure File Allocation and Caching in Large-scale Distributed Systems

    DEFF Research Database (Denmark)

    Di Mauro, Alessio; Mei, Alessandro; Jajodia, Sushil

    2012-01-01

    In this paper, we present a file allocation and caching scheme that guarantees high assurance, availability, and load balancing in a large-scale distributed file system that can support dynamic updates of authorization policies. The scheme uses fragmentation and replication to store files with hi......-balancing, and reducing delay of read operations. The system offers a trade-off-between performance and security that is dynamically tunable according to the current level of threat. We validate our mechanisms with extensive simulations in an Internet-like network.......In this paper, we present a file allocation and caching scheme that guarantees high assurance, availability, and load balancing in a large-scale distributed file system that can support dynamic updates of authorization policies. The scheme uses fragmentation and replication to store files with high...... security requirements in a system composed of a majority of low-security servers. We develop mechanisms to fragment files, to allocate them into multiple servers, and to cache them as close as possible to their readers while preserving the security requirement of the files, providing load...

  8. Improved characterization of reservoir behavior by integration of reservoir performances data and rock type distributions

    Energy Technology Data Exchange (ETDEWEB)

    Davies, D.K.; Vessell, R.K. [David K. Davies & Associates, Kingwood, TX (United States); Doublet, L.E. [Texas A& M Univ., College Station, TX (United States)] [and others

    1997-08-01

    An integrated geological/petrophysical and reservoir engineering study was performed for a large, mature waterflood project (>250 wells, {approximately}80% water cut) at the North Robertson (Clear Fork) Unit, Gaines County, Texas. The primary goal of the study was to develop an integrated reservoir description for {open_quotes}targeted{close_quotes} (economic) 10-acre (4-hectare) infill drilling and future recovery operations in a low permeability, carbonate (dolomite) reservoir. Integration of the results from geological/petrophysical studies and reservoir performance analyses provide a rapid and effective method for developing a comprehensive reservoir description. This reservoir description can be used for reservoir flow simulation, performance prediction, infill targeting, waterflood management, and for optimizing well developments (patterns, completions, and stimulations). The following analyses were performed as part of this study: (1) Geological/petrophysical analyses: (core and well log data) - {open_quotes}Rock typing{close_quotes} based on qualitative and quantitative visualization of pore-scale features. Reservoir layering based on {open_quotes}rock typing {close_quotes} and hydraulic flow units. Development of a {open_quotes}core-log{close_quotes} model to estimate permeability using porosity and other properties derived from well logs. The core-log model is based on {open_quotes}rock types.{close_quotes} (2) Engineering analyses: (production and injection history, well tests) Material balance decline type curve analyses to estimate total reservoir volume, formation flow characteristics (flow capacity, skin factor, and fracture half-length), and indications of well/boundary interference. Estimated ultimate recovery analyses to yield movable oil (or injectable water) volumes, as well as indications of well and boundary interference.

  9. Optical RAM-enabled cache memory and optical routing for chip multiprocessors: technologies and architectures

    Science.gov (United States)

    Pleros, Nikos; Maniotis, Pavlos; Alexoudi, Theonitsa; Fitsios, Dimitris; Vagionas, Christos; Papaioannou, Sotiris; Vyrsokinos, K.; Kanellos, George T.

    2014-03-01

    The processor-memory performance gap, commonly referred to as "Memory Wall" problem, owes to the speed mismatch between processor and electronic RAM clock frequencies, forcing current Chip Multiprocessor (CMP) configurations to consume more than 50% of the chip real-estate for caching purposes. In this article, we present our recent work spanning from Si-based integrated optical RAM cell architectures up to complete optical cache memory architectures for Chip Multiprocessor configurations. Moreover, we discuss on e/o router subsystems with up to Tb/s routing capacity for cache interconnection purposes within CMP configurations, currently pursued within the FP7 PhoxTrot project.

  10. Minimizing cache misses in an event-driven network server: A case study of TUX

    DEFF Research Database (Denmark)

    Bhatia, Sapan; Consel, Charles; Lawall, Julia Laetitia

    2006-01-01

    We analyze the performance of CPU-bound network servers and demonstrate experimentally that the degradation in the performance of these servers under high-concurrency workloads is largely due to inefficient use of the hardware caches. We then describe an approach to speeding up event-driven network...... servers by optimizing their use of the L2 CPU cache in the context of the TUX Web server, known for its robustness to heavy load. Our approach is based on a novel cache-aware memory allocator and a specific scheduling strategy that together ensure that the total working data set of the server stays...

  11. Applying Data Mining Techniques to Improve Information Security in the Cloud: A Single Cache System Approach

    Directory of Open Access Journals (Sweden)

    Amany AlShawi

    2016-01-01

    Full Text Available Presently, the popularity of cloud computing is gradually increasing day by day. The purpose of this research was to enhance the security of the cloud using techniques such as data mining with specific reference to the single cache system. From the findings of the research, it was observed that the security in the cloud could be enhanced with the single cache system. For future purposes, an Apriori algorithm can be applied to the single cache system. This can be applied by all cloud providers, vendors, data distributors, and others. Further, data objects entered into the single cache system can be extended into 12 components. Database and SPSS modelers can be used to implement the same.

  12. Delivery Time Minimization in Edge Caching: Synergistic Benefits of Subspace Alignment and Zero Forcing

    KAUST Repository

    Kakar, Jaber

    2017-10-29

    An emerging trend of next generation communication systems is to provide network edges with additional capabilities such as additional storage resources in the form of caches to reduce file delivery latency. To investigate this aspect, we study the fundamental limits of a cache-aided wireless network consisting of one central base station, $M$ transceivers and $K$ receivers from a latency-centric perspective. We use the normalized delivery time (NDT) to capture the per-bit latency for the worst-case file request pattern at high signal-to-noise ratios (SNR), normalized with respect to a reference interference-free system with unlimited transceiver cache capabilities. For various special cases with $M=\\\\{1,2\\\\}$ and $K=\\\\{1,2,3\\\\}$ that satisfy $M+K\\\\leq 4$, we establish the optimal tradeoff between cache storage and latency. This is facilitated through establishing a novel converse (for arbitrary $M$ and $K$) and an achievability scheme on the NDT. Our achievability scheme is a synergistic combination of multicasting, zero-forcing beamforming and interference alignment.

  13. Evict on write, a management strategy for a prefetch unit and/or first level cache in a multiprocessor system with speculative execution

    Science.gov (United States)

    Gara, Alan; Ohmacht, Martin

    2014-09-16

    In a multiprocessor system with at least two levels of cache, a speculative thread may run on a core processor in parallel with other threads. When the thread seeks to do a write to main memory, this access is to be written through the first level cache to the second level cache. After the write though, the corresponding line is deleted from the first level cache and/or prefetch unit, so that any further accesses to the same location in main memory have to be retrieved from the second level cache. The second level cache keeps track of multiple versions of data, where more than one speculative thread is running in parallel, while the first level cache does not have any of the versions during speculation. A switch allows choosing between modes of operation of a speculation blind first level cache.

  14. Using XRootD to provide caches for CernVM-FS

    CERN Document Server

    Domenighini, Matteo

    2017-01-01

    CernVM-FS recently added the possibility of using plugin for cache management. In order to investigate the capabilities and limits of such possibility, an XRootD plugin was written and benchmarked; as a byproduct, a POSIX plugin was also generated. The tests revealed that the plugin interface introduces no signicant performance over- head; moreover, the XRootD plugin performance was discovered to be worse than the ones of the built-in cache manager and the POSIX plugin. Further test of the XRootD component revealed that its per- formance is dependent on the server disk speed.

  15. Hydroacoustic Estimates of Fish Density Distributions in Cougar Reservoir, 2011

    Energy Technology Data Exchange (ETDEWEB)

    Ploskey, Gene R.; Zimmerman, Shon A.; Hennen, Matthew J.; Batten, George W.; Mitchell, T. D.

    2012-09-01

    Day and night mobile hydroacoustic surveys were conducted once each month from April through December 2011 to quantify the horizontal and vertical distributions of fish throughout Cougar Reservoir, Lane County, Oregon.

  16. Geothermal development plan: Maricopa county

    Energy Technology Data Exchange (ETDEWEB)

    White, D.H.

    1981-01-01

    Maricopa county is the area of Arizona receiving top priority since it contains over half of the state's population. The county is located entirely within the Basin and Range physiographic region in which geothermal resources are known to occur. Several approaches were taken to match potential users to geothermal resources. One approach involved matching some of the largest facilities in the county to nearby geothermal resources. Other approaches involved identifying industrial processes whose heat requirements are less than the average assessed geothermal reservoir temperature of 110/sup 0/C (230/sup 0/F). Since many of the industries are located on or near geothermal resources, geothermal energy potentially could be adapted to many industrial processes.

  17. Cache Timing Analysis of eStream Finalists

    DEFF Research Database (Denmark)

    Zenner, Erik

    2009-01-01

    Cache Timing Attacks have attracted a lot of cryptographic attention due to their relevance for the AES. However, their applicability to other cryptographic primitives is less well researched. In this talk, we give an overview over our analysis of the stream ciphers that were selected for phase 3...

  18. A Survey on Mobile Edge Networks: Convergence of Computing, Caching and Communications

    OpenAIRE

    Wang, Shuo; Zhang, Xing; Zhang, Yan; Wang, Lin; Yang, Juwo; Wang, Wenbo

    2017-01-01

    As the explosive growth of smart devices and the advent of many new applications, traffic volume has been growing exponentially. The traditional centralized network architecture cannot accommodate such user demands due to heavy burden on the backhaul links and long latency. Therefore, new architectures which bring network functions and contents to the network edge are proposed, i.e., mobile edge computing and caching. Mobile edge networks provide cloud computing and caching capabilities at th...

  19. Cache Timing Analysis of LFSR-based Stream Ciphers

    DEFF Research Database (Denmark)

    Zenner, Erik; Leander, Gregor; Hawkes, Philip

    2009-01-01

    Cache timing attacks are a class of side-channel attacks that is applicable against certain software implementations. They have generated significant interest when demonstrated against the Advanced Encryption Standard (AES), but have more recently also been applied against other cryptographic...

  20. Applying Data Mining Techniques to Improve Information Security in the Cloud: A Single Cache System Approach

    OpenAIRE

    Amany AlShawi

    2016-01-01

    Presently, the popularity of cloud computing is gradually increasing day by day. The purpose of this research was to enhance the security of the cloud using techniques such as data mining with specific reference to the single cache system. From the findings of the research, it was observed that the security in the cloud could be enhanced with the single cache system. For future purposes, an Apriori algorithm can be applied to the single cache system. This can be applied by all cloud providers...

  1. Architectural Development and Performance Analysis of a Primary Data Cache with Read Miss Address Prediction Capability

    National Research Council Canada - National Science Library

    Christensen, Kathryn

    1998-01-01

    .... The Predictive Read Cache (PRC) further improves the overall memory hierarchy performance by tracking the data read miss patterns of memory accesses, developing a prediction for the next access and prefetching the data into the faster cache memory...

  2. Dynamic Allocation of SPM Based on Time-Slotted Cache Conflict Graph for System Optimization

    Science.gov (United States)

    Wu, Jianping; Ling, Ming; Zhang, Yang; Mei, Chen; Wang, Huan

    This paper proposes a novel dynamic Scratch-pad Memory allocation strategy to optimize the energy consumption of the memory sub-system. Firstly, the whole program execution process is sliced into several time slots according to the temporal dimension; thereafter, a Time-Slotted Cache Conflict Graph (TSCCG) is introduced to model the behavior of Data Cache (D-Cache) conflicts within each time slot. Then, Integer Nonlinear Programming (INP) is implemented, which can avoid time-consuming linearization process, to select the most profitable data pages. Virtual Memory System (VMS) is adopted to remap those data pages, which will cause severe Cache conflicts within a time slot, to SPM. In order to minimize the swapping overhead of dynamic SPM allocation, a novel SPM controller with a tightly coupled DMA is introduced to issue the swapping operations without CPU's intervention. Last but not the least, this paper discusses the fluctuation of system energy profit based on different MMU page size as well as the Time Slot duration quantitatively. According to our design space exploration, the proposed method can optimize all of the data segments, including global data, heap and stack data in general, and reduce the total energy consumption by 27.28% on average, up to 55.22% with a marginal performance promotion. And comparing to the conventional static CCG (Cache Conflicts Graph), our approach can obtain 24.7% energy profit on average, up to 30.5% with a sight boost in performance.

  3. Exploitation of pocket gophers and their food caches by grizzly bears

    Science.gov (United States)

    Mattson, D.J.

    2004-01-01

    I investigated the exploitation of pocket gophers (Thomomys talpoides) by grizzly bears (Ursus arctos horribilis) in the Yellowstone region of the United States with the use of data collected during a study of radiomarked bears in 1977-1992. My analysis focused on the importance of pocket gophers as a source of energy and nutrients, effects of weather and site features, and importance of pocket gophers to grizzly bears in the western contiguous United States prior to historical extirpations. Pocket gophers and their food caches were infrequent in grizzly bear feces, although foraging for pocket gophers accounted for about 20-25% of all grizzly bear feeding activity during April and May. Compared with roots individually excavated by bears, pocket gopher food caches were less digestible but more easily dug out. Exploitation of gopher food caches by grizzly bears was highly sensitive to site and weather conditions and peaked during and shortly after snowmelt. This peak coincided with maximum success by bears in finding pocket gopher food caches. Exploitation was most frequent and extensive on gently sloping nonforested sites with abundant spring beauty (Claytonia lanceolata) and yampah (Perdieridia gairdneri). Pocket gophers are rare in forests, and spring beauty and yampah roots are known to be important foods of both grizzly bears and burrowing rodents. Although grizzly bears commonly exploit pocket gophers only in the Yellowstone region, this behavior was probably widespread in mountainous areas of the western contiguous United States prior to extirpations of grizzly bears within the last 150 years.

  4. Hybrid caches: design and data management

    OpenAIRE

    Valero Bresó, Alejandro

    2013-01-01

    Cache memories have been usually implemented with Static Random-Access Memory (SRAM) technology since it is the fastest electronic memory technology. However, this technology consumes a high amount of leakage currents, which is a major design concern because leakage energy consumption increases as the transistor size shrinks. Alternative technologies are being considered to reduce this consumption. Among them, embedded Dynamic RAM (eDRAM) technology provides minimal area and le...

  5. Improved Oil Recovery in Fluvial Dominated Deltaic Reservoirs of Kansas - Near-Term

    International Nuclear Information System (INIS)

    Green, Don W.; McCune, A.D.; Michnick, M.; Reynolds, R.; Walton, A.; Watney, L.; Willhite, G. Paul

    1999-01-01

    The objective of this project is to address waterflood problems of the type found in Morrow sandstone reservoirs in southwestern Kansas and in Cherokee Group reservoirs in southeastern Kansas. Two demonstration sites operated by different independent oil operators are involved in this project. The Stewart Field is located in Finney County, Kansas and is operated by PetroSantander, Inc. Te Nelson Lease is located in Allen County, Kansas, in the N.E. Savonburg Field and is operated by James E. Russell Petroleum, Inc. General topics to be addressed are (1) reservoir management and performance evaluation, (2) waterflood optimization, and (3) the demonstration of recovery processes involving off-the-shelf technologies which can be used to enhance waterflood recovery, increase reserves, and reduce the abandonment rate of these reservoir types. In the Stewart Project, the reservoir management portion of the project conducted during Budget Period 1 involved performance evaluation. This included (1) reservoir characterization and the development of a reservoir database, (2) volumetric analysis to evaluate production performance, (3) reservoir modeling, (4) laboratory work, (5) identification of operational problems, (6) identification of unrecovered mobile oil and estimation of recovery factors, and (7) Identification of the most efficient and economical recovery process. To accomplish these objectives the initial budget period was subdivided into three major tasks. The tasks were (1) geological and engineering analysis, (2) laboratory testing, and (3) unitization. Due to the presence of different operators within the field, it was necessary to unitize the field in order to demonstrate a field-wide improved recovery process. This work was completed and the project moved into Budget Period 2

  6. Efficient Resource Scheduling by Exploiting Relay Cache for Cellular Networks

    Directory of Open Access Journals (Sweden)

    Chun He

    2015-01-01

    Full Text Available In relay-enhanced cellular systems, throughput of User Equipment (UE is constrained by the bottleneck of the two-hop link, backhaul link (or the first hop link, and access link (the second hop link. To maximize the throughput, resource allocation should be coordinated between these two hops. A common resource scheduling algorithm, Adaptive Distributed Proportional Fair, only ensures that the throughput of the first hop is greater than or equal to that of the second hop. But it cannot guarantee a good balance of the throughput and fairness between the two hops. In this paper, we propose a Two-Hop Balanced Distributed Scheduling (TBS algorithm by exploiting relay cache for non-real-time data traffic. The evolved Node Basestation (eNB adaptively adjusts the number of Resource Blocks (RBs allocated to the backhaul link and direct links based on the cache information of relays. Each relay allocates RBs for relay UEs based on the size of the relay UE’s Transport Block. We also design a relay UE’s ACK feedback mechanism to update the data at relay cache. Simulation results show that the proposed TBS can effectively improve resource utilization and achieve a good trade-off between system throughput and fairness by balancing the throughput of backhaul and access link.

  7. Caching Over-The-Top Services, the Netflix Case

    DEFF Research Database (Denmark)

    Jensen, Stefan; Jensen, Michael; Gutierrez Lopez, Jose Manuel

    2015-01-01

    Problem (LLB-CFL). The solution search processes are implemented based on Genetic Algorithms (GA), designing genetic operators highly targeted towards this specific problem. The proposed methods are applied to a case study focusing on the demand and cache specifications of Netflix, and framed into a real...

  8. Cache-Oblivious Planar Orthogonal Range Searching and Counting

    DEFF Research Database (Denmark)

    Arge, Lars; Brodal, Gerth Stølting; Fagerberg, Rolf

    2005-01-01

    present the first cache-oblivious data structure for planar orthogonal range counting, and improve on previous results for cache-oblivious planar orthogonal range searching. Our range counting structure uses O(Nlog2 N) space and answers queries using O(logB N) memory transfers, where B is the block...... size of any memory level in a multilevel memory hierarchy. Using bit manipulation techniques, the space can be further reduced to O(N). The structure can also be modified to support more general semigroup range sum queries in O(logB N) memory transfers, using O(Nlog2 N) space for three-sided queries...... and O(Nlog22 N/log2log2 N) space for four-sided queries. Based on the O(Nlog N) space range counting structure, we develop a data structure that uses O(Nlog2 N) space and answers three-sided range queries in O(logB N+T/B) memory transfers, where T is the number of reported points. Based...

  9. An ESL Approach for Energy Consumption Analysis of Cache Memories in SoC Platforms

    Directory of Open Access Journals (Sweden)

    Abel G. Silva-Filho

    2011-01-01

    Full Text Available The design of complex circuits as SoCs presents two great challenges to designers. One is the speeding up of system functionality modeling and the second is the implementation of the system in an architecture that meets performance and power consumption requirements. Thus, developing new high-level specification mechanisms for the reduction of the design effort with automatic architecture exploration is a necessity. This paper proposes an Electronic-System-Level (ESL approach for system modeling and cache energy consumption analysis of SoCs called PCacheEnergyAnalyzer. It uses as entry a high-level UML-2.0 profile model of the system and it generates a simulation model of a multicore platform that can be analyzed for cache tuning. PCacheEnergyAnalyzer performs static/dynamic energy consumption analysis of caches on platforms that may have different processors. Architecture exploration is achieved by letting designers choose different processors for platform generation and different mechanisms for cache optimization. PCacheEnergyAnalyzer has been validated with several applications of Mibench, Mediabench, and PowerStone benchmarks, and results show that it provides analysis with reduced simulation effort.

  10. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  11. Application of advanced reservoir characterization, simulation and production optimization strategies to maximize recovery in slope and basin clastic reservoirs, West Texas (Delaware Basin). Annual report

    Energy Technology Data Exchange (ETDEWEB)

    Dutton, S.P.; Asquith, G.B.; Barton, M.D.; Cole, A.G.; Gogas, J.; Malik, M.A.; Clift, S.J.; Guzman, J.I.

    1997-11-01

    The objective of this project is to demonstrate that detailed reservoir characterization of slope and basin clastic reservoirs in sandstones of the Delaware Mountain Group in the Delaware Basin of West Texas and New Mexico is a cost-effective way to recover a higher percentage of the original oil in place through strategic placement of infill wells and geologically based field development. This project involves reservoir characterization of two Late Permian slope and basin clastic reservoirs in the Delaware Basin, West Texas, followed by a field demonstration in one of the fields. The fields being investigated are Geraldine Ford and Ford West fields in Reeves and Culberson Counties, Texas. Project objectives are divided into two major phases, reservoir characterization and implementation. The objectives of the reservoir characterization phase of the project were to provide a detailed understanding of the architecture and heterogeneity of the two fields, the Ford Geraldine unit and Ford West field. Reservoir characterization utilized 3-D seismic data, high-resolution sequence stratigraphy, subsurface field studies, outcrop characterization, and other techniques. Once reservoir characterized was completed, a pilot area of approximately 1 mi{sup 2} at the northern end of the Ford Geraldine unit was chosen for reservoir simulation. This report summarizes the results of the second year of reservoir characterization.

  12. Memory for multiple cache locations and prey quantities in a food-hoarding songbird

    Directory of Open Access Journals (Sweden)

    Nicola eArmstrong

    2012-12-01

    Full Text Available Most animals can discriminate between pairs of numbers that are each less than four without training. However, North Island robins (Petroica longipes, a food hoarding songbird endemic to New Zealand, can discriminate between quantities of items as high as eight without training. Here we investigate whether robins are capable of other complex quantity discrimination tasks. We test whether their ability to discriminate between small quantities declines with 1. the number of cache sites containing prey rewards and 2. the length of time separating cache creation and retrieval (retention interval. Results showed that subjects generally performed above chance expectations. They were equally able to discriminate between different combinations of prey quantities that were hidden from view in 2, 3 and 4 cache sites from between 1, 10 and 60 seconds. Overall results indicate that North Island robins can process complex quantity information involving more than two discrete quantities of items for up to one minute long retention intervals without training.

  13. Instant Varnish Cache how-to

    CERN Document Server

    Moutinho, Roberto

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. Get the job done and learn as you go. Easy-to-follow, step-by-step recipes which will get you started with Varnish Cache. Practical examples will help you to get set up quickly and easily.This book is aimed at system administrators and web developers who need to scale websites without tossing money on a large and costly infrastructure. It's assumed that you have some knowledge of the HTTP protocol, how browsers and server communicate with each other, and basic Linux systems.

  14. Tannin concentration enhances seed caching by scatter-hoarding rodents: An experiment using artificial ‘seeds’

    Science.gov (United States)

    Wang, Bo; Chen, Jin

    2008-11-01

    Tannins are very common among plant seeds but their effects on the fate of seeds, for example, via mediation of the feeding preferences of scatter-hoarding rodents, are poorly understood. In this study, we created a series of artificial 'seeds' that only differed in tannin concentration and the type of tannin, and placed them in a pine forest in the Shangri-La Alpine Botanical Garden, Yunnan Province of China. Two rodent species ( Apodemus latronum and A. chevrieri) showed significant preferences for 'seeds' with different tannin concentrations. A significantly higher proportion of seeds with low tannin concentration were consumed in situ compared with seeds with a higher tannin concentration. Meanwhile, the tannin concentration was significantly positively correlated with the proportion of seeds cached. The different types of tannin (hydrolysable tannin vs condensed tannin) did not differ significantly in their effect on the proportion of seeds eaten in situ vs seeds cached. Tannin concentrations had no significant effect on the distance that cached seeds were carried, which suggests that rodents may respond to different seed traits in deciding whether or not to cache seeds and how far they will transport seeds.

  15. CACHE: an extended BASIC program which computes the performance of shell and tube heat exchangers

    International Nuclear Information System (INIS)

    Tallackson, J.R.

    1976-03-01

    An extended BASIC program, CACHE, has been written to calculate steady state heat exchange rates in the core auxiliary heat exchangers, (CAHE), designed to remove afterheat from High-Temperature Gas-Cooled Reactors (HTGR). Computationally, these are unbaffled counterflow shell and tube heat exchangers. The computational method is straightforward. The exchanger is subdivided into a user-selected number of lengthwise segments; heat exchange in each segment is calculated in sequence and summed. The program takes the temperature dependencies of all thermal conductivities, viscosities and heat capacities into account providing these are expressed algebraically. CACHE is easily adapted to compute steady state heat exchange rates in any unbaffled counterflow exchanger. As now used, CACHE calculates heat removal by liquid weight from high-temperature helium and helium mixed with nitrogen, oxygen and carbon monoxide. A second program, FULTN, is described. FULTN computes the geometrical parameters required as input to CACHE. As reported herein, FULTN computes the internal dimensions of the Fulton Station CAHE. The two programs are chained to operate as one. Complete user information is supplied. The basic equations, variable lists, annotated program lists, and sample outputs with explanatory notes are included

  16. The Potential Role of Cache Mechanism for Complicated Design Optimization

    International Nuclear Information System (INIS)

    Noriyasu, Hirokawa; Fujita, Kikuo

    2002-01-01

    This paper discusses the potential role of cache mechanism for complicated design optimization While design optimization is an application of mathematical programming techniques to engineering design problems over numerical computation, its progress has been coevolutionary. The trend in such progress indicates that more complicated applications become the next target of design optimization beyond growth of computational resources. As the progress in the past two decades had required response surface techniques, decomposition techniques, etc., any new framework must be introduced for the future of design optimization methods. This paper proposes a possibility of what we call cache mechanism for mediating the coming challenge and briefly demonstrates some promises in the idea of Voronoi diagram based cumulative approximation as an example of its implementation, development of strict robust design, extension of design optimization for product variety

  17. Data from selected Almond Formation outcrops -- Sweetwater County, Wyoming

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, S.R.; Rawn-Schatzinger, V.

    1993-12-01

    The objectives of this research program are to: (1) determine the reservoir characteristics and production problems of shoreline barrier reservoirs; and (2) develop methods and methodologies to effectively characterize shoreline barrier reservoirs to predict flow patterns of injected and produced fluids. Two reservoirs were selected for detailed reservoir characterization studies -- Bell Creek field, Carter County, Montana, that produces from the Lower Cretaceous (Albian-Cenomanian) Muddy Formation, and Patrick Draw field, Sweetwater County, Wyoming that produces from the Upper Cretaceous (Campanian) Almond Formation of the Mesaverde Group. An important component of the research project was to use information from outcrop exposures of the producing formations to study the spatial variations of reservoir properties and the degree to which outcrop information can be used in the construction of reservoir models. A report similar to this one presents the Muddy Formation outcrop data and analyses performed in the course of this study (Rawn-Schatzinger, 1993). Two outcrop localities, RG and RH, previously described by Roehler (1988) provided good exposures of the Upper Almond shoreline barrier facies and were studied during 1990--1991. Core from core well No. 2 drilled approximately 0.3 miles downdip of outcrop RG was obtained for study. The results of the core study will be reported in a separate volume. Outcrops RH and RG, located about 2 miles apart were selected for detailed description and drilling of core plugs. One 257-ft-thick section was measured at outcrop RG, and three sections {approximately}145 ft thick located 490 and 655 feet apart were measured at the outcrop RH. Cross-sections of these described profiles were constructed to determine lateral facies continuity and changes. This report contains the data and analyses from the studied outcrops.

  18. 5G Network Communication, Caching, and Computing Algorithms Based on the Two‐Tier Game Model

    Directory of Open Access Journals (Sweden)

    Sungwook Kim

    2018-02-01

    Full Text Available In this study, we developed hybrid control algorithms in smart base stations (SBSs along with devised communication, caching, and computing techniques. In the proposed scheme, SBSs are equipped with computing power and data storage to collectively offload the computation from mobile user equipment and to cache the data from clouds. To combine in a refined manner the communication, caching, and computing algorithms, game theory is adopted to characterize competitive and cooperative interactions. The main contribution of our proposed scheme is to illuminate the ultimate synergy behind a fully integrated approach, while providing excellent adaptability and flexibility to satisfy the different performance requirements. Simulation results demonstrate that the proposed approach can outperform existing schemes by approximately 5% to 15% in terms of bandwidth utilization, access delay, and system throughput.

  19. Implementació d'una Cache per a un processador MIPS d'una FPGA

    OpenAIRE

    Riera Villanueva, Marc

    2013-01-01

    [CATALÀ] Primer s'explicarà breument l'arquitectura d'un MIPS, la jerarquia de memòria i el funcionament de la cache. Posteriorment s'explicarà com s'ha dissenyat i implementat una jerarquia de memòria per a un MIPS implementat en VHDL en una FPGA. [ANGLÈS] First, the MIPS architecture, memory hierarchy and the functioning of the cache will be explained briefly. Then, the design and implementation of a memory hierarchy for a MIPS processor implemented in VHDL on an FPGA will be explained....

  20. A Software Managed Stack Cache for Real-Time Systems

    DEFF Research Database (Denmark)

    Jordan, Alexander; Abbaspourseyedi, Sahar; Schoeberl, Martin

    2016-01-01

    In a real-time system, the use of a scratchpad memory can mitigate the difficulties related to analyzing data caches, whose behavior is inherently hard to predict. We propose to use a scratchpad memory for stack allocated data. While statically allocating stack frames for individual functions...

  1. Achieving cost/performance balance ratio using tiered storage caching techniques: A case study with CephFS

    Science.gov (United States)

    Poat, M. D.; Lauret, J.

    2017-10-01

    As demand for widely accessible storage capacity increases and usage is on the rise, steady IO performance is desired but tends to suffer within multi-user environments. Typical deployments use standard hard drives as the cost per/GB is quite low. On the other hand, HDD based solutions for storage is not known to scale well with process concurrency and soon enough, high rate of IOPs create a “random access” pattern killing performance. Though not all SSDs are alike, SSDs are an established technology often used to address this exact “random access” problem. In this contribution, we will first discuss the IO performance of many different SSD drives (tested in a comparable and standalone manner). We will then be discussing the performance and integrity of at least three low-level disk caching techniques (Flashcache, dm-cache, and bcache) including individual policies, procedures, and IO performance. Furthermore, the STAR online computing infrastructure currently hosts a POSIX-compliant Ceph distributed storage cluster - while caching is not a native feature of CephFS (only exists in the Ceph Object store), we will show how one can implement a caching mechanism profiting from an implementation at a lower level. As our illustration, we will present our CephFS setup, IO performance tests, and overall experience from such configuration. We hope this work will service the community’s interest for using disk-caching mechanisms with applicable uses such as distributed storage systems and seeking an overall IO performance gain.

  2. Storageless and caching Tier-2 models in the UK context

    Science.gov (United States)

    Cadellin Skipsey, Samuel; Dewhurst, Alastair; Crooks, David; MacMahon, Ewan; Roy, Gareth; Smith, Oliver; Mohammed, Kashif; Brew, Chris; Britton, David

    2017-10-01

    Operational and other pressures have lead to WLCG experiments moving increasingly to a stratified model for Tier-2 resources, where “fat” Tier-2s (“T2Ds”) and “thin” Tier-2s (“T2Cs”) provide different levels of service. In the UK, this distinction is also encouraged by the terms of the current GridPP5 funding model. In anticipation of this, testing has been performed on the implications, and potential implementation, of such a distinction in our resources. In particular, this presentation presents the results of testing of storage T2Cs, where the “thin” nature is expressed by the site having either no local data storage, or only a thin caching layer; data is streamed or copied from a “nearby” T2D when needed by jobs. In OSG, this model has been adopted successfully for CMS AAA sites; but the network topology and capacity in the USA is significantly different to that in the UK (and much of Europe). We present the result of several operational tests: the in-production University College London (UCL) site, which runs ATLAS workloads using storage at the Queen Mary University of London (QMUL) site; the Oxford site, which has had scaling tests performed against T2Ds in various locations in the UK (to test network effects); and the Durham site, which has been testing the specific ATLAS caching solution of “Rucio Cache” integration with ARC’s caching layer.

  3. Optimal Replacement Policies for Non-Uniform Cache Objects with Optional Eviction

    National Research Council Canada - National Science Library

    Bahat, Omri; Makowski, Armand M

    2002-01-01

    .... However, since the introduction of optimal replacement policies for conventional caching, the problem of finding optimal replacement policies under the factors indicated has not been studied in any systematic manner...

  4. Effective caching of shortest paths for location-based services

    DEFF Research Database (Denmark)

    Jensen, Christian S.; Thomsen, Jeppe Rishede; Yiu, Man Lung

    2012-01-01

    Web search is ubiquitous in our daily lives. Caching has been extensively used to reduce the computation time of the search engine and reduce the network traffic beyond a proxy server. Another form of web search, known as online shortest path search, is popular due to advances in geo...

  5. On-farm irrigation reservoirs for surface water storage in eastern Arkansas: Trends in construction in response to aquifer depletion

    Science.gov (United States)

    Yaeger, M. A.; Reba, M. L.; Massey, J. H.; Adviento-Borbe, A.

    2017-12-01

    On-farm surface water storage reservoirs have been constructed to address declines in the Mississippi River Valley Alluvial aquifer, the primary source of irrigation for most of the row crops grown in eastern Arkansas. These reservoirs and their associated infrastructure represent significant investments in financial and natural resources, and may cause producers to incur costs associated with foregone crop production and long-term maintenance. Thus, an analysis of reservoir construction trends in the Grand Prairie Critical Groundwater Area (GPCGA) and Cache River Critical Groundwater Area (CRCGA) was conducted to assist future water management decisions. Between 1996 and 2015, on average, 16 and 4 reservoirs were constructed per year, corresponding to cumulative new reservoir surface areas of 161 and 60 ha yr-1, for the GPCGA and the CRCGA, respectively. In terms of reservoir locations relative to aquifer status, after 1996, 84.5% of 309 total reservoirs constructed in the GPCGA and 91.0% of 78 in the CRCGA were located in areas with remaining saturated aquifer thicknesses of 50% or less. The majority of new reservoirs (74% in the GPCGA and 63% in the CRCGA) were constructed on previously productive cropland. The next most common land use, representing 11% and 15% of new reservoirs constructed in the GPCGA and CRCGA, respectively, was the combination of a field edge and a ditch, stream, or other low-lying area. Less than 10% of post-1996 reservoirs were constructed on predominately low-lying land, and the use of such lands decreased in both critical groundwater areas during the past 20 years. These disparities in reservoir construction rates, locations, and prior land uses is likely due to groundwater declines being first observed in the GPCGA as well as the existence of two large-scale river diversion projects under construction in the GPCGA that feature on-farm storage as a means to offset groundwater use.

  6. Reservoir-induced landslides and risk control in Three Gorges Project on Yangtze River, China

    Directory of Open Access Journals (Sweden)

    Yueping Yin

    2016-10-01

    Full Text Available The Three Gorges region in China was basically a geohazard-prone area prior to construction of the Three Gorges Reservoir (TGR. After construction of the TGR, the water level was raised from 70 m to 175 m above sea level (ASL, and annual reservoir regulation has caused a 30-m water level difference after impoundment of the TGR since September 2008. This paper first presents the spatiotemporal distribution of landslides in six periods of 175 m ASL trial impoundments from 2008 to 2014. The results show that the number of landslides sharply decreased from 273 at the initial stage to less than ten at the second stage of impoundment. Based on this, the reservoir-induced landslides in the TGR region can be roughly classified into five failure patterns, i.e. accumulation landslide, dip-slope landslide, reversed bedding landslide, rockfall, and karst breccia landslide. The accumulation landslides and dip-slope landslides account for more than 90%. Taking the Shuping accumulation landslide (a sliding mass volume of 20.7 × 106 m3 in Zigui County and the Outang dip-slope landslide (a sliding mass volume of about 90 × 106 m3 in Fengjie County as two typical cases, the mechanisms of reactivation of the two landslides are analyzed. The monitoring data and factor of safety (FOS calculation show that the accumulation landslide is dominated by water level variation in the reservoir as most part of the mass body is under 175 m ASL, and the dip-slope landslide is controlled by the coupling effect of reservoir water level variation and precipitation as an extensive recharge area of rainfall from the rear and the front mass is below 175 m ASL. The characteristics of landslide-induced impulsive wave hazards after and before reservoir impoundment are studied, and the probability of occurrence of a landslide-induced impulsive wave hazard has increased in the reservoir region. Simulation results of the Ganjingzi landslide in Wushan County indicate the

  7. Study on data acquisition system based on reconfigurable cache technology

    Science.gov (United States)

    Zhang, Qinchuan; Li, Min; Jiang, Jun

    2018-03-01

    Waveform capture rate is one of the key features of digital acquisition systems, which represents the waveform processing capability of the system in a unit time. The higher the waveform capture rate is, the larger the chance to capture elusive events is and the more reliable the test result is. First, this paper analyzes the impact of several factors on the waveform capture rate of the system, then the novel technology based on reconfigurable cache is further proposed to optimize system architecture, and the simulation results show that the signal-to-noise ratio of signal, capacity, and structure of cache have significant effects on the waveform capture rate. Finally, the technology is demonstrated by the engineering practice, and the results show that the waveform capture rate of the system is improved substantially without significant increase of system's cost, and the technology proposed has a broad application prospect.

  8. Environmental Assessment: Conestoga Reservoir Maintenance and Aquatic Habitat Rehabilitation Project Lancaster County, Nebraska

    Science.gov (United States)

    2014-06-01

    use and camping facilities, a boat launch and mooring area, sanitary facilities, and wells for drinking water at Conestoga Reservoir. Additional...gently sloping to very steep, well drained, loamy clay soils that formed in glacial till. The Sharpsburg series is a deep, moderately drained soil...Unfortunately, due to the number of potential sources ( sanitary wastewater, storm water, Conestoga Reservoir Rehabilitation Project U.S. Army Corps of

  9. Cache-aware data structure model for parallelism and dynamic load balancing

    International Nuclear Information System (INIS)

    Sridi, Marwa

    2016-01-01

    This PhD thesis is dedicated to the implementation of innovative parallel methods in the framework of fast transient fluid-structure dynamics. It improves existing methods within EUROPLEXUS software, in order to optimize the shared memory parallel strategy, complementary to the original distributed memory approach, brought together into a global hybrid strategy for clusters of multi-core nodes. Starting from a sound analysis of the state of the art concerning data structuring techniques correlated to the hierarchic memory organization of current multi-processor architectures, the proposed work introduces an approach suitable for an explicit time integration (i.e. with no linear system to solve at each step). A data structure of type 'Structure of arrays' is conserved for the global data storage, providing flexibility and efficiency for current operations on kinematics fields (displacement, velocity and acceleration). On the contrary, in the particular case of elementary operations (for internal forces generic computations, as well as fluxes computations between cell faces for fluid models), particularly time consuming but localized in the program, a temporary data structure of type 'Array of structures' is used instead, to force an efficient filling of the cache memory and increase the performance of the resolution, for both serial and shared memory parallel processing. Switching from the global structure to the temporary one is based on a cell grouping strategy, following classing cache-blocking principles but handling specifically for this work neighboring data necessary to the efficient treatment of ALE fluxes for cells on the group boundaries. The proposed approach is extensively tested, from the point of views of both the computation time and the access failures into cache memory, confronting the gains obtained within the elementary operations to the potential overhead generated by the data structure switch. Obtained results are very satisfactory, especially

  10. Enhancement web proxy cache performance using Wrapper Feature Selection methods with NB and J48

    Science.gov (United States)

    Mahmoud Al-Qudah, Dua'a.; Funke Olanrewaju, Rashidah; Wong Azman, Amelia

    2017-11-01

    Web proxy cache technique reduces response time by storing a copy of pages between client and server sides. If requested pages are cached in the proxy, there is no need to access the server. Due to the limited size and excessive cost of cache compared to the other storages, cache replacement algorithm is used to determine evict page when the cache is full. On the other hand, the conventional algorithms for replacement such as Least Recently Use (LRU), First in First Out (FIFO), Least Frequently Use (LFU), Randomized Policy etc. may discard important pages just before use. Furthermore, using conventional algorithm cannot be well optimized since it requires some decision to intelligently evict a page before replacement. Hence, most researchers propose an integration among intelligent classifiers and replacement algorithm to improves replacement algorithms performance. This research proposes using automated wrapper feature selection methods to choose the best subset of features that are relevant and influence classifiers prediction accuracy. The result present that using wrapper feature selection methods namely: Best First (BFS), Incremental Wrapper subset selection(IWSS)embedded NB and particle swarm optimization(PSO)reduce number of features and have a good impact on reducing computation time. Using PSO enhance NB classifier accuracy by 1.1%, 0.43% and 0.22% over using NB with all features, using BFS and using IWSS embedded NB respectively. PSO rises J48 accuracy by 0.03%, 1.91 and 0.04% over using J48 classifier with all features, using IWSS-embedded NB and using BFS respectively. While using IWSS embedded NB fastest NB and J48 classifiers much more than BFS and PSO. However, it reduces computation time of NB by 0.1383 and reduce computation time of J48 by 2.998.

  11. Fox squirrels match food assessment and cache effort to value and scarcity.

    Directory of Open Access Journals (Sweden)

    Mikel M Delgado

    Full Text Available Scatter hoarders must allocate time to assess items for caching, and to carry and bury each cache. Such decisions should be driven by economic variables, such as the value of the individual food items, the scarcity of these items, competition for food items and risk of pilferage by conspecifics. The fox squirrel, an obligate scatter-hoarder, assesses cacheable food items using two overt movements, head flicks and paw manipulations. These behaviors allow an examination of squirrel decision processes when storing food for winter survival. We measured wild squirrels' time allocations and frequencies of assessment and investment behaviors during periods of food scarcity (summer and abundance (fall, giving the squirrels a series of 15 items (alternating five hazelnuts and five peanuts. Assessment and investment per cache increased when resource value was higher (hazelnuts or resources were scarcer (summer, but decreased as scarcity declined (end of sessions. This is the first study to show that assessment behaviors change in response to factors that indicate daily and seasonal resource abundance, and that these factors may interact in complex ways to affect food storing decisions. Food-storing tree squirrels may be a useful and important model species to understand the complex economic decisions made under natural conditions.

  12. 75 FR 6257 - Watts Bar Reservoir Land Management Plan, Loudon, Meigs, Rhea, and Roane Counties, TN

    Science.gov (United States)

    2010-02-08

    ... TENNESSEE VALLEY AUTHORITY Watts Bar Reservoir Land Management Plan, Loudon, Meigs, Rhea, and... Watts Bar Reservoir in Tennessee. On November 19, 2009, the TVA Board of Directors (TVA Board) decided... the final environmental impact statement (FEIS) for the Watts Bar Reservoir Land Management Plan...

  13. Killing and caching of an adult White-tailed deer, Odocoileus virginianus, by a single Gray Wolf, Canis lupus

    Science.gov (United States)

    Nelson, Michael E.

    2011-01-01

    A single Gray Wolf (Canis lupus) killed an adult male White-tailed Deer (Odocoileus virginianus) and cached the intact carcass in 76 cm of snow. The carcass was revisited and entirely consumed between four and seven days later. This is the first recorded observation of a Gray Wolf caching an entire adult deer.

  14. A Novel Two-Tier Cooperative Caching Mechanism for the Optimization of Multi-Attribute Periodic Queries in Wireless Sensor Networks

    Science.gov (United States)

    Zhou, ZhangBing; Zhao, Deng; Shu, Lei; Tsang, Kim-Fung

    2015-01-01

    Wireless sensor networks, serving as an important interface between physical environments and computational systems, have been used extensively for supporting domain applications, where multiple-attribute sensory data are queried from the network continuously and periodically. Usually, certain sensory data may not vary significantly within a certain time duration for certain applications. In this setting, sensory data gathered at a certain time slot can be used for answering concurrent queries and may be reused for answering the forthcoming queries when the variation of these data is within a certain threshold. To address this challenge, a popularity-based cooperative caching mechanism is proposed in this article, where the popularity of sensory data is calculated according to the queries issued in recent time slots. This popularity reflects the possibility that sensory data are interested in the forthcoming queries. Generally, sensory data with the highest popularity are cached at the sink node, while sensory data that may not be interested in the forthcoming queries are cached in the head nodes of divided grid cells. Leveraging these cooperatively cached sensory data, queries are answered through composing these two-tier cached data. Experimental evaluation shows that this approach can reduce the network communication cost significantly and increase the network capability. PMID:26131665

  15. Cache-Oblivious Red-Blue Line Segment Intersection

    DEFF Research Database (Denmark)

    Arge, Lars; Mølhave, Thomas; Zeh, Norbert

    2008-01-01

    We present an optimal cache-oblivious algorithm for finding all intersections between a set of non-intersecting red segments and a set of non-intersecting blue segments in the plane. Our algorithm uses $O(\\frac{N}{B}\\log_{M/B}\\frac{N}{B}+T/B)$ memory transfers, where N is the total number...... of segments, M and B are the memory and block transfer sizes of any two consecutive levels of any multilevel memory hierarchy, and T is the number of intersections....

  16. Storage Capacity and Sedimentation of Loch Lomond Reservoir, Santa Cruz, California, 1998

    Science.gov (United States)

    McPherson, Kelly R.; Harmon, Jerry G.

    2000-01-01

    In 1998, a bathymetric survey was done to determine the storage capacity and the loss of capacity owing to sedimentation of Loch Lomond Reservoir in Santa Cruz County, California. Results of the survey indicate that the maximum capacity of the reservoir is 8,991 acre-feet in November 1998. The results of previous investigations indicate that storage capacity of the reservoir is less than 8,991 acre-feet. The storage capacity determined from those investigations probably were underestimated because of limitations of the methods and the equipment used. The volume of sedimentation in a reservoir is considered equal to the decrease in storage capacity. To determine sedimentation in Loch Lomond Reservoir, change in storage capacity was estimated for an upstream reach of the reservoir. The change in storage capacity was determined by comparing a 1998 thalweg profile (valley floor) of the reservoir with thalweg profiles from previous investigations; results of the comparison indicate that sedimentation is occurring in the upstream reach. Cross sections for 1998 and 1982 were compared to determine the magnitude of sedimentation in the upstream reach of the reservoir. Results of the comparison, which were determined from changes in the cross-sectional areas, indicate that the capacity of the reservoir decreased by 55 acre-feet.

  17. Spatial-temporal variations of natural suitability of human settlement environment in the Three Gorges Reservoir Area—A case study in Fengjie County, China

    Science.gov (United States)

    Luo, Jieqiong; Zhou, Tinggang; Du, Peijun; Xu, Zhigang

    2018-01-01

    With rapid environmental degeneration and socio-economic development, the human settlement environment (HSE) has experienced dramatic changes and attracted attention from different communities. Consequently, the spatial-temporal evaluation of natural suitability of the human settlement environment (NSHSE) has become essential for understanding the patterns and dynamics of HSE, and for coordinating sustainable development among regional populations, resources, and environments. This study aims to explore the spatialtemporal evolution of NSHSE patterns in 1997, 2005, and 2009 in Fengjie County near the Three Gorges Reservoir Area (TGRA). A spatially weighted NSHSE model was established by integrating multi-source data (e.g., census data, meteorological data, remote sensing images, DEM data, and GIS data) into one framework, where the Ordinary Least Squares (OLS) linear regression model was applied to calculate the weights of indices in the NSHSE model. Results show that the trend of natural suitability has been first downward and then upward, which is evidenced by the disparity of NSHSE existing in the south, north, and central areas of Fengjie County. Results also reveal clustered NSHSE patterns for all 30 townships. Meanwhile, NSHSE has significant influence on population distribution, and 71.49% of the total population is living in moderate and high suitable districts.

  18. Limnological Conditions and Occurrence of Taste-and-Odor Compounds in Lake William C. Bowen and Municipal Reservoir #1, Spartanburg County, South Carolina, 2006-2009

    Science.gov (United States)

    Journey, Celeste A.; Arrington, Jane M.; Beaulieu, Karen M.; Graham, Jennifer L.; Bradley, Paul M.

    2011-01-01

    Limnological conditions and the occurrence of taste-and-odor compounds were studied in two reservoirs in Spartanburg County, South Carolina, from May 2006 to June 2009. Lake William C. Bowen and Municipal Reservoir #1 are relatively shallow, meso-eutrophic, warm monomictic, cascading impoundments on the South Pacolet River. Overall, water-quality conditions and phytoplankton community assemblages were similar between the two reservoirs but differed seasonally. Median dissolved geosmin concentrations in the reservoirs ranged from 0.004 to 0.006 microgram per liter. Annual maximum dissolved geosmin concentrations tended to occur between March and May. In this study, peak dissolved geosmin production occurred in April and May 2008, ranging from 0.050 to 0.100 microgram per liter at the deeper reservoir sites. Peak dissolved geosmin production was not concurrent with maximum cyanobacterial biovolumes, which tended to occur in the summer (July to August), but was concurrent with a peak in the fraction of genera with known geosmin-producing strains in the cyanobacteria group. Nonetheless, annual maximum cyanobacterial biovolumes rarely resulted in cyanobacteria dominance of the phytoplankton community. In both reservoirs, elevated dissolved geosmin concentrations were correlated to environmental factors indicative of unstratified conditions and reduced algal productivity, but not to nutrient concentrations or ratios. With respect to potential geosmin sources, elevated geosmin concentrations were correlated to greater fractions of genera with known geosmin-producing strains in the cyanobacteria group and to biovolumes of a specific geosmin-producing cyanobacteria genus (Oscillatoria), but not to actinomycetes concentrations. Conversely, environmental factors that correlated with elevated cyanobacterial biovolumes were indicative of stable water columns (stratified conditions), warm water temperatures, reduced nitrogen concentrations, longer residence times, and high

  19. Small County: Development of a Virtual Environment for Instruction in Geological Characterization of Petroleum Reservoirs

    Science.gov (United States)

    Banz, B.; Bohling, G.; Doveton, J.

    2008-12-01

    Traditional programs of geological education continue to be focused primarily on the evaluation of surface or near-surface geology accessed at outcrops and shallow boreholes. However, most students who graduate to careers in geology work almost entirely on subsurface problems, interpreting drilling records and petrophysical logs from exploration and production wells. Thus, college graduates commonly find themselves ill-prepared when they enter the petroleum industry and require specialized training in drilling and petrophysical log interpretation. To aid in this training process, we are developing an environment for interactive instruction in the geological aspects of petroleum reservoir characterization employing a virtual subsurface closely reflecting the geology of the US mid-continent, in the fictional setting of Small County, Kansas. Stochastic simulation techniques are used to generate the subsurface characteristics, including the overall geological structure, distributions of facies, porosity, and fluid saturations, and petrophysical logs. The student then explores this subsurface by siting exploratory wells and examining drilling and petrophysical log records obtained from those wells. We are developing the application using the Eclipse Rich Client Platform, which allows for the rapid development of a platform-agnostic application while providing an immersive graphical interface. The application provides an array of views to enable relevant data display and student interaction. One such view is an interactive map of the county allowing the student to view the locations of existing well bores and select pertinent data overlays such as a contour map of the elevation of an interesting interval. Additionally, from this view a student may choose the site of a new well. Another view emulates a drilling log, complete with drilling rate plot and iconic representation of examined drill cuttings. From here, students are directed to stipulate subsurface lithology and

  20. Ordering sparse matrices for cache-based systems

    International Nuclear Information System (INIS)

    Biswas, Rupak; Oliker, Leonid

    2001-01-01

    The Conjugate Gradient (CG) algorithm is the oldest and best-known Krylov subspace method used to solve sparse linear systems. Most of the coating-point operations within each CG iteration is spent performing sparse matrix-vector multiplication (SPMV). We examine how various ordering and partitioning strategies affect the performance of CG and SPMV when different programming paradigms are used on current commercial cache-based computers. However, a multithreaded implementation on the cacheless Cray MTA demonstrates high efficiency and scalability without any special ordering or partitioning

  1. Modeling Permeability Alteration in Diatomite Reservoirs During Steam Drive, SUPRI TR-113

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Suniti Kumar; Kovscek, Anthony R.

    1999-08-09

    There is an estimated 10 billion barrels of original oil in place (OOIP) in diatomaceous reservoirs in Kern County, California. These reservoirs have low permeability ranging from 0.1 to 10 mD. Injection pressure controlled steam drive has been found to be an effective way to recover oil from these reservoir. However, steam drive in these reservoirs has its own complications. The rock matrix is primarily silica (SiO2). It is a known fact that silica is soluble in hot water and its solubility varies with temperature and pH. Due to this fact, the rock matrix in diatomite may dissolve into the aqueous phase as the temperature at a location increases or it may precipitate from the aqueous phase onto the rock grains as the temperature decreases. Thus, during steam drive silica redistribution will occur in the reservoir along with oil recovery. This silica redistribution causes the permeability and porosity of the reservoir to change. Understanding and quantifying these silica redistribution effects on the reservoir permeability might prove to be a key aspect of designing a steam drive project in these formations.

  2. A cache-friendly sampling strategy for texture-based volume rendering on GPU

    Directory of Open Access Journals (Sweden)

    Junpeng Wang

    2017-06-01

    Full Text Available The texture-based volume rendering is a memory-intensive algorithm. Its performance relies heavily on the performance of the texture cache. However, most existing texture-based volume rendering methods blindly map computational resources to texture memory and result in incoherent memory access patterns, causing low cache hit rates in certain cases. The distance between samples taken by threads of an atomic scheduling unit (e.g. a warp of 32 threads in CUDA of the GPU is a crucial factor that affects the texture cache performance. Based on this fact, we present a new sampling strategy, called Warp Marching, for the ray-casting algorithm of texture-based volume rendering. The effects of different sample organizations and different thread-pixel mappings in the ray-casting algorithm are thoroughly analyzed. Also, a pipeline manner color blending approach is introduced and the power of warp-level GPU operations is leveraged to improve the efficiency of parallel executions on the GPU. In addition, the rendering performance of the Warp Marching is view-independent, and it outperforms existing empty space skipping techniques in scenarios that need to render large dynamic volumes in a low resolution image. Through a series of micro-benchmarking and real-life data experiments, we rigorously analyze our sampling strategies and demonstrate significant performance enhancements over existing sampling methods.

  3. MonetDB/X100 - A DBMS in the CPU cache

    NARCIS (Netherlands)

    M. Zukowski (Marcin); P.A. Boncz (Peter); N.J. Nes (Niels); S. Héman (Sándor)

    2005-01-01

    textabstractX100 is a new execution engine for the MonetDB system, that improves execution speed and overcomes its main memory limitation. It introduces the concept of in-cache vectorized processing that strikes a balance between the existing column-at-a-time MIL execution primitives of MonetDB and

  4. Geopressured-geothermal drilling and testing plan. General Crude Oil--Dept. of Energy Pleasant Bayou No. 1 well, Brazoria County, Texas

    Energy Technology Data Exchange (ETDEWEB)

    1978-05-01

    As a result of geopressured resource assessment studies in the Gulf Coast region, the Brazoria fairway, located in Brazoria County, Texas was determined to be an optimum area for additional studies. A plan is presented for drilling, completion, and testing of one geopressured-geothermal well and two disposal wells in Brazoria County, Texas. The objectives of the well drilling and testing program are to determine the following parameters: reservoir permeability, porosity, thickness, rock material properties, depth, temperature, and pressure; reservoir fluid content, specific gravity, resistivity, viscosity, and hydrocarbons in solution; reservoir fluid production rates, pressure, temperature, production decline, and pressure decline; geopressured well and surface equipment design requirements for high-volume production and possible sand production; specific equipment design for surface operations, hydrocarbons distribution, and effluent disposal; and possibilities of reservoir compaction and/or surface subsidence. (JGB)

  5. Reservoir characterization of the Ordovician Red River Formation in southwest Williston Basin Bowman County, ND and Harding County, SD

    Energy Technology Data Exchange (ETDEWEB)

    Sippel, M.A.; Luff, K.D.; Hendricks, M.L.; Eby, D.E.

    1998-07-01

    This topical report is a compilation of characterizations by different disciplines of the Red River Formation in the southwest portion of the Williston Basin and the oil reservoirs which it contains in an area which straddles the state line between North Dakota and South Dakota. Goals of the report are to increase understanding of the reservoir rocks, oil-in-place, heterogeneity, and methods for improved recovery. The report is divided by discipline into five major sections: (1) geology, (2) petrography-petrophysical, (3) engineering, (4) case studies and (5) geophysical. Interwoven in these sections are results from demonstration wells which were drilled or selected for special testing to evaluate important concepts for field development and enhanced recovery. The Red River study area has been successfully explored with two-dimensional (2D) seismic. Improved reservoir characterization utilizing 3-dimensional (3D) and has been investigated for identification of structural and stratigraphic reservoir compartments. These seismic characterization tools are integrated with geological and engineering studies. Targeted drilling from predictions using 3D seismic for porosity development were successful in developing significant reserves at close distances to old wells. Short-lateral and horizontal drilling technologies were tested for improved completion efficiency. Lateral completions should improve economics for both primary and secondary recovery where low permeability is a problem and higher density drilling is limited by drilling cost. Low water injectivity and widely spaced wells have restricted the application of waterflooding in the past. Water injection tests were performed in both a vertical and a horizontal well. Data from these tests were used to predict long-term injection and oil recovery.

  6. On-chip COMA cache-coherence protocol for microgrids of microthreaded cores

    NARCIS (Netherlands)

    Zhang, L.; Jesshope, C.

    2008-01-01

    This paper describes an on-chip COMA cache coherency protocol to support the microthread model of concurrent program composition. The model gives a sound basis for building multi-core computers as it captures concurrency, abstracts communication and identifies resources, such as processor groups

  7. OneService - Generic Cache Aggregator Framework for Service Depended Cloud Applications

    NARCIS (Netherlands)

    Tekinerdogan, B.; Oral, O.A.

    2017-01-01

    Current big data cloud systems often use different data migration strategies from providers to customers. This often results in increased bandwidth usage and herewith a decrease of the performance. To enhance the performance often caching mechanisms are adopted. However, the implementations of these

  8. Model checking a cache coherence protocol for a Java DSM implementation

    NARCIS (Netherlands)

    J. Pang; W.J. Fokkink (Wan); R. Hofman (Rutger); R. Veldema

    2007-01-01

    textabstractJackal is a fine-grained distributed shared memory implementation of the Java programming language. It aims to implement Java's memory model and allows multithreaded Java programs to run unmodified on a distributed memory system. It employs a multiple-writer cache coherence

  9. INTELLIGENT COMPUTING SYSTEM FOR RESERVOIR ANALYSIS AND RISK ASSESSMENT OF THE RED RIVER FORMATION

    Energy Technology Data Exchange (ETDEWEB)

    Mark A. Sippel; William C. Carrigan; Kenneth D. Luff; Lyn Canter

    2003-11-12

    Integrated software has been written that comprises the tool kit for the Intelligent Computing System (ICS). The software tools in ICS have been developed for characterization of reservoir properties and evaluation of hydrocarbon potential using a combination of inter-disciplinary data sources such as geophysical, geologic and engineering variables. The ICS tools provide a means for logical and consistent reservoir characterization and oil reserve estimates. The tools can be broadly characterized as (1) clustering tools, (2) neural solvers, (3) multiple-linear regression, (4) entrapment-potential calculator and (5) file utility tools. ICS tools are extremely flexible in their approach and use, and applicable to most geologic settings. The tools are primarily designed to correlate relationships between seismic information and engineering and geologic data obtained from wells, and to convert or translate seismic information into engineering and geologic terms or units. It is also possible to apply ICS in a simple framework that may include reservoir characterization using only engineering, seismic, or geologic data in the analysis. ICS tools were developed and tested using geophysical, geologic and engineering data obtained from an exploitation and development project involving the Red River Formation in Bowman County, North Dakota and Harding County, South Dakota. Data obtained from 3D seismic surveys, and 2D seismic lines encompassing nine prospective field areas were used in the analysis. The geologic setting of the Red River Formation in Bowman and Harding counties is that of a shallow-shelf, carbonate system. Present-day depth of the Red River formation is approximately 8000 to 10,000 ft below ground surface. This report summarizes production results from well demonstration activity, results of reservoir characterization of the Red River Formation at demonstration sites, descriptions of ICS tools and strategies for their application.

  10. Geological and petrophysical characterization of the Ferron Sandstone for 3-D simulation of a fluvial-deltaic reservoir. Deliverable 2.5.4, Ferron Sandstone lithologic strip logs, Emergy & Sevier Counties, Utah: Volume I

    Energy Technology Data Exchange (ETDEWEB)

    Allison, M.L.

    1995-12-08

    Strip logs for 491 wells were produced from a digital subsurface database of lithologic descriptions of the Ferron Sandstone Member of the Mancos Shale. This subsurface database covers wells from the parts of Emery and Sevier Counties in central Utah that occur between Ferron Creek on the north and Last Chance Creek on the south. The lithologic descriptions were imported into a logging software application designed for the display of stratigraphic data. Strip logs were produced at a scale of one inch equals 20 feet. The strip logs were created as part of a study by the Utah Geological Survey to develop a comprehensive, interdisciplinary, and qualitative characterization of a fluvial-deltaic reservoir using the Ferron Sandstone as a surface analogue. The study was funded by the U.S. Department of Energy (DOE) under the Geoscience/Engineering Reservoir Characterization Program.

  11. Model checking a cache coherence protocol of a Java DSM implementation

    NARCIS (Netherlands)

    Pang, J.; Fokkink, W.J.; Hofman, R.; Veldema, R.S.

    2007-01-01

    Jackal is a fine-grained distributed shared memory implementation of the Java programming language. It aims to implement Java's memory model and allows multithreaded Java programs to run unmodified on a distributed memory system. It employs a multiple-writer cache coherence protocol. In this paper,

  12. Security in the CernVM File System and the Frontier Distributed Database Caching System

    International Nuclear Information System (INIS)

    Dykstra, D; Blomer, J

    2014-01-01

    Both the CernVM File System (CVMFS) and the Frontier Distributed Database Caching System (Frontier) distribute centrally updated data worldwide for LHC experiments using http proxy caches. Neither system provides privacy or access control on reading the data, but both control access to updates of the data and can guarantee the authenticity and integrity of the data transferred to clients over the internet. CVMFS has since its early days required digital signatures and secure hashes on all distributed data, and recently Frontier has added X.509-based authenticity and integrity checking. In this paper we detail and compare the security models of CVMFS and Frontier.

  13. Security in the CernVM File System and the Frontier Distributed Database Caching System

    Science.gov (United States)

    Dykstra, D.; Blomer, J.

    2014-06-01

    Both the CernVM File System (CVMFS) and the Frontier Distributed Database Caching System (Frontier) distribute centrally updated data worldwide for LHC experiments using http proxy caches. Neither system provides privacy or access control on reading the data, but both control access to updates of the data and can guarantee the authenticity and integrity of the data transferred to clients over the internet. CVMFS has since its early days required digital signatures and secure hashes on all distributed data, and recently Frontier has added X.509-based authenticity and integrity checking. In this paper we detail and compare the security models of CVMFS and Frontier.

  14. An Economic Model for Self-tuned Cloud Caching

    OpenAIRE

    Dash, Debabrata; Kantere, Verena; Ailamaki, Anastasia

    2009-01-01

    Cloud computing, the new trend for service infrastructures requires user multi-tenancy as well as minimal capital expenditure. In a cloud that services large amounts of data that are massively collected and queried, such as scientific data, users typically pay for query services. The cloud supports caching of data in order to provide quality query services. User payments cover query execution costs and maintenance of cloud infrastructure, and incur cloud profit. The challenge resides in provi...

  15. Cache Performance Optimization for SoC Vedio Applications

    OpenAIRE

    Lei Li; Wei Zhang; HuiYao An; Xing Zhang; HuaiQi Zhu

    2014-01-01

    Chip Multiprocessors (CMPs) are adopted by industry to deal with the speed limit of the single-processor. But memory access has become the bottleneck of the performance, especially in multimedia applications. In this paper, a set of management policies is proposed to improve the cache performance for a SoC platform of video application. By analyzing the behavior of Vedio Engine, the memory-friendly writeback and efficient prefetch policies are adopted. The experiment platform is simulated by ...

  16. Transient Variable Caching in Java’s Stack-Based Intermediate Representation

    Directory of Open Access Journals (Sweden)

    Paul Týma

    1999-01-01

    Full Text Available Java’s stack‐based intermediate representation (IR is typically coerced to execute on register‐based architectures. Unoptimized compiled code dutifully replicates transient variable usage designated by the programmer and common optimization practices tend to introduce further usage (i.e., CSE, Loop‐invariant Code Motion, etc.. On register based machines, often transient variables are cached within registers (when available saving the expense of actually accessing memory. Unfortunately, in stack‐based environments because of the need to push and pop the transient values, further performance improvement is possible. This paper presents Transient Variable Caching (TVC, a technique for eliminating transient variable overhead whenever possible. This optimization would find a likely home in optimizers attached to the back of popular Java compilers. Side effects of the algorithm include significant instruction reordering and introduction of many stack‐manipulation operations. This combination has proven to greatly impede the ability to decompile stack‐based IR code sequences. The code that results from the transform is faster, smaller, and greatly impedes decompilation.

  17. Ensemble Flow Forecasts for Risk Based Reservoir Operations of Lake Mendocino in Mendocino County, California: A Framework for Objectively Leveraging Weather and Climate Forecasts in a Decision Support Environment

    Science.gov (United States)

    Delaney, C.; Hartman, R. K.; Mendoza, J.; Whitin, B.

    2017-12-01

    Forecast informed reservoir operations (FIRO) is a methodology that incorporates short to mid-range precipitation and flow forecasts to inform the flood operations of reservoirs. The Ensemble Forecast Operations (EFO) alternative is a probabilistic approach of FIRO that incorporates ensemble streamflow predictions (ESPs) made by NOAA's California-Nevada River Forecast Center (CNRFC). With the EFO approach, release decisions are made to manage forecasted risk of reaching critical operational thresholds. A water management model was developed for Lake Mendocino, a 111,000 acre-foot reservoir located near Ukiah, California, to evaluate the viability of the EFO alternative to improve water supply reliability but not increase downstream flood risk. Lake Mendocino is a dual use reservoir, which is owned and operated for flood control by the United States Army Corps of Engineers and is operated for water supply by the Sonoma County Water Agency. Due to recent changes in the operations of an upstream hydroelectric facility, this reservoir has suffered from water supply reliability issues since 2007. The EFO alternative was simulated using a 26-year (1985-2010) ESP hindcast generated by the CNRFC. The ESP hindcast was developed using Global Ensemble Forecast System version 10 precipitation reforecasts processed with the Hydrologic Ensemble Forecast System to generate daily reforecasts of 61 flow ensemble members for a 15-day forecast horizon. Model simulation results demonstrate that the EFO alternative may improve water supply reliability for Lake Mendocino yet not increase flood risk for downstream areas. The developed operations framework can directly leverage improved skill in the second week of the forecast and is extendable into the S2S time domain given the demonstration of improved skill through a reliable reforecast of adequate historical duration and consistent with operationally available numerical weather predictions.

  18. Broadcasted Location-Aware Data Cache for Vehicular Application

    Directory of Open Access Journals (Sweden)

    Fukuda Akira

    2007-01-01

    Full Text Available There has been increasing interest in the exploitation of advances in information technology, for example, mobile computing and wireless communications in ITS (intelligent transport systems. Classes of applications that can benefit from such an infrastructure include traffic information, roadside businesses, weather reports, entertainment, and so on. There are several wireless communication methods currently available that can be utilized for vehicular applications, such as cellular phone networks, DSRC (dedicated short-range communication, and digital broadcasting. While a cellular phone network is relatively slow and a DSRC has a very small communication area, one-segment digital terrestrial broadcasting service was launched in Japan in 2006, high-performance digital broadcasting for mobile hosts has been available recently. However, broadcast delivery methods have the drawback that clients need to wait for the required data items to appear on the broadcast channel. In this paper, we propose a new cache system to effectively prefetch and replace broadcast data using "scope" (an available area of location-dependent data and "mobility specification" (a schedule according to the direction in which a mobile host moves. We numerically evaluate the cache system on the model close to the traffic road environment, and implement the emulation system to evaluate this location-aware data delivery method for a concrete vehicular application that delivers geographic road map data to a car navigation system.

  19. Reservoir characterization based on tracer response and rank analysis of production and injection rates

    Energy Technology Data Exchange (ETDEWEB)

    Refunjol, B.T. [Lagoven, S.A., Pdvsa (Venezuela); Lake, L.W. [Univ. of Texas, Austin, TX (United States)

    1997-08-01

    Quantification of the spatial distribution of properties is important for many reservoir-engineering applications. But, before applying any reservoir-characterization technique, the type of problem to be tackled and the information available should be analyzed. This is important because difficulties arise in reservoirs where production records are the only information for analysis. This paper presents the results of a practical technique to determine preferential flow trends in a reservoir. The technique is a combination of reservoir geology, tracer data, and Spearman rank correlation coefficient analysis. The Spearman analysis, in particular, will prove to be important because it appears to be insightful and uses injection/production data that are prevalent in circumstances where other data are nonexistent. The technique is applied to the North Buck Draw field, Campbell County, Wyoming. This work provides guidelines to assess information about reservoir continuity in interwell regions from widely available measurements of production and injection rates at existing wells. The information gained from the application of this technique can contribute to both the daily reservoir management and the future design, control, and interpretation of subsequent projects in the reservoir, without the need for additional data.

  20. EXPLOITATION AND OPTIMIZATION OF RESERVOIR PERFORMANCE IN HUNTON FORMATION, OKLAHOMA

    Energy Technology Data Exchange (ETDEWEB)

    Mohan Kelkar

    2002-03-31

    The West Carney Field in Lincoln County, Oklahoma is one of few newly discovered oil fields in Oklahoma. Although profitable, the field exhibits several unusual characteristics. These include decreasing water-oil ratios, decreasing gas-oil ratios, decreasing bottomhole pressures during shut-ins in some wells, and transient behavior for water production in many wells. This report explains the unusual characteristics of West Carney Field based on detailed geological and engineering analyses. We propose a geological history that explains the presence of mobile water and oil in the reservoir. The combination of matrix and fractures in the reservoir explains the reservoir's flow behavior. We confirm our hypothesis by matching observed performance with a simulated model and develop procedures for correlating core data to log data so that the analysis can be extended to other, similar fields where the core coverage may be limited.

  1. Wolves, Canis lupus, carry and cache the collars of radio-collared White-tailed Deer, Odocoileus virginianus, they killed

    Science.gov (United States)

    Nelson, Michael E.; Mech, L. David

    2011-01-01

    Wolves (Canis lupus) in northeastern Minnesota cached six radio-collars (four in winter, two in spring-summer) of 202 radio-collared White-tailed Deer (Odocoileus virginianus) they killed or consumed from 1975 to 2010. A Wolf bedded on top of one collar cached in snow. We found one collar each at a Wolf den and Wolf rendezvous site, 2.5 km and 0.5 km respectively, from each deer's previous locations.

  2. Ground-water and geohydrologic conditions in Queens County, Long Island, New York

    Science.gov (United States)

    Soren, Julian

    1971-01-01

    Queens County is a heavily populated borough of New York City, at the western end of Long Island, N. Y., in which large amounts of ground water are used, mostly for public supply. Ground water, pumped from local aquifers, by privately owned water-supply companies, supplied the water needs of about 750,000 of the nearly 2 million residents of the county in 1967; the balance was supplied by New York City from surface sources outside the county in upstate New York. The county's aquifers consist of sand and gravel of Late Cretaceous and of Pleistocene ages, and the aquifers comprise a wedge-shaped ground-water reservoir lying on a southeastward-sloping floor of Precambrian(?) bedrock. Beds of clay and silt generally confine water in the deeper parts of the reservoir; water in the deeper aquifers ranges from poorly confined to well confined. Wisconsin-age glacial deposits in the uppermost part of the reservoir contain ground water under water-table conditions. Ground water pumpage averaged about 60 mgd (million gallons per day) in Queens County from about 1900 to 1967. Much of the water was used in adjacent Kings County, another borough of New York City, prior to 1950. The large ground-water withdrawal has resulted in a wide-spread and still-growing cone of depression in the water table, reflecting a loss of about 61 billion gallons of fresh water from storage. Significant drawdown of the water table probably began with rapid urbanization of Queens County in the 1920's. The county has been extensively paved, and storm and sanitary sewers divert water, which formerly entered the ground, to tidewater north and south of the county. Natural recharge to the aquifers has been reduced to about one half of the preurban rate and is below the withdrawal rate. Ground-water levels have declined more than 40. feet from the earliest-known levels, in 1903, to 1967, and the water table is below sea level in much of the county. The aquifers are being contaminated by the movement of

  3. Analytical derivation of traffic patterns in cache-coherent shared-memory systems

    DEFF Research Database (Denmark)

    Stuart, Matthias Bo; Sparsø, Jens

    2011-01-01

    This paper presents an analytical method to derive the worst-case traffic pattern caused by a task graph mapped to a cache-coherent shared-memory system. Our analysis allows designers to rapidly evaluate the impact of different mappings of tasks to IP cores on the traffic pattern. The accuracy...

  4. The Cost of Cache-Oblivious Searching

    DEFF Research Database (Denmark)

    Bender, Michael A.; Brodal, Gerth Stølting; Fagerberg, Rolf

    2011-01-01

    of the block sizes are limited to be powers of 2. The paper gives modified versions of the van Emde Boas layout, where the expected number of memory transfers between any two levels of the memory hierarchy is arbitrarily close to [lg e+O(lg lg B/lg B)]log  B N+O(1). This factor approaches lg e≈1.443 as B...... increases. The expectation is taken over the random placement in memory of the first element of the structure. Because searching in the disk-access machine (DAM) model can be performed in log  B N+O(1) block transfers, this result establishes a separation between the (2-level) DAM model and cache...

  5. Bathymetry and capacity of Shawnee Reservoir, Oklahoma, 2016

    Science.gov (United States)

    Ashworth, Chad E.; Smith, S. Jerrod; Smith, Kevin A.

    2017-02-13

    Shawnee Reservoir (locally known as Shawnee Twin Lakes) is a man-made reservoir on South Deer Creek with a drainage area of 32.7 square miles in Pottawatomie County, Oklahoma. The reservoir consists of two lakes connected by an equilibrium channel. The southern lake (Shawnee City Lake Number 1) was impounded in 1935, and the northern lake (Shawnee City Lake Number 2) was impounded in 1960. Shawnee Reservoir serves as a municipal water supply, and water is transferred about 9 miles by gravity to a water treatment plant in Shawnee, Oklahoma. Secondary uses of the reservoir are for recreation, fish and wildlife habitat, and flood control. Shawnee Reservoir has a normal-pool elevation of 1,069.0 feet (ft) above North American Vertical Datum of 1988 (NAVD 88). The auxiliary spillway, which defines the flood-pool elevation, is at an elevation of 1,075.0 ft.The U.S. Geological Survey (USGS), in cooperation with the City of Shawnee, has operated a real-time stage (water-surface elevation) gage (USGS station 07241600) at Shawnee Reservoir since 2006. For the period of record ending in 2016, this gage recorded a maximum stage of 1,078.1 ft on May 24, 2015, and a minimum stage of 1,059.1 ft on April 10–11, 2007. This gage did not report reservoir storage prior to this report (2016) because a sufficiently detailed and thoroughly documented bathymetric (reservoir-bottom elevation) survey and corresponding stage-storage relation had not been published. A 2011 bathymetric survey with contours delineated at 5-foot intervals was published in Oklahoma Water Resources Board (2016), but that publication did not include a stage-storage relation table. The USGS, in cooperation with the City of Shawnee, performed a bathymetric survey of Shawnee Reservoir in 2016 and released the bathymetric-survey data in 2017. The purposes of the bathymetric survey were to (1) develop a detailed bathymetric map of the reservoir and (2) determine the relations between stage and reservoir storage

  6. Researching of Covert Timing Channels Based on HTTP Cache Headers in Web API

    Directory of Open Access Journals (Sweden)

    Denis Nikolaevich Kolegov

    2015-12-01

    Full Text Available In this paper, it is shown how covert timing channels based on HTTP cache headers can be implemented using different Web API of Google Drive, Dropbox and Facebook  Internet services.

  7. A Cross-Layer Framework for Designing and Optimizing Deeply-Scaled FinFET-Based Cache Memories

    Directory of Open Access Journals (Sweden)

    Alireza Shafaei

    2015-08-01

    Full Text Available This paper presents a cross-layer framework in order to design and optimize energy-efficient cache memories made of deeply-scaled FinFET devices. The proposed design framework spans device, circuit and architecture levels and considers both super- and near-threshold modes of operation. Initially, at the device-level, seven FinFET devices on a 7-nm process technology are designed in which only one geometry-related parameter (e.g., fin width, gate length, gate underlap is changed per device. Next, at the circuit-level, standard 6T and 8T SRAM cells made of these 7-nm FinFET devices are characterized and compared in terms of static noise margin, access latency, leakage power consumption, etc. Finally, cache memories with all different combinations of devices and SRAM cells are evaluated at the architecture-level using a modified version of the CACTI tool with FinFET support and other considerations for deeply-scaled technologies. Using this design framework, it is observed that L1 cache memory made of longer channel FinFET devices operating at the near-threshold regime achieves the minimum energy operation point.

  8. Federated or cached searches: providing expected performance from multiple invasive species databases

    Science.gov (United States)

    Graham, Jim; Jarnevich, Catherine S.; Simpson, Annie; Newman, Gregory J.; Stohlgren, Thomas J.

    2011-01-01

    Invasive species are a universal global problem, but the information to identify them, manage them, and prevent invasions is stored around the globe in a variety of formats. The Global Invasive Species Information Network is a consortium of organizations working toward providing seamless access to these disparate databases via the Internet. A distributed network of databases can be created using the Internet and a standard web service protocol. There are two options to provide this integration. First, federated searches are being proposed to allow users to search “deep” web documents such as databases for invasive species. A second method is to create a cache of data from the databases for searching. We compare these two methods, and show that federated searches will not provide the performance and flexibility required from users and a central cache of the datum are required to improve performance.

  9. Integrated Modeling and Carbonate Reservoir Analysis, Upper Jurassic Smackover Formation, Fishpond Field, Southwest Alabama

    Science.gov (United States)

    Owen, Alexander Emory

    This field case study focuses on Upper Jurassic (Oxfordian) Smackover hydrocarbon reservoir characterization, modeling and evaluation at Fishpond Field, Escambia County, Alabama, eastern Gulf Coastal Plain of North America. The field is located in the Conecuh Embayment area, south of the Little Cedar Creek Field in Conecuh County and east of Appleton Field in Escambia County. In the Conecuh Embayment, Smackover microbial buildups commonly developed on Paleozoic basement paleohighs in an inner to middle carbonate ramp setting. The microbial and associated facies identified in Fishpond Field are: (F-1) peloidal wackestone, (F-2) peloidal packstone, (F-3) peloidal grainstone, (F-4) peloidal grainstone/packstone, (F-5) microbially-influenced wackestone, (F-6) microbially-influenced packstone, (F-7) microbial boundstone, (F-8) oolitic grainstone, (F-9) shale, and (F-10) dolomitized wackestone/packstone. The Smackover section consists of an alternation of carbonate facies, including F-1 through F-8. The repetitive vertical trend in facies indicates variations in depositional conditions in the area as a result of changes in water depth, energy conditions, salinity, and/or water chemistry due to temporal variations or changes in relative sea level. Accommodation for sediment accumulation also was produced by a change in base level due to differential movement of basement rocks as a result of faulting and/or subsidence due to burial compaction and extension. These changes in base level contributed to the development of a microbial buildup that ranges between 130-165 ft in thickness. The Fishpond Field carbonate reservoir includes a lower microbial buildup interval, a middle grainstone/packstone interval and an upper microbial buildup interval. The Fishpond Field has sedimentary and petroleum system characteristics similar to the neighboring Appleton and Little Cedar Creek Fields, but also has distinct differences from these Smackover fields. The characteristics of the

  10. CSU Final Report on the Math/CS Institute CACHE: Communication-Avoiding and Communication-Hiding at the Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Strout, Michelle [Colorado State University

    2014-06-10

    The CACHE project entails researching and developing new versions of numerical algorithms that result in data reuse that can be scheduled in a communication avoiding way. Since memory accesses take more time than any computation and require the most power, the focus on turning data reuse into data locality is critical to improving performance and reducing power usage in scientific simulations. This final report summarizes the accomplishments at Colorado State University as part of the CACHE project.

  11. Chemical and physical characteristics of water and sediment in Scofield Reservoir, Carbon County, Utah

    Science.gov (United States)

    Waddell, Kidd M.; Darby, D.W.; Theobald, S.M.

    1985-01-01

    Evaluations based on the nutrient content of the inflow, outflow, water in storage, and the dissolved-oxygen depletion during the summer indicate that the trophic state of Scofield Reservoir is borderline between mesotrophic and eutrophic and may become highly eutrophic unless corrective measures are taken to limit nutrient inflow.Sediment deposition in Scofield Reservoir during 1943-79 is estimated to be 3,000 acre-feet, and has decreased the original storage capacity of the reservoir by 4 percent. The sediment contains some coal, and age dating of those sediments (based on the radioisotope lead-210) indicates that most of the coal was deposited prior to about 1950.Scofield Reservoir is dimictic, with turnovers occurring in the spring and autumn. Water in the reservoir circulates completely to the bottom during turnovers. The concentration of dissolved oxygen decreases with depth except during parts of the turnover periods. Below an altitude of about 7,590 feet, where 20 percent of the water is stored, the concentration of dissolved oxygen was less than 2 milligrams per liter during most of the year. During the summer stratification period, the depletion of dissolved oxygen in the deeper layers is coincident with supersaturated conditions in the shallow layers; this is attributed to plant photosynthesis and bacterial respiration in the reservoir.During October 1,1979-August 31,1980, thedischargeweighted average concentrations of dissolved solids was 195 milligrams per liter in the combined inflow from Fish, Pondtown, and Mud Creeks, and was 175 milligrams per liter in the outflow (and to the Price River). The smaller concentration in the outflow was due primarily to precipitation of calcium carbonate in the reservoir about 80 percent of the decrease can be accounted for through loss as calcium carbonate.The estimated discharge-weighted average concentration of total nitrogen (dissolved plus suspended) in the combined inflow of Fish, Pondtown, and Mud Creeks was 1

  12. The Use of Proxy Caches for File Access in a Multi-Tier Grid Environment

    International Nuclear Information System (INIS)

    Brun, R; Duellmann, D; Ganis, G; Janyst, L; Peters, A J; Rademakers, F; Sindrilaru, E; Hanushevsky, A

    2011-01-01

    The use of proxy caches has been extensively studied in the HEP environment for efficient access of database data and showed significant performance with only very moderate operational effort at higher grid tiers (T2, T3). In this contribution we propose to apply the same concept to the area of file access and analyse the possible performance gains, operational impact on site services and applicability to different HEP use cases. Base on a proof-of-concept studies with a modified XROOT proxy server we review the cache efficiency and overheads for access patterns of typical ROOT based analysis programs. We conclude with a discussion of the potential role of this new component at the different tiers of a distributed computing grid.

  13. Mitochondrial genomic analysis of late onset Alzheimer's disease reveals protective haplogroups H6A1A/H6A1B: the Cache County Study on Memory in Aging.

    Directory of Open Access Journals (Sweden)

    Perry G Ridge

    Full Text Available Alzheimer's disease (AD is the most common cause of dementia and AD risk clusters within families. Part of the familial aggregation of AD is accounted for by excess maternal vs. paternal inheritance, a pattern consistent with mitochondrial inheritance. The role of specific mitochondrial DNA (mtDNA variants and haplogroups in AD risk is uncertain.We determined the complete mitochondrial genome sequence of 1007 participants in the Cache County Study on Memory in Aging, a population-based prospective cohort study of dementia in northern Utah. AD diagnoses were made with a multi-stage protocol that included clinical examination and review by a panel of clinical experts. We used TreeScanning, a statistically robust approach based on haplotype networks, to analyze the mtDNA sequence data. Participants with major mitochondrial haplotypes H6A1A and H6A1B showed a reduced risk of AD (p=0.017, corrected for multiple comparisons. The protective haplotypes were defined by three variants: m.3915G>A, m.4727A>G, and m.9380G>A. These three variants characterize two different major haplogroups. Together m.4727A>G and m.9380G>A define H6A1, and it has been suggested m.3915G>A defines H6A. Additional variants differentiate H6A1A and H6A1B; however, none of these variants had a significant relationship with AD case-control status.Our findings provide evidence of a reduced risk of AD for individuals with mtDNA haplotypes H6A1A and H6A1B. These findings are the results of the largest study to date with complete mtDNA genome sequence data, yet the functional significance of the associated haplotypes remains unknown and replication in others studies is necessary.

  14. Interdisciplinary study of reservoir compartments and heterogeneity. Final report, October 1, 1993--December 31, 1996

    Energy Technology Data Exchange (ETDEWEB)

    Van Kirk, C.

    1998-01-01

    A case study approach using Terry Sandstone production from the Hambert-Aristocrat Field, Weld County, Colorado was used to document the process of integration. One specific project goal is to demonstrate how a multidisciplinary approach can be used to detect reservoir compartmentalization and improve reserve estimates. The final project goal is to derive a general strategy for integration for independent operators. Teamwork is the norm for the petroleum industry where teams of geologists, geophysicists, and petroleum engineers work together to improve profits through a better understanding of reservoir size, compartmentalization, and orientation as well as reservoir flow characteristics. In this manner, integration of data narrows the uncertainty in reserve estimates and enhances reservoir management decisions. The process of integration has proven to be iterative. Integration has helped identify reservoir compartmentalization and reduce the uncertainty in the reserve estimates. This research report documents specific examples of integration and the economic benefits of integration.

  15. Quantification of Interbasin Transfers into the Addicks Reservoir during Hurricane Harvey

    Science.gov (United States)

    Sebastian, A.; Juan, A.; Gori, A.; Maulsby, F.; Bedient, P. B.

    2017-12-01

    Between August 25 and 30, Hurricane Harvey dropped unprecedented rainfall over southeast Texas causing widespread flooding in the City of Houston. Water levels in the Addicks and Barker reservoirs, built in the 1940s to protect downtown Houston, exceeded previous records by approximately 2 meters. Concerns regarding structural integrity of the dams and damage to neighbourhoods in within the reservoir pool resulted in controlled releases into Buffalo Bayou, flooding an estimated 4,000 additional structures downstream of the dams. In 2016, during the Tax Day it became apparent that overflows from Cypress Creek in northern Harris County substantially contribute to water levels in Addicks. Prior to this event, little was known about the hydrodynamics of this overflow area or about the additional stress placed on Addicks and Barker reservoirs due to the volume of overflow. However, this information is critical for determining flood risk in Addicks Watershed, and ultimately Buffalo Bayou. In this study, we utilize the recently developed HEC-RAS 2D model the interbasin transfer that occurs between Cypress Creek Watershed and Addicks Reservoir to quantify the volume and rate at which water from Cypress enters the reservoir during extreme events. Ultimately, the results of this study will help inform the official hydrologic models used by HCFCD to determine reservoir operation during future storm events and better inform residents living in or above the reservoir pool about their potential flood risk.

  16. A Unified Buffering Management with Set Divisible Cache for PCM Main Memory

    Institute of Scientific and Technical Information of China (English)

    Mei-Ying Bian; Su-Kyung Yoon; Jeong-Geun Kim; Sangjae Nam; Shin-Dug Kim

    2016-01-01

    This research proposes a phase-change memory (PCM) based main memory system with an effective combi-nation of a superblock-based adaptive buffering structure and its associated set divisible last-level cache (LLC). To achieve high performance similar to that of dynamic random-access memory (DRAM) based main memory, the superblock-based adaptive buffer (SABU) is comprised of dual DRAM buffers, i.e., an aggressive superblock-based pre-fetching buffer (SBPB) and an adaptive sub-block reusing buffer (SBRB), and a set divisible LLC based on a cache space optimization scheme. According to our experiment, the longer PCM access latency can typically be hidden using our proposed SABU, which can significantly reduce the number of writes over the PCM main memory by 26.44%. The SABU approach can reduce PCM access latency up to 0.43 times, compared with conventional DRAM main memory. Meanwhile, the average memory energy consumption can be reduced by 19.7%.

  17. Groundwater-level data from an earthen dam site in southern Westchester County, New York

    Science.gov (United States)

    Noll, Michael L.; Chu, Anthony

    2018-05-01

    In 2005, the U.S. Geological Survey began a cooperative study with New York City Department of Environmental Protection to characterize the local groundwater-flow system and identify potential sources of seeps on the southern embankment of the Hillview Reservoir in Westchester County, New York. Groundwater levels were collected at 49 wells at Hillview Reservoir, and 1 well in northern Bronx County, from April 2005 through November 2016. Groundwater levels were measured discretely with a chalked steel or electric tape, or continuously with a digital pressure transducer, or both, in accordance with U.S. Geological Survey groundwatermeasurement standards. These groundwater-level data were plotted as time series and are presented in this report as hydrographs. Twenty-eight of the 50 hydrographs have continuous record and discrete field groundwater-level measurements, 22 of the hydrographs contain only discrete measurements.

  18. Soil erosion and sediment fluxes analysis: a watershed study of the Ni Reservoir, Spotsylvania County, VA, USA.

    Science.gov (United States)

    Pope, Ian C; Odhiambo, Ben K

    2014-03-01

    Anthropogenic forces that alter the physical landscape are known to cause significant soil erosion, which has negative impact on surface water bodies, such as rivers, lakes/reservoirs, and coastal zones, and thus sediment control has become one of the central aspects of catchment management planning. The revised universal soil loss equation empirical model, erosion pins, and isotopic sediment core analyses were used to evaluate watershed erosion, stream bank erosion, and reservoir sediment accumulation rates for Ni Reservoir, in central Virginia. Land-use and land cover seems to be dominant control in watershed soil erosion, with barren land and human-disturbed areas contributing the most sediment, and forest and herbaceous areas contributing the least. Results show a 7 % increase in human development from 2001 (14 %) to 2009 (21.6 %), corresponding to an increase in soil loss of 0.82 Mg ha(-1) year(-1) in the same time period. (210)Pb-based sediment accumulation rates at three locations in Ni Reservoir were 1.020, 0.364, and 0.543 g cm(-2) year(-1) respectively, indicating that sediment accumulation and distribution in the reservoir is influenced by reservoir configuration and significant contributions from bedload. All three locations indicate an increase in modern sediment accumulation rates. Erosion pin results show variability in stream bank erosion with values ranging from 4.7 to 11.3 cm year(-1). These results indicate that urban growth and the decline in vegetative cover has increased sediment fluxes from the watershed and poses a significant threat to the long-term sustainability of the Ni Reservoir as urbanization continues to increase.

  19. Broadcasted Location-Aware Data Cache for Vehicular Application

    Directory of Open Access Journals (Sweden)

    Kenya Sato

    2007-05-01

    Full Text Available There has been increasing interest in the exploitation of advances in information technology, for example, mobile computing and wireless communications in ITS (intelligent transport systems. Classes of applications that can benefit from such an infrastructure include traffic information, roadside businesses, weather reports, entertainment, and so on. There are several wireless communication methods currently available that can be utilized for vehicular applications, such as cellular phone networks, DSRC (dedicated short-range communication, and digital broadcasting. While a cellular phone network is relatively slow and a DSRC has a very small communication area, one-segment digital terrestrial broadcasting service was launched in Japan in 2006, high-performance digital broadcasting for mobile hosts has been available recently. However, broadcast delivery methods have the drawback that clients need to wait for the required data items to appear on the broadcast channel. In this paper, we propose a new cache system to effectively prefetch and replace broadcast data using “scope” (an available area of location-dependent data and “mobility specification” (a schedule according to the direction in which a mobile host moves. We numerically evaluate the cache system on the model close to the traffic road environment, and implement the emulation system to evaluate this location-aware data delivery method for a concrete vehicular application that delivers geographic road map data to a car navigation system.

  20. Geochemical analysis of atlantic rim water, carbon county, wyoming: New applications for characterizing coalbed natural gas reservoirs

    Science.gov (United States)

    McLaughlin, J.F.; Frost, C.D.; Sharma, Shruti

    2011-01-01

    Coalbed natural gas (CBNG) production typically requires the extraction of large volumes of water from target formations, thereby influencing any associated reservoir systems. We describe isotopic tracers that provide immediate data on the presence or absence of biogenic natural gas and the identify methane-containing reservoirs are hydrologically confined. Isotopes of dissolved inorganic carbon and strontium, along with water quality data, were used to characterize the CBNG reservoirs and hydrogeologic systems of Wyoming's Atlantic Rim. Water was analyzed from a stream, springs, and CBNG wells. Strontium isotopic composition and major ion geochemistry identify two groups of surface water samples. Muddy Creek and Mesaverde Group spring samples are Ca-Mg-S04-type water with higher 87Sr/86Sr, reflecting relatively young groundwater recharged from precipitation in the Sierra Madre. Groundwaters emitted from the Lewis Shale springs are Na-HCO3-type waters with lower 87Sr/86Sr, reflecting sulfate reduction and more extensive water-rock interaction. To distinguish coalbed waters, methanogenically enriched ??13CDIC wasused from other natural waters. Enriched ??13CDIC, between -3.6 and +13.3???, identified spring water that likely originates from Mesaverde coalbed reservoirs. Strongly positive ??13CDIC, between +12.6 and +22.8???, identified those coalbed reservoirs that are confined, whereas lower ??13CDIC, between +0.0 and +9.9???, identified wells within unconfined reservoir systems. Copyright ?? 2011. The American Association of Petroleum Geologists. All rights reserved.

  1. XRootd, disk-based, caching proxy for optimization of data access, data placement and data replication

    International Nuclear Information System (INIS)

    Bauerdick, L A T; Bloom, K; Bockelman, B; Bradley, D C; Dasu, S; Dost, J M; Sfiligoi, I; Tadel, A; Tadel, M; Wuerthwein, F; Yagil, A

    2014-01-01

    Following the success of the XRootd-based US CMS data federation, the AAA project investigated extensions of the federation architecture by developing two sample implementations of an XRootd, disk-based, caching proxy. The first one simply starts fetching a whole file as soon as a file open request is received and is suitable when completely random file access is expected or it is already known that a whole file be read. The second implementation supports on-demand downloading of partial files. Extensions to the Hadoop Distributed File System have been developed to allow for an immediate fallback to network access when local HDFS storage fails to provide the requested block. Both cache implementations are in pre-production testing at UCSD.

  2. A New Caching Technique to Support Conjunctive Queries in P2P DHT

    Science.gov (United States)

    Kobatake, Koji; Tagashira, Shigeaki; Fujita, Satoshi

    P2P DHT (Peer-to-Peer Distributed Hash Table) is one of typical techniques for realizing an efficient management of shared resources distributed over a network and a keyword search over such networks in a fully distributed manner. In this paper, we propose a new method for supporting conjunctive queries in P2P DHT. The basic idea of the proposed technique is to share a global information on past trials by conducting a local caching of search results for conjunctive queries and by registering the fact to the global DHT. Such a result caching is expected to significantly reduce the amount of transmitted data compared with conventional schemes. The effect of the proposed method is experimentally evaluated by simulation. The result of experiments indicates that by using the proposed method, the amount of returned data is reduced by 60% compared with conventional P2P DHT which does not support conjunctive queries.

  3. Big Data Caching for Networking: Moving from Cloud to Edge

    OpenAIRE

    Zeydan, Engin; Baştuğ, Ejder; Bennis, Mehdi; Kader, Manhal Abdel; Karatepe, Alper; Er, Ahmet Salih; Debbah, Mérouane

    2016-01-01

    In order to cope with the relentless data tsunami in $5G$ wireless networks, current approaches such as acquiring new spectrum, deploying more base stations (BSs) and increasing nodes in mobile packet core networks are becoming ineffective in terms of scalability, cost and flexibility. In this regard, context-aware $5$G networks with edge/cloud computing and exploitation of \\emph{big data} analytics can yield significant gains to mobile operators. In this article, proactive content caching in...

  4. APPLICATION OF INTEGRATED RESERVOIR MANAGEMENT AND RESERVOIR CHARACTERIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Jack Bergeron; Tom Blasingame; Louis Doublet; Mohan Kelkar; George Freeman; Jeff Callard; David Moore; David Davies; Richard Vessell; Brian Pregger; Bill Dixon; Bryce Bezant

    2000-03-01

    Reservoir performance and characterization are vital parameters during the development phase of a project. Infill drilling of wells on a uniform spacing, without regard to characterization does not optimize development because it fails to account for the complex nature of reservoir heterogeneities present in many low permeability reservoirs, especially carbonate reservoirs. These reservoirs are typically characterized by: (1) large, discontinuous pay intervals; (2) vertical and lateral changes in reservoir properties; (3) low reservoir energy; (4) high residual oil saturation; and (5) low recovery efficiency. The operational problems they encounter in these types of reservoirs include: (1) poor or inadequate completions and stimulations; (2) early water breakthrough; (3) poor reservoir sweep efficiency in contacting oil throughout the reservoir as well as in the nearby well regions; (4) channeling of injected fluids due to preferential fracturing caused by excessive injection rates; and (5) limited data availability and poor data quality. Infill drilling operations only need target areas of the reservoir which will be economically successful. If the most productive areas of a reservoir can be accurately identified by combining the results of geological, petrophysical, reservoir performance, and pressure transient analyses, then this ''integrated'' approach can be used to optimize reservoir performance during secondary and tertiary recovery operations without resorting to ''blanket'' infill drilling methods. New and emerging technologies such as geostatistical modeling, rock typing, and rigorous decline type curve analysis can be used to quantify reservoir quality and the degree of interwell communication. These results can then be used to develop a 3-D simulation model for prediction of infill locations. The application of reservoir surveillance techniques to identify additional reservoir ''pay'' zones

  5. Environmental analysis of geopressured-geothermal prospect areas, Brazoria and Kenedy Counties, Texas

    Energy Technology Data Exchange (ETDEWEB)

    White, W.A.; McGraw, M.; Gustavson, T.C.

    1978-01-01

    Preliminary environmental data, including current land use, substrate lithology, soils, natural hazards, water resources, biological assemblages, meteorological data, and regulatory considerations have been collected and analyzed for approximately 150 km/sup 2/ of land: (1) near Chocolate Bayou, Brazoria County, Texas, where a geopressured-geothermal test well was drilled in 1978, and (2) near the rural community of Armstrong, Kenedy County, Texas, where future geopressured-geothermal test well development may occur. The study was designed to establish an environmental data base and to determine, within spatial constraints set by subsurface reservoir conditions, environmentally suitable sites for geopressured-geothermal wells.

  6. Delivery Time Minimization in Edge Caching: Synergistic Benefits of Subspace Alignment and Zero Forcing

    KAUST Repository

    Kakar, Jaber; Alameer, Alaa; Chaaban, Anas; Sezgin, Aydin; Paulraj, Arogyaswami

    2017-01-01

    the fundamental limits of a cache-aided wireless network consisting of one central base station, $M$ transceivers and $K$ receivers from a latency-centric perspective. We use the normalized delivery time (NDT) to capture the per-bit latency for the worst-case file

  7. Reservoir characterization of Pennsylvanian sandstone reservoirs. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Kelkar, M.

    1995-02-01

    This final report summarizes the progress during the three years of a project on Reservoir Characterization of Pennsylvanian Sandstone Reservoirs. The report is divided into three sections: (i) reservoir description; (ii) scale-up procedures; (iii) outcrop investigation. The first section describes the methods by which a reservoir can be described in three dimensions. The next step in reservoir description is to scale up reservoir properties for flow simulation. The second section addresses the issue of scale-up of reservoir properties once the spatial descriptions of properties are created. The last section describes the investigation of an outcrop.

  8. 76 FR 8978 - Proposed Flood Elevation Determinations

    Science.gov (United States)

    2011-02-16

    .../town/county Source of flooding Location ** ground [caret] Elevation in meters (MSL) Existing Modified Unincorporated Areas of Yolo County, California California Unincorporated Areas of Cache Creek Settling Basin At........ Entire None +901 Town of shoreline Wolcottvill e, Unincorpora ted Areas of LaGrange County. * National...

  9. Impacto de la memoria cache en la aceleración de la ejecución de algoritmo de detección de rostros en sistemas empotrados

    Directory of Open Access Journals (Sweden)

    Alejandro Cabrera Aldaya

    2012-06-01

    Full Text Available En este trabajo se analiza el impacto de la memoria cache sobre la aceleración de la ejecución del algoritmo de detección de rostros de Viola-Jones en un sistema de procesamiento basado en el procesador Microblaze empotrado en un FPGA. Se expone el algoritmo, se describe una implementación software del mismo y se analizan sus funciones más relevantes y las características de localidad de las instrucciones y los datos. Se analiza el impacto de las memorias cache de instrucciones y de datos, tanto de sus capacidades (entre 2 y 16 kB como de tamaño de línea (de 4 y 8 palabras. Los resultados obtenidos utilizando una placa de desarrollo Spartan3A Starter Kit basada en un FPGA Spartan3A XC3S700A, con el procesador Microblaze a 62,5 MHz y 64 MB de memoria externa DDR2 a 125 MHz,  muestran un mayor impacto de la cache de instrucciones que la de datos, con valores óptimos de 8kB para la cache de instrucciones y entre 4 y 16kB para la cache de datos. Con estas memorias se alcanza una aceleración de 17 veces con relación a la ejecución del algoritmo en memoria externa. El tamaño de la línea de cache tiene poca influencia sobre la aceleración del algoritmo.

  10. Effectiveness of caching in a distributed digital library system

    DEFF Research Database (Denmark)

    Hollmann, J.; Ardø, Anders; Stenstrom, P.

    2007-01-01

    as manifested by gateways that implement the interfaces to the many fulltext archives. A central research question in this approach is: What is the nature of locality in the user access stream to such a digital library? Based on access logs that drive the simulations, it is shown that client-side caching can......Today independent publishers are offering digital libraries with fulltext archives. In an attempt to provide a single user-interface to a large set of archives, the studied Article-Database-Service offers a consolidated interface to a geographically distributed set of archives. While this approach...

  11. Reservoir management

    International Nuclear Information System (INIS)

    Satter, A.; Varnon, J.E.; Hoang, M.T.

    1992-01-01

    A reservoir's life begins with exploration leading to discovery followed by delineation of the reservoir, development of the field, production by primary, secondary and tertiary means, and finally to abandonment. Sound reservoir management is the key to maximizing economic operation of the reservoir throughout its entire life. Technological advances and rapidly increasing computer power are providing tools to better manage reservoirs and are increasing the gap between good and neural reservoir management. The modern reservoir management process involves goal setting, planning, implementing, monitoring, evaluating, and revising plans. Setting a reservoir management strategy requires knowledge of the reservoir, availability of technology, and knowledge of the business, political, and environmental climate. Formulating a comprehensive management plan involves depletion and development strategies, data acquisition and analyses, geological and numerical model studies, production and reserves forecasts, facilities requirements, economic optimization, and management approval. This paper provides management, engineers, geologists, geophysicists, and field operations staff with a better understanding of the practical approach to reservoir management using a multidisciplinary, integrated team approach

  12. Reservoir management

    International Nuclear Information System (INIS)

    Satter, A.; Varnon, J.E.; Hoang, M.T.

    1992-01-01

    A reservoir's life begins with exploration leading to discovery followed by delineation of the reservoir, development of the field, production by primary, secondary and tertiary means, and finally to abandonment. Sound reservoir management is the key to maximizing economic operation of the reservoir throughout its entire life. Technological advances and rapidly increasing computer power are providing tools to better manage reservoirs and are increasing the gap between good and neutral reservoir management. The modern reservoir management process involves goal setting, planning, implementing, monitoring, evaluating, and revising plans. Setting a reservoir management strategy requires knowledge of the reservoir, availability of technology, and knowledge of the business, political, and environmental climate. Formulating a comprehensive management plan involves depletion and development strategies, data acquisition and analyses, geological and numerical model studies, production and reserves forecasts, facilities requirements, economic optimization, and management approval. This paper provides management, engineers geologists, geophysicists, and field operations staff with a better understanding of the practical approach to reservoir management using a multidisciplinary, integrated team approach

  13. Replicas Strategy and Cache Optimization of Video Surveillance Systems Based on Cloud Storage

    Directory of Open Access Journals (Sweden)

    Rongheng Li

    2018-04-01

    Full Text Available With the rapid development of video surveillance technology, especially the popularity of cloud-based video surveillance applications, video data begins to grow explosively. However, in the cloud-based video surveillance system, replicas occupy an amount of storage space. Also, the slow response to video playback constrains the performance of the system. In this paper, considering the characteristics of video data comprehensively, we propose a dynamic redundant replicas mechanism based on security levels that can dynamically adjust the number of replicas. Based on the location correlation between cameras, this paper also proposes a data cache strategy to improve the response speed of data reading. Experiments illustrate that: (1 our dynamic redundant replicas mechanism can save storage space while ensuring data security; (2 the cache mechanism can predict the playback behaviors of the users in advance and improve the response speed of data reading according to the location and time correlation of the front-end cameras; and (3 in terms of cloud-based video surveillance, our proposed approaches significantly outperform existing methods.

  14. dCache, towards Federated Identities & Anonymized Delegation

    Science.gov (United States)

    Ashish, A.; Millar, AP; Mkrtchyan, T.; Fuhrmann, P.; Behrmann, G.; Sahakyan, M.; Adeyemi, O. S.; Starek, J.; Litvintsev, D.; Rossi, A.

    2017-10-01

    For over a decade, dCache has relied on the authentication and authorization infrastructure (AAI) offered by VOMS, Kerberos, Xrootd etc. Although the established infrastructure has worked well and provided sufficient security, the implementation of procedures and the underlying software is often seen as a burden, especially by smaller communities trying to adopt existing HEP software stacks [1]. Moreover, scientists are increasingly dependent on service portals for data access [2]. In this paper, we describe how federated identity management systems can facilitate the transition from traditional AAI infrastructure to novel solutions like OpenID Connect. We investigate the advantages offered by OpenID Connect in regards to ‘delegation of authentication’ and ‘credential delegation for offline access’. Additionally, we demonstrate how macaroons can provide a more fine-granular authorization mechanism that supports anonymized delegation.

  15. Something different - caching applied to calculation of impedance matrix elements

    CSIR Research Space (South Africa)

    Lysko, AA

    2012-09-01

    Full Text Available of the multipliers, the approximating functions are used any required parameters, such as input impedance or gain pattern etc. The method is relatively straightforward but, especially for small to medium matrices, requires spending time on filling... of the computing the impedance matrix for the method of moments, or a similar method, such as boundary element method (BEM) [22], with the help of the flowchart shown in Figure 1. Input Parameters (a) Search the cached data for a match (b) A match found...

  16. The ''Clinton-Cataract'' potential of Norfolk County--how significant is it

    Energy Technology Data Exchange (ETDEWEB)

    MacDougal, T A

    1973-01-01

    The greatest impact upon the natural gas industry in Norfolk County by future urbanization, is the increase of potentially new markets for distributing utility. In 1958 Norfolk County was a net exporter of natural gas as it produced 1.479 billion cu ft and only consumed .316 billion for a net export of 1.163 billion cu ft. Thirteen years later in 1971, Norfolk produced 1.797 billion cu ft, 43% of which was supplied from Lake Erie, and consumed 2,900 billion cu ft for a net import of 1.103 billion cu ft. With the increased stress on clean air within the heavy industrial sector, the demand for natural gas as a non-pollutant fuel should increase substantially in the Nanticoke industrial region. Some of the increased demand for natural gas in the 3 market sectors could be met through the development of the 766,712 acres which have not been tested. As an economic spin-off, the improved storage potential of the ''Clinton-Cataract'' reservoirs through high energy fracturing could be utilized as local gas storage reservoirs to meet peak load market demands.

  17. Application of Advanced Reservoir Characterization, Simulation, and Production Optimization Strategies to Maximize Recovery in Slope and Basin Clastic Reservoirs, West Texas (Delaware Basin)

    Energy Technology Data Exchange (ETDEWEB)

    Andrew G. Cole; George B. Asquith; Jose I. Guzman; Mark D. Barton; Mohammad A. Malik; Shirley P. Dutton; Sigrid J. Clift

    1998-04-01

    The objective of this Class III project is to demonstrate that detailed reservoir characterization of clastic reservoirs in basinal sandstones of the Delaware Mountain Group in the Delaware Basin of West Texas and New Mexico is a cost-effective way to recover more of the original oil in place by strategic infill-well placement and geologically based enhanced oil recovery. The study focused on the Ford Geraldine unit, which produces from the upper Bell Canyon Formation (Ramsey sandstone). Reservoirs in this and other Delaware Mountain Group fields have low producibility (average recovery <14 percent of the original oil in place) because of a high degree of vertical and lateral heterogeneity caused by depositional processes and post-depositional diagenetic modification. Outcrop analogs were studied to better interpret the depositional processes that formed the reservoirs at the Ford Geraldine unit and to determine the dimensions of reservoir sandstone bodies. Facies relationships and bedding architecture within a single genetic unit exposed in outcrop in Culberson County, Texas, suggest that the sandstones were deposited in a system of channels and levees with attached lobes that initially prograded basinward, aggraded, and then turned around and stepped back toward the shelf. Channel sandstones are 10 to 60 ft thick and 300 to 3,000 ft wide. The flanking levees have a wedge-shaped geometry and are composed of interbedded sandstone and siltstone; thickness varies from 3 to 20 ft and length from several hundred to several thousands of feet. The lobe sandstones are broad lens-shaped bodies; thicknesses range up to 30 ft with aspect ratios (width/thickness) of 100 to 10,000. Lobe sandstones may be interstratified with laminated siltstones.

  18. Pattern recognition for cache management in distributed medical imaging environments.

    Science.gov (United States)

    Viana-Ferreira, Carlos; Ribeiro, Luís; Matos, Sérgio; Costa, Carlos

    2016-02-01

    Traditionally, medical imaging repositories have been supported by indoor infrastructures with huge operational costs. This paradigm is changing thanks to cloud outsourcing which not only brings technological advantages but also facilitates inter-institutional workflows. However, communication latency is one main problem in this kind of approaches, since we are dealing with tremendous volumes of data. To minimize the impact of this issue, cache and prefetching are commonly used. The effectiveness of these mechanisms is highly dependent on their capability of accurately selecting the objects that will be needed soon. This paper describes a pattern recognition system based on artificial neural networks with incremental learning to evaluate, from a set of usage pattern, which one fits the user behavior at a given time. The accuracy of the pattern recognition model in distinct training conditions was also evaluated. The solution was tested with a real-world dataset and a synthesized dataset, showing that incremental learning is advantageous. Even with very immature initial models, trained with just 1 week of data samples, the overall accuracy was very similar to the value obtained when using 75% of the long-term data for training the models. Preliminary results demonstrate an effective reduction in communication latency when using the proposed solution to feed a prefetching mechanism. The proposed approach is very interesting for cache replacement and prefetching policies due to the good results obtained since the first deployment moments.

  19. 75 FR 13301 - Los Vaqueros Reservoir Expansion, Contra Costa and Alameda Counties, CA

    Science.gov (United States)

    2010-03-19

    ... from Ms. Sharon McHale, Bureau of Reclamation, 2800 Cottage Way, Sacramento, CA 95825; by calling 916..., Sacramento, CA 95825. Bureau of Reclamation, Denver Office Library, Building 67, Room 167, Denver Federal... and Alameda Counties, CA AGENCY: Bureau of Reclamation, Interior. ACTION: Notice of availability of...

  20. A Comparison between Fixed Priority and EDF Scheduling accounting for Cache Related Pre-emption Delays

    Directory of Open Access Journals (Sweden)

    Will Lunniss

    2014-04-01

    Full Text Available In multitasking real-time systems, the choice of scheduling algorithm is an important factor to ensure that response time requirements are met while maximising limited system resources. Two popular scheduling algorithms include fixed priority (FP and earliest deadline first (EDF. While they have been studied in great detail before, they have not been compared when taking into account cache related pre-emption delays (CRPD. Memory and cache are split into a number of blocks containing instructions and data. During a pre-emption, cache blocks from the pre-empting task can evict those of the pre-empted task. When the pre-empted task is resumed, if it then has to re-load the evicted blocks, CRPD are introduced which then affect the schedulability of the task. In this paper we compare FP and EDF scheduling algorithms in the presence of CRPD using the state-of-the-art CRPD analysis. We find that when CRPD is accounted for, the performance gains offered by EDF over FP, while still notable, are diminished. Furthermore, we find that under scenarios that cause relatively high CRPD, task layout optimisation techniques can be applied to allow FP to schedule tasksets at a similar processor utilisation to EDF. Thus making the choice of the task layout in memory as important as the choice of scheduling algorithm. This is very relevant for industry, as it is much cheaper and simpler to adjust the task layout through the linker than it is to switch the scheduling algorithm.

  1. Geometric Algorithms for Private-Cache Chip Multiprocessors

    DEFF Research Database (Denmark)

    Ajwani, Deepak; Sitchinava, Nodari; Zeh, Norbert

    2010-01-01

    -D convex hulls. These results are obtained by analyzing adaptations of either the PEM merge sort algorithm or PRAM algorithms. For the second group of problems—orthogonal line segment intersection reporting, batched range reporting, and related problems—more effort is required. What distinguishes......We study techniques for obtaining efficient algorithms for geometric problems on private-cache chip multiprocessors. We show how to obtain optimal algorithms for interval stabbing counting, 1-D range counting, weighted 2-D dominance counting, and for computing 3-D maxima, 2-D lower envelopes, and 2...... these problems from the ones in the previous group is the variable output size, which requires I/O-efficient load balancing strategies based on the contribution of the individual input elements to the output size. To obtain nearly optimal algorithms for these problems, we introduce a parallel distribution...

  2. Observations of territorial breeding common ravens caching eggs of greater sage-grouse

    Science.gov (United States)

    Howe, Kristy B.; Coates, Peter S.

    2015-01-01

    Previous investigations using continuous video monitoring of greater sage-grouse Centrocercus urophasianus nests have unambiguously identified common ravens Corvus corax as an important egg predator within the western United States. The quantity of greater sage-grouse eggs an individual common raven consumes during the nesting period and the extent to which common ravens actively hunt greater sage-grouse nests are largely unknown. However, some evidence suggests that territorial breeding common ravens, rather than nonbreeding transients, are most likely responsible for nest depredations. We describe greater sage-grouse egg depredation observations obtained opportunistically from three common raven nests located in Idaho and Nevada where depredated greater sage-grouse eggs were found at or in the immediate vicinity of the nest site, including the caching of eggs in nearby rock crevices. We opportunistically monitored these nests by counting and removing depredated eggs and shell fragments from the nest sites during each visit to determine the extent to which the common raven pairs preyed on greater sage-grouse eggs. To our knowledge, our observations represent the first evidence that breeding, territorial pairs of common ravens cache greater sage-grouse eggs and are capable of depredating multiple greater sage-grouse nests.

  3. XRootd, disk-based, caching-proxy for optimization of data-access, data-placement and data-replication

    CERN Document Server

    Tadel, Matevz

    2013-01-01

    Following the smashing success of XRootd-based USCMS data-federation, AAA project investigated extensions of the federation architecture by developing two sample implementations of an XRootd, disk-based, caching-proxy. The first one simply starts fetching a whole file as soon as a file-open request is received and is suitable when completely random file access is expected or it is already known that a whole file be read. The second implementation supports on-demand downloading of partial files. Extensions to the Hadoop file-system have been developed to allow foran immediate fallback to network access when local HDFS storage fails to provide the requested block. Tools needed to analyze and to tweak block replication factors and to inject downloaded blocks into a running HDFS installation have also been developed. Both cache implementations are in operation at UCSD and several tests were also performed at UNL and UW-M. Operational experience and applications to automatic storage healing and opportunistic compu...

  4. Population genetic structure and its implications for adaptive variation in memory and the hippocampus on a continental scale in food-caching black-capped chickadees.

    Science.gov (United States)

    Pravosudov, V V; Roth, T C; Forister, M L; Ladage, L D; Burg, T M; Braun, M J; Davidson, B S

    2012-09-01

    Food-caching birds rely on stored food to survive the winter, and spatial memory has been shown to be critical in successful cache recovery. Both spatial memory and the hippocampus, an area of the brain involved in spatial memory, exhibit significant geographic variation linked to climate-based environmental harshness and the potential reliance on food caches for survival. Such geographic variation has been suggested to have a heritable basis associated with differential selection. Here, we ask whether population genetic differentiation and potential isolation among multiple populations of food-caching black-capped chickadees is associated with differences in memory and hippocampal morphology by exploring population genetic structure within and among groups of populations that are divergent to different degrees in hippocampal morphology. Using mitochondrial DNA and 583 AFLP loci, we found that population divergence in hippocampal morphology is not significantly associated with neutral genetic divergence or geographic distance, but instead is significantly associated with differences in winter climate. These results are consistent with variation in a history of natural selection on memory and hippocampal morphology that creates and maintains differences in these traits regardless of population genetic structure and likely associated gene flow. Published 2012. This article is a US Government work and is in the public domain in the USA.

  5. The Caregiver Contribution to Heart Failure Self-Care (CACHS): Further Psychometric Testing of a Novel Instrument.

    Science.gov (United States)

    Buck, Harleah G; Harkness, Karen; Ali, Muhammad Usman; Carroll, Sandra L; Kryworuchko, Jennifer; McGillion, Michael

    2017-04-01

    Caregivers (CGs) contribute important assistance with heart failure (HF) self-care, including daily maintenance, symptom monitoring, and management. Until CGs' contributions to self-care can be quantified, it is impossible to characterize it, account for its impact on patient outcomes, or perform meaningful cost analyses. The purpose of this study was to conduct psychometric testing and item reduction on the recently developed 34-item Caregiver Contribution to Heart Failure Self-care (CACHS) instrument using classical and item response theory methods. Fifty CGs (mean age 63 years ±12.84; 70% female) recruited from a HF clinic completed the CACHS in 2014 and results evaluated using classical test theory and item response theory. Items would be deleted for low (.95) endorsement, low (.7) corrected item-total correlations, significant pairwise correlation coefficients, floor or ceiling effects, relatively low latent trait and item information function levels ( .5), and differential item functioning. After analysis, 14 items were excluded, resulting in a 20-item instrument (self-care maintenance eight items; monitoring seven items; and management five items). Most items demonstrated moderate to high discrimination (median 2.13, minimum .77, maximum 5.05), and appropriate item difficulty (-2.7 to 1.4). Internal consistency reliability was excellent (Cronbach α = .94, average inter-item correlation = .41) with no ceiling effects. The newly developed 20-item version of the CACHS is supported by rigorous instrument development and represents a novel instrument to measure CGs' contribution to HF self-care. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  6. Cultural Resources Investigation of Eau Galle Reservoir, Pierce and St. Croix Counties, Wisconsin,

    Science.gov (United States)

    1985-05-01

    descriptions) 6. Roads, paths, and trails 7. Ditches, irrigation , tiling 8. Stream/channel alteration (of any descrip- tion) 9. Extensive dredging...not be limited to, the following sections. These sections.do not necessarily need to be discrete sections; however, they should be readily discernable...Collection and Treatment System at Granada , Martin County, Minnesota; KBM, Inc.; Archaeological Field Services, Inc.; Principal Investigator. 1979 An

  7. Fortescue reservoir development and reservoir studies

    Energy Technology Data Exchange (ETDEWEB)

    Henzell, S.T.; Hicks, G.J.; Horden, M.J.; Irrgang, H.R.; Janssen, E.J.; Kable, C.W.; Mitchell, R.A.H.; Morrell, N.W.; Palmer, I.D.; Seage, N.W.

    1985-03-01

    The Fortescue field in the Gippsland Basin, offshore southeastern Australia is being developed from two platforms (Fortescue A and Cobia A) by Esso Australia Ltd. (operator) and BHP Petroleum. The Fortescue reservoir is a stratigraphic trap at the top of the Latrobe Group of sediments. It overlies the western flank of the Halibut and Cobia fields and is separated from them by a non-net sequence of shales and coals which form a hydraulic barrier between the two systems. Development drilling into the Fortescue reservoir commenced in April 1983 with production coming onstream in May 1983. Fortescue, with booked reserves of 44 stock tank gigalitres (280 million stock tank barrels) of 43/sup 0/ API oil, is the seventh major oil reservoir to be developed in the offshore Gippsland Basin by Esso/BHP. In mid-1984, after drilling a total of 20 exploration and development wells, and after approximately one year of production, a detailed three-dimensional, two-phase reservoir simulation study was performed to examine the recovery efficiency, drainage patterns, pressure performance and production rate potential of the reservoir. The model was validated by history matching an extensive suite of Repeat Formation Test (RFT) pressure data. The results confirmed the reserves basis, and demonstrated that the ultimate oil recovery from the reservoir is not sensitive to production rate. This result is consistent with studies on other high quality Latrobe Group reservoirs in the Gippsland Basin which contain undersaturated crudes and receive very strong water drive from the Basin-wide aquifer system. With the development of the simulation model during the development phase, it has been possible to more accurately define the optimal well pattern for the remainder of the development.

  8. Water resources of Racine and Kenosha Counties, southeastern Wisconsin

    Science.gov (United States)

    Hutchinson, R.D.

    1970-01-01

    Urbanization and changes in regional development in Racine and Kenosha Counties are increasing the need for water-resources information useful for planning and management. The area is fortunate in having abundant supplies of generally good quality water available for present and projected future needs. Lake Michigan and ground-water reservoirs have great potential for increased development. Lake Michigan assures the urbanized area in the eastern part of the two counties of a nearly inexhaustible water supply. In 1967 the cities of Racine and Kenosha pumped an average of 32.6 mgd (million gallons per day) from the lake. Water from Lake Michigan is of the calcium magnesium bicarbonate type, but it is less hard than water from other sources. Discharge from Racine and Kenosha Counties into Lake Michigan is low and has little effect on the lake. The Root and Pike Rivers and a number of smaller streams contribute a mean flow of about 125 cfs (cubic feet per second) to the lake. Ground water, approximately 5 cfs, enters the lake as discharge from springs or as seeps. The Des Plaines, Root, and Pike Rivers drain areas of relatively impermeable silty clay that promotes rapid surface runoff and provides little sustained base flow. Sewage sometimes accounts for most of the base flow of the Root River. In contrast, the Fox River, which drains the western half of the area, has steady and dependable flow derived from the sand and gravel and the Niagara aquifers. Sewage-plant effluent released to the Fox River in 1964 was about 5 percent of the total flow. A 5-mile reach of the Root River loses about 30,000 gpd (gallons per day) per mile to the local ground-water reservoir and is a possible source of ground-water contamination. Thirty-five of the 43 lakes in the area are the visible parts of the groundwater table, and their stages fluctuate with changes in ground-water levels. The rest of the lakes are perched above the ground-water table. Flooding is a recurring but generally

  9. Flood Frequency Analysis of Future Climate Projections in the Cache Creek Watershed

    Science.gov (United States)

    Fischer, I.; Trihn, T.; Ishida, K.; Jang, S.; Kavvas, E.; Kavvas, M. L.

    2014-12-01

    Effects of climate change on hydrologic flow regimes, particularly extreme events, necessitate modeling of future flows to best inform water resources management. Future flow projections may be modeled through the joint use of carbon emission scenarios, general circulation models and watershed models. This research effort ran 13 simulations for carbon emission scenarios (taken from the A1, A2 and B1 families) over the 21st century (2001-2100) for the Cache Creek watershed in Northern California. Atmospheric data from general circulation models, CCSM3 and ECHAM5, were dynamically downscaled to a 9 km resolution using MM5, a regional mesoscale model, before being input into the physically based watershed environmental hydrology (WEHY) model. Ensemble mean and standard deviation of simulated flows describe the expected hydrologic system response. Frequency histograms and cumulative distribution functions characterize the range of hydrologic responses that may occur. The modeled flow results comprise a dataset suitable for time series and frequency analysis allowing for more robust system characterization, including indices such as the 100 year flood return period. These results are significant for water quality management as the Cache Creek watershed is severely impacted by mercury pollution from historic mining activities. Extreme flow events control mercury fate and transport affecting the downstream water bodies of the Sacramento River and Sacramento- San Joaquin Delta which provide drinking water to over 25 million people.

  10. Advantageous Reservoir Characterization Technology in Extra Low Permeability Oil Reservoirs

    Directory of Open Access Journals (Sweden)

    Yutian Luo

    2017-01-01

    Full Text Available This paper took extra low permeability reservoirs in Dagang Liujianfang Oilfield as an example and analyzed different types of microscopic pore structures by SEM, casting thin sections fluorescence microscope, and so on. With adoption of rate-controlled mercury penetration, NMR, and some other advanced techniques, based on evaluation parameters, namely, throat radius, volume percentage of mobile fluid, start-up pressure gradient, and clay content, the classification and assessment method of extra low permeability reservoirs was improved and the parameter boundaries of the advantageous reservoirs were established. The physical properties of reservoirs with different depth are different. Clay mineral variation range is 7.0%, and throat radius variation range is 1.81 μm, and start pressure gradient range is 0.23 MPa/m, and movable fluid percentage change range is 17.4%. The class IV reservoirs account for 9.56%, class II reservoirs account for 12.16%, and class III reservoirs account for 78.29%. According to the comparison of different development methods, class II reservoir is most suitable for waterflooding development, and class IV reservoir is most suitable for gas injection development. Taking into account the gas injection in the upper section of the reservoir, the next section of water injection development will achieve the best results.

  11. High-speed mapping of water isotopes and residence time in Cache Slough Complex, San Francisco Bay Delta, CA

    Data.gov (United States)

    Department of the Interior — Real-time, high frequency (1-second sample interval) GPS location, water quality, and water isotope (δ2H, δ18O) data was collected in the Cache Slough Complex (CSC),...

  12. Towards Cache-Enabled, Order-Aware, Ontology-Based Stream Reasoning Framework

    Energy Technology Data Exchange (ETDEWEB)

    Yan, Rui; Praggastis, Brenda L.; Smith, William P.; McGuinness, Deborah L.

    2016-08-16

    While streaming data have become increasingly more popular in business and research communities, semantic models and processing software for streaming data have not kept pace. Traditional semantic solutions have not addressed transient data streams. Semantic web languages (e.g., RDF, OWL) have typically addressed static data settings and linked data approaches have predominantly addressed static or growing data repositories. Streaming data settings have some fundamental differences; in particular, data are consumed on the fly and data may expire. Stream reasoning, a combination of stream processing and semantic reasoning, has emerged with the vision of providing "smart" processing of streaming data. C-SPARQL is a prominent stream reasoning system that handles semantic (RDF) data streams. Many stream reasoning systems including C-SPARQL use a sliding window and use data arrival time to evict data. For data streams that include expiration times, a simple arrival time scheme is inadequate if the window size does not match the expiration period. In this paper, we propose a cache-enabled, order-aware, ontology-based stream reasoning framework. This framework consumes RDF streams with expiration timestamps assigned by the streaming source. Our framework utilizes both arrival and expiration timestamps in its cache eviction policies. In addition, we introduce the notion of "semantic importance" which aims to address the relevance of data to the expected reasoning, thus enabling the eviction algorithms to be more context- and reasoning-aware when choosing what data to maintain for question answering. We evaluate this framework by implementing three different prototypes and utilizing five metrics. The trade-offs of deploying the proposed framework are also discussed.

  13. Reservoir characterization and final pre-test analysis in support of the compressed-air-energy-storage Pittsfield aquifer field test in Pike County, Illinois

    Energy Technology Data Exchange (ETDEWEB)

    Wiles, L.E.; McCann, R.A.

    1983-06-01

    The work reported is part of a field experimental program to demonstrate and evaluate compressed air energy storage in a porous media aquifer reservoir near Pittsfield, Illinois. The reservoir is described. Numerical modeling of the reservoir was performed concurrently with site development. The numerical models were applied to predict the thermohydraulic performance of the porous media reservoir. This reservoir characterization and pre-test analysis made use of evaluation of bubble development, water coning, thermal development, and near-wellbore desaturation. The work was undertaken to define the time required to develop an air storage bubble of adequate size, to assess the specification of instrumentation and above-ground equipment, and to develop and evaluate operational strategies for air cycling. A parametric analysis was performed for the field test reservoir. (LEW)

  14. Application of Integrated Reservoir Management and Reservoir Characterization to Optimize Infill Drilling

    Energy Technology Data Exchange (ETDEWEB)

    P. K. Pande

    1998-10-29

    Initial drilling of wells on a uniform spacing, without regard to reservoir performance and characterization, must become a process of the past. Such efforts do not optimize reservoir development as they fail to account for the complex nature of reservoir heterogeneities present in many low permeability reservoirs, and carbonate reservoirs in particular. These reservoirs are typically characterized by: o Large, discontinuous pay intervals o Vertical and lateral changes in reservoir properties o Low reservoir energy o High residual oil saturation o Low recovery efficiency

  15. The role of reservoir characterization in the reservoir management process (as reflected in the Department of Energy`s reservoir management demonstration program)

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, M.L. [BDM-Petroleum Technologies, Bartlesville, OK (United States); Young, M.A.; Madden, M.P. [BDM-Oklahoma, Bartlesville, OK (United States)] [and others

    1997-08-01

    Optimum reservoir recovery and profitability result from guidance of reservoir practices provided by an effective reservoir management plan. Success in developing the best, most appropriate reservoir management plan requires knowledge and consideration of (1) the reservoir system including rocks, and rock-fluid interactions (i.e., a characterization of the reservoir) as well as wellbores and associated equipment and surface facilities; (2) the technologies available to describe, analyze, and exploit the reservoir; and (3) the business environment under which the plan will be developed and implemented. Reservoir characterization is the essential to gain needed knowledge of the reservoir for reservoir management plan building. Reservoir characterization efforts can be appropriately scaled by considering the reservoir management context under which the plan is being built. Reservoir management plans de-optimize with time as technology and the business environment change or as new reservoir information indicates the reservoir characterization models on which the current plan is based are inadequate. BDM-Oklahoma and the Department of Energy have implemented a program of reservoir management demonstrations to encourage operators with limited resources and experience to learn, implement, and disperse sound reservoir management techniques through cooperative research and development projects whose objectives are to develop reservoir management plans. In each of the three projects currently underway, careful attention to reservoir management context assures a reservoir characterization approach that is sufficient, but not in excess of what is necessary, to devise and implement an effective reservoir management plan.

  16. I/O-Optimal Distribution Sweeping on Private-Cache Chip Multiprocessors

    DEFF Research Database (Denmark)

    Ajwani, Deepak; Sitchinava, Nodar; Zeh, Norbert

    2011-01-01

    /PB) for a number of problems on axis aligned objects; P denotes the number of cores/processors, B denotes the number of elements that fit in a cache line, N and K denote the sizes of the input and output, respectively, and sortp(N) denotes the I/O complexity of sorting N items using P processors in the PEM model...... framework was introduced recently, and a number of algorithms for problems on axis-aligned objects were obtained using this framework. The obtained algorithms were efficient but not optimal. In this paper, we improve the framework to obtain algorithms with the optimal I/O complexity of O(sortp(N) + K...

  17. A Cache-Oblivious Implicit Dictionary with the Working Set Property

    DEFF Research Database (Denmark)

    Brodal, Gerth Stølting; Kejlberg-Rasmussen, Casper; Truelsen, Jakob

    2010-01-01

    In this paper we present an implicit dictionary with the working set property i.e. a dictionary supporting \\op{insert}($e$), \\op{delete}($x$) and \\op{predecessor}($x$) in~$\\O(\\log n)$ time and \\op{search}($x$) in $\\O(\\log\\ell)$ time, where $n$ is the number of elements stored in the dictionary...... and $\\ell$ is the number of distinct elements searched for since the element with key~$x$ was last searched for. The dictionary stores the elements in an array of size~$n$ using \\emph{no} additional space. In the cache-oblivious model the operations \\op{insert}($e$), \\op{delete}($x$) and \\op...

  18. Minimizing End-to-End Interference in I/O Stacks Spanning Shared Multi-Level Buffer Caches

    Science.gov (United States)

    Patrick, Christina M.

    2011-01-01

    This thesis presents an end-to-end interference minimizing uniquely designed high performance I/O stack that spans multi-level shared buffer cache hierarchies accessing shared I/O servers to deliver a seamless high performance I/O stack. In this thesis, I show that I can build a superior I/O stack which minimizes the inter-application interference…

  19. The Preston Geothermal Resources; Renewed Interest in a Known Geothermal Resource Area

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Thomas R. [Univ. of Idaho, Idaho Falls, ID (United States); Worthing, Wade [Univ. of Idaho, Idaho Falls, ID (United States); Cannon, Cody [Univ. of Idaho, Idaho Falls, ID (United States); Palmer, Carl [Univ. of Idaho, Idaho Falls, ID (United States); Neupane, Ghanashyam [Idaho National Lab. (INL), Idaho Falls, ID (United States); McLing, Travis L [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Earth Sciences Div.; Mattson, Earl [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Earth Sciences Div.; Dobson, Patric [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Earth Sciences Div.; Conrad, Mark [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Earth Sciences Div.

    2015-01-01

    The Preston Geothermal prospect is located in northern Cache Valley approximately 8 kilometers north of the city of Preston, in southeast Idaho. The Cache Valley is a structural graben of the northern portion of the Basin and Range Province, just south of the border with the Eastern Snake River Plain (ESRP). This is a known geothermal resource area (KGRA) that was evaluated in the 1970's by the State of Idaho Department of Water Resources (IDWR) and by exploratory wells drilled by Sunedco Energy Development. The resource is poorly defined but current interpretations suggest that it is associated with the Cache Valley structural graben. Thermal waters moving upward along steeply dipping northwest trending basin and range faults emanate in numerous hot springs in the area. Springs reach temperatures as hot as 84° C. Traditional geothermometry models estimated reservoir temperatures of approximately 125° C in the 1970’s study. In January of 2014, interest was renewed in the areas when a water well drilled to 79 m (260 ft) yielded a bottom hole temperature of 104° C (217° F). The well was sampled in June of 2014 to investigate the chemical composition of the water for modeling geothermometry reservoir temperature. Traditional magnesium corrected Na-K-Ca geothermometry estimates this new well to be tapping water from a thermal reservoir of 227° C (440° F). Even without the application of improved predictive methods, the results indicate much higher temperatures present at much shallower depths than previously thought. This new data provides strong support for further investigation and sampling of wells and springs in the Northern Cache Valley, proposed for the summer of 2015. The results of the water will be analyzed utilizing a new multicomponent equilibrium geothermometry (MEG) tool called Reservoir Temperature Estimate (RTEst) to obtain an improved estimate of the reservoir temperature. The new data suggest that other KGRAs and overlooked areas may need

  20. Flood of February 1980 along the Agua Fria River, Maricopa County, Arizona

    Science.gov (United States)

    Thomsen, B.W.

    1980-01-01

    The flood of February 20, 1980, along the Agua Fria River below Waddell Dam, Maricopa County, Ariz., was caused by heavy rains during February 13-20. The runoff filled Lake Pleasant and resulted in the largest release--66,600 cubic feet per second--from the reservoir since it was built in 1927; the maximum inflow to the reservoir was about 73,300 cubic feet per second. The area inundated by the releases includes about 28 miles along the channel from the mouth of the Agua Fria River to the Beardsley Canal flume crossing 5 miles downstream from Waddell Dam. The flood of 1980 into Lake Pleasant has a recurrence interval of about 47 years, whereas the flood of record (1919) has a recurrence interval of about 100 years. (USGS)

  1. Using dCache in Archiving Systems oriented to Earth Observation

    Science.gov (United States)

    Garcia Gil, I.; Perez Moreno, R.; Perez Navarro, O.; Platania, V.; Ozerov, D.; Leone, R.

    2012-04-01

    The object of LAST activity (Long term data Archive Study on new Technologies) is to perform an independent study on best practices and assessment of different archiving technologies mature for operation in the short and mid-term time frame, or available in the long-term with emphasis on technologies better suited to satisfy the requirements of ESA, LTDP and other European and Canadian EO partners in terms of digital information preservation and data accessibility and exploitation. During the last phase of the project, a testing of several archiving solutions has been performed in order to evaluate their suitability. In particular, dCache, aimed to provide a file system tree view of the data repository exchanging this data with backend (tertiary) Storage Systems as well as space management, pool attraction, dataset replication, hot spot determination and recovery from disk or node failures. Connected to a tertiary storage system, dCache simulates unlimited direct access storage space. Data exchanges to and from the underlying HSM are performed automatically and invisibly to the user Dcache was created to solve the requirements of big computer centers and universities with big amounts of data, putting their efforts together and founding EMI (European Middleware Initiative). At the moment being, Dcache is mature enough to be implemented, being used by several research centers of relevance (e.g. LHC storing up to 50TB/day). This solution has been not used so far in Earth Observation and the results of the study are summarized in this article, focusing on the capacities over a simulated environment to get in line with the ESA requirements for a geographically distributed storage. The challenge of a geographically distributed storage system can be summarized as the way to provide a maximum quality for storage and dissemination services with the minimum cost.

  2. Efficiently GPU-accelerating long kernel convolutions in 3-D DIRECT TOF PET reconstruction via memory cache optimization

    Energy Technology Data Exchange (ETDEWEB)

    Ha, Sungsoo; Mueller, Klaus [Stony Brook Univ., NY (United States). Center for Visual Computing; Matej, Samuel [Pennsylvania Univ., Philadelphia, PA (United States). Dept. of Radiology

    2011-07-01

    The DIRECT represents a novel approach for 3-D Time-of-Flight (TOF) PET reconstruction. Its novelty stems from the fact that it performs all iterative predictor-corrector operations directly in image space. The projection operations now amount to convolutions in image space, using long TOF (resolution) kernels. While for spatially invariant kernels the computational complexity can be algorithmically overcome by replacing spatial convolution with multiplication in Fourier space, spatially variant kernels cannot use this shortcut. Therefore in this paper, we describe a GPU-accelerated approach for this task. However, the intricate parallel architecture of GPUs poses its own challenges, and careful memory and thread management is the key to obtaining optimal results. As convolution is mainly memory-bound we focus on the former, proposing two types of memory caching schemes that warrant best cache memory re-use by the parallel threads. In contrast to our previous two-stage algorithm, the schemes presented here are both single-stage which is more accurate. (orig.)

  3. Post Waterflood CO2 Miscible Flood in Light Oil, Fluvial-Dominated Deltaic Reservoir (Pre-Work and Project Proposal), Class I

    Energy Technology Data Exchange (ETDEWEB)

    Bou-Mikael, Sami

    2002-02-05

    This project outlines a proposal to improve the recovery of light oil from waterflooded fluvial dominated deltaic (FDD) reservoir through a miscible carbon dioxide (CO2) flood. The site is the Port Neches Field in Orange County, Texas. The field is well explored and well exploited. The project area is 270 acres within the Port Neches Field.

  4. Behavior characterization of the shared last-level cache in a chip multiprocessor

    OpenAIRE

    Benedicte Illescas, Pedro

    2014-01-01

    [CATALÀ] Aquest projecte consisteix a analitzar diferents aspectes de la jerarquia de memòria i entendre la seva influència al rendiment del sistema. Els aspectes que s'analitzaran són els algorismes de reemplaçament, els esquemes de mapeig de memòria i les polítiques de pàgina de memòria. [ANGLÈS] This project consists in analyzing different aspects of the memory hierarchy and understanding its influence in the overall system performance. The aspects that will be analyzed are cache replac...

  5. Analysis of real-time reservoir monitoring : reservoirs, strategies, & modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Mani, Seethambal S.; van Bloemen Waanders, Bart Gustaaf; Cooper, Scott Patrick; Jakaboski, Blake Elaine; Normann, Randy Allen; Jennings, Jim (University of Texas at Austin, Austin, TX); Gilbert, Bob (University of Texas at Austin, Austin, TX); Lake, Larry W. (University of Texas at Austin, Austin, TX); Weiss, Chester Joseph; Lorenz, John Clay; Elbring, Gregory Jay; Wheeler, Mary Fanett (University of Texas at Austin, Austin, TX); Thomas, Sunil G. (University of Texas at Austin, Austin, TX); Rightley, Michael J.; Rodriguez, Adolfo (University of Texas at Austin, Austin, TX); Klie, Hector (University of Texas at Austin, Austin, TX); Banchs, Rafael (University of Texas at Austin, Austin, TX); Nunez, Emilio J. (University of Texas at Austin, Austin, TX); Jablonowski, Chris (University of Texas at Austin, Austin, TX)

    2006-11-01

    The project objective was to detail better ways to assess and exploit intelligent oil and gas field information through improved modeling, sensor technology, and process control to increase ultimate recovery of domestic hydrocarbons. To meet this objective we investigated the use of permanent downhole sensors systems (Smart Wells) whose data is fed real-time into computational reservoir models that are integrated with optimized production control systems. The project utilized a three-pronged approach (1) a value of information analysis to address the economic advantages, (2) reservoir simulation modeling and control optimization to prove the capability, and (3) evaluation of new generation sensor packaging to survive the borehole environment for long periods of time. The Value of Information (VOI) decision tree method was developed and used to assess the economic advantage of using the proposed technology; the VOI demonstrated the increased subsurface resolution through additional sensor data. Our findings show that the VOI studies are a practical means of ascertaining the value associated with a technology, in this case application of sensors to production. The procedure acknowledges the uncertainty in predictions but nevertheless assigns monetary value to the predictions. The best aspect of the procedure is that it builds consensus within interdisciplinary teams The reservoir simulation and modeling aspect of the project was developed to show the capability of exploiting sensor information both for reservoir characterization and to optimize control of the production system. Our findings indicate history matching is improved as more information is added to the objective function, clearly indicating that sensor information can help in reducing the uncertainty associated with reservoir characterization. Additional findings and approaches used are described in detail within the report. The next generation sensors aspect of the project evaluated sensors and packaging

  6. The water-quality monitoring program for the Baltimore reservoir system, 1981-2007—Description, review and evaluation, and framework integration for enhanced monitoring

    Science.gov (United States)

    Koterba, Michael T.; Waldron, Marcus C.; Kraus, Tamara E.C.

    2011-01-01

    The City of Baltimore, Maryland, and parts of five surrounding counties obtain their water from Loch Raven and Liberty Reservoirs. A third reservoir, Prettyboy, is used to resupply Loch Raven Reservoir. Management of the watershed conditions for each reservoir is a shared responsibility by agreement among City, County, and State jurisdictions. The most recent (2005) Baltimore Reservoir Watershed Management Agreement (RWMA) called for continued and improved water-quality monitoring in the reservoirs and selected watershed tributaries. The U.S. Geological Survey (USGS) conducted a retrospective review of the effectiveness of monitoring data obtained and analyzed by the RWMA jurisdictions from 1981 through 2007 to help identify possible improvements in the monitoring program to address RWMA water-quality concerns. Long-term water-quality concerns include eutrophication and sedimentation in the reservoirs, and elevated concentrations of (a) nutrients (nitrogen and phosphorus) being transported from the major tributaries to the reservoirs, (b) iron and manganese released from reservoir bed sediments during periods of deep-water anoxia, (c) mercury in higher trophic order game fish in the reservoirs, and (d) bacteria in selected reservoir watershed tributaries. Emerging concerns include elevated concentrations of sodium, chloride, and disinfection by-products (DBPs) in the drinking water from both supply reservoirs. Climate change and variability also could be emerging concerns, affecting seasonal patterns, annual trends, and drought occurrence, which historically have led to declines in reservoir water quality. Monitoring data increasingly have been used to support the development of water-quality models. The most recent (2006) modeling helped establish an annual sediment Total Maximum Daily Load to Loch Raven Reservoir, and instantaneous and 30-day moving average water-quality endpoints for chlorophyll-a (chl-a) and dissolved oxygen (DO) in Loch Raven and Prettyboy

  7. Large reservoirs: Chapter 17

    Science.gov (United States)

    Miranda, Leandro E.; Bettoli, Phillip William

    2010-01-01

    Large impoundments, defined as those with surface area of 200 ha or greater, are relatively new aquatic ecosystems in the global landscape. They represent important economic and environmental resources that provide benefits such as flood control, hydropower generation, navigation, water supply, commercial and recreational fisheries, and various other recreational and esthetic values. Construction of large impoundments was initially driven by economic needs, and ecological consequences received little consideration. However, in recent decades environmental issues have come to the forefront. In the closing decades of the 20th century societal values began to shift, especially in the developed world. Society is no longer willing to accept environmental damage as an inevitable consequence of human development, and it is now recognized that continued environmental degradation is unsustainable. Consequently, construction of large reservoirs has virtually stopped in North America. Nevertheless, in other parts of the world construction of large reservoirs continues. The emergence of systematic reservoir management in the early 20th century was guided by concepts developed for natural lakes (Miranda 1996). However, we now recognize that reservoirs are different and that reservoirs are not independent aquatic systems inasmuch as they are connected to upstream rivers and streams, the downstream river, other reservoirs in the basin, and the watershed. Reservoir systems exhibit longitudinal patterns both within and among reservoirs. Reservoirs are typically arranged sequentially as elements of an interacting network, filter water collected throughout their watersheds, and form a mosaic of predictable patterns. Traditional approaches to fisheries management such as stocking, regulating harvest, and in-lake habitat management do not always produce desired effects in reservoirs. As a result, managers may expend resources with little benefit to either fish or fishing. Some locally

  8. Understanding the True Stimulated Reservoir Volume in Shale Reservoirs

    KAUST Repository

    Hussain, Maaruf

    2017-06-06

    Successful exploitation of shale reservoirs largely depends on the effectiveness of hydraulic fracturing stimulation program. Favorable results have been attributed to intersection and reactivation of pre-existing fractures by hydraulically-induced fractures that connect the wellbore to a larger fracture surface area within the reservoir rock volume. Thus, accurate estimation of the stimulated reservoir volume (SRV) becomes critical for the reservoir performance simulation and production analysis. Micro-seismic events (MS) have been commonly used as a proxy to map out the SRV geometry, which could be erroneous because not all MS events are related to hydraulic fracture propagation. The case studies discussed here utilized a fully 3-D simulation approach to estimate the SRV. The simulation approach presented in this paper takes into account the real-time changes in the reservoir\\'s geomechanics as a function of fluid pressures. It is consisted of four separate coupled modules: geomechanics, hydrodynamics, a geomechanical joint model for interfacial resolution, and an adaptive re-meshing. Reservoir stress condition, rock mechanical properties, and injected fluid pressure dictate how fracture elements could open or slide. Critical stress intensity factor was used as a fracture criterion governing the generation of new fractures or propagation of existing fractures and their directions. Our simulations were run on a Cray XC-40 HPC system. The studies outcomes proved the approach of using MS data as a proxy for SRV to be significantly flawed. Many of the observed stimulated natural fractures are stress related and very few that are closer to the injection field are connected. The situation is worsened in a highly laminated shale reservoir as the hydraulic fracture propagation is significantly hampered. High contrast in the in-situ stresses related strike-slip developed thereby shortens the extent of SRV. However, far field nature fractures that were not connected to

  9. Post Waterflood CO2 Miscible Flood in Light Oil, Fluvial-Dominated Deltaic Reservoir (Pre-Work and Project Proposal), Class I; FINAL

    International Nuclear Information System (INIS)

    Bou-Mikael, Sami

    2002-01-01

    This project outlines a proposal to improve the recovery of light oil from waterflooded fluvial dominated deltaic (FDD) reservoir through a miscible carbon dioxide (CO2) flood. The site is the Port Neches Field in Orange County, Texas. The field is well explored and well exploited. The project area is 270 acres within the Port Neches Field

  10. Effect of reservoir heterogeneity on air injection performance in a light oil reservoir

    Directory of Open Access Journals (Sweden)

    Hu Jia

    2018-03-01

    Full Text Available Air injection is a good option to development light oil reservoir. As well-known that, reservoir heterogeneity has great effect for various EOR processes. This also applies to air injection. However, oil recovery mechanisms and physical processes for air injection in heterogeneous reservoir with dip angle are still not well understood. The reported setting of reservoir heterogeneous for physical model or simulation model of air injection only simply uses different-layer permeability of porous media. In practice, reservoir heterogeneity follows the principle of geostatistics. How much of contrast in permeability actually challenges the air injection in light oil reservoir? This should be investigated by using layered porous medial settings of the classical Dykstra-Parsons style. Unfortunately, there has been no work addressing this issue for air injection in light oil reservoir. In this paper, Reservoir heterogeneity is quantified based on the use of different reservoir permeability distribution according to classical Dykstra-Parsons coefficients method. The aim of this work is to investigate the effect of reservoir heterogeneity on physical process and production performance of air injection in light oil reservoir through numerical reservoir simulation approach. The basic model is calibrated based on previous study. Total eleven pseudo compounders are included in this model and ten complexity of reactions are proposed to achieve the reaction scheme. Results show that oil recovery factor is decreased with the increasing of reservoir heterogeneity both for air and N2 injection from updip location, which is against the working behavior of air injection from updip location. Reservoir heterogeneity sometimes can act as positive effect to improve sweep efficiency as well as enhance production performance for air injection. High O2 content air injection can benefit oil recovery factor, also lead to early O2 breakthrough in heterogeneous reservoir. Well

  11. Advanced Oil Recovery Technologies for Improved Recovery from Slope Basin Clastic Reservoirs, Nash Draw Brushy Canyon Pool, Eddy County, New Mexico

    International Nuclear Information System (INIS)

    Murphy, Mark B.

    1999-01-01

    The overall objective of this project is to demonstrate that a development program based on advanced reservoir management methods can significantly improve oil recovery at the Nash Draw Pool (NDP). The plan includes developing a control area using standard reservoir management techniques and comparing its performance to an area developed using advanced reservoir management methods. Specific goals are (1) to demonstrate that an advanced development drilling and pressure maintenance program can significantly improve oil recovery compared to existing technology applications and (2) to transfer these advanced methodologies to oil and gas producers in the Permian Basin and elsewhere throughout the U.S. oil and gas industry

  12. Mercury and methylmercury concentrations and loads in the Cache Creek watershed, California

    Energy Technology Data Exchange (ETDEWEB)

    Domagalski, Joseph L.; Alpers, Charles N.; Slotton, Darell G.; Suchanek, Thomas H.; Ayers, Shaun M

    2004-07-05

    Concentrations and loads of total mercury and methylmercury were measured in streams draining abandoned mercury mines and in the proximity of geothermal discharge in the Cache Creek watershed of California during a 17-month period from January 2000 through May 2001. Rainfall and runoff were lower than long-term averages during the study period. The greatest loading of mercury and methylmercury from upstream sources to downstream receiving waters, such as San Francisco Bay, generally occurred during or after winter rainfall events. During the study period, loads of mercury and methylmercury from geothermal sources tended to be greater than those from abandoned mining areas, a pattern attributable to the lack of large precipitation events capable of mobilizing significant amounts of either mercury-laden sediment or dissolved mercury and methylmercury from mine waste. Streambed sediments of Cache Creek are a significant source of mercury and methylmercury to downstream receiving bodies of water. Much of the mercury in these sediments is the result of deposition over the last 100-150 years by either storm-water runoff, from abandoned mines, or continuous discharges from geothermal areas. Several geochemical constituents were useful as natural tracers for mining and geothermal areas, including the aqueous concentrations of boron, chloride, lithium and sulfate, and the stable isotopes of hydrogen and oxygen in water. Stable isotopes of water in areas draining geothermal discharges showed a distinct trend toward enrichment of {sup 18}O compared with meteoric waters, whereas much of the runoff from abandoned mines indicated a stable isotopic pattern more consistent with local meteoric water.

  13. Mercury and methylmercury concentrations and loads in the Cache Creek watershed, California

    International Nuclear Information System (INIS)

    Domagalski, Joseph L.; Alpers, Charles N.; Slotton, Darell G.; Suchanek, Thomas H.; Ayers, Shaun M.

    2004-01-01

    Concentrations and loads of total mercury and methylmercury were measured in streams draining abandoned mercury mines and in the proximity of geothermal discharge in the Cache Creek watershed of California during a 17-month period from January 2000 through May 2001. Rainfall and runoff were lower than long-term averages during the study period. The greatest loading of mercury and methylmercury from upstream sources to downstream receiving waters, such as San Francisco Bay, generally occurred during or after winter rainfall events. During the study period, loads of mercury and methylmercury from geothermal sources tended to be greater than those from abandoned mining areas, a pattern attributable to the lack of large precipitation events capable of mobilizing significant amounts of either mercury-laden sediment or dissolved mercury and methylmercury from mine waste. Streambed sediments of Cache Creek are a significant source of mercury and methylmercury to downstream receiving bodies of water. Much of the mercury in these sediments is the result of deposition over the last 100-150 years by either storm-water runoff, from abandoned mines, or continuous discharges from geothermal areas. Several geochemical constituents were useful as natural tracers for mining and geothermal areas, including the aqueous concentrations of boron, chloride, lithium and sulfate, and the stable isotopes of hydrogen and oxygen in water. Stable isotopes of water in areas draining geothermal discharges showed a distinct trend toward enrichment of 18 O compared with meteoric waters, whereas much of the runoff from abandoned mines indicated a stable isotopic pattern more consistent with local meteoric water

  14. Amplitude various angles (AVA) phenomena in thin layer reservoir: Case study of various reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Nurhandoko, Bagus Endar B., E-mail: bagusnur@bdg.centrin.net.id, E-mail: bagusnur@rock-fluid.com [Wave Inversion and Subsurface Fluid Imaging Research Laboratory (WISFIR), Basic Science Center A 4" t" hfloor, Physics Dept., FMIPA, Institut Teknologi Bandung (Indonesia); Rock Fluid Imaging Lab., Bandung (Indonesia); Susilowati, E-mail: bagusnur@bdg.centrin.net.id, E-mail: bagusnur@rock-fluid.com [Rock Fluid Imaging Lab., Bandung (Indonesia)

    2015-04-16

    Amplitude various offset is widely used in petroleum exploration as well as in petroleum development field. Generally, phenomenon of amplitude in various angles assumes reservoir’s layer is quite thick. It also means that the wave is assumed as a very high frequency. But, in natural condition, the seismic wave is band limited and has quite low frequency. Therefore, topic about amplitude various angles in thin layer reservoir as well as low frequency assumption is important to be considered. Thin layer reservoir means the thickness of reservoir is about or less than quarter of wavelength. In this paper, I studied about the reflection phenomena in elastic wave which considering interference from thin layer reservoir and transmission wave. I applied Zoeppritz equation for modeling reflected wave of top reservoir, reflected wave of bottom reservoir, and also transmission elastic wave of reservoir. Results show that the phenomena of AVA in thin layer reservoir are frequency dependent. Thin layer reservoir causes interference between reflected wave of top reservoir and reflected wave of bottom reservoir. These phenomena are frequently neglected, however, in real practices. Even though, the impact of inattention in interference phenomena caused by thin layer in AVA may cause inaccurate reservoir characterization. The relation between classes of AVA reservoir and reservoir’s character are different when effect of ones in thin reservoir and ones in thick reservoir are compared. In this paper, I present some AVA phenomena including its cross plot in various thin reservoir types based on some rock physics data of Indonesia.

  15. Amplitude various angles (AVA) phenomena in thin layer reservoir: Case study of various reservoirs

    International Nuclear Information System (INIS)

    thfloor, Physics Dept., FMIPA, Institut Teknologi Bandung (Indonesia); Rock Fluid Imaging Lab., Bandung (Indonesia))" data-affiliation=" (Wave Inversion and Subsurface Fluid Imaging Research Laboratory (WISFIR), Basic Science Center A 4thfloor, Physics Dept., FMIPA, Institut Teknologi Bandung (Indonesia); Rock Fluid Imaging Lab., Bandung (Indonesia))" >Nurhandoko, Bagus Endar B.; Susilowati

    2015-01-01

    Amplitude various offset is widely used in petroleum exploration as well as in petroleum development field. Generally, phenomenon of amplitude in various angles assumes reservoir’s layer is quite thick. It also means that the wave is assumed as a very high frequency. But, in natural condition, the seismic wave is band limited and has quite low frequency. Therefore, topic about amplitude various angles in thin layer reservoir as well as low frequency assumption is important to be considered. Thin layer reservoir means the thickness of reservoir is about or less than quarter of wavelength. In this paper, I studied about the reflection phenomena in elastic wave which considering interference from thin layer reservoir and transmission wave. I applied Zoeppritz equation for modeling reflected wave of top reservoir, reflected wave of bottom reservoir, and also transmission elastic wave of reservoir. Results show that the phenomena of AVA in thin layer reservoir are frequency dependent. Thin layer reservoir causes interference between reflected wave of top reservoir and reflected wave of bottom reservoir. These phenomena are frequently neglected, however, in real practices. Even though, the impact of inattention in interference phenomena caused by thin layer in AVA may cause inaccurate reservoir characterization. The relation between classes of AVA reservoir and reservoir’s character are different when effect of ones in thin reservoir and ones in thick reservoir are compared. In this paper, I present some AVA phenomena including its cross plot in various thin reservoir types based on some rock physics data of Indonesia

  16. Comparison of the Frontier Distributed Database Caching System with NoSQL Databases

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Non-relational "NoSQL" databases such as Cassandra and CouchDB are best known for their ability to scale to large numbers of clients spread over a wide area. The Frontier distributed database caching system, used in production by the Large Hadron Collider CMS and ATLAS detector projects, is based on traditional SQL databases but also has the same high scalability and wide-area distributability for an important subset of applications. This paper compares the architectures, behavior, performance, and maintainability of the two different approaches and identifies the criteria for choosing which approach to prefer over the other.

  17. Caching-Aided Collaborative D2D Operation for Predictive Data Dissemination in Industrial IoT

    OpenAIRE

    Orsino, Antonino; Kovalchukov, Roman; Samuylov, Andrey; Moltchanov, Dmitri; Andreev, Sergey; Koucheryavy, Yevgeni; Valkama, Mikko

    2018-01-01

    Industrial automation deployments constitute challenging environments where moving IoT machines may produce high-definition video and other heavy sensor data during surveying and inspection operations. Transporting massive contents to the edge network infrastructure and then eventually to the remote human operator requires reliable and high-rate radio links supported by intelligent data caching and delivery mechanisms. In this work, we address the challenges of contents dissemination in chara...

  18. Agricultural Influences on Cache Valley, Utah Air Quality During a Wintertime Inversion Episode

    Science.gov (United States)

    Silva, P. J.

    2017-12-01

    Several of northern Utah's intermountain valleys are classified as non-attainment for fine particulate matter. Past data indicate that ammonium nitrate is the major contributor to fine particles and that the gas phase ammonia concentrations are among the highest in the United States. During the 2017 Utah Winter Fine Particulate Study, USDA brought a suite of online and real-time measurement methods to sample particulate matter and potential gaseous precursors from agricultural emissions in the Cache Valley. Instruments were co-located at the State of Utah monitoring site in Smithfield, Utah from January 21st through February 12th, 2017. A Scanning mobility particle sizer (SMPS) and aerodynamic particle sizer (APS) acquired size distributions of particles from 10 nm - 10 μm in 5-min intervals. A URG ambient ion monitor (AIM) gave hourly concentrations for gas and particulate ions and a Chromatotec Trsmedor gas chromatograph obtained 10 minute measurements of gaseous sulfur species. High ammonia concentrations were detected at the Smithfield site with concentrations above 100 ppb at times, indicating a significant influence from agriculture at the sampling site. Ammonia is not the only agricultural emission elevated in Cache Valley during winter, as reduced sulfur gas concentrations of up to 20 ppb were also detected. Dimethylsulfide was the major sulfur-containing gaseous species. Analysis indicates that particle growth and particle nucleation events were both observed by the SMPS. Relationships between gas and particulate concentrations and correlations between the two will be discussed.

  19. Design issues and caching strategies for CD-ROM-based multimedia storage

    Science.gov (United States)

    Shastri, Vijnan; Rajaraman, V.; Jamadagni, H. S.; Venkat-Rangan, P.; Sampath-Kumar, Srihari

    1996-03-01

    CD-ROMs have proliferated as a distribution media for desktop machines for a large variety of multimedia applications (targeted for a single-user environment) like encyclopedias, magazines and games. With CD-ROM capacities up to 3 GB being available in the near future, they will form an integral part of Video on Demand (VoD) servers to store full-length movies and multimedia. In the first section of this paper we look at issues related to the single- user desktop environment. Since these multimedia applications are highly interactive in nature, we take a pragmatic approach, and have made a detailed study of the multimedia application behavior in terms of the I/O request patterns generated to the CD-ROM subsystem by tracing these patterns. We discuss prefetch buffer design and seek time characteristics in the context of the analysis of these traces. We also propose an adaptive main-memory hosted cache that receives caching hints from the application to reduce the latency when the user moves from one node of the hyper graph to another. In the second section we look at the use of CD-ROM in a VoD server and discuss the problem of scheduling multiple request streams and buffer management in this scenario. We adapt the C-SCAN (Circular SCAN) algorithm to suit the CD-ROM drive characteristics and prove that it is optimal in terms of buffer size management. We provide computationally inexpensive relations by which this algorithm can be implemented. We then propose an admission control algorithm which admits new request streams without disrupting the continuity of playback of the previous request streams. The algorithm also supports operations such as fast forward and replay. Finally, we discuss the problem of optimal placement of MPEG streams on CD-ROMs in the third section.

  20. An Integrated Study of the Grayburg/San Andres Reservoir, Foster and South Cowden Fields, Ector County, Texas, Class II

    Energy Technology Data Exchange (ETDEWEB)

    Trentham, Robert C.; Weinbrandt, Richard; Robinson, William C.; Widner, Kevin

    2001-05-03

    The objectives of the project were to: (1) Thoroughly understand the 60-year history of the field. (2) Develop a reservoir description using geology and 3D seismic. (3) Isolate the upper Grayburg in wells producing from multiple intervals to stop cross flow. (4) Re-align and optimize the upper Grayburg waterflood. (5) Determine well condition, identify re-frac candidates, evaluate the effectiveness of well work and obtain bottom hole pressure data for simulation utilizing pressure transient testing field wide. (6) Quantitatively integrate all the data to guide the field operations, including identification of new well locations utilizing reservoir simulation.

  1. Multi-data reservoir history matching for enhanced reservoir forecasting and uncertainty quantification

    KAUST Repository

    Katterbauer, Klemens

    2015-04-01

    Reservoir simulations and history matching are critical for fine-tuning reservoir production strategies, improving understanding of the subsurface formation, and forecasting remaining reserves. Production data have long been incorporated for adjusting reservoir parameters. However, the sparse spatial sampling of this data set has posed a significant challenge for efficiently reducing uncertainty of reservoir parameters. Seismic, electromagnetic, gravity and InSAR techniques have found widespread applications in enhancing exploration for oil and gas and monitoring reservoirs. These data have however been interpreted and analyzed mostly separately, rarely exploiting the synergy effects that could result from combining them. We present a multi-data ensemble Kalman filter-based history matching framework for the simultaneous incorporation of various reservoir data such as seismic, electromagnetics, gravimetry and InSAR for best possible characterization of the reservoir formation. We apply an ensemble-based sensitivity method to evaluate the impact of each observation on the estimated reservoir parameters. Numerical experiments for different test cases demonstrate considerable matching enhancements when integrating all data sets in the history matching process. Results from the sensitivity analysis further suggest that electromagnetic data exhibit the strongest impact on the matching enhancements due to their strong differentiation between water fronts and hydrocarbons in the test cases.

  2. A New Method for Fracturing Wells Reservoir Evaluation in Fractured Gas Reservoir

    Directory of Open Access Journals (Sweden)

    Jianchun Guo

    2014-01-01

    Full Text Available Natural fracture is a geological phenomenon widely distributed in tight formation, and fractured gas reservoir stimulation effect mainly depends on the communication of natural fractures. Therefore it is necessary to carry out the evaluation of this reservoir and to find out the optimal natural fractures development wells. By analyzing the interactions and nonlinear relationships of the parameters, it establishes three-level index system of reservoir evaluation and proposes a new method for gas well reservoir evaluation model in fractured gas reservoir on the basis of fuzzy logic theory and multilevel gray correlation. For this method, the Gaussian membership functions to quantify the degree of every factor in the decision-making system and the multilevel gray relation to determine the weight of each parameter on stimulation effect. Finally through fuzzy arithmetic operator between multilevel weights and fuzzy evaluation matrix, score, rank, the reservoir quality, and predicted production will be gotten. Result of this new method shows that the evaluation of the production coincidence rate reaches 80%, which provides a new way for fractured gas reservoir evaluation.

  3. Reservoir Engineering Management Program

    Energy Technology Data Exchange (ETDEWEB)

    Howard, J.H.; Schwarz, W.J.

    1977-12-14

    The Reservoir Engineering Management Program being conducted at Lawrence Berkeley Laboratory includes two major tasks: 1) the continuation of support to geothermal reservoir engineering related work, started under the NSF-RANN program and transferred to ERDA at the time of its formation; 2) the development and subsequent implementation of a broad plan for support of research in topics related to the exploitation of geothermal reservoirs. This plan is now known as the GREMP plan. Both the NSF-RANN legacies and GREMP are in direct support of the DOE/DGE mission in general and the goals of the Resource and Technology/Resource Exploitation and Assessment Branch in particular. These goals are to determine the magnitude and distribution of geothermal resources and reduce risk in their exploitation through improved understanding of generically different reservoir types. These goals are to be accomplished by: 1) the creation of a large data base about geothermal reservoirs, 2) improved tools and methods for gathering data on geothermal reservoirs, and 3) modeling of reservoirs and utilization options. The NSF legacies are more research and training oriented, and the GREMP is geared primarily to the practical development of the geothermal reservoirs. 2 tabs., 3 figs.

  4. Water quality and trend analysis of Colorado--Big Thompson system reservoirs and related conveyances, 1969 through 2000

    Science.gov (United States)

    Stevens, Michael R.

    2003-01-01

    The U.S. Geological Survey, in an ongoing cooperative monitoring program with the Northern Colorado Water Conservancy District, Bureau of Reclamation, and City of Fort Collins, has collected water-quality data in north-central Colorado since 1969 in reservoirs and conveyances, such as canals and tunnels, related to the Colorado?Big Thompson Project, a water-storage, collection, and distribution system. Ongoing changes in water use among agricultural and municipal users on the eastern slope of the Rocky Mountains in Colorado, changing land use in reservoir watersheds, and other water-quality issues among Northern Colorado Water Conservancy District customers necessitated a reexamination of water-quality trends in the Colorado?Big Thompson system reservoirs and related conveyances. The sampling sites are on reservoirs, canals, and tunnels in the headwaters of the Colorado River (on the western side of the transcontinental diversion operations) and the headwaters of the Big Thompson River (on the eastern side of the transcontinental diversion operations). Carter Lake Reservoir and Horsetooth Reservoir are off-channel water-storage facilities, located in the foothills of the northern Colorado Front Range, for water supplied from the Colorado?Big Thompson Project. The length of water-quality record ranges from approximately 3 to 30 years depending on the site and the type of measurement or constituent. Changes in sampling frequency, analytical methods, and minimum reporting limits have occurred repeatedly over the period of record. The objective of this report was to complete a retrospective water-quality and trend analysis of reservoir profiles, nutrients, major ions, selected trace elements, chlorophyll-a, and hypolimnetic oxygen data from 1969 through 2000 in Lake Granby, Shadow Mountain Lake, and the Granby Pump Canal in Grand County, Colorado, and Horsetooth Reservoir, Carter Lake, Lake Estes, Alva B. Adams Tunnel, and Olympus Tunnel in Larimer County, Colorado

  5. Water resources of Rockland County, New York, 2005-07, with emphasis on the Newark Basin Bedrock Aquifer

    Science.gov (United States)

    Heisig, Paul M.

    2011-01-01

    Concerns over the state of water resources in Rockland County, NY, prompted an assessment of current (2005-07) conditions. The investigation included a review of all water resources but centered on the Newark basin aquifer, a fractured-bedrock aquifer over which nearly 300,000 people reside. Most concern has been focused on this aquifer because of (1) high summer pumping rates, with occasional entrained-air problems and an unexplained water-level decline at a monitoring well, (2) annual withdrawals that have approached or even exceeded previous estimates of aquifer recharge, and (3) numerous contamination problems that have caused temporary or long-term shutdown of production wells. Public water supply in Rockland County uses three sources of water in roughly equal parts: (1) the Newark basin sedimentary bedrock aquifer, (2) alluvial aquifers along the Ramapo and Mahwah Rivers, and (3) surface waters from Lake DeForest Reservoir and a smaller, new reservoir supply in the Highlands part of the county. Water withdrawals from the alluvial aquifer in the Ramapo River valley and the Lake DeForest Reservoir are subject to water-supply application permits that stipulate minimum flows that must be maintained downstream into New Jersey. There is a need, therefore, at a minimum, to prevent any loss of the bedrock-aquifer resource--to maintain it in terms of both sustainable use and water-quality protection. The framework of the Newark basin bedrock aquifer included characterization of (1) the structure and fracture occurrence associated with the Newark basin strata, (2) the texture and thickness of overlying glacial and alluvial deposits, (3) the presence of the Palisades sill and associated basaltic units on or within the Newark basin strata, and (4) the streams that drain the aquifer system. The greatest concern regarding sustainability of groundwater resources is the aquifer response to the seasonal increase in pumping rates from May through October (an average increase

  6. Petroleum potential of two sites in Deaf Smith and Swisher Counties, Texas Panhandle: Volume 1: Technical report

    International Nuclear Information System (INIS)

    Rose, P.R.

    1986-09-01

    This is the third in a series of regional geologic studies to assess the petroleum resources of two potentially acceptable sites under study for a nuclear waste disposal facility. Site 1 is in northeastern Deaf Smith County, Texas, and Site 2 is in northeastern Swisher County, Texas. Although potential reservoir zones are present under Site 1, the likelihood of hydrocarbon charge and structural or stratigraphic entrapment is low. The probability of a commercial petroleum discovery is estimated at 1:1000, and expected net present value of potential production is about $700,000. Little future industry drilling activity is foreseen around Site 1. Five potential reservoir zones are present under Site 2, and some may contain hydrocarbons. Anticlines are adjacent to Site 2, and some may contain hydrocarbons. Anticlines are adjacent to Site 2 on the northeast, southeast, and northwest, but the middle of the acreage block is synclinal, and its petroleum potential is very low. Discovery probability of the structures adjacent to Site 2 is higher, but the chance of developing commercial production is only about 2:100. Such accumulations might extend into the northeast and southeast corners of the block; expected net present value of such conjectured reserves is about $1,100,000 and $650,000, respectively. Continued industry activity pursuant to these three structures is expected, including seismic surveys and drilling. Considering the potential loss of petroleum resources through withdrawal of acreage from exploration, and the possibility of adjacent drilling, Site 1 in Deaf Smith County is clearly preferable for location of the proposed nuclear waste disposal facility

  7. Environmental Impact Statement: Peacekeeper Missile System Deactivation and Dismantlement

    Science.gov (United States)

    2000-12-01

    1996. The remaining supplies are product which have been recovered and reclaimed back to a chemically pure state in accordance to ARI-700...Watersheds 10180008 10180009 10180011 10180012 10180013 Glendo Reservoir Middle North Platte River Lower Laramie River Horse Creek Pumpkin Creek North...Reservoir Middle North Platte River Lower Laramie River Horse Creek Pumpkin Creek North Platte River 10190007 10190008 10190009 10190015 10190016 Cache La

  8. Evaluation of an Empirical Reservoir Shape Function to Define Sediment Distributions in Small Reservoirs

    Directory of Open Access Journals (Sweden)

    Bogusław Michalec

    2015-08-01

    Full Text Available Understanding and defining the spatial distribution of sediment deposited in reservoirs is essential not only at the design stage but also during the operation. The majority of research concerns the distribution of sediment deposition in medium and large water reservoirs. Most empirical methods do not provide satisfactory results when applied to the determination of sediment deposition in small reservoirs. Small reservoir’s volumes do not exceed 5 × 106 m3 and their capacity-inflow ratio is less than 10%. Long-term silting measurements of three small reservoirs were used to evaluate the method described by Rahmanian and Banihashemi for predicting sediment distributions in small reservoirs. Rahmanian and Banihashemi stated that their model of distribution of sediment deposition in water reservoir works well for a long duration operation. In the presented study, the silting rate was used in order to determine the long duration operation. Silting rate is a quotient of volume of the sediment deposited in the reservoir and its original volume. It was stated that when the silting rate had reached 50%, the sediment deposition in the reservoir may be described by an empirical reservoir depth shape function (RDSF.

  9. Modeling reservoir geomechanics using discrete element method : Application to reservoir monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Alassi, Haitham Tayseer

    2008-09-15

    Understanding reservoir geomechanical behavior is becoming more and more important for the petroleum industry. Reservoir compaction, which may result in surface subsidence and fault reactivation, occurs during reservoir depletion. Stress changes and possible fracture development inside and outside a depleting reservoir can be monitored using time-lapse (so-called '4D') seismic and/or passive seismic, and this can give valuable information about the conditions of a given reservoir during production. In this study we will focus on using the (particle-based) Discrete Element Method (DEM) to model reservoir geomechanical behavior during depletion and fluid injection. We show in this study that DEM can be used in modeling reservoir geomechanical behavior by comparing results obtained from DEM to those obtained from analytical solutions. The match of the displacement field between DEM and the analytical solution is good, however there is mismatch of the stress field which is related to the way stress is measured in DEM. A good match is however obtained by measuring the stress field carefully. We also use DEM to model reservoir geomechanical behavior beyond the elasticity limit where fractures can develop and faults can reactivate. A general technique has been developed to relate DEM parameters to rock properties. This is necessary in order to use correct reservoir geomechanical properties during modeling. For any type of particle packing there is a limitation that the maximum ratio between P- and S-wave velocity Vp/Vs that can be modeled is 3 . The static behavior for a loose packing is different from the dynamic behavior. Empirical relations are needed for the static behavior based on numerical test observations. The dynamic behavior for both dense and loose packing can be given by analytical relations. Cosserat continuum theory is needed to derive relations for Vp and Vs. It is shown that by constraining the particle rotation, the S-wave velocity can be

  10. Estimation of Bank Erosion Due To Reservoir Operation in Cascade (Case Study: Citarum Cascade Reservoir

    Directory of Open Access Journals (Sweden)

    Sri Legowo

    2009-11-01

    Full Text Available Sedimentation is such a crucial issue to be noted once the accumulated sediment begins to fill the reservoir dead storage, this will then influence the long-term reservoir operation. The sediment accumulated requires a serious attention for it may influence the storage capacity and other reservoir management of activities. The continuous inflow of sediment to the reservoir will decrease the capacity of reservoir storage, the reservoir value in use, and the useful age of reservoir. Because of that, the rate of the sediment needs to be delayed as possible. In this research, the delay of the sediment rate is considered based on the rate of flow of landslide of the reservoir slope. The rate of flow of the sliding slope can be minimized by way of each reservoir autonomous efforts. This effort can be performed through; the regulation of fluctuating rate of reservoir surface current that does not cause suddenly drawdown and upraising as well. The research model is compiled using the searching technique of Non Linear Programming (NLP.The rate of bank erosion for the reservoir variates from 0.0009 to 0.0048 MCM/year, which is no sigrificant value to threaten the life time of reservoir.Mean while the rate of watershed sediment has a significant value, i.e: 3,02 MCM/year for Saguling that causes to fullfill the storage capacity in 40 next years (from years 2008.

  11. Norms for CERAD Constructional Praxis Recall

    Science.gov (United States)

    Fillenbaum, Gerda G.; Burchett, Bruce M.; Unverzagt, Frederick W.; Rexroth, Daniel F.; Welsh-Bohmer, Kathleen

    2012-01-01

    Recall of the 4-item constructional praxis measure was a later addition to the Consortium to Establish a Registry for Alzheimer’s Disease (CERAD) neuropsychological battery. Norms for this measure, based on cognitively intact African Americans age ≥70 (Indianapolis-Ibadan Dementia Project, N=372), European American participants age ≥66 (Cache County Study of Memory, Health and Aging, N=507), and European American CERAD clinic controls age ≥50 (N=182), are presented here. Performance varied by site; by sex, education and age (African Americans in Indianapolis); education and age (Cache County European Americans; and only age (CERAD European American controls). Performance declined with increased age, within age with less education, and was poorer for women. Means, standard deviations, and percentiles are presented separately for each sample. PMID:21992077

  12. Comparison of the Frontier Distributed Database Caching System with NoSQL Databases

    CERN Document Server

    Dykstra, David

    2012-01-01

    One of the main attractions of non-relational "NoSQL" databases is their ability to scale to large numbers of readers, including readers spread over a wide area. The Frontier distributed database caching system, used in production by the Large Hadron Collider CMS and ATLAS detector projects for Conditions data, is based on traditional SQL databases but also has high scalability and wide-area distributability for an important subset of applications. This paper compares the major characteristics of the two different approaches and identifies the criteria for choosing which approach to prefer over the other. It also compares in some detail the NoSQL databases used by CMS and ATLAS: MongoDB, CouchDB, HBase, and Cassandra.

  13. DEVELOPMENT OF RESERVOIR CHARACTERIZATION TECHNIQUES AND PRODUCTION MODELS FOR EXPLOITING NATURALLY FRACTURED RESERVOIRS

    Energy Technology Data Exchange (ETDEWEB)

    Michael L. Wiggins; Raymon L. Brown; Faruk Civan; Richard G. Hughes

    2002-12-31

    For many years, geoscientists and engineers have undertaken research to characterize naturally fractured reservoirs. Geoscientists have focused on understanding the process of fracturing and the subsequent measurement and description of fracture characteristics. Engineers have concentrated on the fluid flow behavior in the fracture-porous media system and the development of models to predict the hydrocarbon production from these complex systems. This research attempts to integrate these two complementary views to develop a quantitative reservoir characterization methodology and flow performance model for naturally fractured reservoirs. The research has focused on estimating naturally fractured reservoir properties from seismic data, predicting fracture characteristics from well logs, and developing a naturally fractured reservoir simulator. It is important to develop techniques that can be applied to estimate the important parameters in predicting the performance of naturally fractured reservoirs. This project proposes a method to relate seismic properties to the elastic compliance and permeability of the reservoir based upon a sugar cube model. In addition, methods are presented to use conventional well logs to estimate localized fracture information for reservoir characterization purposes. The ability to estimate fracture information from conventional well logs is very important in older wells where data are often limited. Finally, a desktop naturally fractured reservoir simulator has been developed for the purpose of predicting the performance of these complex reservoirs. The simulator incorporates vertical and horizontal wellbore models, methods to handle matrix to fracture fluid transfer, and fracture permeability tensors. This research project has developed methods to characterize and study the performance of naturally fractured reservoirs that integrate geoscience and engineering data. This is an important step in developing exploitation strategies for

  14. SILTATION IN RESERVOIRS

    African Journals Online (AJOL)

    Keywords: reservoir model, siltation, sediment, catchment, sediment transport. 1. Introduction. Sediment ... rendered water storage structures useless in less than 25 years. ... reservoir, thus reducing the space available for water storage and ...

  15. Optimal Operation of Hydropower Reservoirs under Climate Change: The Case of Tekeze Reservoir, Eastern Nile

    Directory of Open Access Journals (Sweden)

    Fikru Fentaw Abera

    2018-03-01

    Full Text Available Optimal operation of reservoirs is very essential for water resource planning and management, but it is very challenging and complicated when dealing with climate change impacts. The objective of this paper was to assess existing and future hydropower operation at the Tekeze reservoir in the face of climate change. In this study, a calibrated and validated Soil and Water Assessment Tool (SWAT was used to model runoff inflow into the Tekeze hydropower reservoir under present and future climate scenarios. Inflow to the reservoir was simulated using hydro-climatic data from an ensemble of downscaled climate data based on the Coordinated Regional climate Downscaling Experiment over African domain (CORDEX-Africa with Coupled Intercomparison Project Phase 5 (CMIP5 simulations under Representative Concentration Pathway (RCP4.5 and RCP8.5 climate scenarios. Observed and projected inflows to Tekeze hydropower reservoir were used as input to the US Army Corps of Engineer’s Reservoir Evaluation System Perspective Reservoir Model (HEC-ResPRM, a reservoir operation model, to optimize hydropower reservoir release, storage and pool level. Results indicated that climate change has a clear impact on reservoir inflow and showed increase in annual and monthly inflow into the reservoir except in dry months from May to June under RCP4.5 and RCP8.5 climate scenarios. HEC-ResPRM optimal operation results showed an increase in Tekeze reservoir power storage potential up to 25% and 30% under RCP4.5 and RCP8.5 climate scenarios, respectively. This implies that Tekeze hydropower production will be affected by climate change. This analysis can be used by water resources planners and mangers to develop reservoir operation techniques considering climate change impact to increase power production.

  16. Improving reservoir history matching of EM heated heavy oil reservoirs via cross-well seismic tomography

    KAUST Repository

    Katterbauer, Klemens

    2014-01-01

    Enhanced recovery methods have become significant in the industry\\'s drive to increase recovery rates from oil and gas reservoirs. For heavy oil reservoirs, the immobility of the oil at reservoir temperatures, caused by its high viscosity, limits the recovery rates and strains the economic viability of these fields. While thermal recovery methods, such as steam injection or THAI, have extensively been applied in the field, their success has so far been limited due to prohibitive heat losses and the difficulty in controlling the combustion process. Electromagnetic (EM) heating via high-frequency EM radiation has attracted attention due to its wide applicability in different environments, its efficiency, and the improved controllability of the heating process. While becoming a promising technology for heavy oil recovery, its effect on overall reservoir production and fluid displacements are poorly understood. Reservoir history matching has become a vital tool for the oil & gas industry to increase recovery rates. Limited research has been undertaken so far to capture the nonlinear reservoir dynamics and significantly varying flow rates for thermally heated heavy oil reservoir that may notably change production rates and render conventional history matching frameworks more challenging. We present a new history matching framework for EM heated heavy oil reservoirs incorporating cross-well seismic imaging. Interfacing an EM heating solver to a reservoir simulator via Andrade’s equation, we couple the system to an ensemble Kalman filter based history matching framework incorporating a cross-well seismic survey module. With increasing power levels and heating applied to the heavy oil reservoirs, reservoir dynamics change considerably and may lead to widely differing production forecasts and increased uncertainty. We have shown that the incorporation of seismic observations into the EnKF framework can significantly enhance reservoir simulations, decrease forecasting

  17. Nagylengyel: an interesting reservoir. [Yugoslovia

    Energy Technology Data Exchange (ETDEWEB)

    Dedinszky, J

    1971-04-01

    The Nagylengyel oil field, discovered in 1951, has oil-producing formations mostly in the Upper-Triassic dolomites, in the Norian-Ractian transition formations, in the Upper-Cretaceous limestones and shales, and in the Miocene. The formation of the reservoir space occurred in many stages. A porous, cavernous fractured reservoir is developed in the Norian principal dolomite. A cavernous fractured reservoir exists in the Cretaceous limestone and in the Cretaceous shale and porous fractured reservoir is developed in the Miocene. The derivation of the model of the reservoir, and the conservative evaluation of the volume of the reservoir made it possible to use secondary recovery.

  18. Physical, chemical, and biological characteristics of selected headwater streams along the Allegheny Front, Blair County, Pennsylvania, July 2011–September 2013

    Science.gov (United States)

    Low, Dennis J.; Brightbill, Robin A.; Eggleston, Heather L.; Chaplin, Jeffrey J.

    2016-02-29

    The Altoona Water Authority (AWA) obtains all of its water supply from headwater streams that drain western Blair County, an area underlain in part by black shale of the Marcellus Formation. Development of the shale-gas reservoirs will require new access roads, stream crossing, drill-pad construction, and pipeline installation, activities that have the potential to alter existing stream channel morphology, increase runoff and sediment supply, alter streamwater chemistry, and affect aquatic habitat. The U.S. Geological Survey, in cooperation with Altoona Water Authority and Blair County Conservation District, investigated the water quality of 12 headwater streams and biotic health of 10 headwater streams.

  19. County Spending

    Data.gov (United States)

    Montgomery County of Maryland — This dataset includes County spending data for Montgomery County government. It does not include agency spending. Data considered sensitive or confidential and will...

  20. Rational Rock Physics for Improved Velocity Prediction and Reservoir Properties Estimation for Granite Wash (Tight Sands in Anadarko Basin, Texas

    Directory of Open Access Journals (Sweden)

    Muhammad Z. A. Durrani

    2014-01-01

    Full Text Available Due to the complex nature, deriving elastic properties from seismic data for the prolific Granite Wash reservoir (Pennsylvanian age in the western Anadarko Basin Wheeler County (Texas is quite a challenge. In this paper, we used rock physics tool to describe the diagenesis and accurate estimation of seismic velocities of P and S waves in Granite Wash reservoir. Hertz-Mindlin and Cementation (Dvorkin’s theories are applied to analyze the nature of the reservoir rocks (uncemented and cemented. In the implementation of rock physics diagnostics, three classical rock physics (empirical relations, Kuster-Toksöz, and Berryman models are comparatively analyzed for velocity prediction taking into account the pore shape geometry. An empirical (VP-VS relationship is also generated calibrated with core data for shear wave velocity prediction. Finally, we discussed the advantages of each rock physics model in detail. In addition, cross-plots of unconventional attributes help us in the clear separation of anomalous zone and lithologic properties of sand and shale facies over conventional attributes.

  1. Model simulation of the Manasquan water-supply system in Monmouth County, New Jersey

    Science.gov (United States)

    Chang, Ming; Tasker, Gary D.; Nieswand, Steven

    2001-01-01

    Model simulation of the Manasquan Water Supply System in Monmouth County, New Jersey, was completed using historic hydrologic data to evaluate the effects of operational and withdrawal alternatives on the Manasquan reservoir and pumping system. Changes in the system operations can be simulated with the model using precipitation forecasts. The Manasquan Reservoir system model operates by using daily streamflow values, which were reconstructed from historical U.S. Geological Survey streamflow-gaging station records. The model is able to run in two modes--General Risk analysis Model (GRAM) and Position Analysis Model (POSA). The GRAM simulation procedure uses reconstructed historical streamflow records to provide probability estimates of certain events, such as reservoir storage levels declining below a specific level, when given an assumed set of operating rules and withdrawal rates. POSA can be used to forecast the likelihood of specified outcomes, such as streamflows falling below statutory passing flows, associated with a specific working plan for the water-supply system over a period of months. The user can manipulate the model and generate graphs and tables of streamflows and storage, for example. This model can be used as a management tool to facilitate the development of drought warning and drought emergency rule curves and safe yield values for the water-supply system.

  2. Comparison of the Frontier Distributed Database Caching System to NoSQL Databases

    Science.gov (United States)

    Dykstra, Dave

    2012-12-01

    One of the main attractions of non-relational “NoSQL” databases is their ability to scale to large numbers of readers, including readers spread over a wide area. The Frontier distributed database caching system, used in production by the Large Hadron Collider CMS and ATLAS detector projects for Conditions data, is based on traditional SQL databases but also adds high scalability and the ability to be distributed over a wide-area for an important subset of applications. This paper compares the major characteristics of the two different approaches and identifies the criteria for choosing which approach to prefer over the other. It also compares in some detail the NoSQL databases used by CMS and ATLAS: MongoDB, CouchDB, HBase, and Cassandra.

  3. Comparison of the Frontier Distributed Database Caching System to NoSQL Databases

    International Nuclear Information System (INIS)

    Dykstra, Dave

    2012-01-01

    One of the main attractions of non-relational “NoSQL” databases is their ability to scale to large numbers of readers, including readers spread over a wide area. The Frontier distributed database caching system, used in production by the Large Hadron Collider CMS and ATLAS detector projects for Conditions data, is based on traditional SQL databases but also adds high scalability and the ability to be distributed over a wide-area for an important subset of applications. This paper compares the major characteristics of the two different approaches and identifies the criteria for choosing which approach to prefer over the other. It also compares in some detail the NoSQL databases used by CMS and ATLAS: MongoDB, CouchDB, HBase, and Cassandra.

  4. Comparison of the Frontier Distributed Database Caching System to NoSQL Databases

    Energy Technology Data Exchange (ETDEWEB)

    Dykstra, Dave [Fermilab

    2012-07-20

    One of the main attractions of non-relational NoSQL databases is their ability to scale to large numbers of readers, including readers spread over a wide area. The Frontier distributed database caching system, used in production by the Large Hadron Collider CMS and ATLAS detector projects for Conditions data, is based on traditional SQL databases but also adds high scalability and the ability to be distributed over a wide-area for an important subset of applications. This paper compares the major characteristics of the two different approaches and identifies the criteria for choosing which approach to prefer over the other. It also compares in some detail the NoSQL databases used by CMS and ATLAS: MongoDB, CouchDB, HBase, and Cassandra.

  5. A Study of the Optimal Planning Model for Reservoir Sustainable Management- A Case Study of Shihmen Reservoir

    Science.gov (United States)

    Chen, Y. Y.; Ho, C. C.; Chang, L. C.

    2017-12-01

    The reservoir management in Taiwan faces lots of challenge. Massive sediment caused by landslide were flushed into reservoir, which will decrease capacity, rise the turbidity, and increase supply risk. Sediment usually accompanies nutrition that will cause eutrophication problem. Moreover, the unevenly distribution of rainfall cause water supply instability. Hence, how to ensure sustainable use of reservoirs has become an important task in reservoir management. The purpose of the study is developing an optimal planning model for reservoir sustainable management to find out an optimal operation rules of reservoir flood control and sediment sluicing. The model applies Genetic Algorithms to combine with the artificial neural network of hydraulic analysis and reservoir sediment movement. The main objective of operation rules in this study is to prevent reservoir outflow caused downstream overflow, minimum the gap between initial and last water level of reservoir, and maximum sluicing sediment efficiency. A case of Shihmen reservoir was used to explore the different between optimal operating rule and the current operation of the reservoir. The results indicate optimal operating rules tended to open desilting tunnel early and extend open duration during flood discharge period. The results also show the sluicing sediment efficiency of optimal operating rule is 36%, 44%, 54% during Typhoon Jangmi, Typhoon Fung-Wong, and Typhoon Sinlaku respectively. The results demonstrate the optimal operation rules do play a role in extending the service life of Shihmen reservoir and protecting the safety of downstream. The study introduces a low cost strategy, alteration of operation reservoir rules, into reservoir sustainable management instead of pump dredger in order to improve the problem of elimination of reservoir sediment and high cost.

  6. Multi-data reservoir history matching for enhanced reservoir forecasting and uncertainty quantification

    KAUST Repository

    Katterbauer, Klemens; Arango, Santiago; Sun, Shuyu; Hoteit, Ibrahim

    2015-01-01

    Reservoir simulations and history matching are critical for fine-tuning reservoir production strategies, improving understanding of the subsurface formation, and forecasting remaining reserves. Production data have long been incorporated

  7. Improved cache performance in Monte Carlo transport calculations using energy banding

    Science.gov (United States)

    Siegel, A.; Smith, K.; Felker, K.; Romano, P.; Forget, B.; Beckman, P.

    2014-04-01

    We present an energy banding algorithm for Monte Carlo (MC) neutral particle transport simulations which depend on large cross section lookup tables. In MC codes, read-only cross section data tables are accessed frequently, exhibit poor locality, and are typically too much large to fit in fast memory. Thus, performance is often limited by long latencies to RAM, or by off-node communication latencies when the data footprint is very large and must be decomposed on a distributed memory machine. The proposed energy banding algorithm allows maximal temporal reuse of data in band sizes that can flexibly accommodate different architectural features. The energy banding algorithm is general and has a number of benefits compared to the traditional approach. In the present analysis we explore its potential to achieve improvements in time-to-solution on modern cache-based architectures.

  8. All-optical reservoir computing.

    Science.gov (United States)

    Duport, François; Schneider, Bendix; Smerieri, Anteo; Haelterman, Marc; Massar, Serge

    2012-09-24

    Reservoir Computing is a novel computing paradigm that uses a nonlinear recurrent dynamical system to carry out information processing. Recent electronic and optoelectronic Reservoir Computers based on an architecture with a single nonlinear node and a delay loop have shown performance on standardized tasks comparable to state-of-the-art digital implementations. Here we report an all-optical implementation of a Reservoir Computer, made of off-the-shelf components for optical telecommunications. It uses the saturation of a semiconductor optical amplifier as nonlinearity. The present work shows that, within the Reservoir Computing paradigm, all-optical computing with state-of-the-art performance is possible.

  9. Sedimentological and Geomorphological Effects of Reservoir Flushing: The Cachi Reservoir, Costa Rica, 1996

    DEFF Research Database (Denmark)

    Brandt, Anders; Swenning, Joar

    1999-01-01

    Physical geography, hydrology, geomorphology, sediment transport, erosion, sedimentation, dams, reservoirs......Physical geography, hydrology, geomorphology, sediment transport, erosion, sedimentation, dams, reservoirs...

  10. Optimising reservoir operation

    DEFF Research Database (Denmark)

    Ngo, Long le

    Anvendelse af optimeringsteknik til drift af reservoirer er blevet et væsentligt element i vandressource-planlægning og -forvaltning. Traditionelt har reservoirer været styret af heuristiske procedurer for udtag af vand, suppleret i en vis udstrækning af subjektive beslutninger. Udnyttelse af...... reservoirer involverer en lang række interessenter med meget forskellige formål (f.eks. kunstig vanding, vandkraft, vandforsyning mv.), og optimeringsteknik kan langt bedre lede frem til afbalancerede løsninger af de ofte modstridende interesser. Afhandlingen foreslår en række tiltag, hvormed traditionelle...

  11. Phenotypic plasticity in fish life-history traits in two neotropical reservoirs: Petit-Saut Reservoir in French Guiana and Brokopondo Reservoir in Suriname

    Directory of Open Access Journals (Sweden)

    Bernard de Mérona

    Full Text Available Fish species are known for their large phenotypic plasticity in life-history traits in relation to environmental characteristics. Plasticity allows species to increase their fitness in a given environment. Here we examined the life-history response of fish species after an abrupt change in their environment caused by the damming of rivers. Two reservoirs of different age, both situated on the Guiana Shield, were investigated: the young Petit-Saut Reservoir in French Guiana (14 years and the much older Brokopondo Reservoir in Suriname (44 years. Six life-history traits in 14 fish species were studied and compared to their value in the Sinnamary River prior to the completion of Petit-Saut Reservoir. The traits analyzed were maximum length, absolute and relative length at first maturation, proportion of mature oocytes in ripe gonad, batch fecundity and mean size of mature oocytes. The results revealed a general increase of reproductive effort. All species showed a decrease in maximum length. Compared to the values observed before the dam constructions, eight species had larger oocytes and three species showed an increased batch fecundity. These observed changes suggest a trend towards a pioneer strategy. The changes observed in Petit-Saut Reservoir also seemed to apply to the 30 years older Brokopondo Reservoir suggesting that these reservoirs remain in a state of immaturity for a long time.

  12. Transport of reservoir fines

    DEFF Research Database (Denmark)

    Yuan, Hao; Shapiro, Alexander; Stenby, Erling Halfdan

    Modeling transport of reservoir fines is of great importance for evaluating the damage of production wells and infectivity decline. The conventional methodology accounts for neither the formation heterogeneity around the wells nor the reservoir fines’ heterogeneity. We have developed an integral...... dispersion equation in modeling the transport and the deposition of reservoir fines. It successfully predicts the unsymmetrical concentration profiles and the hyperexponential deposition in experiments....

  13. Sediment management for reservoir

    International Nuclear Information System (INIS)

    Rahman, A.

    2005-01-01

    All natural lakes and reservoirs whether on rivers, tributaries or off channel storages are doomed to be sited up. Pakistan has two major reservoirs of Tarbela and Managla and shallow lake created by Chashma Barrage. Tarbela and Mangla Lakes are losing their capacities ever since first impounding, Tarbela since 1974 and Mangla since 1967. Tarbela Reservoir receives average annual flow of about 62 MAF and sediment deposits of 0.11 MAF whereas Mangla gets about 23 MAF of average annual flows and is losing its storage at the rate of average 34,000 MAF annually. The loss of storage is a great concern and studies for Tarbela were carried out by TAMS and Wallingford to sustain its capacity whereas no study has been done for Mangla as yet except as part of study for Raised Mangla, which is only desk work. Delta of Tarbala reservoir has advanced to about 6.59 miles (Pivot Point) from power intakes. In case of liquefaction of delta by tremor as low as 0.12g peak ground acceleration the power tunnels I, 2 and 3 will be blocked. Minimum Pool of reservoir is being raised so as to check the advance of delta. Mangla delta will follow the trend of Tarbela. Tarbela has vast amount of data as reservoir is surveyed every year, whereas Mangla Reservoir survey was done at five-year interval, which has now been proposed .to be reduced to three-year interval. In addition suspended sediment sampling of inflow streams is being done by Surface Water Hydrology Project of WAPDA as also some bed load sampling. The problem of Chasma Reservoir has also been highlighted, as it is being indiscriminately being filled up and drawdown several times a year without regard to its reaction to this treatment. The Sediment Management of these reservoirs is essential and the paper discusses pros and cons of various alternatives. (author)

  14. Advances in photonic reservoir computing

    Science.gov (United States)

    Van der Sande, Guy; Brunner, Daniel; Soriano, Miguel C.

    2017-05-01

    We review a novel paradigm that has emerged in analogue neuromorphic optical computing. The goal is to implement a reservoir computer in optics, where information is encoded in the intensity and phase of the optical field. Reservoir computing is a bio-inspired approach especially suited for processing time-dependent information. The reservoir's complex and high-dimensional transient response to the input signal is capable of universal computation. The reservoir does not need to be trained, which makes it very well suited for optics. As such, much of the promise of photonic reservoirs lies in their minimal hardware requirements, a tremendous advantage over other hardware-intensive neural network models. We review the two main approaches to optical reservoir computing: networks implemented with multiple discrete optical nodes and the continuous system of a single nonlinear device coupled to delayed feedback.

  15. Sudden water pollution accidents and reservoir emergency operations: impact analysis at Danjiangkou Reservoir.

    Science.gov (United States)

    Zheng, Hezhen; Lei, Xiaohui; Shang, Yizi; Duan, Yang; Kong, Lingzhong; Jiang, Yunzhong; Wang, Hao

    2018-03-01

    Danjiangkou Reservoir is the source reservoir of the Middle Route of the South-to-North Water Diversion Project (MRP). Any sudden water pollution accident in the reservoir would threaten the water supply of the MRP. We established a 3-D hydrodynamic and water quality model for the Danjiangkou Reservoir, and proposed scientific suggestions on the prevention and emergency management for sudden water pollution accidents based on simulated results. Simulations were performed on 20 hypothetical pollutant discharge locations and 3 assumed amounts, in order to model the effect of pollutant spreading under different reservoir operation types. The results showed that both the location and mass of pollution affected water quality; however, different reservoir operation types had little effect. Five joint regulation scenarios, which altered the hydrodynamic processes of water conveyance for the Danjiangkou and Taocha dams, were considered for controlling pollution dispersion. The results showed that the spread of a pollutant could be effectively controlled through the joint regulation of the two dams and that the collaborative operation of the Danjiangkou and Taocha dams is critical for ensuring the security of water quality along the MRP.

  16. Quantification of Libby Reservoir Levels Needed to Maintain or Enhance Reservoir Fisheries, 1983-1987 Methods and Data Summary.

    Energy Technology Data Exchange (ETDEWEB)

    Chisholm, Ian

    1989-12-01

    Libby Reservoir was created under an International Columbia River Treaty between the United States and Canada for cooperative water development of the Columbia River Basin. The authorized purpose of the dam is to provide power, flood control, and navigation and other benefits. Research began in May 1983 to determine how operations of Libby dam impact the reservoir fishery and to suggest ways to lessen these impacts. This study is unique in that it was designed to accomplish its goal through detailed information gathering on every trophic level in the reservoir system and integration of this information into a quantitative computer model. The specific study objectives are to: quantify available reservoir habitat, determine abundance, growth and distribution of fish within the reservoir and potential recruitment of salmonids from Libby Reservoir tributaries within the United States, determine abundance and availability of food organisms for fish in the reservoir, quantify fish use of available food items, develop relationships between reservoir drawdown and reservoir habitat for fish and fish food organisms, and estimate impacts of reservoir operation on the reservoir fishery. 115 refs., 22 figs., 51 tabs.

  17. Characteristics of volcanic reservoirs and distribution rules of effective reservoirs in the Changling fault depression, Songliao Basin

    Directory of Open Access Journals (Sweden)

    Pujun Wang

    2015-11-01

    Full Text Available In the Songliao Basin, volcanic oil and gas reservoirs are important exploration domains. Based on drilling, logging, and 3D seismic (1495 km2 data, 546 sets of measured physical properties and gas testing productivity of 66 wells in the Changling fault depression, Songliao Basin, eruptive cycles and sub-lithofacies were distinguished after lithologic correction of the 19,384 m volcanic well intervals, so that a quantitative analysis was conducted on the relation between the eruptive cycles, lithologies and lithofacies and the distribution of effective reservoirs. After the relationship was established between lithologies, lithofacies & cycles and reservoir physical properties & oil and gas bearing situations, an analysis was conducted on the characteristics of volcanic reservoirs and the distribution rules of effective reservoirs. It is indicated that 10 eruptive cycles of 3 sections are totally developed in this area, and the effective reservoirs are mainly distributed at the top cycles of eruptive sequences, with those of the 1st and 3rd Members of Yingcheng Formation presenting the best reservoir properties. In this area, there are mainly 11 types of volcanic rocks, among which rhyolite, rhyolitic tuff, rhyolitic tuffo lava and rhyolitic volcanic breccia are the dominant lithologies of effective reservoirs. In the target area are mainly developed 4 volcanic lithofacies (11 sub-lithofacies, among which upper sub-lithofacies of effusive facies and thermal clastic sub-lithofacies of explosion lithofacies are predominant in effective reservoirs. There is an obvious corresponding relationship between the physical properties of volcanic reservoirs and the development degree of effective reservoirs. The distribution of effective reservoirs is controlled by reservoir physical properties, and the formation of effective reservoirs is influenced more by porosity than by permeability. It is concluded that deep volcanic gas exploration presents a good

  18. Reviving Abandoned Reservoirs with High-Pressure Air Injection: Application in a Fractured and Karsted Dolomite Reservoir

    Energy Technology Data Exchange (ETDEWEB)

    Robert Loucks; Stephen C. Ruppel; Dembla Dhiraj; Julia Gale; Jon Holder; Jeff Kane; Jon Olson; John A. Jackson; Katherine G. Jackson

    2006-09-30

    Despite declining production rates, existing reservoirs in the United States contain vast volumes of remaining oil that is not being effectively recovered. This oil resource constitutes a huge target for the development and application of modern, cost-effective technologies for producing oil. Chief among the barriers to the recovery of this oil are the high costs of designing and implementing conventional advanced recovery technologies in these mature, in many cases pressure-depleted, reservoirs. An additional, increasingly significant barrier is the lack of vital technical expertise necessary for the application of these technologies. This lack of expertise is especially notable among the small operators and independents that operate many of these mature, yet oil-rich, reservoirs. We addressed these barriers to more effective oil recovery by developing, testing, applying, and documenting an innovative technology that can be used by even the smallest operator to significantly increase the flow of oil from mature U.S. reservoirs. The Bureau of Economic Geology and Goldrus Producing Company assembled a multidisciplinary team of geoscientists and engineers to evaluate the applicability of high-pressure air injection (HPAI) in revitalizing a nearly abandoned carbonate reservoir in the Permian Basin of West Texas. The Permian Basin, the largest oil-bearing basin in North America, contains more than 70 billion barrels of remaining oil in place and is an ideal venue to validate this technology. We have demonstrated the potential of HPAI for oil-recovery improvement in preliminary laboratory tests and a reservoir pilot project. To more completely test the technology, this project emphasized detailed characterization of reservoir properties, which were integrated to access the effectiveness and economics of HPAI. The characterization phase of the project utilized geoscientists and petroleum engineers from the Bureau of Economic Geology and the Department of Petroleum

  19. Modeling Reservoir-River Networks in Support of Optimizing Seasonal-Scale Reservoir Operations

    Science.gov (United States)

    Villa, D. L.; Lowry, T. S.; Bier, A.; Barco, J.; Sun, A.

    2011-12-01

    HydroSCOPE (Hydropower Seasonal Concurrent Optimization of Power and the Environment) is a seasonal time-scale tool for scenario analysis and optimization of reservoir-river networks. Developed in MATLAB, HydroSCOPE is an object-oriented model that simulates basin-scale dynamics with an objective of optimizing reservoir operations to maximize revenue from power generation, reliability in the water supply, environmental performance, and flood control. HydroSCOPE is part of a larger toolset that is being developed through a Department of Energy multi-laboratory project. This project's goal is to provide conventional hydropower decision makers with better information to execute their day-ahead and seasonal operations and planning activities by integrating water balance and operational dynamics across a wide range of spatial and temporal scales. This presentation details the modeling approach and functionality of HydroSCOPE. HydroSCOPE consists of a river-reservoir network model and an optimization routine. The river-reservoir network model simulates the heat and water balance of river-reservoir networks for time-scales up to one year. The optimization routine software, DAKOTA (Design Analysis Kit for Optimization and Terascale Applications - dakota.sandia.gov), is seamlessly linked to the network model and is used to optimize daily volumetric releases from the reservoirs to best meet a set of user-defined constraints, such as maximizing revenue while minimizing environmental violations. The network model uses 1-D approximations for both the reservoirs and river reaches and is able to account for surface and sediment heat exchange as well as ice dynamics for both models. The reservoir model also accounts for inflow, density, and withdrawal zone mixing, and diffusive heat exchange. Routing for the river reaches is accomplished using a modified Muskingum-Cunge approach that automatically calculates the internal timestep and sub-reach lengths to match the conditions of

  20. Cesium reservoir and interconnective components

    International Nuclear Information System (INIS)

    1994-03-01

    The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW range. A thermionic converter must be supplied with cesium vapor for two reasons. Cesium atoms adsorbed on the surface of the emitter cause a reduction of the emitter work function to permit high current densities without excessive heating of the emitter. The second purpose of the cesium vapor is to provide space-charge neutralization in the emitter-collector gap so that the high current densities may flow across the gap unattenuated. The function of the cesium reservoir is to provide a source of cesium atoms, and to provide a reserve in the event that cesium is lost from the plasma by any mechanism. This can be done with a liquid cesium metal reservoir in which case it is heated to the desired temperature with auxiliary heaters. In a TFE, however, it is desirable to have the reservoir passively heated by the nuclear fuel. In this case, the reservoir must operate at a temperature intermediate between the emitter and the collector, ruling out the use of liquid reservoirs. Integral reservoirs contained within the TFE will produce cesium vapor pressures in the desired range at typical electrode temperatures. The reservoir material that appears to be the best able to meet requirements is graphite. Cesium intercalates easily into graphite, and the cesium pressure is insensitive to loading for a given intercalation stage. The goals of the cesium reservoir test program were to verify the performance of Cs-graphite reservoirs in the temperature-pressure range of interest to TFE operation, and to test the operation of these reservoirs after exposure to a fast neutron fluence corresponding to seven year mission lifetime. In addition, other materials were evaluated for possible use in the integral reservoir

  1. Watching the Creation of Southern California's Largest Reservoir

    Science.gov (United States)

    2001-01-01

    The new Diamond Valley Lake Reservoir near the city of Hemet in Riverside County is billed as the largest earthworks construction project in U.S.history. Construction began in 1995 and involved 31 million cubic meters of foundation excavation and 84 million cubic meters of embankment construction. This set of MISR images captures the most recent phase in the reservoir's activation. At the upper left is a natural-color view acquired by the instrument's vertical-viewing (nadir) camera on March 14, 2000 (Terra orbit 1273), shortly after the Metropolitan Water District began filling the reservoir with water from the Colorado River and Northern California. Water appears darker than the surrounding land. The image at the upper right was acquired nearly one year later on March 1, 2001 (Terra orbit 6399), and shows a clear increase in the reservoir's water content. When full, the lake will hold nearly a trillion liters of water.According to the Metropolitan Water District, the 7 kilometer x 3 kilometer reservoir nearly doubles Southern California's above-groundwater storage capacity. In addition to routine water management, Diamond Valley Lake is designed to provide protection against drought and a six-month emergency supply in the event of earthquake damage to a major aqueduct. In the face of electrical power shortages, it is also expected to reduce dependence on the pumping of water from northern mountains during the high-demand summer months. An unexpected result of site excavation was the uncovering of mastodon and mammoth skeletons along with bones from extinct species not previously thought to have been indigenous to the area, such as the giant long-horned bison and North American lion. A museum and interpretive center is being built to protect these finds.The lower MISR image, from May 20, 2001 (Terra orbit 7564), is a false-color view combining data from the instrument's 26-degree forward view (displayed as blue) with data from the 26-degree backward view

  2. Fort Cobb Reservoir Watershed, Oklahoma and Thika River Watershed, Kenya Twinning Pilot Project

    Science.gov (United States)

    Moriasi, D.; Steiner, J.; Arnold, J.; Allen, P.; Dunbar, J.; Shisanya, C.; Gathenya, J.; Nyaoro, J.; Sang, J.

    2007-12-01

    The Fort Cobb Reservoir Watershed (FCRW) (830 km2) is a watershed within the HELP Washita Basin, located in Caddo and Washita Counties, OK. It is also a benchmark watershed under USDA's Conservation Effects Assessment Project, a national project to quantify environmental effects of USDA and other conservation programs. Population in south-western Oklahoma, in which FCRW is located, is sparse and decreasing. Agricultural focuses on commodity production (beef, wheat, and row crops) with high costs and low margins. Surface and groundwater resources supply public, domestic, and irrigation water. Fort Cobb Reservoir and contributing stream segments are listed on the Oklahoma 303(d) list as not meeting water quality standards based on sedimentation, trophic level of the lake associated with phosphorus loads, and nitrogen in some stream segments in some seasons. Preliminary results from a rapid geomorphic assessment results indicated that unstable stream channels dominate the stream networks and make a significant but unknown contribution to suspended-sediment loadings. Impairment of the lake for municipal water supply, recreation, and fish and wildlife are important factors in local economies. The Thika River Watershed (TRW) (867 km2) is located in central Kenya. Population in TRW is high and increasing, which has led to a poor land-population ratio with population densities ranging from 250 people/km2 to over 500 people/km2. The poor land-population ratio has resulted in land sub-division, fragmentation, over- cultivation, overgrazing, and deforestation which have serious implications on soil erosion, which poses a threat to both agricultural production and downstream reservoirs. Agricultural focuses mainly on subsistence and some cash crops (dairy cattle, corn, beans, coffee, floriculture and pineapple) farming. Surface and groundwater resources supply domestic, public, and hydroelectric power generation water. Thika River supplies 80% of the water for the city of

  3. Security in the Cache and Forward Architecture for the Next Generation Internet

    Science.gov (United States)

    Hadjichristofi, G. C.; Hadjicostis, C. N.; Raychaudhuri, D.

    The future Internet architecture will be comprised predominately of wireless devices. It is evident at this stage that the TCP/IP protocol that was developed decades ago will not properly support the required network functionalities since contemporary communication profiles tend to be data-driven rather than host-based. To address this paradigm shift in data propagation, a next generation architecture has been proposed, the Cache and Forward (CNF) architecture. This research investigates security aspects of this new Internet architecture. More specifically, we discuss content privacy, secure routing, key management and trust management. We identify security weaknesses of this architecture that need to be addressed and we derive security requirements that should guide future research directions. Aspects of the research can be adopted as a step-stone as we build the future Internet.

  4. Comparison of total Hg results in sediment samples from Rio Grande reservoir determine by NAA and CV AAS

    International Nuclear Information System (INIS)

    Franklin, Robson L.

    2011-01-01

    The Rio Grande reservoir is located in the Metropolitan area of Sao Paulo and it is used for recreation purposes and as source water for drinking water production. During the last decades has been detected mercury contamination in the sediments of this reservoir, mainly in the eastern part, near the main affluent of the reservoir, in the Rio Grande da Serra and Ribeirao Pires counties. In the present study bottom sediment samples were collected in four different sites into four sampling campaigns during the period of September 2008 to January 2010. The samples were dried at room temperature, ground and passed through a 2 mm sieve. Total Hg determination in the sediment samples was carried out by two different analytical techniques: neutron activation analysis (NAA) and cold vapor atomic absorption spectrometry (CV AAS). The methodology validation, in terms of precision and accuracy, was performed by reference materials, and presented a recovery of 83 to 108%. The total Hg results obtained by both analytical techniques ranged from 3 to 71 mg kg-1 and were considered similar by statistical analysis, even though NAA technique furnishes the total concentration while CV AAS using the 3015 digestion procedure characterizes only the bioavailable Hg. These results confirm that both analytical techniques were suitable to detect the Hg concentration levels in the Rio Grande sediments studied. The Hg levels in the sediment of the Rio Grande reservoir confirm the anthropogenic origin for this element in this ecosystem. (author)

  5. CO2 Huff-n-Puff process in a light oil shallow shelf carbonate reservoir. Annual report, January 1, 1995--December 31, 1995

    Energy Technology Data Exchange (ETDEWEB)

    Wehner, S.C.; Boomer, R.J.; Cole, R.; Preiditus, J.; Vogt, J.

    1996-09-01

    The application of cyclic CO{sub 2}, often referred to as the CO{sub 2} Huff-n-Puff process, may find its niche in the maturing waterfloods of the Permian Basin. Coupling the CO{sub 2} H-n-P process to miscible flooding applications could provide the needed revenue to sufficiently mitigate near-term negative cash flow concerns in the capital intensive miscible projects. Texaco Exploration & Production Inc. and the U.S. Department of Energy have teamed up in an attempt to develop the CO{sub 2} Huff-n-Puff process in the Grayburg/San Andres formation; a light oil, shallow shelf carbonate reservoir within the Permian Basin. This cost-shared effort is intended to demonstrate the viability of this underutilized technology in a specific class of domestic reservoir. A significant amount of oil reserves are located in carbonate reservoirs. Specifically, the carbonates deposited in shallow shelf (SSC) environments make up the largest percentage of known reservoirs within the Permian Basin of North America. Many of these known resources have been under waterflooding operations for decades and are at risk of abandonment if crude oil recoveries cannot be economically enhanced. The selected site for this demonstration project is the Central Vacuum Unit waterflood in Lea County, New Mexico.

  6. A study on the effectiveness of lockup-free caches for a Reduced Instruction Set Computer (RISC) processor

    OpenAIRE

    Tharpe, Leonard.

    1992-01-01

    Approved for public release; distribution is unlimited This thesis presents a simulation and analysis of the Reduced Instruction Set Computer (RISC) architecture and the effects on RISC performance of a lockup-free cache interface. RISC architectures achieve high performance by having a small, but sufficient, instruction set with most instructions executing in one clock cycle. Current RISC performance range from 1.5 to 2.0 CPI. The goal of RISC is to attain a CPI of 1.0. The major hind...

  7. Simulation of Reservoir Sediment Flushing of the Three Gorges Reservoir Using an Artificial Neural Network

    Directory of Open Access Journals (Sweden)

    Xueying Li

    2016-05-01

    Full Text Available Reservoir sedimentation and its effect on the environment are the most serious world-wide problems in water resources development and utilization today. As one of the largest water conservancy projects, the Three Gorges Reservoir (TGR has been controversial since its demonstration period, and sedimentation is the major concern. Due to the complex physical mechanisms of water and sediment transport, this study adopts the Error Back Propagation Training Artificial Neural Network (BP-ANN to analyze the relationship between the sediment flushing efficiency of the TGR and its influencing factors. The factors are determined by the analysis on 1D unsteady flow and sediment mathematical model, mainly including reservoir inflow, incoming sediment concentration, reservoir water level, and reservoir release. Considering the distinguishing features of reservoir sediment delivery in different seasons, the monthly average data from 2003, when the TGR was put into operation, to 2011 are used to train, validate, and test the BP-ANN model. The results indicate that, although the sample space is quite limited, the whole sediment delivery process can be schematized by the established BP-ANN model, which can be used to help sediment flushing and thus decrease the reservoir sedimentation.

  8. An Effective Reservoir Parameter for Seismic Characterization of Organic Shale Reservoir

    Science.gov (United States)

    Zhao, Luanxiao; Qin, Xuan; Zhang, Jinqiang; Liu, Xiwu; Han, De-hua; Geng, Jianhua; Xiong, Yineng

    2017-12-01

    Sweet spots identification for unconventional shale reservoirs involves detection of organic-rich zones with abundant porosity. However, commonly used elastic attributes, such as P- and S-impedances, often show poor correlations with porosity and organic matter content separately and thus make the seismic characterization of sweet spots challenging. Based on an extensive analysis of worldwide laboratory database of core measurements, we find that P- and S-impedances exhibit much improved linear correlations with the sum of volume fraction of organic matter and porosity than the single parameter of organic matter volume fraction or porosity. Importantly, from the geological perspective, porosity in conjunction with organic matter content is also directly indicative of the total hydrocarbon content of shale resources plays. Consequently, we propose an effective reservoir parameter (ERP), the sum of volume fraction of organic matter and porosity, to bridge the gap between hydrocarbon accumulation and seismic measurements in organic shale reservoirs. ERP acts as the first-order factor in controlling the elastic properties as well as characterizing the hydrocarbon storage capacity of organic shale reservoirs. We also use rock physics modeling to demonstrate why there exists an improved linear correlation between elastic impedances and ERP. A case study in a shale gas reservoir illustrates that seismic-derived ERP can be effectively used to characterize the total gas content in place, which is also confirmed by the production well.

  9. Integrated Approach to Drilling Project in Unconventional Reservoir Using Reservoir Simulation

    Science.gov (United States)

    Stopa, Jerzy; Wiśniowski, Rafał; Wojnarowski, Paweł; Janiga, Damian; Skrzypaszek, Krzysztof

    2018-03-01

    Accumulation and flow mechanisms in unconventional reservoir are different compared to conventional. This requires a special approach of field management with drilling and stimulation treatments as major factor for further production. Integrated approach of unconventional reservoir production optimization assumes coupling drilling project with full scale reservoir simulation for determine best well placement, well length, fracturing treatment design and mid-length distance between wells. Full scale reservoir simulation model emulate a part of polish shale - gas field. The aim of this paper is to establish influence of technical factor for gas production from shale gas field. Due to low reservoir permeability, stimulation treatment should be direct towards maximizing the hydraulic contact. On the basis of production scenarios, 15 stages hydraulic fracturing allows boost gas production over 1.5 times compared to 8 stages. Due to the possible interference of the wells, it is necessary to determine the distance between the horizontal parts of the wells trajectories. In order to determine the distance between the wells allowing to maximize recovery factor of resources in the stimulated zone, a numerical algorithm based on a dynamic model was developed and implemented. Numerical testing and comparative study show that the most favourable arrangement assumes a minimum allowable distance between the wells. This is related to the volume ratio of the drainage zone to the total volume of the stimulated zone.

  10. Reservoir floodplains support distinct fish assemblages

    Science.gov (United States)

    Miranda, Leandro E.; Wigen, S. L.; Dagel, Jonah D.

    2014-01-01

    Reservoirs constructed on floodplain rivers are unique because the upper reaches of the impoundment may include extensive floodplain environments. Moreover, reservoirs that experience large periodic water level fluctuations as part of their operational objectives seasonally inundate and dewater floodplains in their upper reaches, partly mimicking natural inundations of river floodplains. In four flood control reservoirs in Mississippi, USA, we explored the dynamics of connectivity between reservoirs and adjacent floodplains and the characteristics of fish assemblages that develop in reservoir floodplains relative to those that develop in reservoir bays. Although fish species richness in floodplains and bays were similar, species composition differed. Floodplains emphasized fish species largely associated with backwater shallow environments, often resistant to harsh environmental conditions. Conversely, dominant species in bays represented mainly generalists that benefit from the continuous connectivity between the bay and the main reservoir. Floodplains in the study reservoirs provided desirable vegetated habitats at lower water level elevations, earlier in the year, and more frequently than in bays. Inundating dense vegetation in bays requires raising reservoir water levels above the levels required to reach floodplains. Therefore, aside from promoting distinct fish assemblages within reservoirs and helping promote diversity in regulated rivers, reservoir floodplains are valued because they can provide suitable vegetated habitats for fish species at elevations below the normal pool, precluding the need to annually flood upland vegetation that would inevitably be impaired by regular flooding. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.

  11. A rationale for reservoir management economics

    International Nuclear Information System (INIS)

    Hickman, T.S.

    1995-01-01

    Significant economic benefits can be derived from the application f reservoir management. The key elements in economical reservoir management are the efficient use of available resources and optimization of reservoir exploitation through a multidisciplined approach. This paper describes various aspects of and approaches to reservoir management and provides case histories that support the findings

  12. INCREASING WATERFLOOD RESERVES IN THE WILMINGTON OIL FIELD THROUGH IMPROVED RESERVOIR CHARACTERIZATION AND RESERVOIR MANAGEMENT

    Energy Technology Data Exchange (ETDEWEB)

    Scott Walker; Chris Phillips; Roy Koerner; Don Clarke; Dan Moos; Kwasi Tagbor

    2002-02-28

    This project increased recoverable waterflood reserves in slope and basin reservoirs through improved reservoir characterization and reservoir management. The particular application of this project is in portions of Fault Blocks IV and V of the Wilmington Oil Field, in Long Beach, California, but the approach is widely applicable in slope and basin reservoirs. Transferring technology so that it can be applied in other sections of the Wilmington Field and by operators in other slope and basin reservoirs is a primary component of the project. This project used advanced reservoir characterization tools, including the pulsed acoustic cased-hole logging tool, geologic three-dimensional (3-D) modeling software, and commercially available reservoir management software to identify sands with remaining high oil saturation following waterflood. Production from the identified high oil saturated sands was stimulated by recompleting existing production and injection wells in these sands using conventional means as well as a short radius redrill candidate. Although these reservoirs have been waterflooded over 40 years, researchers have found areas of remaining oil saturation. Areas such as the top sand in the Upper Terminal Zone Fault Block V, the western fault slivers of Upper Terminal Zone Fault Block V, the bottom sands of the Tar Zone Fault Block V, and the eastern edge of Fault Block IV in both the Upper Terminal and Lower Terminal Zones all show significant remaining oil saturation. Each area of interest was uncovered emphasizing a different type of reservoir characterization technique or practice. This was not the original strategy but was necessitated by the different levels of progress in each of the project activities.

  13. Exploration and reservoir characterization; Technology Target Areas; TTA2 - Exploration and reservoir characterisation

    Energy Technology Data Exchange (ETDEWEB)

    2008-07-01

    In future, research within exploration and reservoir characterization will play an even more important role for Norway since resources are decreasing and new challenges like deep sea, harsh environment and last but not least environmental issues have to be considered. There are two major fields which have to be addressed within exploration and reservoir characterization: First, replacement of reserves by new discoveries and ultimate field recoveries in mature basins at the Norwegian Continental shelf, e.g. at the Halten Terrace has to be addressed. A wealth of data exists in the more mature areas. Interdisciplinary integration is a key feature of reservoir characterization, where available data and specialist knowledge need to be combined into a consistent reservoir description. A systematic approach for handling both uncertainties in data sources and uncertainties in basic models is needed. Fast simulation techniques are necessary to generate models spanning the event space, covering both underground based and model-based uncertainties. Second, exploration in frontier areas like the Barents Sea region and the deeper Voering Basin has to be addressed. The scarcity of wells in these frontier areas leads to uncertainties in the geological understanding. Basin- and depositional modelling are essential for predicting where source rocks and reservoir rocks are deposited, and if, when and which hydrocarbons are generated and trapped. Predictive models and improved process understanding is therefore crucial to meet these issues. Especially the challenges related to the salt deposits e.g. sub-salt/sub-basalt reservoir definitions in the Nordkapp Basin demands up-front research and technology developments. TTA2 stresses the need to focus on the development of new talents. We also see a strong need to push cooperation as far as possible in the present competitive environment. Projects that may require a substantial financial commitment have been identified. The following

  14. Reservoir Characterization, Production Characteristics, and Research Needs for Fluvial/Alluvial Reservoirs in the United States

    Energy Technology Data Exchange (ETDEWEB)

    Cole, E.L.; Fowler, M.L.; Jackson, S.R.; Madden, M.P.; Raw-Schatzinger, V.; Salamy, S.P.; Sarathi, P.; Young, M.A.

    1999-04-28

    The Department of Energy's (DOE's) Oil Recovery Field Demonstration Program was initiated in 1992 to maximize the economically and environmentally sound recovery of oil from known domestic reservoirs and to preserve access to this resource. Cost-shared field demonstration projects are being initiated in geology defined reservoir classes which have been prioritized by their potential for incremental recovery and their risk of abandonment. This document defines the characteristics of the fifth geological reservoir class in the series, fluvial/alluvial reservoirs. The reservoirs of Class 5 include deposits of alluvial fans, braided streams, and meandering streams. Deposit morphologies vary as a complex function of climate and tectonics and are characterized by a high degree of heterogeneity to fluid flow as a result of extreme variations in water energy as the deposits formed.

  15. Post waterflood CO{sub 2} miscible flood in light oil, fluvial-dominated deltaic reservoir. Annual report, October 1, 1993--September 30, 1994

    Energy Technology Data Exchange (ETDEWEB)

    Bou-Mikael, S.

    1995-07-01

    Texaco Exploration and Production Inc. (TEPI) and the U.S. Department of Energy (DOE) entered into a cost sharing cooperative agreement to conduct an Enhanced Oil Recovery demonstration project at Port Neches. The field is located in Orange County near Beaumont, Texas. The project will demonstrate the effectiveness of the CO{sub 2}, miscible process in Fluvial Dominated Deltaic reservoirs. It will also evaluate the use of horizontal CO{sub 2} injection wells to improve the overall sweep efficiency. A data base of FDD reservoirs for the gulf coast region will be developed by LSU, using a screening model developed by Texaco Research Center in Houston. Finally, the results and the information gained from this project will be disseminated throughout the oil industry via a series of SPE papers and industry open forums. Reservoir characterization efforts for the Marginulina sand, are in progress utilizing conventional and advanced technologies including 3-D seismic. Sidewall and conventional. cores were cut and analyzed, lab tests were conducted on reservoir fluids, reservoir BHP pressure and reservoir voidage were monitored as shown. Texaco is utilizing the above data to develop a Stratamodel to best describe and characterize the reservoir and to use it as an input for the compositional simulator. The current compositional model is being revised to integrate the new data from the 3-D seismic and field performance under CO{sub 2} injection, to ultimately develop an accurate economic model. All facilities work has been completed and placed in service including the CO{sub 2} pipeline and metering equipment, CO{sub 2} injection and production equipment, water injection equipment, well work and injection/production lines. The horizontal injection well was drilled and completed on January 15, 1994. CO{sub 2} purchases from Cardox continue at an average rate of 3600 MCFD. The CO{sub 2} is being injected at line pressure of 1350 psi.

  16. Reservoir architecture and tough gas reservoir potential of fluvial crevasse-splay deposits

    NARCIS (Netherlands)

    Van Toorenenburg, K.A.; Donselaar, M.E.; Weltje, G.J.

    2015-01-01

    Unconventional tough gas reservoirs in low-net-to-gross fluvial stratigraphic intervals may constitute a secondary source of fossil energy to prolong the gas supply in the future. To date, however, production from these thin-bedded, fine-grained reservoirs has been hampered by the economic risks

  17. Development of gas and gas condensate reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-01

    In the study of gas reservoir development, the first year topics are restricted on reservoir characterization. There are two types of reservoir characterization. One is the reservoir formation characterization and the other is the reservoir fluid characterization. For the reservoir formation characterization, calculation of conditional simulation was compared with that of unconditional simulation. The results of conditional simulation has higher confidence level than the unconditional simulation because conditional simulation considers the sample location as well as distance correlation. In the reservoir fluid characterization, phase behavior calculations revealed that the component grouping is more important than the increase of number of components. From the liquid volume fraction with pressure drop, the phase behavior of reservoir fluid can be estimated. The calculation results of fluid recombination, constant composition expansion, and constant volume depletion are matched very well with the experimental data. In swelling test of the reservoir fluid with lean gas, the accuracy of dew point pressure forecast depends on the component characterization. (author). 28 figs., 10 tabs.

  18. Advances in photonic reservoir computing

    Directory of Open Access Journals (Sweden)

    Van der Sande Guy

    2017-05-01

    Full Text Available We review a novel paradigm that has emerged in analogue neuromorphic optical computing. The goal is to implement a reservoir computer in optics, where information is encoded in the intensity and phase of the optical field. Reservoir computing is a bio-inspired approach especially suited for processing time-dependent information. The reservoir’s complex and high-dimensional transient response to the input signal is capable of universal computation. The reservoir does not need to be trained, which makes it very well suited for optics. As such, much of the promise of photonic reservoirs lies in their minimal hardware requirements, a tremendous advantage over other hardware-intensive neural network models. We review the two main approaches to optical reservoir computing: networks implemented with multiple discrete optical nodes and the continuous system of a single nonlinear device coupled to delayed feedback.

  19. Summary and Synthesis of Mercury Studies in the Cache Creek Watershed, California, 2000-01

    Science.gov (United States)

    Domagalski, Joseph L.; Slotton, Darell G.; Alpers, Charles N.; Suchanek, Thomas H.; Churchill, Ronald; Bloom, Nicolas; Ayers, Shaun M.; Clinkenbeard, John

    2004-01-01

    This report summarizes the principal findings of the Cache Creek, California, components of a project funded by the CALFED Bay?Delta Program entitled 'An Assessment of Ecological and Human Health Impacts of Mercury in the Bay?Delta Watershed.' A companion report summarizes the key findings of other components of the project based in the San Francisco Bay and the Delta of the Sacramento and San Joaquin Rivers. These summary documents present the more important findings of the various studies in a format intended for a wide audience. For more in-depth, scientific presentation and discussion of the research, a series of detailed technical reports of the integrated mercury studies is available at the following website: .

  20. From the Island of the Blue Dolphins: A unique 19th century cache feature from San Nicolas Island, California

    Science.gov (United States)

    Erlandson, Jon M.; Thomas-Barnett, Lisa; Vellanoweth, René L.; Schwartz, Steven J.; Muhs, Daniel R.

    2013-01-01

    A cache feature salvaged from an eroding sea cliff on San Nicolas Island produced two redwood boxes containing more than 200 artifacts of Nicoleño, Native Alaskan, and Euro-American origin. Outside the boxes were four asphaltum-coated baskets, abalone shells, a sandstone dish, and a hafted stone knife. The boxes, made from split redwood planks, contained a variety of artifacts and numerous unmodified bones and teeth from marine mammals, fish, birds, and large land mammals. Nicoleño-style artifacts include 11 knives with redwood handles and stone blades, stone projectile points, steatite ornaments and effigies, a carved stone pipe, abraders and burnishing stones, bird bone whistles, bone and shell pendants, abalone shell dishes, and two unusual barbed shell fishhooks. Artifacts of Native Alaskan style include four bone toggling harpoons, two unilaterally barbed bone harpoon heads, bone harpoon fore-shafts, a ground slate blade, and an adze blade. Objects of Euro-American origin or materials include a brass button, metal harpoon blades, and ten flaked glass bifaces. The contents of the cache feature, dating to the early-to-mid nineteenth century, provide an extraordinary window on a time of European expansion and global economic development that created unique cultural interactions and social transformations.

  1. Reservoir fisheries of Asia

    International Nuclear Information System (INIS)

    Silva, S.S. De.

    1990-01-01

    At a workshop on reservoir fisheries research, papers were presented on the limnology of reservoirs, the changes that follow impoundment, fisheries management and modelling, and fish culture techniques. Separate abstracts have been prepared for three papers from this workshop

  2. National Dam Safety Program. Lakeview Estates Dam (MO 11004), Mississippi - Kaskaskia - St. Louis Basin, Warren County, Missouri. Phase I Inspection Report.

    Science.gov (United States)

    1979-09-01

    ificatiozh Distributon/ Availabilit oe LAKEVIEW ESTATES DAM WARREN COUNTY, MISSOURI MISSOURI INVENTORY NO. 11004 PHASE I INSPECTION REPORT NATIONAL DAM SAFETY...and *impounds less than 1,000 acre-feet of water . Our inspection and evaluation indicates that the spill- way of Lakeview Estates Dam does not meet...not be measured because of high reservoir level, scalloping near the crest and a berm just under the water surface. Limestone riprap in sizes from sand

  3. An index of reservoir habitat impairment

    Science.gov (United States)

    Miranda, L.E.; Hunt, K.M.

    2011-01-01

    Fish habitat impairment resulting from natural and anthropogenic watershed and in-lake processes has in many cases reduced the ability of reservoirs to sustain native fish assemblages and fisheries quality. Rehabilitation of impaired reservoirs is hindered by the lack of a method suitable for scoring impairment status. To address this limitation, an index of reservoir habitat impairment (IRHI) was developed by merging 14 metrics descriptive of common impairment sources, with each metric scored from 0 (no impairment) to 5 (high impairment) by fisheries scientists with local knowledge. With a plausible range of 5 to 25, distribution of the IRHI scores ranged from 5 to 23 over 482 randomly selected reservoirs dispersed throughout the USA. The IRHI reflected five impairment factors including siltation, structural habitat, eutrophication, water regime, and aquatic plants. The factors were weakly related to key reservoir characteristics including reservoir area, depth, age, and usetype, suggesting that common reservoir descriptors are poor predictors of fish habitat impairment. The IRHI is rapid and inexpensive to calculate, provides an easily understood measure of the overall habitat impairment, allows comparison of reservoirs and therefore prioritization of restoration activities, and may be used to track restoration progress. The major limitation of the IRHI is its reliance on unstandardized professional judgment rather than standardized empirical measurements. ?? 2010 US Government.

  4. 4. International reservoir characterization technical conference

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-04-01

    This volume contains the Proceedings of the Fourth International Reservoir Characterization Technical Conference held March 2-4, 1997 in Houston, Texas. The theme for the conference was Advances in Reservoir Characterization for Effective Reservoir Management. On March 2, 1997, the DOE Class Workshop kicked off with tutorials by Dr. Steve Begg (BP Exploration) and Dr. Ganesh Thakur (Chevron). Tutorial presentations are not included in these Proceedings but may be available from the authors. The conference consisted of the following topics: data acquisition; reservoir modeling; scaling reservoir properties; and managing uncertainty. Selected papers have been processed separately for inclusion in the Energy Science and Technology database.

  5. Data Locality via Coordinated Caching for Distributed Processing

    Science.gov (United States)

    Fischer, M.; Kuehn, E.; Giffels, M.; Jung, C.

    2016-10-01

    To enable data locality, we have developed an approach of adding coordinated caches to existing compute clusters. Since the data stored locally is volatile and selected dynamically, only a fraction of local storage space is required. Our approach allows to freely select the degree at which data locality is provided. It may be used to work in conjunction with large network bandwidths, providing only highly used data to reduce peak loads. Alternatively, local storage may be scaled up to perform data analysis even with low network bandwidth. To prove the applicability of our approach, we have developed a prototype implementing all required functionality. It integrates seamlessly into batch systems, requiring practically no adjustments by users. We have now been actively using this prototype on a test cluster for HEP analyses. Specifically, it has been integral to our jet energy calibration analyses for CMS during run 2. The system has proven to be easily usable, while providing substantial performance improvements. Since confirming the applicability for our use case, we have investigated the design in a more general way. Simulations show that many infrastructure setups can benefit from our approach. For example, it may enable us to dynamically provide data locality in opportunistic cloud resources. The experience we have gained from our prototype enables us to realistically assess the feasibility for general production use.

  6. Reservoir resistivity characterization incorporating flow dynamics

    KAUST Repository

    Arango, Santiago

    2016-04-07

    Systems and methods for reservoir resistivity characterization are provided, in various aspects, an integrated framework for the estimation of Archie\\'s parameters for a strongly heterogeneous reservoir utilizing the dynamics of the reservoir are provided. The framework can encompass a Bayesian estimation/inversion method for estimating the reservoir parameters, integrating production and time lapse formation conductivity data to achieve a better understanding of the subsurface rock conductivity properties and hence improve water saturation imaging.

  7. Reservoir resistivity characterization incorporating flow dynamics

    KAUST Repository

    Arango, Santiago; Sun, Shuyu; Hoteit, Ibrahim; Katterbauer, Klemens

    2016-01-01

    Systems and methods for reservoir resistivity characterization are provided, in various aspects, an integrated framework for the estimation of Archie's parameters for a strongly heterogeneous reservoir utilizing the dynamics of the reservoir are provided. The framework can encompass a Bayesian estimation/inversion method for estimating the reservoir parameters, integrating production and time lapse formation conductivity data to achieve a better understanding of the subsurface rock conductivity properties and hence improve water saturation imaging.

  8. Paragenetic evolution of reservoir facies, Middle Triassic Halfway Formation, PeeJay Field, northeastern British Columbia: controls on reservoir quality

    Energy Technology Data Exchange (ETDEWEB)

    Caplan, M. L. [Alberta Univ., Dept. of Earth and Atmospheric Sciences, Edmonton, AB (Canada); Moslow, T. F. [Ulster Petroleum Ltd., Calgary, AB (Canada)

    1998-09-01

    Because of the obvious importance of reservoir quality to reservoir performance, diagenetic controls on reservoir quality of Middle Triassic reservoir facies are investigated by comparing two reservoir lithofacies. The implications of porosity structure on the efficiency of primary and secondary hydrocarbon recovery are also assessed. Halfway reservoir facies are composed of bioclastic grainstones (lithofacies G) and litharenites/sublitharenites (lithofacies H), both of which are interpreted as tidal inlet fills. Although paragenetic evolution was similar for the two reservoir facies, subtle differences in reservoir quality are discernible. These are controlled by sedimentary structures, porosity type, grain constituents, and degree of cementation. Reservoir quality in lithofacies G is a function of connectivity of the pore network. In lithofacies H, secondary granular porosity creates a more homogeneous interconnected pore system, wide pore throats and low aspect ratios. The high porosity and low permeability values of the bioclastic grainstones are suspected to cause inefficient flushing of hydrocarbons during waterflooding. However, it is suggested that recovery may be enhanced by induced hydraulic fracturing and acidization of lower permeability calcareous cemented zones. 52 refs., 15 figs.

  9. Sensitivity Analysis of Methane Hydrate Reservoirs: Effects of Reservoir Parameters on Gas Productivity and Economics

    Science.gov (United States)

    Anderson, B. J.; Gaddipati, M.; Nyayapathi, L.

    2008-12-01

    This paper presents a parametric study on production rates of natural gas from gas hydrates by the method of depressurization, using CMG STARS. Seven factors/parameters were considered as perturbations from a base-case hydrate reservoir description based on Problem 7 of the International Methane Hydrate Reservoir Simulator Code Comparison Study led by the Department of Energy and the USGS. This reservoir is modeled after the inferred properties of the hydrate deposit at the Prudhoe Bay L-106 site. The included sensitivity variables were hydrate saturation, pressure (depth), temperature, bottom-hole pressure of the production well, free water saturation, intrinsic rock permeability, and porosity. A two-level (L=2) Plackett-Burman experimental design was used to study the relative effects of these factors. The measured variable was the discounted cumulative gas production. The discount rate chosen was 15%, resulting in the gas contribution to the net present value of a reservoir. Eight different designs were developed for conducting sensitivity analysis and the effects of the parameters on the real and discounted production rates will be discussed. The breakeven price in various cases and the dependence of the breakeven price on the production parameters is given in the paper. As expected, initial reservoir temperature has the strongest positive effect on the productivity of a hydrate deposit and the bottom-hole pressure in the production well has the strongest negative dependence. Also resulting in a positive correlation is the intrinsic permeability and the initial free water of the formation. Negative effects were found for initial hydrate saturation (at saturations greater than 50% of the pore space) and the reservoir porosity. These negative effects are related to the available sensible heat of the reservoir, with decreasing productivity due to decreasing available sensible heat. Finally, we conclude that for the base case reservoir, the break-even price (BEP

  10. Static reservoir modeling of the Bahariya reservoirs for the oilfields development in South Umbarka area, Western Desert, Egypt

    Science.gov (United States)

    Abdel-Fattah, Mohamed I.; Metwalli, Farouk I.; Mesilhi, El Sayed I.

    2018-02-01

    3D static reservoir modeling of the Bahariya reservoirs using seismic and wells data can be a relevant part of an overall strategy for the oilfields development in South Umbarka area (Western Desert, Egypt). The seismic data is used to build the 3D grid, including fault sticks for the fault modeling, and horizon interpretations and surfaces for horizon modeling. The 3D grid is the digital representation of the structural geology of Bahariya Formation. When we got a reasonably accurate representation, we fill the 3D grid with facies and petrophysical properties to simulate it, to gain a more precise understanding of the reservoir properties behavior. Sequential Indicator Simulation (SIS) and Sequential Gaussian Simulation (SGS) techniques are the stochastic algorithms used to spatially distribute discrete reservoir properties (facies) and continuous reservoir properties (shale volume, porosity, and water saturation) respectively within the created 3D grid throughout property modeling. The structural model of Bahariya Formation exhibits the trapping mechanism which is a fault assisted anticlinal closure trending NW-SE. This major fault breaks the reservoirs into two major fault blocks (North Block and South Block). Petrophysical models classified Lower Bahariya reservoir as a moderate to good reservoir rather than Upper Bahariya reservoir in terms of facies, with good porosity and permeability, low water saturation, and moderate net to gross. The Original Oil In Place (OOIP) values of modeled Bahariya reservoirs show hydrocarbon accumulation in economic quantity, considering the high structural dips at the central part of South Umbarka area. The powerful of 3D static modeling technique has provided a considerable insight into the future prediction of Bahariya reservoirs performance and production behavior.

  11. Deriving Area-storage Curves of Global Reservoirs

    Science.gov (United States)

    Mu, M.; Tang, Q.

    2017-12-01

    Basic information including capacity, dam height, and largest water area on global reservoirs and dams is well documented in databases such as GRanD (Global Reservoirs and Dams), ICOLD (International Commission on Large Dams). However, though playing a critical role in estimating reservoir storage variations from remote sensing or hydrological models, area-storage (or elevation-storage) curves of reservoirs are not publicly shared. In this paper, we combine Landsat surface water extent, 1 arc-minute global relief model (ETOPO1) and GRanD database to derive area-storage curves of global reservoirs whose area is larger than 1 km2 (6,000 more reservoirs are included). First, the coverage polygon of each reservoir in GRanD is extended to where water was detected by Landsat during 1985-2015. Second, elevation of each pixel in the reservoir is extracted from resampled 30-meter ETOPO1, and then relative depth and frequency of each depth value is calculated. Third, cumulative storage is calculated with increasing water area by every one percent of reservoir coverage area and then the uncalibrated area-storage curve is obtained. Finally, the area-storage curve is linearly calibrated by the ratio of calculated capacity over reported capacity in GRanD. The derived curves are compared with in-situ reservoir data collected in Great Plains Region in US, and the results show that in-situ records are well captured by the derived curves even in relative small reservoirs (several square kilometers). The new derived area-storage curves have the potential to be employed in global monitoring or modelling of reservoirs storage and area variations.

  12. Economics of Developing Hot Stratigraphic Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Greg Mines; Hillary Hanson; Rick Allis; Joseph Moore

    2014-09-01

    Stratigraphic geothermal reservoirs at 3 – 4 km depth in high heat-flow basins are capable of sustaining 100 MW-scale power plants at about 10 c/kWh. This paper examines the impacts on the levelized cost of electricity (LCOE) of reservoir depth and temperature, reservoir productivity, and drillhole/casing options. For a reservoir at 3 km depth with a moderate productivity index by hydrothermal reservoir standards (about 50 L/s/MPa, 5.6 gpm/psi), an LCOE of 10c/kWh requires the reservoir to be at about 200°C. This is the upper temperature limit for pumps. The calculations assume standard hydrothermal drilling costs, with the production interval completed with a 7 inch liner in an 8.5 inch hole. If a reservoir at 4 km depth has excellent permeability characteristics with a productivity index of 100 L/s/MPa (11.3 gpm/psi), then the LCOE is about 11 c/kWh assuming the temperature decline rate with development is not excessive (< 1%/y, with first thermal breakthrough delayed by about 10 years). Completing wells with modest horizontal legs (e.g. several hundred meters) may be important for improving well productivity because of the naturally high, sub-horizontal permeability in this type of reservoir. Reducing the injector/producer well ratio may also be cost-effective if the injectors are drilled as larger holes.

  13. Estimating Western U.S. Reservoir Sedimentation

    Science.gov (United States)

    Bensching, L.; Livneh, B.; Greimann, B. P.

    2017-12-01

    Reservoir sedimentation is a long-term problem for water management across the Western U.S. Observations of sedimentation are limited to reservoir surveys that are costly and infrequent, with many reservoirs having only two or fewer surveys. This work aims to apply a recently developed ensemble of sediment algorithms to estimate reservoir sedimentation over several western U.S. reservoirs. The sediment algorithms include empirical, conceptual, stochastic, and processes based approaches and are coupled with a hydrologic modeling framework. Preliminary results showed that the more complex and processed based algorithms performed better in predicting high sediment flux values and in a basin transferability experiment. However, more testing and validation is required to confirm sediment model skill. This work is carried out in partnership with the Bureau of Reclamation with the goal of evaluating the viability of reservoir sediment yield prediction across the western U.S. using a multi-algorithm approach. Simulations of streamflow and sediment fluxes are validated against observed discharges, as well as a Reservoir Sedimentation Information database that is being developed by the US Army Corps of Engineers. Specific goals of this research include (i) quantifying whether inter-algorithm differences consistently capture observational variability; (ii) identifying whether certain categories of models consistently produce the best results, (iii) assessing the expected sedimentation life-span of several western U.S. reservoirs through long-term simulations.

  14. Encapsulated microsensors for reservoir interrogation

    Science.gov (United States)

    Scott, Eddie Elmer; Aines, Roger D.; Spadaccini, Christopher M.

    2016-03-08

    In one general embodiment, a system includes at least one microsensor configured to detect one or more conditions of a fluidic medium of a reservoir; and a receptacle, wherein the receptacle encapsulates the at least one microsensor. In another general embodiment, a method include injecting the encapsulated at least one microsensor as recited above into a fluidic medium of a reservoir; and detecting one or more conditions of the fluidic medium of the reservoir.

  15. Remedial investigation/feasibility study report for lower Watts Bar Reservoir Operable Unit

    International Nuclear Information System (INIS)

    1994-08-01

    This document is the combined Remedial Investigation and Feasibility Study Report for the Lower Watts Bar Reservoir (LWBR) Operable Unit (OU). The LWBR is located in Roane, Rhea, and Meigs counties, Tennessee, and consists of Watts Bar Reservoir downstream of the Clinch River. This area has received hazardous substances released over a period of 50 years from the U.S. Department of Energy's Oak Ridge Reservation (ORR), a National Priority List site established under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). As required by this law, the ORR and all off-site areas that have received containments, including LWBR, must be investigated to determine the risk to human health and the environment resulting from these releases, the need for any remedial action to reduce these risks, and the remedial actions that are most feasible for implementation in this OU. Contaminants from the ORR are primarily transported to the LWBR via the Clinch River. Water-soluble contaminants released to ORR surface waters are rapidly diluted upon entering the Clinch River and then quickly transported downstream to the Tennessee River where further dilution occurs. Almost the entire quantity of these diluted contaminants rapidly flows through LWBR. In contrast, particle-associated contaminants tend to accumulate in the lower Clinch River and in LWBR in areas of sediment deposition. Those particle-associated contaminants that were released in peak quantities during the early years of ORR operations (e.g., mercury and 137 Cs) are buried under as much as 80 cm of cleaner sediment in LWBR. Certain contaminants, most notably polychlorinated biphenyls (PCBs), have accumulated in LWBR biota. The contamination of aquatic biota with PCBs is best documented for certain fish species and extends to reservoirs upstream of the ORR, indicating a contamination problem that is regional in scope and not specific to the ORR

  16. Massachusetts reservoir simulation tool—User’s manual

    Science.gov (United States)

    Levin, Sara B.

    2016-10-06

    IntroductionThe U.S. Geological Survey developed the Massachusetts Reservoir Simulation Tool to examine the effects of reservoirs on natural streamflows in Massachusetts by simulating the daily water balance of reservoirs. The simulation tool was developed to assist environmental managers to better manage water withdrawals in reservoirs and to preserve downstream aquatic habitats.

  17. Nonlinear Filtering Effects of Reservoirs on Flood Frequency Curves at the Regional Scale: RESERVOIRS FILTER FLOOD FREQUENCY CURVES

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Wei; Li, Hong-Yi; Leung, Lai-Yung; Yigzaw, Wondmagegn Y.; Zhao, Jianshi; Lu, Hui; Deng, Zhiqun; Demissie, Yonas; Bloschl, Gunter

    2017-10-01

    Anthropogenic activities, e.g., reservoir operation, may alter the characteristics of Flood Frequency Curve (FFC) and challenge the basic assumption of stationarity used in flood frequency analysis. This paper presents a combined data-modeling analysis of the nonlinear filtering effects of reservoirs on the FFCs over the contiguous United States. A dimensionless Reservoir Impact Index (RII), defined as the total upstream reservoir storage capacity normalized by the annual streamflow volume, is used to quantify reservoir regulation effects. Analyses are performed for 388 river stations with an average record length of 50 years. The first two moments of the FFC, mean annual maximum flood (MAF) and coefficient of variations (CV), are calculated for the pre- and post-dam periods and compared to elucidate the reservoir regulation effects as a function of RII. It is found that MAF generally decreases with increasing RII but stabilizes when RII exceeds a threshold value, and CV increases with RII until a threshold value beyond which CV decreases with RII. The processes underlying the nonlinear threshold behavior of MAF and CV are investigated using three reservoir models with different levels of complexity. All models capture the non-linear relationships of MAF and CV with RII, suggesting that the basic flood control function of reservoirs is key to the non-linear relationships. The relative roles of reservoir storage capacity, operation objectives, available storage prior to a flood event, and reservoir inflow pattern are systematically investigated. Our findings may help improve flood-risk assessment and mitigation in regulated river systems at the regional scale.

  18. Using reservoir engineering data to solve geological ambiguities : a case study of one of the Iranian carbonate reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Kord, S. [National Iranian South Oil Co. (Iran, Islamic Republic of)

    2006-07-01

    A fractured carbonate reservoir in southwest Iran was studied with reference to reserve estimation, risk analysis, material balance and recovery factor. The 40 km long and 4 km wide reservoir consists of 2 parts with crest depths of 3780 and 3749 mss respectively. The eastern part is smaller and more productive than the western part which has high water saturation and absolutely no production. Economic production from the reservoir began in 1977. By 2004, the cumulative production had reached 12.064 MMSTB. Of the 6 wells drilled, only 2 wells in the eastern part are productive. This study addressed the main uncertainty of whether the 2 parts of the reservoir are sealed or not. The reservoir is under-saturated but the current pressure is near saturation pressure. The reservoir is divided into the following 4 zones: zones 1 and 2 are productive and consist mainly of carbonate rocks; zone 3 has thin beds of sand and shale; and, zone 4 consists of layers of carbonate, shale, marn, and dolomite. Although there are no faults, mud loss suggests that the reservoir has hairline fractures. Oil in place and reserves were estimated for both parts based on calculated reservoir engineering parameters. Material balance calculations were then performed to analyze and simulate the reservoir. The communication between the 2 parts of the reservoir were examined according to core analysis, rock type, fluid characterization, pressure analysis, water-oil contacts, production history and petrophysical evaluations. The porosity was found to be the same in both parts, but the water saturation and net to gross ratios were different between the eastern and western parts. The petrophysical evaluation revealed that there is no communication between the two parts of the reservoir. 4 refs., 2 figs., 2 appendices.

  19. Carbon emission from global hydroelectric reservoirs revisited.

    Science.gov (United States)

    Li, Siyue; Zhang, Quanfa

    2014-12-01

    Substantial greenhouse gas (GHG) emissions from hydropower reservoirs have been of great concerns recently, yet the significant carbon emitters of drawdown area and reservoir downstream (including spillways and turbines as well as river reaches below dams) have not been included in global carbon budget. Here, we revisit GHG emission from hydropower reservoirs by considering reservoir surface area, drawdown zone and reservoir downstream. Our estimates demonstrate around 301.3 Tg carbon dioxide (CO2)/year and 18.7 Tg methane (CH4)/year from global hydroelectric reservoirs, which are much higher than recent observations. The sum of drawdown and downstream emission, which is generally overlooked, represents 42 % CO2 and 67 % CH4 of the total emissions from hydropower reservoirs. Accordingly, the global average emissions from hydropower are estimated to be 92 g CO2/kWh and 5.7 g CH4/kWh. Nonetheless, global hydroelectricity could currently reduce approximate 2,351 Tg CO2eq/year with respect to fuel fossil plant alternative. The new findings show a substantial revision of carbon emission from the global hydropower reservoirs.

  20. Evaluation of heavy-oil and tar sands in Bourbon, Crawford, and Cherokee Counties, Kansas. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Ebanks, W.J. Jr.; James, G.W.; Livingston, N.D.

    1977-12-01

    The current national energy-resource situation has provided the incentive to investigate more fully deposits of heavy-oil bearing sandstone in southeastern Kansas, as part of a larger, three-state study. The results of this study indicate that the size of the heavy-oil resource in the three Kansas counties studied is smaller than earlier estimates suggested. A resource of 200 to 225 million barrels of oil in-place is estimated to be present in areas of ''known oil occurrence,'' as established by this study. The amount of this in-place resource which may be considered to be reserves, that is, recoverable under existing technology and economics, is zero. The estimates of resource-size are severely downgraded from earlier estimates mainly because of the discontinuous nature of the potential reservoir sandstone bodies and because of the thinness and shaliness of some of these sandstones. The earlier impression of these heavy-oil reservoirs, at least in Kansas, as being widespread, heavily oil saturated, ''blanket'' sandstones unfortunately is not correct. There are areas, shown on maps, which may warrant further investigation because of locally good oil-saturation, i.e., more than 400 barrels per acre foot, in trends of sandstone thicker than 20 feet. It is concluded that there will be no widespread exploitation of subsurface heavy-oil sandstones within the areas of Bourbon, Crawford, and Cherokee Counties, Kansas. Smaller areas indicated here may warrant further drilling and investigation, but the potential size of the heavy-oil resource is severely downgraded from earlier estimates.

  1. Reservoir characteristics and control factors of Carboniferous volcanic gas reservoirs in the Dixi area of Junggar Basin, China

    Directory of Open Access Journals (Sweden)

    Ji'an Shi

    2017-02-01

    Full Text Available Field outcrop observation, drilling core description, thin-section analysis, SEM analysis, and geochemistry, indicate that Dixi area of Carboniferous volcanic rock gas reservoir belongs to the volcanic rock oil reservoir of the authigenic gas reservoir. The source rocks make contact with volcanic rock reservoir directly or by fault, and having the characteristics of near source accumulation. The volcanic rock reservoir rocks mainly consist of acidic rhyolite and dacite, intermediate andesite, basic basalt and volcanic breccia: (1 Acidic rhyolite and dacite reservoirs are developed in the middle-lower part of the structure, have suffered strong denudation effect, and the secondary pores have formed in the weathering and tectonic burial stages, but primary pores are not developed within the early diagenesis stage. Average porosity is only at 8%, and the maximum porosity is at 13.5%, with oil and gas accumulation showing poor performance. (2 Intermediate andesite and basic basalt reservoirs are mainly distributed near the crater, which resembles the size of and suggests a volcanic eruption. Primary pores are formed in the early diagenetic stage, secondary pores developed in weathering and erosion transformation stage, and secondary fractures formed in the tectonic burial stage. The average porosity is at 9.2%, and the maximum porosity is at 21.9%: it is of the high-quality reservoir types in Dixi area. (3 The volcanic breccia reservoir has the same diagenetic features with sedimentary rocks, but also has the same mineral composition with volcanic rock; rigid components can keep the primary porosity without being affected by compaction during the burial process. At the same time, the brittleness of volcanic breccia reservoir makes it easily fracture under the stress; internal fracture was developmental. Volcanic breccia developed in the structural high part and suffered a long-term leaching effect. The original pore-fracture combination also made

  2. An integrated study of the Grayburg/San Andres reservoir, Foster and south Cowden fields, Ector County, Texas. Quarterly report, January 1--March 31, 1996

    Energy Technology Data Exchange (ETDEWEB)

    Trentham, R.C.; Weinbrandt, R.; Reeves, J.J.

    1996-06-17

    The principal objective of this research is to demonstrate in the field that 3D seismic data can be used to aid in identifying porosity zones, permeability barriers and thief zones and thereby improve waterflood design. Geologic and engineering data will be integrated with the geophysical data to result in a detailed reservoir characterization. Reservoir simulation will then be used to determine infill drilling potential and the optimum waterflood design for the project area. This design will be implemented and the success of the waterflood evaluated.

  3. Advanced reservoir characterization for improved oil recovery in a New Mexico Delaware basin project

    Energy Technology Data Exchange (ETDEWEB)

    Martin, F.D.; Kendall, R.P.; Whitney, E.M. [Dave Martin and Associates, Inc., Socorro, NM (United States)] [and others

    1997-08-01

    The Nash Draw Brushy Canyon Pool in Eddy County, New Mexico is a field demonstration site in the Department of Energy Class III program. The basic problem at the Nash Draw Pool is the low recovery typically observed in similar Delaware fields. By comparing a control area using standard infill drilling techniques to a pilot area developed using advanced reservoir characterization methods, the goal of the project is to demonstrate that advanced technology can significantly improve oil recovery. During the first year of the project, four new producing wells were drilled, serving as data acquisition wells. Vertical seismic profiles and a 3-D seismic survey were acquired to assist in interwell correlations and facies prediction. Limited surface access at the Nash Draw Pool, caused by proximity of underground potash mining and surface playa lakes, limits development with conventional drilling. Combinations of vertical and horizontal wells combined with selective completions are being evaluated to optimize production performance. Based on the production response of similar Delaware fields, pressure maintenance is a likely requirement at the Nash Draw Pool. A detailed reservoir model of pilot area was developed, and enhanced recovery options, including waterflooding, lean gas, and carbon dioxide injection, are being evaluated.

  4. Application of integrated reservoir management and reservoir characterization to optimize infill drilling. Quarterly progress report, June 13, 1995--September 12, 1995

    Energy Technology Data Exchange (ETDEWEB)

    Pande, P.K.

    1995-09-12

    At this stage of the reservoir characterization research, the main emphasis is on the geostatistics and reservoir simulation. Progress is reported on geological analysis, reservoir simulation, and reservoir management.

  5. Optimizing transformations of stencil operations for parallel object-oriented scientific frameworks on cache-based architectures

    Energy Technology Data Exchange (ETDEWEB)

    Bassetti, F.; Davis, K.; Quinlan, D.

    1998-12-31

    High-performance scientific computing relies increasingly on high-level large-scale object-oriented software frameworks to manage both algorithmic complexity and the complexities of parallelism: distributed data management, process management, inter-process communication, and load balancing. This encapsulation of data management, together with the prescribed semantics of a typical fundamental component of such object-oriented frameworks--a parallel or serial array-class library--provides an opportunity for increasingly sophisticated compile-time optimization techniques. This paper describes two optimizing transformations suitable for certain classes of numerical algorithms, one for reducing the cost of inter-processor communication, and one for improving cache utilization; demonstrates and analyzes the resulting performance gains; and indicates how these transformations are being automated.

  6. The Coupling Effect of Rainfall and Reservoir Water Level Decline on the Baijiabao Landslide in the Three Gorges Reservoir Area, China

    Directory of Open Access Journals (Sweden)

    Nenghao Zhao

    2017-01-01

    Full Text Available Rainfall and reservoir level fluctuation are two of the main factors contributing to reservoir landslides. However, in China’s Three Gorges Reservoir Area, when the reservoir water level fluctuates significantly, it comes at a time of abundant rainfall, which makes it difficult to distinguish which factor dominates the deformation of the landslide. This study focuses on how rainfall and reservoir water level decline affect the seepage and displacement field of Baijiabao landslide spatially and temporally during drawdown of reservoir water level in the Three Gorges Reservoir Area, thus exploring its movement mechanism. The monitoring data of the landslide in the past 10 years were analyzed, and the correlation between rainfall, reservoir water level decline, and landslide displacement was clarified. By the numerical simulation method, the deformation evolution mechanism of this landslide during drawdown of reservoir water level was revealed, respectively, under three conditions, namely, rainfall, reservoir water level decline, and coupling of the above two conditions. The results showed that the deformation of the Baijiabao landslide was the coupling effect of rainfall and reservoir water level decline, while the latter effect is more pronounced.

  7. Temporal locality optimizations for stencil operations for parallel object-oriented scientific frameworks on cache-based architectures

    Energy Technology Data Exchange (ETDEWEB)

    Bassetti, F.; Davis, K.; Quinlan, D.

    1998-12-01

    High-performance scientific computing relies increasingly on high-level large-scale object-oriented software frameworks to manage both algorithmic complexity and the complexities of parallelism: distributed data management, process management, inter-process communication, and load balancing. This encapsulation of data management, together with the prescribed semantics of a typical fundamental component of such object-oriented frameworks--a parallel or serial array-class library--provides an opportunity for increasingly sophisticated compile-time optimization techniques. This paper describes a technique for introducing cache blocking suitable for certain classes of numerical algorithms, demonstrates and analyzes the resulting performance gains, and indicates how this optimization transformation is being automated.

  8. Prospective study of ready-to-eat breakfast cereal consumption and cognitive decline among elderly men and women.

    Science.gov (United States)

    Wengreen, H; Nelson, C; Munger, R G; Corcoran, C

    2011-03-01

    To examine associations between frequency of ready-to-eat-cereal (RTEC) consumption and cognitive function among elderly men and women of the Cache County Study on Memory Health and Aging in Utah. A population-based prospective cohort study established in Cache County, Utah in 1995. 3831 men and women > 65 years of age who were living in Cache County, Utah in 1995. Diet was assessed using a 142-item food frequency questionnaire at baseline. Cognitive function was assessed using an adapted version of the Modified Mini-Mental State examination (3MS) at baseline and three subsequent interviews over 11 years. RTEC consumption was defined as daily, weekly, or infrequent use. In multivariable models, more frequent RTEC consumption was not associated with a cognitive benefit. Those consuming RTEC weekly but less than daily scored higher on their baseline 3MS than did those consuming RTEC more or less frequently (91.7, 90.6, 90.6, respectively; p-value < 0.001). This association was maintained across 11 years of observation such that those consuming RTEC weekly but less than daily declined on average 3.96 points compared to an average 5.13 and 4.57 point decline for those consuming cereal more or less frequently (p-value = 0.0009). Those consuming RTEC at least daily had poorer cognitive performance at baseline and over 11 years of follow-up compared to those who consumed cereal more or less frequently. RTEC is a nutrient dense food, but should not replace the consumption of other healthy foods in the diets' of elderly people. Associations between RTEC consumption, dietary patterns, and cognitive function deserve further study.

  9. Avaliação do compartilhamento das memórias cache no desempenho de arquiteturas multi-core

    OpenAIRE

    Marco Antonio Zanata Alves

    2009-01-01

    No atual contexto de inovações em multi-core, em que as novas tecnologias de integração estão fornecendo um número crescente de transistores por chip, o estudo de técnicas de aumento de vazão de dados é de suma importância para os atuais e futuros processadores multi-core e many-core. Com a contínua demanda por desempenho computacional, as memórias cache vêm sendo largamente adotadas nos diversos tipos de projetos arquiteturais de computadores. Os atuais processadores disponíveis no mercado a...

  10. Dutchess County Resource Recovery Task Force report: Dutchess County Pyrolysis Program

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-07-01

    Dutchess County initiated development of a long-range master plan for Solid Waste Management in 1971. The plan included development of a resource recovery facility to service the municipalities in the County population center. Based on early recommendations, a pyrolysis facility employing Purox technology was to be implemented. A feasibility study, paid for by County funds was completed in 1975. The study provided siting recommendations, estimation of available waste, and preliminary facility design. Because of various considerations, the project was not developed. Under the Department of Energy grant, the County reassessed the feasibility of a resource recovery facility, with emphasis on confirming previous conclusions supporting the Purox technology, waste availability, energy recovery and sale and siting of the plant. The conclusions reached in the new study were: a resource recovery facility is feasible for the County; sufficient waste for such a facility is available and subject to control; While Purox technology was feasible it is not the most appropriate available technoloy for the County; that mass burning with steam recovery is the most appropriate technology; and that resource recovery while presently more expensive than landfilling, represents the only cost effective, energy efficient, and environmentally sound way to handle the solid waste problem in the County.

  11. Mathematical and field analysis of longitudinal reservoir infill

    Science.gov (United States)

    Ke, W. T.; Capart, H.

    2016-12-01

    In reservoirs, severe problems are caused by infilled sediment deposits. In long term, the sediment accumulation reduces the capacity of reservoir storage and flood control benefits. In the short term, the sediment deposits influence the intakes of water-supply and hydroelectricity generation. For the management of reservoir, it is important to understand the deposition process and then to predict the sedimentation in reservoir. To investigate the behaviors of sediment deposits, we propose a one-dimensional simplified theory derived by the Exner equation to predict the longitudinal sedimentation distribution in idealized reservoirs. The theory models the reservoir infill geomorphic actions for three scenarios: delta progradation, near-dam bottom deposition, and final infill. These yield three kinds of self-similar analytical solutions for the reservoir bed profiles, under different boundary conditions. Three analytical solutions are composed by error function, complementary error function, and imaginary error function, respectively. The theory is also computed by finite volume method to test the analytical solutions. The theoretical and numerical predictions are in good agreement with one-dimensional small-scale laboratory experiment. As the theory is simple to apply with analytical solutions and numerical computation, we propose some applications to simulate the long-profile evolution of field reservoirs and focus on the infill sediment deposit volume resulting the uplift of near-dam bottom elevation. These field reservoirs introduced here are Wushe Reservoir, Tsengwen Reservoir, Mudan Reservoir in Taiwan, Lago Dos Bocas in Puerto Rico, and Sakuma Dam in Japan.

  12. Phytoplankton and water quality in a Mediterranean drinking-water reservoir (Marathonas Reservoir, Greece).

    Science.gov (United States)

    Katsiapi, Matina; Moustaka-Gouni, Maria; Michaloudi, Evangelia; Kormas, Konstantinos Ar

    2011-10-01

    Phytoplankton and water quality of Marathonas drinking-water Reservoir were examined for the first time. During the study period (July-September 2007), phytoplankton composition was indicative of eutrophic conditions although phytoplankton biovolume was low (max. 2.7 mm³ l⁻¹). Phytoplankton was dominated by cyanobacteria and diatoms, whereas desmids and dinoflagellates contributed with lower biovolume values. Changing flushing rate in the reservoir (up to 0.7% of reservoir's water volume per day) driven by water withdrawal and occurring in pulses for a period of 15-25 days was associated with phytoplankton dynamics. Under flushing pulses: (1) biovolume was low and (2) both 'good' quality species and the tolerant to flushing 'nuisance' cyanobacterium Microcystis aeruginosa dominated. According to the Water Framework Directive, the metrics of phytoplankton biovolume and cyanobacterial percentage (%) contribution indicated a moderate ecological water quality. In addition, the total biovolume of cyanobacteria as well as the dominance of the known toxin-producing M. aeruginosa in the reservoir's phytoplankton indicated a potential hazard for human health according to the World Health Organization.

  13. AUTOMATED TECHNIQUE FOR FLOW MEASUREMENTS FROM MARIOTTE RESERVOIRS.

    Science.gov (United States)

    Constantz, Jim; Murphy, Fred

    1987-01-01

    The mariotte reservoir supplies water at a constant hydraulic pressure by self-regulation of its internal gas pressure. Automated outflow measurements from mariotte reservoirs are generally difficult because of the reservoir's self-regulation mechanism. This paper describes an automated flow meter specifically designed for use with mariotte reservoirs. The flow meter monitors changes in the mariotte reservoir's gas pressure during outflow to determine changes in the reservoir's water level. The flow measurement is performed by attaching a pressure transducer to the top of a mariotte reservoir and monitoring gas pressure changes during outflow with a programmable data logger. The advantages of the new automated flow measurement techniques include: (i) the ability to rapidly record a large range of fluxes without restricting outflow, and (ii) the ability to accurately average the pulsing flow, which commonly occurs during outflow from the mariotte reservoir.

  14. Monitoring Reservoirs Using MERIS And LANDSAT Fused Images : A Case Study Of Polyfitos Reservoir - West Macedonia - Greece

    Science.gov (United States)

    Stefouli, M.; Charou, E.; Vasileiou, E.; Stathopoulos, N.; Perrakis, A.

    2012-04-01

    Research and monitoring is essential to assess baseline conditions in reservoirs and their watershed and provide necessary information to guide decision-makers. Erosion and degradation of mountainous areas can lead to gradual aggradation of reservoirs reducing their lifetime. Collected measurements and observations have to be communicated to the managers of the reservoirs so as to achieve a common / comprehensive management of a large watershed and reservoir system. At this point Remote Sensing could help as the remotely sensed data are repeatedly and readily available to the end users. Aliakmon is the longest river in Greece, it's length is about 297 km and the surface of the river basin is 9.210 km2.The flow of the river starts from Northwest of Greece and ends in Thermaikos Gulf. The riverbed is not natural throughout the entire route, because constructed dams restrict water and create artificial lakes, such as lake of Polyfitos, that prevent flooding. This lake is used as reservoir, for covering irrigational water needs and the water is used to produce energy from the hydroelectric plant of Public Power Corporation-PPC. The catchment basin of Polyfitos' reservoir covers an area of 847.76 km2. Soil erosion - degradation in the mountainous watershed of streams of Polyfitos reservoir is taking place. It has been estimated that an annual volume of sediments reaching the reservoir is of the order of 244 m3. Geomatic based techniques are used in processing multiple data of the study area. A data inventory was formulated after the acquisition of topographic maps, compilation of geological and hydro-geological maps, compilation of digital elevation model for the area of interest based on satellite data and available maps. It also includes the acquisition of various hydro-meteorological data when available. On the basis of available maps and satellite data, digital elevation models are used in order to delineate the basic sub-catchments of the Polyfytos basin as well as

  15. SEISMIC DETERMINATION OF RESERVOIR HETEROGENEITY: APPLICATION TO THE CHARACTERIZATION OF HEAVY OIL RESERVOIRS

    Energy Technology Data Exchange (ETDEWEB)

    Matthias G. Imhof; James W. Castle

    2005-02-01

    The objective of the project was to examine how seismic and geologic data can be used to improve characterization of small-scale heterogeneity and their parameterization in reservoir models. The study focused on West Coalinga Field in California. The project initially attempted to build reservoir models based on different geologic and geophysical data independently using different tools, then to compare the results, and ultimately to integrate them all. We learned, however, that this strategy was impractical. The different data and tools need to be integrated from the beginning because they are all interrelated. This report describes a new approach to geostatistical modeling and presents an integration of geology and geophysics to explain the formation of the complex Coalinga reservoir.

  16. CO{sub 2} Huff-n-Puff process in a light oil shallow shelf carbonate reservoir. 1994 Annual report

    Energy Technology Data Exchange (ETDEWEB)

    Wehner, S.C.

    1995-05-01

    It is anticipated that this project will show that the application of the CO{sub 2} Huff-n-Puff process in shallow shelf carbonates can be economically implemented to recover appreciable volumes of light oil. The goals of the project are the development of guidelines for cost-effective selection of candidate reservoirs and wells, along with estimating recovery potential. The selected site for the demonstration project is the Central Vacuum Unit waterflood in Lea County, New Mexico. Work is nearing completion on the reservoir characterization components of the project. The near-term emphasis is to, (1) provide an accurate distribution of original oil-in-place on a waterflood pattern entity level, (2) evaluate past recovery efficiencies, (3) perform parametric simulations, and (4) forecast performance for a site specific field demonstration of the proposed technology. Macro zonation now exists throughout the study area and cross-sections are available. The Oil-Water Contact has been defined. Laboratory capillary pressure data was used to define the initial water saturations within the pay horizon. The reservoir`s porosity distribution has been enhanced with the assistance of geostatistical software. Three-Dimensional kriging created the spatial distributions of porosity at interwell locations. Artificial intelligence software was utilized to relate core permeability to core porosity, which in turn was applied to the 3-D geostatistical porosity gridding. An Equation-of-State has been developed and refined for upcoming compositional simulation exercises. Options for local grid-refinement in the model are under consideration. These tasks will be completed by mid-1995, prior to initiating the field demonstrations in the second budget period.

  17. An integrated GIS/remote sensing data base in North Cache soil conservation district, Utah: A pilot project for the Utah Department of Agriculture's RIMS (Resource Inventory and Monitoring System)

    Science.gov (United States)

    Wheeler, D. J.; Ridd, M. K.; Merola, J. A.

    1984-01-01

    A basic geographic information system (GIS) for the North Cache Soil Conservation District (SCD) was sought for selected resource problems. Since the resource management issues in the North Cache SCD are very complex, it is not feasible in the initial phase to generate all the physical, socioeconomic, and political baseline data needed for resolving all management issues. A selection of critical varables becomes essential. Thus, there are foud specific objectives: (1) assess resource management needs and determine which resource factors ae most fundamental for building a beginning data base; (2) evaluate the variety of data gathering and analysis techniques for the resource factors selected; (3) incorporate the resulting data into a useful and efficient digital data base; and (4) demonstrate the application of the data base to selected real world resoource management issues.

  18. Development of Reservoir Characterization Techniques and Production Models for Exploiting Naturally Fractured Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Wiggins, Michael L.; Brown, Raymon L.; Civan, Frauk; Hughes, Richard G.

    2001-08-15

    Research continues on characterizing and modeling the behavior of naturally fractured reservoir systems. Work has progressed on developing techniques for estimating fracture properties from seismic and well log data, developing naturally fractured wellbore models, and developing a model to characterize the transfer of fluid from the matrix to the fracture system for use in the naturally fractured reservoir simulator.

  19. Chickamauga reservoir embayment study - 1990

    Energy Technology Data Exchange (ETDEWEB)

    Meinert, D.L.; Butkus, S.R.; McDonough, T.A.

    1992-12-01

    The objectives of this report are three-fold: (1) assess physical, chemical, and biological conditions in the major embayments of Chickamauga Reservoir; (2) compare water quality and biological conditions of embayments with main river locations; and (3) identify any water quality concerns in the study embayments that may warrant further investigation and/or management actions. Embayments are important areas of reservoirs to be considered when assessments are made to support water quality management plans. In general, embayments, because of their smaller size (water surface areas usually less than 1000 acres), shallower morphometry (average depth usually less than 10 feet), and longer detention times (frequently a month or more), exhibit more extreme responses to pollutant loadings and changes in land use than the main river region of the reservoir. Consequently, embayments are often at greater risk of water quality impairments (e.g. nutrient enrichment, filling and siltation, excessive growths of aquatic plants, algal blooms, low dissolved oxygen concentrations, bacteriological contamination, etc.). Much of the secondary beneficial use of reservoirs occurs in embayments (viz. marinas, recreation areas, parks and beaches, residential development, etc.). Typically embayments comprise less than 20 percent of the surface area of a reservoir, but they often receive 50 percent or more of the water-oriented recreational use of the reservoir. This intensive recreational use creates a potential for adverse use impacts if poor water quality and aquatic conditions exist in an embayment.

  20. Application of integrated reservoir management and reservoir characterization to optimize infill drilling, Class II

    Energy Technology Data Exchange (ETDEWEB)

    Bergeron, Jack; Blasingame, Tom; Doublet, Louis; Kelkar, Mohan; Freeman, George; Callard, Jeff; Moore, David; Davies, David; Vessell, Richard; Pregger, Brian; Dixon, Bill; Bezant, Bryce

    2000-03-16

    The major purpose of this project was to demonstrate the use of cost effective reservoir characterization and management tools that will be helpful to both independent and major operators for the optimal development of heterogeneous, low permeability carbonate reservoirs such as the North Robertson (Clearfork) Unit.

  1. Clock generation and distribution for the 130-nm Itanium$^{R}$ 2 processor with 6-MB on-die L3 cache

    CERN Document Server

    Tam, S; Limaye, R D

    2004-01-01

    The clock generation and distribution system for the 130-nm Itanium 2 processor operates at 1.5 GHz with a skew of 24 ps. The Itanium 2 processor features 6 MB of on-die L3 cache and has a die size of 374 mm/sup 2/. Fuse-based clock de-skew enables post-silicon clock optimization to gain higher frequency. This paper describes the clock generation, global clock distribution, local clocking, and the clock skew optimization feature.

  2. Reservoir Identification: Parameter Characterization or Feature Classification

    Science.gov (United States)

    Cao, J.

    2017-12-01

    The ultimate goal of oil and gas exploration is to find the oil or gas reservoirs with industrial mining value. Therefore, the core task of modern oil and gas exploration is to identify oil or gas reservoirs on the seismic profiles. Traditionally, the reservoir is identify by seismic inversion of a series of physical parameters such as porosity, saturation, permeability, formation pressure, and so on. Due to the heterogeneity of the geological medium, the approximation of the inversion model and the incompleteness and noisy of the data, the inversion results are highly uncertain and must be calibrated or corrected with well data. In areas where there are few wells or no well, reservoir identification based on seismic inversion is high-risk. Reservoir identification is essentially a classification issue. In the identification process, the underground rocks are divided into reservoirs with industrial mining value and host rocks with non-industrial mining value. In addition to the traditional physical parameters classification, the classification may be achieved using one or a few comprehensive features. By introducing the concept of seismic-print, we have developed a new reservoir identification method based on seismic-print analysis. Furthermore, we explore the possibility to use deep leaning to discover the seismic-print characteristics of oil and gas reservoirs. Preliminary experiments have shown that the deep learning of seismic data could distinguish gas reservoirs from host rocks. The combination of both seismic-print analysis and seismic deep learning is expected to be a more robust reservoir identification method. The work was supported by NSFC under grant No. 41430323 and No. U1562219, and the National Key Research and Development Program under Grant No. 2016YFC0601

  3. An environmental data base for all Hydro-Quebec reservoirs

    International Nuclear Information System (INIS)

    Demers, C.

    1988-01-01

    Hydro-Quebec has created two management positions specifically for reservoirs, namely Reservoir Ecology Advisor and Reservoir Management Advisor. To assist management decisions, a means was required of bringing together all existing environmental information for each reservoir operated by Hydro-Quebec, including storage reservoirs, auxiliary reservoirs and forebays. A relational database using Reflex software was developed on a network of Macintosh computers. The database contains five blocks of information: general information, and physical, physiochemical, biologic and socioeconomic characteristics for each reservoir. Data will be collected on over 100 sites, and the tool will form the basis for developing a medium-range study program on reservoir ecology. The program must take into account the physical, biological and socioeconomic aspects of the environment, as well as the concerns of management personnel operating the reservoirs, the local population, reservoir users, and various government departments. 2 figs

  4. Processing of next generation weather radar-multisensor precipitation estimates and quantitative precipitation forecast data for the DuPage County streamflow simulation system

    Science.gov (United States)

    Bera, Maitreyee; Ortel, Terry W.

    2018-01-12

    The U.S. Geological Survey, in cooperation with DuPage County Stormwater Management Department, is testing a near real-time streamflow simulation system that assists in the management and operation of reservoirs and other flood-control structures in the Salt Creek and West Branch DuPage River drainage basins in DuPage County, Illinois. As part of this effort, the U.S. Geological Survey maintains a database of hourly meteorological and hydrologic data for use in this near real-time streamflow simulation system. Among these data are next generation weather radar-multisensor precipitation estimates and quantitative precipitation forecast data, which are retrieved from the North Central River Forecasting Center of the National Weather Service. The DuPage County streamflow simulation system uses these quantitative precipitation forecast data to create streamflow predictions for the two simulated drainage basins. This report discusses in detail how these data are processed for inclusion in the Watershed Data Management files used in the streamflow simulation system for the Salt Creek and West Branch DuPage River drainage basins.

  5. Improving reservoir history matching of EM heated heavy oil reservoirs via cross-well seismic tomography

    KAUST Repository

    Katterbauer, Klemens; Hoteit, Ibrahim

    2014-01-01

    process. While becoming a promising technology for heavy oil recovery, its effect on overall reservoir production and fluid displacements are poorly understood. Reservoir history matching has become a vital tool for the oil & gas industry to increase

  6. Allegheny County Addressing Landmarks

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — This dataset contains address points which represent physical address locations assigned by the Allegheny County addressing authority. Data is updated by County...

  7. Allegheny County Address Points

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — This dataset contains address points which represent physical address locations assigned by the Allegheny County addressing authority. Data is updated by County...

  8. Ecological operation for Three Gorges Reservoir

    Directory of Open Access Journals (Sweden)

    Wen-xian Guo

    2011-06-01

    Full Text Available The traditional operation of the Three Gorges Reservoir has mainly focused on water for flood control, power generation, navigation, water supply, and recreation, and given less attention to the negative impacts of reservoir operation on the river ecosystem. In order to reduce the negative influence of reservoir operation, ecological operation of the reservoir should be studied with a focus on maintaining a healthy river ecosystem. This study considered ecological operation targets, including maintaining the river environmental flow and protecting the spawning and reproduction of the Chinese sturgeon and four major Chinese carps. Using flow data from 1900 to 2006 at the Yichang gauging station as the control station data for the Yangtze River, the minimal and optimal river environmental flows were analyzed, and eco-hydrological targets for the Chinese sturgeon and four major Chinese carps in the Yangtze River were calculated. This paper proposes a reservoir ecological operation model, which comprehensively considers flood control, power generation, navigation, and the ecological environment. Three typical periods, wet, normal, and dry years, were selected, and the particle swarm optimization algorithm was used to analyze the model. The results show that ecological operation modes have different effects on the economic benefit of the hydropower station, and the reservoir ecological operation model can simulate the flood pulse for the requirements of spawning of the Chinese sturgeon and four major Chinese carps. According to the results, by adopting a suitable re-operation scheme, the hydropower benefit of the reservoir will not decrease dramatically while the ecological demand is met. The results provide a reference for designing reasonable operation schemes for the Three Gorges Reservoir.

  9. 76 FR 40354 - Environmental Impacts Statements; Notice of Availability

    Science.gov (United States)

    2011-07-08

    .... 20110216, Final EIS, FHWA, UT, Hyde Park/North Logan Corridor Project, Proposed 200 East Transportation Corridor between North Logan City and Hyde Park, Funding, Right-of-Way Acquisitions and US Army COE Section 404 Permit, Cache County, UT, Review Period Ends: 08/08/2011, Contact: Paul C. Ziman 801-955-3525...

  10. 77 FR 23498 - Notice of Intent To Repatriate Cultural Items: The Colorado College, Colorado Springs, CO

    Science.gov (United States)

    2012-04-19

    ... Taylor Museum and the Colorado Springs Fine Arts Center) and the Denver Museum of Nature & Science... Davis, Chief of Staff, President's Office, Colorado College, Armstrong Hall, Room 201, 14 E. Cache La... objects, as well as other cultural items were removed from Canyon de Chelly, Apache County, AZ, under the...

  11. Functional age as an indicator of reservoir senescence

    Science.gov (United States)

    Miranda, Leandro E.; Krogman, R. M.

    2015-01-01

    It has been conjectured that reservoirs differ in the rate at which they manifest senescence, but no attempt has been made to find an indicator of senescence that performs better than chronological age. We assembled an indicator of functional age by creating a multimetric scale consisting of 10 metrics descriptive of reservoir environments that were expected to change directionally with reservoir senescence. In a sample of 1,022 U.S. reservoirs, chronological age was not correlated with functional age. Functional age was directly related to percentage of cultivated land in the catchment and inversely related to reservoir depth. Moreover, aspects of reservoir fishing quality and fish population characteristics were related to functional age. A multimetric scale to indicate reservoir functional age presents the possibility for management intervention from multiple angles. If a reservoir is functionally aging at an accelerated rate, action may be taken to remedy the conditions contributing most to functional age. Intervention to reduce scores of selected metrics in the scale can potentially reduce the rate of senescence and increase the life expectancy of the reservoir. This leads to the intriguing implication that steps can be taken to reduce functional age and actually make the reservoir grow younger.

  12. RECENT ADVANCES IN NATURALLY FRACTURED RESERVOIR MODELING

    OpenAIRE

    ORDOÑEZ, A; PEÑUELA, G; IDROBO, E. A; MEDINA, C. E

    2001-01-01

    Large amounts of oil reserves are contained in naturally fractured reservoirs. Most of these hydrocarbon volumes have been left behind because of the poor knowledge and/or description methodology of those reservoirs. This lack of knowledge has lead to the nonexistence of good quantitative models for this complicated type of reservoirs. The complexity of naturally fractured reservoirs causes the need for integration of all existing information at all scales (drilling, well logging, seismic, we...

  13. Allegheny County Air Quality

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Air quality data from Allegheny County Health Department monitors throughout the county. Air quality monitored data must be verified by qualified individuals before...

  14. Allegheny County Municipal Boundaries

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — This dataset demarcates the municipal boundaries in Allegheny County. Data was created to portray the boundaries of the 130 Municipalities in Allegheny County the...

  15. Geothermal reservoir engineering

    CERN Document Server

    Grant, Malcolm Alister

    2011-01-01

    As nations alike struggle to diversify and secure their power portfolios, geothermal energy, the essentially limitless heat emanating from the earth itself, is being harnessed at an unprecedented rate.  For the last 25 years, engineers around the world tasked with taming this raw power have used Geothermal Reservoir Engineering as both a training manual and a professional reference.  This long-awaited second edition of Geothermal Reservoir Engineering is a practical guide to the issues and tasks geothermal engineers encounter in the course of their daily jobs. The bo

  16. Upper Hiwassee River Basin reservoirs 1989 water quality assessment

    International Nuclear Information System (INIS)

    Fehring, J.P.

    1991-08-01

    The water in the Upper Hiwassee River Basin is slightly acidic and low in conductivity. The four major reservoirs in the Upper Hiwassee River Basin (Apalachia, Hiwassee, Chatuge, and Nottely) are not threatened by acidity, although Nottely Reservoir has more sulfates than the other reservoirs. Nottely also has the highest organic and nutrient concentrations of the four reservoirs. This results in Nottely having the poorest water clarity and the most algal productivity, although clarity as measured by color and secchi depths does not indicate any problem with most water use. However, chlorophyll concentrations indicate taste and odor problems would be likely if the upstream end of Nottely Reservoir were used for domestic water supply. Hiwassee Reservoir is clearer and has less organic and nutrient loading than either of the two upstream reservoirs. All four reservoirs have sufficient algal activity to produce supersaturated dissolved oxygen conditions and relatively high pH values at the surface. All four reservoirs are thermally stratified during the summer, and all but Apalachia have bottom waters depleted in oxygen. The very short residence time of Apalachia Reservoir, less than ten days as compared to over 100 days for the other three reservoirs, results in it being more riverine than the other three reservoirs. Hiwassee Reservoir actually develops three distinct water temperature strata due to the location of the turbine intake. The water quality of all of the reservoirs supports designated uses, but water quality complaints are being received regarding both Chatuge and Nottely Reservoirs and their tailwaters

  17. Allegheny County Council Districts

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — This dataset portrays the boundaries of the County Council Districts in Allegheny County. The dataset is based on municipal boundaries and City of Pittsburgh ward...

  18. Reservoir sedimentation; a literature survey

    NARCIS (Netherlands)

    Sloff, C.J.

    1991-01-01

    A survey of literature is made on reservoir sedimentation, one of the most threatening processes for world-wide reservoir performance. The sedimentation processes, their impacts, and their controlling factors are assessed from a hydraulic engineering point of view with special emphasis on

  19. Spatially pooled depth-dependent reservoir storage, elevation, and water-quality data for selected reservoirs in Texas, January 1965-January 2010

    Science.gov (United States)

    Burley, Thomas E.; Asquith, William H.; Brooks, Donald L.

    2011-01-01

    The U.S. Geological Survey (USGS), in cooperation with Texas Tech University, constructed a dataset of selected reservoir storage (daily and instantaneous values), reservoir elevation (daily and instantaneous values), and water-quality data from 59 reservoirs throughout Texas. The period of record for the data is as large as January 1965-January 2010. Data were acquired from existing databases, spreadsheets, delimited text files, and hard-copy reports. The goal was to obtain as much data as possible; therefore, no data acquisition restrictions specifying a particular time window were used. Primary data sources include the USGS National Water Information System, the Texas Commission on Environmental Quality Surface Water-Quality Management Information System, and the Texas Water Development Board monthly Texas Water Condition Reports. Additional water-quality data for six reservoirs were obtained from USGS Texas Annual Water Data Reports. Data were combined from the multiple sources to create as complete a set of properties and constituents as the disparate databases allowed. By devising a unique per-reservoir short name to represent all sites on a reservoir regardless of their source, all sampling sites at a reservoir were spatially pooled by reservoir and temporally combined by date. Reservoir selection was based on various criteria including the availability of water-quality properties and constituents that might affect the trophic status of the reservoir and could also be important for understanding possible effects of climate change in the future. Other considerations in the selection of reservoirs included the general reservoir-specific period of record, the availability of concurrent reservoir storage or elevation data to match with water-quality data, and the availability of sample depth measurements. Additional separate selection criteria included historic information pertaining to blooms of golden algae. Physical properties and constituents were water

  20. Integrating gravimetric and interferometric synthetic aperture radar data for enhancing reservoir history matching of carbonate gas and volatile oil reservoirs

    KAUST Repository

    Katterbauer, Klemens; Arango, Santiago; Sun, Shuyu; Hoteit, Ibrahim

    2016-01-01

    Reservoir history matching is assuming a critical role in understanding reservoir characteristics, tracking water fronts, and forecasting production. While production data have been incorporated for matching reservoir production levels

  1. Data Compression of Hydrocarbon Reservoir Simulation Grids

    KAUST Repository

    Chavez, Gustavo Ivan

    2015-05-28

    A dense volumetric grid coming from an oil/gas reservoir simulation output is translated into a compact representation that supports desired features such as interactive visualization, geometric continuity, color mapping and quad representation. A set of four control curves per layer results from processing the grid data, and a complete set of these 3-dimensional surfaces represents the complete volume data and can map reservoir properties of interest to analysts. The processing results yield a representation of reservoir simulation results which has reduced data storage requirements and permits quick performance interaction between reservoir analysts and the simulation data. The degree of reservoir grid compression can be selected according to the quality required, by adjusting for different thresholds, such as approximation error and level of detail. The processions results are of potential benefit in applications such as interactive rendering, data compression, and in-situ visualization of large-scale oil/gas reservoir simulations.

  2. The Alphabet Soup of HIV Reservoir Markers.

    Science.gov (United States)

    Sharaf, Radwa R; Li, Jonathan Z

    2017-04-01

    Despite the success of antiretroviral therapy in suppressing HIV, life-long therapy is required to avoid HIV reactivation from long-lived viral reservoirs. Currently, there is intense interest in searching for therapeutic interventions that can purge the viral reservoir to achieve complete remission in HIV patients off antiretroviral therapy. The evaluation of such interventions relies on our ability to accurately and precisely measure the true size of the viral reservoir. In this review, we assess the most commonly used HIV reservoir assays, as a clear understanding of the strengths and weaknesses of each is vital for the accurate interpretation of results and for the development of improved assays. The quantification of intracellular or plasma HIV RNA or DNA levels remains the most commonly used tests for the characterization of the viral reservoir. While cost-effective and high-throughput, these assays are not able to differentiate between replication-competent or defective fractions or quantify the number of infected cells. Viral outgrowth assays provide a lower bound for the fraction of cells that can produce infectious virus, but these assays are laborious, expensive and substantially underestimate the potential reservoir of replication-competent provirus. Newer assays are now available that seek to overcome some of these problems, including full-length proviral sequencing, inducible HIV RNA assays, ultrasensitive p24 assays and murine adoptive transfer techniques. The development and evaluation of strategies for HIV remission rely upon our ability to accurately and precisely quantify the size of the remaining viral reservoir. At this time, all current HIV reservoir assays have drawbacks such that combinations of assays are generally needed to gain a more comprehensive view of the viral reservoir. The development of novel, rapid, high-throughput assays that can sensitively quantify the levels of the replication-competent HIV reservoir is still needed.

  3. Maqalika Reservoir: utilisation and sustainability of Maqalika Reservoir as a source of potable water supply for Maseru in Lesotho

    CSIR Research Space (South Africa)

    Letsie, M

    2008-07-01

    Full Text Available The storage of water in the Maqalika reservoir is gradually decreasing as sediment, carried by the natural catchment run-off, accumulates in the reservoir. Moreover, water pumped into the reservoir from the Caledon River (which is heavily sedimented...

  4. Modified stretched exponential model of computer system resources management limitations-The case of cache memory

    Science.gov (United States)

    Strzałka, Dominik; Dymora, Paweł; Mazurek, Mirosław

    2018-02-01

    In this paper we present some preliminary results in the field of computer systems management with relation to Tsallis thermostatistics and the ubiquitous problem of hardware limited resources. In the case of systems with non-deterministic behaviour, management of their resources is a key point that guarantees theirs acceptable performance and proper working. This is very wide problem that stands for many challenges in financial, transport, water and food, health, etc. areas. We focus on computer systems with attention paid to cache memory and propose to use an analytical model that is able to connect non-extensive entropy formalism, long-range dependencies, management of system resources and queuing theory. Obtained analytical results are related to the practical experiment showing interesting and valuable results.

  5. An ecological response model for the Cache la Poudre River through Fort Collins

    Science.gov (United States)

    Shanahan, Jennifer; Baker, Daniel; Bledsoe, Brian P.; Poff, LeRoy; Merritt, David M.; Bestgen, Kevin R.; Auble, Gregor T.; Kondratieff, Boris C.; Stokes, John; Lorie, Mark; Sanderson, John

    2014-01-01

    The Poudre River Ecological Response Model (ERM) is a collaborative effort initiated by the City of Fort Collins and a team of nine river scientists to provide the City with a tool to improve its understanding of the past, present, and likely future conditions of the Cache la Poudre River ecosystem. The overall ecosystem condition is described through the measurement of key ecological indicators such as shape and character of the stream channel and banks, streamside plant communities and floodplain wetlands, aquatic vegetation and insects, and fishes, both coolwater trout and warmwater native species. The 13- mile-long study area of the Poudre River flows through Fort Collins, Colorado, and is located in an ecological transition zone between the upstream, cold-water, steep-gradient system in the Front Range of the Southern Rocky Mountains and the downstream, warm-water, low-gradient reach in the Colorado high plains.

  6. Application of Reservoir Characterization and Advanced Technology to Improve Recovery and Economics in a Lower Quality Shallow Shelf Carbonate Reservoir

    International Nuclear Information System (INIS)

    Taylor, Archie R.

    1996-01-01

    The Class 2 Project at West Welch was designed to demonstrate the use of advanced technologies to enhance the economics of improved oil recovery (IOR) projects in lower quality Shallow Shelf Carbonate (SSC) reservoirs, resulting in recovery of additional oil that would otherwise be left in the reservoir at project abandonment. Accurate reservoir description is critical to the effective evaluation and efficient design of IOR projects in the heterogeneous SSC reservoirs. Therefore, the majority of Budget Period 1 was devoted to reservoir characterization. Technologies being demonstrated include: (1) Advanced petrophysics; (2) Three dimensional (3-D) seismic; (3) Cross-well bore tomography; (4) Advanced reservoir simulation; (5) Carbon dioxide (CO 2 ) stimulation treatments; (6) Hydraulic fracturing design and monitoring; and (7) Mobility control agents

  7. Remedial investigation/feasibility study report for Lower Watts Bar Reservoir Operable Unit

    International Nuclear Information System (INIS)

    1995-03-01

    This document is the combined Remedial Investigation and Feasibility Study Report for the lower Watts Bar Reservoir (LWBR) Operable Unit (OU). The LWBR is located in Roane, Rhea, and Meigs counties, Tennessee, and consists of Watts Bar Reservoir downstream of the Clinch river. This area has received hazardous substances released over a period of 50 years from the US Department of Energy's Oak Ridge Reservation (ORR), a National Priority List site established under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). As required by this law, the ORR and all off-site areas that have received contaminants, including LWBR, must be investigated to determine the risk to human health and the environment resulting from these releases, the need for any remedial action to reduce these risks, and the remedial actions that are most feasible for implementation in this OU. Contaminants from the ORR are primarily transported to the LWBR via the Clinch River. There is little data regarding the quantities of most contaminants potentially released from the ORR to the Clinch River, particularly for the early years of ORR operations. Estimates of the quantities released during this period are available for most radionuclides and some inorganic contaminants, indicating that releases 30 to 50 years ago were much higher than today. Since the early 1970s, the release of potential contaminants has been monitored for compliance with environmental law and reported in the annual environmental monitoring reports for the ORR

  8. Remedial investigation/feasibility study report for Lower Watts Bar Reservoir Operable Unit

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    This document is the combined Remedial Investigation and Feasibility Study Report for the lower Watts Bar Reservoir (LWBR) Operable Unit (OU). The LWBR is located in Roane, Rhea, and Meigs counties, Tennessee, and consists of Watts Bar Reservoir downstream of the Clinch river. This area has received hazardous substances released over a period of 50 years from the US Department of Energy`s Oak Ridge Reservation (ORR), a National Priority List site established under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). As required by this law, the ORR and all off-site areas that have received contaminants, including LWBR, must be investigated to determine the risk to human health and the environment resulting from these releases, the need for any remedial action to reduce these risks, and the remedial actions that are most feasible for implementation in this OU. Contaminants from the ORR are primarily transported to the LWBR via the Clinch River. There is little data regarding the quantities of most contaminants potentially released from the ORR to the Clinch River, particularly for the early years of ORR operations. Estimates of the quantities released during this period are available for most radionuclides and some inorganic contaminants, indicating that releases 30 to 50 years ago were much higher than today. Since the early 1970s, the release of potential contaminants has been monitored for compliance with environmental law and reported in the annual environmental monitoring reports for the ORR.

  9. Naturally fractured reservoirs-yet an unsolved mystery

    International Nuclear Information System (INIS)

    Zahoor, M.K.

    2013-01-01

    Some of the world's most profitable reservoirs are assumed to be naturally fractured reservoirs (NFR). Effective evaluation, prediction and planning of these reservoirs require an early recognition of the role of natural fractures and then a comprehensive study of factors which affect the flowing performance through these fractures is necessary. As NFRs are the combination of matrix and fractures mediums so their analysis varies from non-fractured reservoirs. Matrix acts as a storage medium while mostly fluid flow takes place from fracture network. Many authors adopted different approaches to understand the flow behavior in such reservoirs. In this paper a broad review about the previous work done in naturally fractured reservoirs area is outlined and a different idea is initiated for the NFR simulation studies. The role of capillary pressure in natural fractures is always been a key factor for accurate recovery estimations. Also recovery through these reservoirs is dependent upon grid block shape while doing NFR simulation. Some authors studied above mentioned factors in combination with other rock properties to understand the flow behavior in such reservoirs but less emphasis was given for checking the effects on recovery estimations by the variations of only fracture capillary pressures and grid block shapes. So there is need to analyze the behavior of NFR for the mentioned conditions. (author)

  10. Geophysical monitoring in a hydrocarbon reservoir

    Science.gov (United States)

    Caffagni, Enrico; Bokelmann, Goetz

    2016-04-01

    Extraction of hydrocarbons from reservoirs demands ever-increasing technological effort, and there is need for geophysical monitoring to better understand phenomena occurring within the reservoir. Significant deformation processes happen when man-made stimulation is performed, in combination with effects deriving from the existing natural conditions such as stress regime in situ or pre-existing fracturing. Keeping track of such changes in the reservoir is important, on one hand for improving recovery of hydrocarbons, and on the other hand to assure a safe and proper mode of operation. Monitoring becomes particularly important when hydraulic-fracturing (HF) is used, especially in the form of the much-discussed "fracking". HF is a sophisticated technique that is widely applied in low-porosity geological formations to enhance the production of natural hydrocarbons. In principle, similar HF techniques have been applied in Europe for a long time in conventional reservoirs, and they will probably be intensified in the near future; this suggests an increasing demand in technological development, also for updating and adapting the existing monitoring techniques in applied geophysics. We review currently available geophysical techniques for reservoir monitoring, which appear in the different fields of analysis in reservoirs. First, the properties of the hydrocarbon reservoir are identified; here we consider geophysical monitoring exclusively. The second step is to define the quantities that can be monitored, associated to the properties. We then describe the geophysical monitoring techniques including the oldest ones, namely those in practical usage from 40-50 years ago, and the most recent developments in technology, within distinct groups, according to the application field of analysis in reservoir. This work is performed as part of the FracRisk consortium (www.fracrisk.eu); this project, funded by the Horizon2020 research programme, aims at helping minimize the

  11. Non-Markovian reservoir-dependent squeezing

    International Nuclear Information System (INIS)

    Paavola, J

    2010-01-01

    The squeezing dynamics of a damped harmonic oscillator are studied for different types of environment without making the Markovian approximation. The squeezing dynamics of a coherent state depend on the reservoir spectrum in a unique way that can, in the weak coupling approximation, be analysed analytically. Comparison of squeezing dynamics for ohmic, sub-ohmic and super-ohmic environments is done, showing a clear connection between the squeezing-non-squeezing oscillations and reservoir structure. Understanding the effects occurring due to structured reservoirs is important both from a purely theoretical point of view and in connection with evolving experimental techniques and future quantum computing applications.

  12. Understanding the True Stimulated Reservoir Volume in Shale Reservoirs

    KAUST Repository

    Hussain, Maaruf; Saad, Bilal; Negara, Ardiansyah; Sun, Shuyu

    2017-01-01

    Successful exploitation of shale reservoirs largely depends on the effectiveness of hydraulic fracturing stimulation program. Favorable results have been attributed to intersection and reactivation of pre-existing fractures by hydraulically

  13. Effects of water-supply reservoirs on streamflow in Massachusetts

    Science.gov (United States)

    Levin, Sara B.

    2016-10-06

    State and local water-resource managers need modeling tools to help them manage and protect water-supply resources for both human consumption and ecological needs. The U.S. Geological Survey, in cooperation with the Massachusetts Department of Environmental Protection, has developed a decision-support tool to estimate the effects of reservoirs on natural streamflow. The Massachusetts Reservoir Simulation Tool is a model that simulates the daily water balance of a reservoir. The reservoir simulation tool provides estimates of daily outflows from reservoirs and compares the frequency, duration, and magnitude of the volume of outflows from reservoirs with estimates of the unaltered streamflow that would occur if no dam were present. This tool will help environmental managers understand the complex interactions and tradeoffs between water withdrawals, reservoir operational practices, and reservoir outflows needed for aquatic habitats.A sensitivity analysis of the daily water balance equation was performed to identify physical and operational features of reservoirs that could have the greatest effect on reservoir outflows. For the purpose of this report, uncontrolled releases of water (spills or spillage) over the reservoir spillway were considered to be a proxy for reservoir outflows directly below the dam. The ratio of average withdrawals to the average inflows had the largest effect on spillage patterns, with the highest withdrawals leading to the lowest spillage. The size of the surface area relative to the drainage area of the reservoir also had an effect on spillage; reservoirs with large surface areas have high evaporation rates during the summer, which can contribute to frequent and long periods without spillage, even in the absence of water withdrawals. Other reservoir characteristics, such as variability of inflows, groundwater interactions, and seasonal demand patterns, had low to moderate effects on the frequency, duration, and magnitude of spillage. The

  14. Tenth workshop on geothermal reservoir engineering: proceedings

    Energy Technology Data Exchange (ETDEWEB)

    1985-01-22

    The workshop contains presentations in the following areas: (1) reservoir engineering research; (2) field development; (3) vapor-dominated systems; (4) the Geysers thermal area; (5) well test analysis; (6) production engineering; (7) reservoir evaluation; (8) geochemistry and injection; (9) numerical simulation; and (10) reservoir physics. (ACR)

  15. Limno-reservoirs as a new landscape, environmental and touristic resource: Pareja Limno-reservoir as a case of study (Guadalajara, Spain)

    Science.gov (United States)

    Díaz-Carrión, I.; Sastre-Merlín, A.; Martínez-Pérez, S.; Molina-Navarro, E.; Bienes-Allas, R.

    2012-04-01

    A limno-reservoir is a hydrologic infrastructure with the main goal of generating a body of water with a constant level in the riverine zone of a reservoir, building a dam that makes de limno-reservoir independent from the main body of water. This dam can be built in the main river supplying the reservoir or any tributary as well flowing into it. Despite its novel conception and design, around a dozen are already operative in some Spanish reservoirs. This infrastructure allows the new water body to be independent of the main reservoir management, so the water level stability is its main distinctive characteristic. It leads to the development of environmental, sports and cultural initiatives; which may be included in a touristic exploitation in a wide sense. An opinion poll was designed in 2009 to be carried out the Pareja Limno-reservoir (Entrepeñas reservoir area, Tajo River Basin, central Spain). The results showed that for both, Pareja inhabitants and occasional visitors, the limno-reservoir has become an important touristic resource, mainly demanded during summer season. The performance of leisure activities (especially swimming) are being the main brand of this novel hydraulic and environmental infrastructure, playing a role as corrective and/or compensatory action which is needed to apply in order to mitigate the environmental impacts of the large hydraulic constructions.

  16. Contrasting patterns of survival and dispersal in multiple habitats reveal an ecological trap in a food-caching bird.

    Science.gov (United States)

    Norris, D Ryan; Flockhart, D T Tyler; Strickland, Dan

    2013-11-01

    A comprehensive understanding of how natural and anthropogenic variation in habitat influences populations requires long-term information on how such variation affects survival and dispersal throughout the annual cycle. Gray jays Perisoreus canadensis are widespread boreal resident passerines that use cached food to survive over the winter and to begin breeding during the late winter. Using multistate capture-recapture analysis, we examined apparent survival and dispersal in relation to habitat quality in a gray jay population over 34 years (1977-2010). Prior evidence suggests that natural variation in habitat quality is driven by the proportion of conifers on territories because of their superior ability to preserve cached food. Although neither adults (>1 year) nor juveniles (conifer territories, both age classes were less likely to leave high-conifer territories and, when they did move, were more likely to disperse to high-conifer territories. In contrast, survival rates were lower on territories that were adjacent to a major highway compared to territories that did not border the highway but there was no evidence for directional dispersal towards or away from highway territories. Our results support the notion that natural variation in habitat quality is driven by the proportion of coniferous trees on territories and provide the first evidence that high-mortality highway habitats can act as an equal-preference ecological trap for birds. Reproductive success, as shown in a previous study, but not survival, is sensitive to natural variation in habitat quality, suggesting that gray jays, despite living in harsh winter conditions, likely favor the allocation of limited resources towards self-maintenance over reproduction.

  17. Integrating gravimetric and interferometric synthetic aperture radar data for enhancing reservoir history matching of carbonate gas and volatile oil reservoirs

    KAUST Repository

    Katterbauer, Klemens

    2016-08-25

    Reservoir history matching is assuming a critical role in understanding reservoir characteristics, tracking water fronts, and forecasting production. While production data have been incorporated for matching reservoir production levels and estimating critical reservoir parameters, the sparse spatial nature of this dataset limits the efficiency of the history matching process. Recently, gravimetry techniques have significantly advanced to the point of providing measurement accuracy in the microgal range and consequently can be used for the tracking of gas displacement caused by water influx. While gravity measurements provide information on subsurface density changes, i.e., the composition of the reservoir, these data do only yield marginal information about temporal displacements of oil and inflowing water. We propose to complement gravimetric data with interferometric synthetic aperture radar surface deformation data to exploit the strong pressure deformation relationship for enhancing fluid flow direction forecasts. We have developed an ensemble Kalman-filter-based history matching framework for gas, gas condensate, and volatile oil reservoirs, which synergizes time-lapse gravity and interferometric synthetic aperture radar data for improved reservoir management and reservoir forecasts. Based on a dual state-parameter estimation algorithm separating the estimation of static reservoir parameters from the dynamic reservoir parameters, our numerical experiments demonstrate that history matching gravity measurements allow monitoring the density changes caused by oil-gas phase transition and water influx to determine the saturation levels, whereas the interferometric synthetic aperture radar measurements help to improve the forecasts of hydrocarbon production and water displacement directions. The reservoir estimates resulting from the dual filtering scheme are on average 20%-40% better than those from the joint estimation scheme, but require about a 30% increase in

  18. Design Techniques and Reservoir Simulation

    Directory of Open Access Journals (Sweden)

    Ahad Fereidooni

    2012-11-01

    Full Text Available Enhanced oil recovery using nitrogen injection is a commonly applied method for pressure maintenance in conventional reservoirs. Numerical simulations can be practiced for the prediction of a reservoir performance in the course of injection process; however, a detailed simulation might take up enormous computer processing time. In such cases, a simple statistical model may be a good approach to the preliminary prediction of the process without any application of numerical simulation. In the current work, seven rock/fluid reservoir properties are considered as screening parameters and those parameters having the most considerable effect on the process are determined using the combination of experimental design techniques and reservoir simulations. Therefore, the statistical significance of the main effects and interactions of screening parameters are analyzed utilizing statistical inference approaches. Finally, the influential parameters are employed to create a simple statistical model which allows the preliminary prediction of nitrogen injection in terms of a recovery factor without resorting to numerical simulations.

  19. Stretch due to Penile Prosthesis Reservoir Migration

    Directory of Open Access Journals (Sweden)

    E. Baten

    2016-03-01

    Full Text Available A 43-year old patient presented to the emergency department with stretch, due to impossible deflation of the penile prosthesis, 4 years after successful implant. A CT-scan showed migration of the reservoir to the left rectus abdominis muscle. Refilling of the reservoir was inhibited by muscular compression, causing stretch. Removal and replacement of the reservoir was performed, after which the prosthesis was well-functioning again. Migration of the penile prosthesis reservoir is extremely rare but can cause several complications, such as stretch.

  20. Physical Model-Based Investigation of Reservoir Sedimentation Processes

    Directory of Open Access Journals (Sweden)

    Cheng-Chia Huang

    2018-03-01

    Full Text Available Sedimentation is a serious problem in the operations of reservoirs. In Taiwan, the situation became worse after the Chi-Chi Earthquake recorded on 21 September 1999. The sediment trap efficiency in several regional reservoirs has been sharply increased, adversely affecting the operations on water supplies. According to the field record, the average annual sediment deposition observed in several regional reservoirs in Taiwan has been increased. For instance, the typhoon event recorded in 2008 at the Wushe Reservoir, Taiwan, produced a 3 m sediment deposit upstream of the dam. The remaining storage capacity in the Wushe Reservoir was reduced to 35.9% or a volume of 53.79 million m3 for flood water detention in 2010. It is urgent that research should be conducted to understand the sediment movement in the Wushe Reservoir. In this study, a scale physical model was built to reproduce the flood flow through the reservoir, investigate the long-term depositional pattern, and evaluate sediment trap efficiency. This allows us to estimate the residual life of the reservoir by proposing a modification of Brune’s method. It can be presented to predict the lifespan of Taiwan reservoirs due to higher applicability in both the physical model and the observed data.

  1. Reservoir Operating Rule Optimization for California's Sacramento Valley

    Directory of Open Access Journals (Sweden)

    Timothy Nelson

    2016-03-01

    Full Text Available doi: http://dx.doi.org/10.15447/sfews.2016v14iss1art6Reservoir operating rules for water resource systems are typically developed by combining intuition, professional discussion, and simulation modeling. This paper describes a joint optimization–simulation approach to develop preliminary economically-based operating rules for major reservoirs in California’s Sacramento Valley, based on optimized results from CALVIN, a hydro-economic optimization model. We infer strategic operating rules from the optimization model results, including storage allocation rules to balance storage among multiple reservoirs, and reservoir release rules to determine monthly release for individual reservoirs. Results show the potential utility of considering previous year type on water availability and various system and sub-system storage conditions, in addition to normal consideration of local reservoir storage, season, and current inflows. We create a simple simulation to further refine and test the derived operating rules. Optimization model results show particular insights for balancing the allocation of water storage among Shasta, Trinity, and Oroville reservoirs over drawdown and refill seasons, as well as some insights for release rules at major reservoirs in the Sacramento Valley. We also discuss the applicability and limitations of developing reservoir operation rules from optimization model results.

  2. Reflection Phenomena in Underground Pumped Storage Reservoirs

    Directory of Open Access Journals (Sweden)

    Elena Pummer

    2018-04-01

    Full Text Available Energy storage through hydropower leads to free surface water waves in the connected reservoirs. The reason for this is the movement of water between reservoirs at different elevations, which is necessary for electrical energy storage. Currently, the expansion of renewable energies requires the development of fast and flexible energy storage systems, of which classical pumped storage plants are the only technically proven and cost-effective technology and are the most used. Instead of classical pumped storage plants, where reservoirs are located on the surface, underground pumped storage plants with subsurface reservoirs could be an alternative. They are independent of topography and have a low surface area requirement. This can be a great advantage for energy storage expansion in case of environmental issues, residents’ concerns and an unusable terrain surface. However, the reservoirs of underground pumped storage plants differ in design from classical ones for stability and space reasons. The hydraulic design is essential to ensure their satisfactory hydraulic performance. The paper presents a hybrid model study, which is defined here as a combination of physical and numerical modelling to use the advantages and to compensate for the disadvantages of the respective methods. It shows the analysis of waves in ventilated underground reservoir systems with a great length to height ratio, considering new operational aspects from energy supply systems with a great percentage of renewable energies. The multifaceted and narrow design of the reservoirs leads to complex free surface flows; for example, undular and breaking bores arise. The results show excessive wave heights through wave reflections, caused by the impermeable reservoir boundaries. Hence, their knowledge is essential for a successful operational and constructive design of the reservoirs.

  3. Class III Mid-Term Project, "Increasing Heavy Oil Reserves in the Wilmington Oil Field Through Advanced Reservoir Characterization and Thermal Production Technologies"

    Energy Technology Data Exchange (ETDEWEB)

    Scott Hara

    2007-03-31

    The overall objective of this project was to increase heavy oil reserves in slope and basin clastic (SBC) reservoirs through the application of advanced reservoir characterization and thermal production technologies. The project involved improving thermal recovery techniques in the Tar Zone of Fault Blocks II-A and V (Tar II-A and Tar V) of the Wilmington Field in Los Angeles County, near Long Beach, California. A primary objective has been to transfer technology that can be applied in other heavy oil formations of the Wilmington Field and other SBC reservoirs, including those under waterflood. The first budget period addressed several producibility problems in the Tar II-A and Tar V thermal recovery operations that are common in SBC reservoirs. A few of the advanced technologies developed include a three-dimensional (3-D) deterministic geologic model, a 3-D deterministic thermal reservoir simulation model to aid in reservoir management and subsequent post-steamflood development work, and a detailed study on the geochemical interactions between the steam and the formation rocks and fluids. State of the art operational work included drilling and performing a pilot steam injection and production project via four new horizontal wells (2 producers and 2 injectors), implementing a hot water alternating steam (WAS) drive pilot in the existing steamflood area to improve thermal efficiency, installing a 2400-foot insulated, subsurface harbor channel crossing to supply steam to an island location, testing a novel alkaline steam completion technique to control well sanding problems, and starting on an advanced reservoir management system through computer-aided access to production and geologic data to integrate reservoir characterization, engineering, monitoring, and evaluation. The second budget period phase (BP2) continued to implement state-of-the-art operational work to optimize thermal recovery processes, improve well drilling and completion practices, and evaluate the

  4. 75 FR 25308 - Environmental Impact Statement: Winnebago County, IL and Rock County, WI

    Science.gov (United States)

    2010-05-07

    ... DEPARTMENT OF TRANSPORTATION Federal Highway Administration Environmental Impact Statement: Winnebago County, IL and Rock County, WI AGENCY: Federal Highway Administration (FHWA), DOT. ACTION: Notice... Nye School Road northwest of Beloit, Rock County, Wisconsin to the interchange of Rockton Road and I...

  5. Climate variability and sedimentation of a hydropower reservoir

    International Nuclear Information System (INIS)

    Riedel, M.

    2008-01-01

    As part of the relicensing of a large Hydroelectric Project in the central Appalachians, large scale watershed and reservoir sedimentation models were developed to forecast potential sedimentation scenarios. The GIS based watershed model was spatially explicit and calibrated to long term observed data. Potential socio/economic development scenarios were used to construct future watershed land cover scenarios. Climatic variability and potential change analysis were used to identify future climate regimes and shifts in precipitation and temperature patterns. Permutations of these development and climate changes were forecasted over 50 years and used to develop sediment yield regimes to the project reservoir. Extensive field work and reservoir survey, including current and wave instrumentation, were used to characterize the project watershed, rivers and reservoir hydrodynamics. A fully 3 dimensional hydrodynamic reservoir sedimentation model was developed for the project and calibrated to observed data. Hydrologic and sedimentation results from watershed forecasting provided boundary conditions for reservoir inputs. The calibrated reservoir model was then used to forecast changes in reservoir sedimentation and storage capacity under different future climate scenarios. Results indicated unique zones of advancing sediment deltas and temporary storage areas. Forecasted changes in reservoir bathymetry and sedimentation patterns were also developed for the various climate change scenarios. The warmer and wetter scenario produced sedimentation impacts similar to extensive development under no climate change. The results of these analyses are being used to develop collaborative watershed and soil conservation partnerships to reduce future soil losses and reservoir sedimentation from projected development. (author)

  6. ROE County Data

    Data.gov (United States)

    U.S. Environmental Protection Agency — This polygon dataset shows the outlines of states, counties, and county equivalents (Louisiana parishes, Alaska boroughs, Puerto Rico municipalities, and U.S. Virgin...

  7. Application of Reservoir Characterization and Advanced Technology to Improve Recovery and Economics in a Lower Quality Shallow Shelf Carbonate Reservoir

    International Nuclear Information System (INIS)

    Hickman, Scott T.; Justice James L.; Taylor, Archie R.

    1999-01-01

    The Class 2 Project at West Welch was designed to demonstrate the use of advanced technologies to enhance the economics of improved oil recovery (IOR) projects in lower quality Shallow Shelf Carbonate (SSC) reservoirs, resulting in recovery of additional oil that would otherwise be left in the reservoir at project abandonment. Accurate reservoir description is critical to the effective evaluation and efficient design of IOR projects in the heterogeneous SSC reservoirs

  8. 33 CFR 110.77 - Amistad Reservoir, Tex.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Amistad Reservoir, Tex. 110.77... ANCHORAGE REGULATIONS Special Anchorage Areas § 110.77 Amistad Reservoir, Tex. (a) Diablo East, Tex. That portion of the Amistad Reservoir enclosed by a line connecting the following points, excluding a 300-foot...

  9. Method of extracting heat from dry geothermal reservoirs

    Science.gov (United States)

    Potter, R.M.; Robinson, E.S.; Smith, M.C.

    1974-01-22

    Hydraulic fracturing is used to interconnect two or more holes that penetrate a previously dry geothermal reservoir, and to produce within the reservoir a sufficiently large heat-transfer surface so that heat can be extracted from the reservoir at a usefully high rate by a fluid entering it through one hole and leaving it through another. Introduction of a fluid into the reservoir to remove heat from it and establishment of natural (unpumped) convective circulation through the reservoir to accomplish continuous heat removal are important and novel features of the method. (auth)

  10. AirCache: A Crowd-Based Solution for Geoanchored Floating Data

    Directory of Open Access Journals (Sweden)

    Armir Bujari

    2016-01-01

    Full Text Available The Internet edge has evolved from a simple consumer of information and data to eager producer feeding sensed data at a societal scale. The crowdsensing paradigm is a representative example which has the potential to revolutionize the way we acquire and consume data. Indeed, especially in the era of smartphones, the geographical and temporal scopus of data is often local. For instance, users’ queries are more and more frequently about a nearby object, event, person, location, and so forth. These queries could certainly be processed and answered locally, without the need for contacting a remote server through the Internet. In this scenario, the data is alimented (sensed by the users and, as a consequence, data lifetime is limited by human organizational factors (e.g., mobility. From this basis, data survivability in the Area of Interest (AoI is crucial and, if not guaranteed, could undermine system deployment. Addressing this scenario, we discuss and contribute with a novel protocol named AirCache, whose aim is to guarantee data availability in the AoI while at the same time reducing the data access costs at the network edges. We assess our proposal through a simulation analysis showing that our approach effectively fulfills its design objectives.

  11. 75 FR 49016 - County of Greenville, S.C.-Acquisition Exemption-Greenville County Economic Development Corporation

    Science.gov (United States)

    2010-08-12

    ... Greenville, S.C.--Acquisition Exemption--Greenville County Economic Development Corporation The County of... verified notice of exemption under 49 CFR 1150.31 to acquire from Greenville County Economic Development... System Act, 16 U.S.C. 1247(d). See Greenville County Economic Development Corporation--Abandonment and...

  12. Reservoir simulation with MUFITS code: Extension for double porosity reservoirs and flows in horizontal wells

    Science.gov (United States)

    Afanasyev, Andrey

    2017-04-01

    Numerical modelling of multiphase flows in porous medium is necessary in many applications concerning subsurface utilization. An incomplete list of those applications includes oil and gas fields exploration, underground carbon dioxide storage and geothermal energy production. The numerical simulations are conducted using complicated computer programs called reservoir simulators. A robust simulator should include a wide range of modelling options covering various exploration techniques, rock and fluid properties, and geological settings. In this work we present a recent development of new options in MUFITS code [1]. The first option concerns modelling of multiphase flows in double-porosity double-permeability reservoirs. We describe internal representation of reservoir models in MUFITS, which are constructed as a 3D graph of grid blocks, pipe segments, interfaces, etc. In case of double porosity reservoir, two linked nodes of the graph correspond to a grid cell. We simulate the 6th SPE comparative problem [2] and a five-spot geothermal production problem to validate the option. The second option concerns modelling of flows in porous medium coupled with flows in horizontal wells that are represented in the 3D graph as a sequence of pipe segments linked with pipe junctions. The well completions link the pipe segments with reservoir. The hydraulics in the wellbore, i.e. the frictional pressure drop, is calculated in accordance with Haaland's formula. We validate the option against the 7th SPE comparative problem [3]. We acknowledge financial support by the Russian Foundation for Basic Research (project No RFBR-15-31-20585). References [1] Afanasyev, A. MUFITS Reservoir Simulation Software (www.mufits.imec.msu.ru). [2] Firoozabadi A. et al. Sixth SPE Comparative Solution Project: Dual-Porosity Simulators // J. Petrol. Tech. 1990. V.42. N.6. P.710-715. [3] Nghiem L., et al. Seventh SPE Comparative Solution Project: Modelling of Horizontal Wells in Reservoir Simulation

  13. Quantification of Libby Reservoir Water Levels Needed to Maintain or Enhance Reservoir Fisheries, 1988-1996 Methods and Data Summary.

    Energy Technology Data Exchange (ETDEWEB)

    Dalbey, Steven Ray

    1998-03-01

    The Libby Reservoir study is part of the Northwest Power Planning Council's resident fish and wildlife program. The program was mandated by the Northwest Planning Act of 1980, and is responsible for mitigating for damages to fish and wildlife caused by hydroelectric development in the Columbia River Basin. The objective of Phase I of the project (1983 through 1987) was to maintain or enhance the Libby Reservoir fishery by quantifying seasonal water levels and developing ecologically sound operational guidelines. The objective of Phase II of the project (1988 through 1996) was to determine the biological effects of reservoir operations combined with biotic changes associated with an aging reservoir. This report summarizes the data collected from Libby Reservoir during 1988 through 1996.

  14. FEASIBILITY STUDY OF SEDIMENT FLUSHING FROM MOSUL RESERVOIR, IRAQ

    Directory of Open Access Journals (Sweden)

    Thair Mahmood Al-Taiee

    2015-02-01

    Full Text Available The Feasibility of sediment flushing  from Mosul reservoir located northern iraq was conducted. Many up to date world criteria and indices for checking the efficiency of sediment flushing from reservoir which have been got through analyzing large amount of  data from many flushed reservoirs  in the world which were depended tested and applied in the present case study (Mosul Reservoir. These criteria and indices depend mainly on the hydrological , hydraulic and  topographical properties of the reservoirs in-addition to the operation plan of the reservoirs. They gave a good indication for checking the efficiency of the sediment flushing  process in the reservoirs. It was concluded that approximately the main criteria for the successful flushing sediment was  verified  in  Mosul  reservoir  such as  Sediment Balance Ratio   (SBR and the Long Term Capacity Ratio (LTCR,the shape factor  of reservoir (W/L and the hydraulic condition such as the percentage of (Qf/Qin and (Vf/Vin. This gave an indication that the processes of flushing sediment in Mosul reservoir is probably feasible and may be applied  in the future to maintain the water storage in the reservoir.

  15. Imaging fluid/solid interactions in hydrocarbon reservoir rocks.

    Science.gov (United States)

    Uwins, P J; Baker, J C; Mackinnon, I D

    1993-08-01

    The environmental scanning electron microscope (ESEM) has been used to image liquid hydrocarbons in sandstones and oil shales. Additionally, the fluid sensitivity of selected clay minerals in hydrocarbon reservoirs was assessed via three case studies: HCl acid sensitivity of authigenic chlorite in sandstone reservoirs, freshwater sensitivity of authigenic illite/smectite in sandstone reservoirs, and bleach sensitivity of a volcanic reservoir containing abundant secondary chlorite/corrensite. The results showed the suitability of using ESEM for imaging liquid hydrocarbon films in hydrocarbon reservoirs and the importance of simulating in situ fluid-rock interactions for hydrocarbon production programmes. In each case, results of the ESEM studies greatly enhanced prediction of reservoir/borehole reactions and, in some cases, contradicted conventional wisdom regarding the outcome of potential engineering solutions.

  16. The impact of hydraulic flow unit & reservoir quality index on pressure profile and productivity index in multi-segments reservoirs

    Directory of Open Access Journals (Sweden)

    Salam Al-Rbeawi

    2017-12-01

    Full Text Available The objective of this paper is studying the impact of the hydraulic flow unit and reservoir quality index (RQI on pressure profile and productivity index of horizontal wells acting in finite reservoirs. Several mathematical models have been developed to investigate this impact. These models have been built based on the pressure distribution in porous media, depleted by a horizontal well, consist of multi hydraulic flow units and different reservoir quality index. The porous media are assumed to be finite rectangular reservoirs having different configurations and the wellbores may have different lengths. Several analytical models describing flow regimes have been derived wherein hydraulic flow units and reservoir quality index have been included in addition to rock and fluid properties. The impact of these two parameters on reservoir performance has also been studied using steady state productivity index.It has been found that both pressure responses and flow regimes are highly affected by the existence of multiple hydraulic flow units in the porous media and the change in reservoir quality index for these units. Positive change in the RQI could lead to positive change in both pressure drop required for reservoir fluids to move towards the wellbore and hence the productivity index.

  17. Reservoir Models for Gas Hydrate Numerical Simulation

    Science.gov (United States)

    Boswell, R.

    2016-12-01

    Scientific and industrial drilling programs have now providing detailed information on gas hydrate systems that will increasingly be the subject of field experiments. The need to carefully plan these programs requires reliable prediction of reservoir response to hydrate dissociation. Currently, a major emphasis in gas hydrate modeling is the integration of thermodynamic/hydrologic phenomena with geomechanical response for both reservoir and bounding strata. However, also critical to the ultimate success of these efforts is the appropriate development of input geologic models, including several emerging issues, including (1) reservoir heterogeneity, (2) understanding of the initial petrophysical characteristics of the system (reservoirs and seals), the dynamic evolution of those characteristics during active dissociation, and the interdependency of petrophysical parameters and (3) the nature of reservoir boundaries. Heterogeneity is ubiquitous aspect of every natural reservoir, and appropriate characterization is vital. However, heterogeneity is not random. Vertical variation can be evaluated with core and well log data; however, core data often are challenged by incomplete recovery. Well logs also provide interpretation challenges, particularly where reservoirs are thinly-bedded due to limitation in vertical resolution. This imprecision will extend to any petrophysical measurements that are derived from evaluation of log data. Extrapolation of log data laterally is also complex, and should be supported by geologic mapping. Key petrophysical parameters include porosity, permeability and it many aspects, and water saturation. Field data collected to date suggest that the degree of hydrate saturation is strongly controlled by/dependant upon reservoir quality and that the ratio of free to bound water in the remaining pore space is likely also controlled by reservoir quality. Further, those parameters will also evolve during dissociation, and not necessary in a simple

  18. Dredged Material Management Plan and Environmental Impact Statement. McNary Reservoir and Lower Snake River Reservoirs. Appendix C: Economic Analysis

    National Research Council Canada - National Science Library

    2002-01-01

    ...; for managment of dredged material from these reservoirs; and for maintenance of flow conveyance capacity at the most upstream extent of the Lower Granite reservoir for the remaining economic life of the dam and reservoir project (to year 2074...

  19. 49 CFR 236.792 - Reservoir, equalizing.

    Science.gov (United States)

    2010-10-01

    ... Reservoir, equalizing. An air reservoir connected with and adding volume to the top portion of the equalizing piston chamber of the automatic brake valve, to provide uniform service reductions in brake pipe...

  20. Taos County Roads

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — Vector line shapefile under the stewardship of the Taos County Planning Department depicting roads in Taos County, New Mexico. Originally under the Emergency...