WorldWideScience

Sample records for large-scale geologic processes

  1. Application of simplified models to CO2 migration and immobilization in large-scale geological systems

    KAUST Repository

    Gasda, Sarah E.

    2012-07-01

    Long-term stabilization of injected carbon dioxide (CO 2) is an essential component of risk management for geological carbon sequestration operations. However, migration and trapping phenomena are inherently complex, involving processes that act over multiple spatial and temporal scales. One example involves centimeter-scale density instabilities in the dissolved CO 2 region leading to large-scale convective mixing that can be a significant driver for CO 2 dissolution. Another example is the potentially important effect of capillary forces, in addition to buoyancy and viscous forces, on the evolution of mobile CO 2. Local capillary effects lead to a capillary transition zone, or capillary fringe, where both fluids are present in the mobile state. This small-scale effect may have a significant impact on large-scale plume migration as well as long-term residual and dissolution trapping. Computational models that can capture both large and small-scale effects are essential to predict the role of these processes on the long-term storage security of CO 2 sequestration operations. Conventional modeling tools are unable to resolve sufficiently all of these relevant processes when modeling CO 2 migration in large-scale geological systems. Herein, we present a vertically-integrated approach to CO 2 modeling that employs upscaled representations of these subgrid processes. We apply the model to the Johansen formation, a prospective site for sequestration of Norwegian CO 2 emissions, and explore the sensitivity of CO 2 migration and trapping to subscale physics. Model results show the relative importance of different physical processes in large-scale simulations. The ability of models such as this to capture the relevant physical processes at large spatial and temporal scales is important for prediction and analysis of CO 2 storage sites. © 2012 Elsevier Ltd.

  2. Approaches to large scale unsaturated flow in heterogeneous, stratified, and fractured geologic media

    International Nuclear Information System (INIS)

    Ababou, R.

    1991-08-01

    This report develops a broad review and assessment of quantitative modeling approaches and data requirements for large-scale subsurface flow in radioactive waste geologic repository. The data review includes discussions of controlled field experiments, existing contamination sites, and site-specific hydrogeologic conditions at Yucca Mountain. Local-scale constitutive models for the unsaturated hydrodynamic properties of geologic media are analyzed, with particular emphasis on the effect of structural characteristics of the medium. The report further reviews and analyzes large-scale hydrogeologic spatial variability from aquifer data, unsaturated soil data, and fracture network data gathered from the literature. Finally, various modeling strategies toward large-scale flow simulations are assessed, including direct high-resolution simulation, and coarse-scale simulation based on auxiliary hydrodynamic models such as single equivalent continuum and dual-porosity continuum. The roles of anisotropy, fracturing, and broad-band spatial variability are emphasized. 252 refs

  3. Geologic implications of large-scale trends in well-log response, northern Green River Basin, Wyoming

    International Nuclear Information System (INIS)

    Prensky, S.E.

    1986-01-01

    Well-log response in lower Tertiary and Upper Cretaceous rocks in the northern Green River basin, Wyoming, is examined. Digitally recorded well-log data for selected wells located throughout the basin were processed by computer and displayed as highly compressed depth-scale plots for examining large-scale geologic trends. Stratigraphic units, formed under similar depositional conditions, are distinguishable by differing patterns on these plots. In particular, a strong lithologic contrast between Tertiary and underlying Upper Cretaceous non-marine clastic rocks is revealed and correlated through the study area. Laboratory analysis combined with gamma-ray spectrometry log data show that potassium feldspars in the arkosic Tertiary sandstones cause the contrast. The nature and extent of overpressuring has been examined. Data shift on shale conductivity and shale acoustic transit-time plots, previously ascribed to changes in pore pressure, correspond to stratigraphic changes and not necessarily with changes in pore pressure as indicated by drilling-mud weights. Gulf Coast well-log techniques for detecting overpressuring are unreliable and ineffectual in this basin, which has experienced significantly different geologic depositional and tectonic conditions

  4. Large-scale impact cratering on the terrestrial planets

    International Nuclear Information System (INIS)

    Grieve, R.A.F.

    1982-01-01

    The crater densities on the earth and moon form the basis for a standard flux-time curve that can be used in dating unsampled planetary surfaces and constraining the temporal history of endogenic geologic processes. Abundant evidence is seen not only that impact cratering was an important surface process in planetary history but also that large imapact events produced effects that were crucial in scale. By way of example, it is noted that the formation of multiring basins on the early moon was as important in defining the planetary tectonic framework as plate tectonics is on the earth. Evidence from several planets suggests that the effects of very-large-scale impacts go beyond the simple formation of an impact structure and serve to localize increased endogenic activity over an extended period of geologic time. Even though no longer occurring with the frequency and magnitude of early solar system history, it is noted that large scale impact events continue to affect the local geology of the planets. 92 references

  5. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  6. Large earthquake rates from geologic, geodetic, and seismological perspectives

    Science.gov (United States)

    Jackson, D. D.

    2017-12-01

    Earthquake rate and recurrence information comes primarily from geology, geodesy, and seismology. Geology gives the longest temporal perspective, but it reveals only surface deformation, relatable to earthquakes only with many assumptions. Geodesy is also limited to surface observations, but it detects evidence of the processes leading to earthquakes, again subject to important assumptions. Seismology reveals actual earthquakes, but its history is too short to capture important properties of very large ones. Unfortunately, the ranges of these observation types barely overlap, so that integrating them into a consistent picture adequate to infer future prospects requires a great deal of trust. Perhaps the most important boundary is the temporal one at the beginning of the instrumental seismic era, about a century ago. We have virtually no seismological or geodetic information on large earthquakes before then, and little geological information after. Virtually all-modern forecasts of large earthquakes assume some form of equivalence between tectonic- and seismic moment rates as functions of location, time, and magnitude threshold. That assumption links geology, geodesy, and seismology, but it invokes a host of other assumptions and incurs very significant uncertainties. Questions include temporal behavior of seismic and tectonic moment rates; shape of the earthquake magnitude distribution; upper magnitude limit; scaling between rupture length, width, and displacement; depth dependence of stress coupling; value of crustal rigidity; and relation between faults at depth and their surface fault traces, to name just a few. In this report I'll estimate the quantitative implications for estimating large earthquake rate. Global studies like the GEAR1 project suggest that surface deformation from geology and geodesy best show the geography of very large, rare earthquakes in the long term, while seismological observations of small earthquakes best forecasts moderate earthquakes

  7. Field Geology/Processes

    Science.gov (United States)

    Allen, Carlton; Jakes, Petr; Jaumann, Ralf; Marshall, John; Moses, Stewart; Ryder, Graham; Saunders, Stephen; Singer, Robert

    1996-01-01

    The field geology/process group examined the basic operations of a terrestrial field geologist and the manner in which these operations could be transferred to a planetary lander. Four basic requirements for robotic field geology were determined: geologic content; surface vision; mobility; and manipulation. Geologic content requires a combination of orbital and descent imaging. Surface vision requirements include range, resolution, stereo, and multispectral imaging. The minimum mobility for useful field geology depends on the scale of orbital imagery. Manipulation requirements include exposing unweathered surfaces, screening samples, and bringing samples in contact with analytical instruments. To support these requirements, several advanced capabilities for future development are recommended. Capabilities include near-infrared reflectance spectroscopy, hyper-spectral imaging, multispectral microscopy, artificial intelligence in support of imaging, x ray diffraction, x ray fluorescence, and rock chipping.

  8. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  9. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  10. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif; Orakzai, Faisal Moeen; Abdelaziz, Ibrahim; Khayyat, Zuhair

    2017-01-01

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  11. Decoupling processes and scales of shoreline morphodynamics

    Science.gov (United States)

    Hapke, Cheryl J.; Plant, Nathaniel G.; Henderson, Rachel E.; Schwab, William C.; Nelson, Timothy R.

    2016-01-01

    Behavior of coastal systems on time scales ranging from single storm events to years and decades is controlled by both small-scale sediment transport processes and large-scale geologic, oceanographic, and morphologic processes. Improved understanding of coastal behavior at multiple time scales is required for refining models that predict potential erosion hazards and for coastal management planning and decision-making. Here we investigate the primary controls on shoreline response along a geologically-variable barrier island on time scales resolving extreme storms and decadal variations over a period of nearly one century. An empirical orthogonal function analysis is applied to a time series of shoreline positions at Fire Island, NY to identify patterns of shoreline variance along the length of the island. We establish that there are separable patterns of shoreline behavior that represent response to oceanographic forcing as well as patterns that are not explained by this forcing. The dominant shoreline behavior occurs over large length scales in the form of alternating episodes of shoreline retreat and advance, presumably in response to storms cycles. Two secondary responses include long-term response that is correlated to known geologic variations of the island and the other reflects geomorphic patterns with medium length scale. Our study also includes the response to Hurricane Sandy and a period of post-storm recovery. It was expected that the impacts from Hurricane Sandy would disrupt long-term trends and spatial patterns. We found that the response to Sandy at Fire Island is not notable or distinguishable from several other large storms of the prior decade.

  12. The Geological Grading Scale: Every million Points Counts!

    Science.gov (United States)

    Stegman, D. R.; Cooper, C. M.

    2006-12-01

    The concept of geological time, ranging from thousands to billions of years, is naturally quite difficult for students to grasp initially, as it is much longer than the timescales over which they experience everyday life. Moreover, universities operate on a few key timescales (hourly lectures, weekly assignments, mid-term examinations) to which students' maximum attention is focused, largely driven by graded assessment. The geological grading scale exploits the overwhelming interest students have in grades as an opportunity to instill familiarity with geological time. With the geological grading scale, the number of possible points/marks/grades available in the course is scaled to 4.5 billion points --- collapsing the entirety of Earth history into one semester. Alternatively, geological time can be compressed into each assignment, with scores for weekly homeworks not worth 100 points each, but 4.5 billion! Homeworks left incomplete with questions unanswered lose 100's of millions of points - equivalent to missing the Paleozoic era. The expected quality of presentation for problem sets can be established with great impact in the first week by docking assignments an insignificant amount points for handing in messy work; though likely more points than they've lost in their entire schooling history combined. Use this grading scale and your students will gradually begin to appreciate exactly how much time represents a geological blink of the eye.

  13. Multi-scale interactions of geological processes during mineralization: cascade dynamics model and multifractal simulation

    Directory of Open Access Journals (Sweden)

    L. Yao

    2011-03-01

    Full Text Available Relations between mineralization and certain geological processes are established mostly by geologist's knowledge of field observations. However, these relations are descriptive and a quantitative model of how certain geological processes strengthen or hinder mineralization is not clear, that is to say, the mechanism of the interactions between mineralization and the geological framework has not been thoroughly studied. The dynamics behind these interactions are key in the understanding of fractal or multifractal formations caused by mineralization, among which singularities arise due to anomalous concentration of metals in narrow space. From a statistical point of view, we think that cascade dynamics play an important role in mineralization and studying them can reveal the nature of the various interactions throughout the process. We have constructed a multiplicative cascade model to simulate these dynamics. The probabilities of mineral deposit occurrences are used to represent direct results of mineralization. Multifractal simulation of probabilities of mineral potential based on our model is exemplified by a case study dealing with hydrothermal gold deposits in southern Nova Scotia, Canada. The extent of the impacts of certain geological processes on gold mineralization is related to the scale of the cascade process, especially to the maximum cascade division number nmax. Our research helps to understand how the singularity occurs during mineralization, which remains unanswered up to now, and the simulation may provide a more accurate distribution of mineral deposit occurrences that can be used to improve the results of the weights of evidence model in mapping mineral potential.

  14. On the Geologic Time Scale

    NARCIS (Netherlands)

    Gradstein, F.M.; Ogg, J.G.; Hilgen, F.J.

    2012-01-01

    This report summarizes the international divisions and ages in the Geologic Time Scale, published in 2012 (GTS2012). Since 2004, when GTS2004 was detailed, major developments have taken place that directly bear and have considerable impact on the intricate science of geologic time scaling. Precam

  15. Identification of low order models for large scale processes

    NARCIS (Netherlands)

    Wattamwar, S.K.

    2010-01-01

    Many industrial chemical processes are complex, multi-phase and large scale in nature. These processes are characterized by various nonlinear physiochemical effects and fluid flows. Such processes often show coexistence of fast and slow dynamics during their time evolutions. The increasing demand

  16. Deterministic geologic processes and stochastic modeling

    International Nuclear Information System (INIS)

    Rautman, C.A.; Flint, A.L.

    1992-01-01

    This paper reports that recent outcrop sampling at Yucca Mountain, Nevada, has produced significant new information regarding the distribution of physical properties at the site of a potential high-level nuclear waste repository. consideration of the spatial variability indicates that her are a number of widespread deterministic geologic features at the site that have important implications for numerical modeling of such performance aspects as ground water flow and radionuclide transport. Because the geologic processes responsible for formation of Yucca Mountain are relatively well understood and operate on a more-or-less regional scale, understanding of these processes can be used in modeling the physical properties and performance of the site. Information reflecting these deterministic geologic processes may be incorporated into the modeling program explicitly using geostatistical concepts such as soft information, or implicitly, through the adoption of a particular approach to modeling

  17. A practical process for light-water detritiation at large scales

    Energy Technology Data Exchange (ETDEWEB)

    Boniface, H.A. [Atomic Energy of Canada Limited, Chalk River, ON (Canada); Robinson, J., E-mail: jr@tyne-engineering.com [Tyne Engineering, Burlington, ON (Canada); Gnanapragasam, N.V.; Castillo, I.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    AECL and Tyne Engineering have recently completed a preliminary engineering design for a modest-scale tritium removal plant for light water, intended for installation at AECL's Chalk River Laboratories (CRL). This plant design was based on the Combined Electrolysis and Catalytic Exchange (CECE) technology developed at CRL over many years and demonstrated there and elsewhere. The general features and capabilities of this design have been reported as well as the versatility of the design for separating any pair of the three hydrogen isotopes. The same CECE technology could be applied directly to very large-scale wastewater detritiation, such as the case at Fukushima Daiichi Nuclear Power Station. However, since the CECE process scales linearly with throughput, the required capital and operating costs are substantial for such large-scale applications. This paper discusses some options for reducing the costs of very large-scale detritiation. Options include: Reducing tritium removal effectiveness; Energy recovery; Improving the tolerance of impurities; Use of less expensive or more efficient equipment. A brief comparison with alternative processes is also presented. (author)

  18. Characterization of Pliocene and Miocene Formations in the Wilmington Graben, Offshore Los Angeles, for Large-Scale Geologic Storage of CO₂

    Energy Technology Data Exchange (ETDEWEB)

    Bruno, Michael [Geomechanics Technologies, Incorporated, Monrovia, CA (United States)

    2014-12-08

    Geomechanics Technologies has completed a detailed characterization study of the Wilmington Graben offshore Southern California area for large-scale CO₂ storage. This effort has included: an evaluation of existing wells in both State and Federal waters, field acquisition of about 175 km (109 mi) of new seismic data, new well drilling, development of integrated 3D geologic, geomechanics, and fluid flow models for the area. The geologic analysis indicates that more than 796 MMt of storage capacity is available within the Pliocene and Miocene formations in the Graben for midrange geologic estimates (P50). Geomechanical analyses indicate that injection can be conducted without significant risk for surface deformation, induced stresses or fault activation. Numerical analysis of fluid migration indicates that injection into the Pliocene Formation at depths of 1525 m (5000 ft) would lead to undesirable vertical migration of the CO₂ plume. Recent well drilling however, indicates that deeper sand is present at depths exceeding 2135 m (7000 ft), which could be viable for large volume storage. For vertical containment, injection would need to be limited to about 250,000 metric tons per year per well, would need to be placed at depths greater than 7000ft, and would need to be placed in new wells located at least 1 mile from any existing offset wells. As a practical matter, this would likely limit storage operations in the Wilmington Graben to about 1 million tons per year or less. A quantitative risk analysis for the Wilmington Graben indicate that such large scale CO₂ storage in the area would represent higher risk than other similar size projects in the US and overseas.

  19. A KPI-based process monitoring and fault detection framework for large-scale processes.

    Science.gov (United States)

    Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang

    2017-05-01

    Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Large-scale methanol plants. [Based on Japanese-developed process

    Energy Technology Data Exchange (ETDEWEB)

    Tado, Y

    1978-02-01

    A study was made on how to produce methanol economically which is expected as a growth item for use as a material for pollution-free energy or for chemical use, centering on the following subjects: (1) Improvement of thermal economy, (2) Improvement of process, and (3) Problems of hardware attending the expansion of scale. The results of this study were already adopted in actual plants, obtaining good results, and large-scale methanol plants are going to be realized.

  1. Towards Portable Large-Scale Image Processing with High-Performance Computing.

    Science.gov (United States)

    Huo, Yuankai; Blaber, Justin; Damon, Stephen M; Boyd, Brian D; Bao, Shunxing; Parvathaneni, Prasanna; Noguera, Camilo Bermudez; Chaganti, Shikha; Nath, Vishwesh; Greer, Jasmine M; Lyu, Ilwoo; French, William R; Newton, Allen T; Rogers, Baxter P; Landman, Bennett A

    2018-05-03

    High-throughput, large-scale medical image computing demands tight integration of high-performance computing (HPC) infrastructure for data storage, job distribution, and image processing. The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has constructed a large-scale image storage and processing infrastructure that is composed of (1) a large-scale image database using the eXtensible Neuroimaging Archive Toolkit (XNAT), (2) a content-aware job scheduling platform using the Distributed Automation for XNAT pipeline automation tool (DAX), and (3) a wide variety of encapsulated image processing pipelines called "spiders." The VUIIS CCI medical image data storage and processing infrastructure have housed and processed nearly half-million medical image volumes with Vanderbilt Advanced Computing Center for Research and Education (ACCRE), which is the HPC facility at the Vanderbilt University. The initial deployment was natively deployed (i.e., direct installations on a bare-metal server) within the ACCRE hardware and software environments, which lead to issues of portability and sustainability. First, it could be laborious to deploy the entire VUIIS CCI medical image data storage and processing infrastructure to another HPC center with varying hardware infrastructure, library availability, and software permission policies. Second, the spiders were not developed in an isolated manner, which has led to software dependency issues during system upgrades or remote software installation. To address such issues, herein, we describe recent innovations using containerization techniques with XNAT/DAX which are used to isolate the VUIIS CCI medical image data storage and processing infrastructure from the underlying hardware and software environments. The newly presented XNAT/DAX solution has the following new features: (1) multi-level portability from system level to the application level, (2) flexible and dynamic software

  2. Pore-scale studies of multiphase flow and reaction involving CO2 sequestration in geologic formations

    Science.gov (United States)

    Kang, Q.; Wang, M.; Lichtner, P. C.

    2008-12-01

    In geologic CO2 sequestration, pore-scale interfacial phenomena ultimately govern the key processes of fluid mobility, chemical transport, adsorption, and reaction. However, spatial heterogeneity at the pore scale cannot be resolved at the continuum scale, where averaging occurs over length scales much larger than typical pore sizes. Natural porous media, such as sedimentary rocks and other geological media encountered in subsurface formations, are inherently heterogeneous. This pore-scale heterogeneity can produce variabilities in flow, transport, and reaction processes that take place within a porous medium, and can result in spatial variations in fluid velocity, aqueous concentrations, and reaction rates. Consequently, the unresolved spatial heterogeneity at the pore scale may be important for reactive transport modeling at the larger scale. In addition, current continuum models of surface complexation reactions ignore a fundamental property of physical systems, namely conservation of charge. Therefore, to better understand multiphase flow and reaction involving CO2 sequestration in geologic formations, it is necessary to quantitatively investigate the influence of the pore-scale heterogeneity on the emergent behavior at the field scale. We have applied the lattice Boltzmann method to simulating the injection of CO2 saturated brine or supercritical CO2 into geological formations at the pore scale. Multiple pore-scale processes, including advection, diffusion, homogeneous reactions among multiple aqueous species, heterogeneous reactions between the aqueous solution and minerals, ion exchange and surface complexation, as well as changes in solid and pore geometry are all taken into account. The rich pore scale information will provide a basis for upscaling to the continuum scale.

  3. Constructing large scale SCI-based processing systems by switch elements

    International Nuclear Information System (INIS)

    Wu, B.; Kristiansen, E.; Skaali, B.; Bogaerts, A.; Divia, R.; Mueller, H.

    1993-05-01

    The goal of this paper is to study some of the design criteria for the switch elements to form the interconnection of large scale SCI-based processing systems. The approved IEEE standard 1596 makes it possible to couple up to 64K nodes together. In order to connect thousands of nodes to construct large scale SCI-based processing systems, one has to interconnect these nodes by switch elements to form different topologies. A summary of the requirements and key points of interconnection networks and switches is presented. Two models of the SCI switch elements are proposed. The authors investigate several examples of systems constructed for 4-switches with simulations and the results are analyzed. Some issues and enhancements are discussed to provide the ideas behind the switch design that can improve performance and reduce latency. 29 refs., 11 figs., 3 tabs

  4. Combined process automation for large-scale EEG analysis.

    Science.gov (United States)

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Visual analysis of inter-process communication for large-scale parallel computing.

    Science.gov (United States)

    Muelder, Chris; Gygi, Francois; Ma, Kwan-Liu

    2009-01-01

    In serial computation, program profiling is often helpful for optimization of key sections of code. When moving to parallel computation, not only does the code execution need to be considered but also communication between the different processes which can induce delays that are detrimental to performance. As the number of processes increases, so does the impact of the communication delays on performance. For large-scale parallel applications, it is critical to understand how the communication impacts performance in order to make the code more efficient. There are several tools available for visualizing program execution and communications on parallel systems. These tools generally provide either views which statistically summarize the entire program execution or process-centric views. However, process-centric visualizations do not scale well as the number of processes gets very large. In particular, the most common representation of parallel processes is a Gantt char t with a row for each process. As the number of processes increases, these charts can become difficult to work with and can even exceed screen resolution. We propose a new visualization approach that affords more scalability and then demonstrate it on systems running with up to 16,384 processes.

  6. Geophysical mapping of complex glaciogenic large-scale structures

    DEFF Research Database (Denmark)

    Høyer, Anne-Sophie

    2013-01-01

    This thesis presents the main results of a four year PhD study concerning the use of geophysical data in geological mapping. The study is related to the Geocenter project, “KOMPLEKS”, which focuses on the mapping of complex, large-scale geological structures. The study area is approximately 100 km2...... data types and co-interpret them in order to improve our geological understanding. However, in order to perform this successfully, methodological considerations are necessary. For instance, a structure indicated by a reflection in the seismic data is not always apparent in the resistivity data...... information) can be collected. The geophysical data are used together with geological analyses from boreholes and pits to interpret the geological history of the hill-island. The geophysical data reveal that the glaciotectonic structures truncate at the surface. The directions of the structures were mapped...

  7. Response of deep and shallow tropical maritime cumuli to large-scale processes

    Science.gov (United States)

    Yanai, M.; Chu, J.-H.; Stark, T. E.; Nitta, T.

    1976-01-01

    The bulk diagnostic method of Yanai et al. (1973) and a simplified version of the spectral diagnostic method of Nitta (1975) are used for a more quantitative evaluation of the response of various types of cumuliform clouds to large-scale processes, using the same data set in the Marshall Islands area for a 100-day period in 1956. The dependence of the cloud mass flux distribution on radiative cooling, large-scale vertical motion, and evaporation from the sea is examined. It is shown that typical radiative cooling rates in the tropics tend to produce a bimodal distribution of mass spectrum exhibiting deep and shallow clouds. The bimodal distribution is further enhanced when the large-scale vertical motion is upward, and a nearly unimodal distribution of shallow clouds prevails when the relative cooling is compensated by the heating due to the large-scale subsidence. Both deep and shallow clouds are modulated by large-scale disturbances. The primary role of surface evaporation is to maintain the moisture flux at the cloud base.

  8. Hydromechanical coupling in geologic processes

    Science.gov (United States)

    Neuzil, C.E.

    2003-01-01

    Earth's porous crust and the fluids within it are intimately linked through their mechanical effects on each other. This paper presents an overview of such "hydromechanical" coupling and examines current understanding of its role in geologic processes. An outline of the theory of hydromechanics and rheological models for geologic deformation is included to place various analytical approaches in proper context and to provide an introduction to this broad topic for nonspecialists. Effects of hydromechanical coupling are ubiquitous in geology, and can be local and short-lived or regional and very long-lived. Phenomena such as deposition and erosion, tectonism, seismicity, earth tides, and barometric loading produce strains that tend to alter fluid pressure. Resulting pressure perturbations can be dramatic, and many so-called "anomalous" pressures appear to have been created in this manner. The effects of fluid pressure on crustal mechanics are also profound. Geologic media deform and fail largely in response to effective stress, or total stress minus fluid pressure. As a result, fluid pressures control compaction, decompaction, and other types of deformation, as well as jointing, shear failure, and shear slippage, including events that generate earthquakes. By controlling deformation and failure, fluid pressures also regulate states of stress in the upper crust. Advances in the last 80 years, including theories of consolidation, transient groundwater flow, and poroelasticity, have been synthesized into a reasonably complete conceptual framework for understanding and describing hydromechanical coupling. Full coupling in two or three dimensions is described using force balance equations for deformation coupled with a mass conservation equation for fluid flow. Fully coupled analyses allow hypothesis testing and conceptual model development. However, rigorous application of full coupling is often difficult because (1) the rheological behavior of geologic media is complex

  9. Mapping geological structures in bedrock via large-scale direct current resistivity and time-domain induced polarization tomography

    DEFF Research Database (Denmark)

    Rossi, Matteo; Olsson, Per-Ivar; Johansson, Sara

    2017-01-01

    -current resistivity distribution of the subsoil and the phase of the complex conductivity using a constant-phase angle model. The joint interpretation of electrical resistivity and induced-polarization models leads to a better understanding of complex three-dimensional subsoil geometries. The results have been......An investigation of geological conditions is always a key point for planning infrastructure constructions. Bedrock surface and rock quality must be estimated carefully in the designing process of infrastructures. A large direct-current resistivity and time-domain induced-polarization survey has......, there are northwest-trending Permian dolerite dykes that are less deformed. Four 2D direct-current resistivity and time-domain induced-polarization profiles of about 1-km length have been carefully pre-processed to retrieve time-domain induced polarization responses and inverted to obtain the direct...

  10. Health benefits of geologic materials and geologic processes

    Science.gov (United States)

    Finkelman, R.B.

    2006-01-01

    The reemerging field of Medical Geology is concerned with the impacts of geologic materials and geologic processes on animal and human health. Most medical geology research has been focused on health problems caused by excess or deficiency of trace elements, exposure to ambient dust, and on other geologically related health problems or health problems for which geoscience tools, techniques, or databases could be applied. Little, if any, attention has been focused on the beneficial health effects of rocks, minerals, and geologic processes. These beneficial effects may have been recognized as long as two million years ago and include emotional, mental, and physical health benefits. Some of the earliest known medicines were derived from rocks and minerals. For thousands of years various clays have been used as an antidote for poisons. "Terra sigillata," still in use today, may have been the first patented medicine. Many trace elements, rocks, and minerals are used today in a wide variety of pharmaceuticals and health care products. There is also a segment of society that believes in the curative and preventative properties of crystals (talismans and amulets). Metals and trace elements are being used in some of today's most sophisticated medical applications. Other recent examples of beneficial effects of geologic materials and processes include epidemiological studies in Japan that have identified a wide range of health problems (such as muscle and joint pain, hemorrhoids, burns, gout, etc.) that may be treated by one or more of nine chemically distinct types of hot springs, and a study in China indicating that residential coal combustion may be mobilizing sufficient iodine to prevent iodine deficiency disease. ?? 2006 MDPI. All rights reserved.

  11. Microarray Data Processing Techniques for Genome-Scale Network Inference from Large Public Repositories.

    Science.gov (United States)

    Chockalingam, Sriram; Aluru, Maneesha; Aluru, Srinivas

    2016-09-19

    Pre-processing of microarray data is a well-studied problem. Furthermore, all popular platforms come with their own recommended best practices for differential analysis of genes. However, for genome-scale network inference using microarray data collected from large public repositories, these methods filter out a considerable number of genes. This is primarily due to the effects of aggregating a diverse array of experiments with different technical and biological scenarios. Here we introduce a pre-processing pipeline suitable for inferring genome-scale gene networks from large microarray datasets. We show that partitioning of the available microarray datasets according to biological relevance into tissue- and process-specific categories significantly extends the limits of downstream network construction. We demonstrate the effectiveness of our pre-processing pipeline by inferring genome-scale networks for the model plant Arabidopsis thaliana using two different construction methods and a collection of 11,760 Affymetrix ATH1 microarray chips. Our pre-processing pipeline and the datasets used in this paper are made available at http://alurulab.cc.gatech.edu/microarray-pp.

  12. Preliminary discussion on possible genesis of crustal rotation, its impact on geotectonic evolution and its relation to large-scale metallogeny in Hunan province and adjacent regions

    International Nuclear Information System (INIS)

    Shu Xiaojing

    2005-01-01

    Hunan province and adjacent regions show ring-form distribution features both on surface geologic structure and geophysical field. Such features might result from the rotation movement of the earth crust and exert serious impact on the geotectonic evolution and large-scale metallogeny in Hunan province and adjacent regions. This paper makes a preliminary discussion on the possible genesis of such rotation movement, as well as the associated series of geologic processes and its relation to large-scale metallogeny in Hunan province and adjacent regions. (authors)

  13. Digital geologic map in the scale 1:50 000

    International Nuclear Information System (INIS)

    Kacer, S.; Antalik, M.

    2005-01-01

    In this presentation authors present preparation of new digital geologic map of the Slovak Republic. This map is prepared by the State Geological Institute of Dionyz Stur as a part of the project Geological information system GeoIS. One of the basic information geologic layers, which will be accessible on the web-site will be digital geologic map of the Slovak Republic in the scale 1: 50 000

  14. Large-scale membrane transfer process: its application to single-crystal-silicon continuous membrane deformable mirror

    International Nuclear Information System (INIS)

    Wu, Tong; Sasaki, Takashi; Hane, Kazuhiro; Akiyama, Masayuki

    2013-01-01

    This paper describes a large-scale membrane transfer process developed for the construction of large-scale membrane devices via the transfer of continuous single-crystal-silicon membranes from one substrate to another. This technique is applied for fabricating a large stroke deformable mirror. A bimorph spring array is used to generate a large air gap between the mirror membrane and the electrode. A 1.9 mm × 1.9 mm × 2 µm single-crystal-silicon membrane is successfully transferred to the electrode substrate by Au–Si eutectic bonding and the subsequent all-dry release process. This process provides an effective approach for transferring a free-standing large continuous single-crystal-silicon to a flexible suspension spring array with a large air gap. (paper)

  15. Application of simplified models to CO2 migration and immobilization in large-scale geological systems

    KAUST Repository

    Gasda, Sarah E.; Nordbotten, Jan M.; Celia, Michael A.

    2012-01-01

    Long-term stabilization of injected carbon dioxide (CO 2) is an essential component of risk management for geological carbon sequestration operations. However, migration and trapping phenomena are inherently complex, involving processes that act

  16. South Louisiana Enhanced Oil Recovery/Sequestration R&D Project Small Scale Field Tests of Geologic Reservoir Classes for Geologic Storage

    Energy Technology Data Exchange (ETDEWEB)

    Hite, Roger [Blackhorse Energy LLC, Houston, TX (United States)

    2016-10-01

    The project site is located in Livingston Parish, Louisiana, approximately 26 miles due east of Baton Rouge. This project proposed to evaluate an early Eocene-aged Wilcox oil reservoir for permanent storage of CO2. Blackhorse Energy, LLC planned to conduct a parallel CO2 oil recovery project in the First Wilcox Sand. The primary focus of this project was to examine and prove the suitability of South Louisiana geologic formations for large-scale geologic sequestration of CO2 in association with enhanced oil recovery applications. This was to be accomplished through the focused demonstration of small-scale, permanent storage of CO2 in the First Wilcox Sand. The project was terminated at the request of Blackhorse Energy LLC on October 22, 2014.

  17. High-Resiliency and Auto-Scaling of Large-Scale Cloud Computing for OCO-2 L2 Full Physics Processing

    Science.gov (United States)

    Hua, H.; Manipon, G.; Starch, M.; Dang, L. B.; Southam, P.; Wilson, B. D.; Avis, C.; Chang, A.; Cheng, C.; Smyth, M.; McDuffie, J. L.; Ramirez, P.

    2015-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as SWOT and NISAR where data volumes and data throughput rates are order of magnitude larger than present day missions. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. We present our experiences on deploying a hybrid-cloud computing science data system (HySDS) for the OCO-2 Science Computing Facility to support large-scale processing of their Level-2 full physics data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer ~10X costs savings but with an unpredictable computing environment based on market forces. We will present how we enabled high-tolerance computing in order to achieve large-scale computing as well as operational cost savings.

  18. Geological hazard monitoring system in Georgia

    Science.gov (United States)

    Gaprindashvili, George

    2017-04-01

    Georgia belongs to one of world's most complex mountainous regions according to the scale and frequency of Geological processes and damage caused to population, farmlands, and Infrastructure facilities. Geological hazards (landslide, debrisflow/mudflow, rockfall, erosion and etc.) are affecting many populated areas, agricultural fields, roads, oil and gas pipes, high-voltage electric power transmission towers, hydraulic structures, and tourist complexes. Landslides occur almost in all geomorphological zones, resulting in wide differentiation in the failure types and mechanisms and in the size-frequency distribution. In Georgia, geological hazards triggered by: 1. Activation of highly intense earthquakes; 2. Meteorological events provoking the disaster processes on the background of global climatic change; 3. Large-scale Human impact on the environment. The prediction and monitoring of Geological Hazards is a very wide theme, which involves different researchers from different spheres. Geological hazard monitoring is essential to prevent and mitigate these hazards. In past years in Georgia several monitoring system, such as Ground-based geodetic techniques, Debrisflow Early Warning System (EWS) were installed on high sensitive landslide and debrisflow areas. This work presents description of Geological hazard monitoring system in Georgia.

  19. Moditored unsaturated soil transport processes as a support for large scale soil and water management

    Science.gov (United States)

    Vanclooster, Marnik

    2010-05-01

    The current societal demand for sustainable soil and water management is very large. The drivers of global and climate change exert many pressures on the soil and water ecosystems, endangering appropriate ecosystem functioning. The unsaturated soil transport processes play a key role in soil-water system functioning as it controls the fluxes of water and nutrients from the soil to plants (the pedo-biosphere link), the infiltration flux of precipitated water to groundwater and the evaporative flux, and hence the feed back from the soil to the climate system. Yet, unsaturated soil transport processes are difficult to quantify since they are affected by huge variability of the governing properties at different space-time scales and the intrinsic non-linearity of the transport processes. The incompatibility of the scales between the scale at which processes reasonably can be characterized, the scale at which the theoretical process correctly can be described and the scale at which the soil and water system need to be managed, calls for further development of scaling procedures in unsaturated zone science. It also calls for a better integration of theoretical and modelling approaches to elucidate transport processes at the appropriate scales, compatible with the sustainable soil and water management objective. Moditoring science, i.e the interdisciplinary research domain where modelling and monitoring science are linked, is currently evolving significantly in the unsaturated zone hydrology area. In this presentation, a review of current moditoring strategies/techniques will be given and illustrated for solving large scale soil and water management problems. This will also allow identifying research needs in the interdisciplinary domain of modelling and monitoring and to improve the integration of unsaturated zone science in solving soil and water management issues. A focus will be given on examples of large scale soil and water management problems in Europe.

  20. Beowulf Distributed Processing and the United States Geological Survey

    Science.gov (United States)

    Maddox, Brian G.

    2002-01-01

    Introduction In recent years, the United States Geological Survey's (USGS) National Mapping Discipline (NMD) has expanded its scientific and research activities. Work is being conducted in areas such as emergency response research, scientific visualization, urban prediction, and other simulation activities. Custom-produced digital data have become essential for these types of activities. High-resolution, remotely sensed datasets are also seeing increased use. Unfortunately, the NMD is also finding that it lacks the resources required to perform some of these activities. Many of these projects require large amounts of computer processing resources. Complex urban-prediction simulations, for example, involve large amounts of processor-intensive calculations on large amounts of input data. This project was undertaken to learn and understand the concepts of distributed processing. Experience was needed in developing these types of applications. The idea was that this type of technology could significantly aid the needs of the NMD scientific and research programs. Porting a numerically intensive application currently being used by an NMD science program to run in a distributed fashion would demonstrate the usefulness of this technology. There are several benefits that this type of technology can bring to the USGS's research programs. Projects can be performed that were previously impossible due to a lack of computing resources. Other projects can be performed on a larger scale than previously possible. For example, distributed processing can enable urban dynamics research to perform simulations on larger areas without making huge sacrifices in resolution. The processing can also be done in a more reasonable amount of time than with traditional single-threaded methods (a scaled version of Chester County, Pennsylvania, took about fifty days to finish its first calibration phase with a single-threaded program). This paper has several goals regarding distributed processing

  1. Vertical equilibrium with sub-scale analytical methods for geological CO2 sequestration

    KAUST Repository

    Gasda, S. E.

    2009-04-23

    Large-scale implementation of geological CO2 sequestration requires quantification of risk and leakage potential. One potentially important leakage pathway for the injected CO2 involves existing oil and gas wells. Wells are particularly important in North America, where more than a century of drilling has created millions of oil and gas wells. Models of CO 2 injection and leakage will involve large uncertainties in parameters associated with wells, and therefore a probabilistic framework is required. These models must be able to capture both the large-scale CO 2 plume associated with the injection and the small-scale leakage problem associated with localized flow along wells. Within a typical simulation domain, many hundreds of wells may exist. One effective modeling strategy combines both numerical and analytical models with a specific set of simplifying assumptions to produce an efficient numerical-analytical hybrid model. The model solves a set of governing equations derived by vertical averaging with assumptions of a macroscopic sharp interface and vertical equilibrium. These equations are solved numerically on a relatively coarse grid, with an analytical model embedded to solve for wellbore flow occurring at the sub-gridblock scale. This vertical equilibrium with sub-scale analytical method (VESA) combines the flexibility of a numerical method, allowing for heterogeneous and geologically complex systems, with the efficiency and accuracy of an analytical method, thereby eliminating expensive grid refinement for sub-scale features. Through a series of benchmark problems, we show that VESA compares well with traditional numerical simulations and to a semi-analytical model which applies to appropriately simple systems. We believe that the VESA model provides the necessary accuracy and efficiency for applications of risk analysis in many CO2 sequestration problems. © 2009 Springer Science+Business Media B.V.

  2. Large-scale groundwater modeling using global datasets: a test case for the Rhine-Meuse basin

    NARCIS (Netherlands)

    Sutanudjaja, E.H.; Beek, L.P.H. van; Jong, S.M. de; Geer, F.C. van; Bierkens, M.F.P.

    2011-01-01

    The current generation of large-scale hydrological models does not include a groundwater flow component. Large-scale groundwater models, involving aquifers and basins of multiple countries, are still rare mainly due to a lack of hydro-geological data which are usually only available in

  3. Large-scale groundwater modeling using global datasets: A test case for the Rhine-Meuse basin

    NARCIS (Netherlands)

    Sutanudjaja, E.H.; Beek, L.P.H. van; Jong, S.M. de; Geer, F.C. van; Bierkens, M.F.P.

    2011-01-01

    The current generation of large-scale hydrological models does not include a groundwater flow component. Large-scale groundwater models, involving aquifers and basins of multiple countries, are still rare mainly due to a lack of hydro-geological data which are usually only available in developed

  4. A large-scale forest landscape model incorporating multi-scale processes and utilizing forest inventory data

    Science.gov (United States)

    Wen J. Wang; Hong S. He; Martin A. Spetich; Stephen R. Shifley; Frank R. Thompson III; David R. Larsen; Jacob S. Fraser; Jian. Yang

    2013-01-01

    Two challenges confronting forest landscape models (FLMs) are how to simulate fine, standscale processes while making large-scale (i.e., .107 ha) simulation possible, and how to take advantage of extensive forest inventory data such as U.S. Forest Inventory and Analysis (FIA) data to initialize and constrain model parameters. We present the LANDIS PRO model that...

  5. Database for volcanic processes and geology of Augustine Volcano, Alaska

    Science.gov (United States)

    McIntire, Jacqueline; Ramsey, David W.; Thoms, Evan; Waitt, Richard B.; Beget, James E.

    2012-01-01

    Augustine Island (volcano) in lower Cook Inlet, Alaska, has erupted repeatedly in late-Holocene and historical times. Eruptions typically beget high-energy volcanic processes. Most notable are bouldery debris avalanches containing immense angular clasts shed from summit domes. Coarse deposits of these avalanches form much of Augustine's lower flanks. This geologic map at 1:25,000 scale depicts these deposits, these processes.

  6. Improving predictions of large scale soil carbon dynamics: Integration of fine-scale hydrological and biogeochemical processes, scaling, and benchmarking

    Science.gov (United States)

    Riley, W. J.; Dwivedi, D.; Ghimire, B.; Hoffman, F. M.; Pau, G. S. H.; Randerson, J. T.; Shen, C.; Tang, J.; Zhu, Q.

    2015-12-01

    Numerical model representations of decadal- to centennial-scale soil-carbon dynamics are a dominant cause of uncertainty in climate change predictions. Recent attempts by some Earth System Model (ESM) teams to integrate previously unrepresented soil processes (e.g., explicit microbial processes, abiotic interactions with mineral surfaces, vertical transport), poor performance of many ESM land models against large-scale and experimental manipulation observations, and complexities associated with spatial heterogeneity highlight the nascent nature of our community's ability to accurately predict future soil carbon dynamics. I will present recent work from our group to develop a modeling framework to integrate pore-, column-, watershed-, and global-scale soil process representations into an ESM (ACME), and apply the International Land Model Benchmarking (ILAMB) package for evaluation. At the column scale and across a wide range of sites, observed depth-resolved carbon stocks and their 14C derived turnover times can be explained by a model with explicit representation of two microbial populations, a simple representation of mineralogy, and vertical transport. Integrating soil and plant dynamics requires a 'process-scaling' approach, since all aspects of the multi-nutrient system cannot be explicitly resolved at ESM scales. I will show that one approach, the Equilibrium Chemistry Approximation, improves predictions of forest nitrogen and phosphorus experimental manipulations and leads to very different global soil carbon predictions. Translating model representations from the site- to ESM-scale requires a spatial scaling approach that either explicitly resolves the relevant processes, or more practically, accounts for fine-resolution dynamics at coarser scales. To that end, I will present recent watershed-scale modeling work that applies reduced order model methods to accurately scale fine-resolution soil carbon dynamics to coarse-resolution simulations. Finally, we

  7. Large scale production and downstream processing of a recombinant porcine parvovirus vaccine

    NARCIS (Netherlands)

    Maranga, L.; Rueda, P.; Antonis, A.F.G.; Vela, C.; Langeveld, J.P.M.; Casal, J.I.; Carrondo, M.J.T.

    2002-01-01

    Porcine parvovirus (PPV) virus-like particles (VLPs) constitute a potential vaccine for prevention of parvovirus-induced reproductive failure in gilts. Here we report the development of a large scale (25 l) production process for PPV-VLPs with baculovirus-infected insect cells. A low multiplicity of

  8. Simulation research on the process of large scale ship plane segmentation intelligent workshop

    Science.gov (United States)

    Xu, Peng; Liao, Liangchuang; Zhou, Chao; Xue, Rui; Fu, Wei

    2017-04-01

    Large scale ship plane segmentation intelligent workshop is a new thing, and there is no research work in related fields at home and abroad. The mode of production should be transformed by the existing industry 2.0 or part of industry 3.0, also transformed from "human brain analysis and judgment + machine manufacturing" to "machine analysis and judgment + machine manufacturing". In this transforming process, there are a great deal of tasks need to be determined on the aspects of management and technology, such as workshop structure evolution, development of intelligent equipment and changes in business model. Along with them is the reformation of the whole workshop. Process simulation in this project would verify general layout and process flow of large scale ship plane section intelligent workshop, also would analyze intelligent workshop working efficiency, which is significant to the next step of the transformation of plane segmentation intelligent workshop.

  9. Data-driven process decomposition and robust online distributed modelling for large-scale processes

    Science.gov (United States)

    Shu, Zhang; Lijuan, Li; Lijuan, Yao; Shipin, Yang; Tao, Zou

    2018-02-01

    With the increasing attention of networked control, system decomposition and distributed models show significant importance in the implementation of model-based control strategy. In this paper, a data-driven system decomposition and online distributed subsystem modelling algorithm was proposed for large-scale chemical processes. The key controlled variables are first partitioned by affinity propagation clustering algorithm into several clusters. Each cluster can be regarded as a subsystem. Then the inputs of each subsystem are selected by offline canonical correlation analysis between all process variables and its controlled variables. Process decomposition is then realised after the screening of input and output variables. When the system decomposition is finished, the online subsystem modelling can be carried out by recursively block-wise renewing the samples. The proposed algorithm was applied in the Tennessee Eastman process and the validity was verified.

  10. Ontology-aided annotation, visualization and generalization of geological time-scale information from online geological map services

    NARCIS (Netherlands)

    Ma, X.; Carranza, E.J.M.; Wu, C.; Meer, F.D. van der

    2012-01-01

    Geological maps are increasingly published and shared online, whereas tools and services supporting information retrieval and knowledge discovery are underdeveloped. In this study, we developed an ontology of geological time scale by using a RDF (Resource Description Framework) model to represent

  11. Ontology-aided annotation, visualization and generalization of geological time scale information from online geological map services

    NARCIS (Netherlands)

    Ma, Marshal; Ma, X.; Carranza, E.J.M; Wu, C.; van der Meer, F.D.

    2012-01-01

    Geological maps are increasingly published and shared online, whereas tools and services supporting information retrieval and knowledge discovery are underdeveloped. In this study, we developed an ontology of geological time scale by using a Resource Description Framework model to represent the

  12. Large Scale Gaussian Processes for Atmospheric Parameter Retrieval and Cloud Screening

    Science.gov (United States)

    Camps-Valls, G.; Gomez-Chova, L.; Mateo, G.; Laparra, V.; Perez-Suay, A.; Munoz-Mari, J.

    2017-12-01

    Current Earth-observation (EO) applications for image classification have to deal with an unprecedented big amount of heterogeneous and complex data sources. Spatio-temporally explicit classification methods are a requirement in a variety of Earth system data processing applications. Upcoming missions such as the super-spectral Copernicus Sentinels EnMAP and FLEX will soon provide unprecedented data streams. Very high resolution (VHR) sensors like Worldview-3 also pose big challenges to data processing. The challenge is not only attached to optical sensors but also to infrared sounders and radar images which increased in spectral, spatial and temporal resolution. Besides, we should not forget the availability of the extremely large remote sensing data archives already collected by several past missions, such ENVISAT, Cosmo-SkyMED, Landsat, SPOT, or Seviri/MSG. These large-scale data problems require enhanced processing techniques that should be accurate, robust and fast. Standard parameter retrieval and classification algorithms cannot cope with this new scenario efficiently. In this work, we review the field of large scale kernel methods for both atmospheric parameter retrieval and cloud detection using infrared sounding IASI data and optical Seviri/MSG imagery. We propose novel Gaussian Processes (GPs) to train problems with millions of instances and high number of input features. Algorithms can cope with non-linearities efficiently, accommodate multi-output problems, and provide confidence intervals for the predictions. Several strategies to speed up algorithms are devised: random Fourier features and variational approaches for cloud classification using IASI data and Seviri/MSG, and engineered randomized kernel functions and emulation in temperature, moisture and ozone atmospheric profile retrieval from IASI as a proxy to the upcoming MTG-IRS sensor. Excellent compromise between accuracy and scalability are obtained in all applications.

  13. Sizing and scaling requirements of a large-scale physical model for code validation

    International Nuclear Information System (INIS)

    Khaleel, R.; Legore, T.

    1990-01-01

    Model validation is an important consideration in application of a code for performance assessment and therefore in assessing the long-term behavior of the engineered and natural barriers of a geologic repository. Scaling considerations relevant to porous media flow are reviewed. An analysis approach is presented for determining the sizing requirements of a large-scale, hydrology physical model. The physical model will be used to validate performance assessment codes that evaluate the long-term behavior of the repository isolation system. Numerical simulation results for sizing requirements are presented for a porous medium model in which the media properties are spatially uncorrelated

  14. Process optimization of large-scale production of recombinant adeno-associated vectors using dielectric spectroscopy.

    Science.gov (United States)

    Negrete, Alejandro; Esteban, Geoffrey; Kotin, Robert M

    2007-09-01

    A well-characterized manufacturing process for the large-scale production of recombinant adeno-associated vectors (rAAV) for gene therapy applications is required to meet current and future demands for pre-clinical and clinical studies and potential commercialization. Economic considerations argue in favor of suspension culture-based production. Currently, the only feasible method for large-scale rAAV production utilizes baculovirus expression vectors and insect cells in suspension cultures. To maximize yields and achieve reproducibility between batches, online monitoring of various metabolic and physical parameters is useful for characterizing early stages of baculovirus-infected insect cells. In this study, rAAVs were produced at 40-l scale yielding ~1 x 10(15) particles. During the process, dielectric spectroscopy was performed by real time scanning in radio frequencies between 300 kHz and 10 MHz. The corresponding permittivity values were correlated with the rAAV production. Both infected and uninfected reached a maximum value; however, only infected cell cultures permittivity profile reached a second maximum value. This effect was correlated with the optimal harvest time for rAAV production. Analysis of rAAV indicated the harvesting time around 48 h post-infection (hpi), and 72 hpi produced similar quantities of biologically active rAAV. Thus, if operated continuously, the 24-h reduction in the production process of rAAV gives sufficient time for additional 18 runs a year corresponding to an extra production of ~2 x 10(16) particles. As part of large-scale optimization studies, this new finding will facilitate the bioprocessing scale-up of rAAV and other bioproducts.

  15. The MIRAGE project: large scale radionuclide transport investigations and integral migration experiments

    International Nuclear Information System (INIS)

    Come, B.; Bidoglio, G.; Chapman, N.

    1986-01-01

    Predictions of radionuclide migration through the geosphere must be supported by large-scale, long-term investigations. Several research areas of the MIRAGE Project are devoted to acquiring reliable data for developing and validating models. Apart from man-made migration experiments in boreholes and/or underground galleries, attention is paid to natural geological migration systems which have been active for very long time spans. The potential role of microbial activity, either resident or introduced into the host media, is also considered. In order to clarify basic mechanisms, smaller scale ''integral'' migration experiments under fully controlled laboratory conditions are also carried out using real waste forms and representative geological media. (author)

  16. Performance evaluation of the DCMD desalination process under bench scale and large scale module operating conditions

    KAUST Repository

    Francis, Lijo

    2014-04-01

    The flux performance of different hydrophobic microporous flat sheet commercial membranes made of poly tetrafluoroethylene (PTFE) and poly propylene (PP) was tested for Red Sea water desalination using the direct contact membrane distillation (DCMD) process, under bench scale (high δT) and large scale module (low δT) operating conditions. Membranes were characterized for their surface morphology, water contact angle, thickness, porosity, pore size and pore size distribution. The DCMD process performance was optimized using a locally designed and fabricated module aiming to maximize the flux at different levels of operating parameters, mainly feed water and coolant inlet temperatures at different temperature differences across the membrane (δT). Water vapor flux of 88.8kg/m2h was obtained using a PTFE membrane at high δT (60°C). In addition, the flux performance was compared to the first generation of a new locally synthesized and fabricated membrane made of a different class of polymer under the same conditions. A total salt rejection of 99.99% and boron rejection of 99.41% were achieved under extreme operating conditions. On the other hand, a detailed water characterization revealed that low molecular weight non-ionic molecules (ppb level) were transported with the water vapor molecules through the membrane structure. The membrane which provided the highest flux was then tested under large scale module operating conditions. The average flux of the latter study (low δT) was found to be eight times lower than that of the bench scale (high δT) operating conditions.

  17. Performance evaluation of the DCMD desalination process under bench scale and large scale module operating conditions

    KAUST Repository

    Francis, Lijo; Ghaffour, NorEddine; Alsaadi, Ahmad Salem; Nunes, Suzana Pereira; Amy, Gary L.

    2014-01-01

    The flux performance of different hydrophobic microporous flat sheet commercial membranes made of poly tetrafluoroethylene (PTFE) and poly propylene (PP) was tested for Red Sea water desalination using the direct contact membrane distillation (DCMD) process, under bench scale (high δT) and large scale module (low δT) operating conditions. Membranes were characterized for their surface morphology, water contact angle, thickness, porosity, pore size and pore size distribution. The DCMD process performance was optimized using a locally designed and fabricated module aiming to maximize the flux at different levels of operating parameters, mainly feed water and coolant inlet temperatures at different temperature differences across the membrane (δT). Water vapor flux of 88.8kg/m2h was obtained using a PTFE membrane at high δT (60°C). In addition, the flux performance was compared to the first generation of a new locally synthesized and fabricated membrane made of a different class of polymer under the same conditions. A total salt rejection of 99.99% and boron rejection of 99.41% were achieved under extreme operating conditions. On the other hand, a detailed water characterization revealed that low molecular weight non-ionic molecules (ppb level) were transported with the water vapor molecules through the membrane structure. The membrane which provided the highest flux was then tested under large scale module operating conditions. The average flux of the latter study (low δT) was found to be eight times lower than that of the bench scale (high δT) operating conditions.

  18. Modeling of large-scale oxy-fuel combustion processes

    DEFF Research Database (Denmark)

    Yin, Chungen

    2012-01-01

    Quite some studies have been conducted in order to implement oxy-fuel combustion with flue gas recycle in conventional utility boilers as an effective effort of carbon capture and storage. However, combustion under oxy-fuel conditions is significantly different from conventional air-fuel firing......, among which radiative heat transfer under oxy-fuel conditions is one of the fundamental issues. This paper demonstrates the nongray-gas effects in modeling of large-scale oxy-fuel combustion processes. Oxy-fuel combustion of natural gas in a 609MW utility boiler is numerically studied, in which...... calculation of the oxy-fuel WSGGM remarkably over-predicts the radiative heat transfer to the furnace walls and under-predicts the gas temperature at the furnace exit plane, which also result in a higher incomplete combustion in the gray calculation. Moreover, the gray and non-gray calculations of the same...

  19. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  20. Applicability of vector processing to large-scale nuclear codes

    International Nuclear Information System (INIS)

    Ishiguro, Misako; Harada, Hiroo; Matsuura, Toshihiko; Okuda, Motoi; Ohta, Fumio; Umeya, Makoto.

    1982-03-01

    To meet the growing trend of computational requirements in JAERI, introduction of a high-speed computer with vector processing faculty (a vector processor) is desirable in the near future. To make effective use of a vector processor, appropriate optimization of nuclear codes to pipelined-vector architecture is vital, which will pose new problems concerning code development and maintenance. In this report, vector processing efficiency is assessed with respect to large-scale nuclear codes by examining the following items: 1) The present feature of computational load in JAERI is analyzed by compiling the computer utilization statistics. 2) Vector processing efficiency is estimated for the ten heavily-used nuclear codes by analyzing their dynamic behaviors run on a scalar machine. 3) Vector processing efficiency is measured for the other five nuclear codes by using the current vector processors, FACOM 230-75 APU and CRAY-1. 4) Effectiveness of applying a high-speed vector processor to nuclear codes is evaluated by taking account of the characteristics in JAERI jobs. Problems of vector processors are also discussed from the view points of code performance and ease of use. (author)

  1. Development of polymers for large scale roll-to-roll processing of polymer solar cells

    DEFF Research Database (Denmark)

    Carlé, Jon Eggert

    Development of polymers for large scale roll-to-roll processing of polymer solar cells Conjugated polymers potential to both absorb light and transport current as well as the perspective of low cost and large scale production has made these kinds of material attractive in solar cell research....... The research field of polymer solar cells (PSCs) is rapidly progressing along three lines: Improvement of efficiency and stability together with the introduction of large scale production methods. All three lines are explored in this work. The thesis describes low band gap polymers and why these are needed....... Polymer of this type display broader absorption resulting in better overlap with the solar spectrum and potentially higher current density. Synthesis, characterization and device performance of three series of polymers illustrating how the absorption spectrum of polymers can be manipulated synthetically...

  2. Large-scale functional networks connect differently for processing words and symbol strings.

    Science.gov (United States)

    Liljeström, Mia; Vartiainen, Johanna; Kujala, Jan; Salmelin, Riitta

    2018-01-01

    Reconfigurations of synchronized large-scale networks are thought to be central neural mechanisms that support cognition and behavior in the human brain. Magnetoencephalography (MEG) recordings together with recent advances in network analysis now allow for sub-second snapshots of such networks. In the present study, we compared frequency-resolved functional connectivity patterns underlying reading of single words and visual recognition of symbol strings. Word reading emphasized coherence in a left-lateralized network with nodes in classical perisylvian language regions, whereas symbol processing recruited a bilateral network, including connections between frontal and parietal regions previously associated with spatial attention and visual working memory. Our results illustrate the flexible nature of functional networks, whereby processing of different form categories, written words vs. symbol strings, leads to the formation of large-scale functional networks that operate at distinct oscillatory frequencies and incorporate task-relevant regions. These results suggest that category-specific processing should be viewed not so much as a local process but as a distributed neural process implemented in signature networks. For words, increased coherence was detected particularly in the alpha (8-13 Hz) and high gamma (60-90 Hz) frequency bands, whereas increased coherence for symbol strings was observed in the high beta (21-29 Hz) and low gamma (30-45 Hz) frequency range. These findings attest to the role of coherence in specific frequency bands as a general mechanism for integrating stimulus-dependent information across brain regions.

  3. Analogue scale modelling of extensional tectonic processes using a large state-of-the-art centrifuge

    Science.gov (United States)

    Park, Heon-Joon; Lee, Changyeol

    2017-04-01

    Analogue scale modelling of extensional tectonic processes such as rifting and basin opening has been numerously conducted. Among the controlling factors, gravitational acceleration (g) on the scale models was regarded as a constant (Earth's gravity) in the most of the analogue model studies, and only a few model studies considered larger gravitational acceleration by using a centrifuge (an apparatus generating large centrifugal force by rotating the model at a high speed). Although analogue models using a centrifuge allow large scale-down and accelerated deformation that is derived by density differences such as salt diapir, the possible model size is mostly limited up to 10 cm. A state-of-the-art centrifuge installed at the KOCED Geotechnical Centrifuge Testing Center, Korea Advanced Institute of Science and Technology (KAIST) allows a large surface area of the scale-models up to 70 by 70 cm under the maximum capacity of 240 g-tons. Using the centrifuge, we will conduct analogue scale modelling of the extensional tectonic processes such as opening of the back-arc basin. Acknowledgement This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (grant number 2014R1A6A3A04056405).

  4. A Proactive Complex Event Processing Method for Large-Scale Transportation Internet of Things

    OpenAIRE

    Wang, Yongheng; Cao, Kening

    2014-01-01

    The Internet of Things (IoT) provides a new way to improve the transportation system. The key issue is how to process the numerous events generated by IoT. In this paper, a proactive complex event processing method is proposed for large-scale transportation IoT. Based on a multilayered adaptive dynamic Bayesian model, a Bayesian network structure learning algorithm using search-and-score is proposed to support accurate predictive analytics. A parallel Markov decision processes model is design...

  5. Hierarchical optimal control of large-scale nonlinear chemical processes.

    Science.gov (United States)

    Ramezani, Mohammad Hossein; Sadati, Nasser

    2009-01-01

    In this paper, a new approach is presented for optimal control of large-scale chemical processes. In this approach, the chemical process is decomposed into smaller sub-systems at the first level, and a coordinator at the second level, for which a two-level hierarchical control strategy is designed. For this purpose, each sub-system in the first level can be solved separately, by using any conventional optimization algorithm. In the second level, the solutions obtained from the first level are coordinated using a new gradient-type strategy, which is updated by the error of the coordination vector. The proposed algorithm is used to solve the optimal control problem of a complex nonlinear chemical stirred tank reactor (CSTR), where its solution is also compared with the ones obtained using the centralized approach. The simulation results show the efficiency and the capability of the proposed hierarchical approach, in finding the optimal solution, over the centralized method.

  6. Possible implications of large scale radiation processing of food

    International Nuclear Information System (INIS)

    Zagorski, Z.P.

    1990-01-01

    Large scale irradiation has been discussed in terms of the participation of processing cost in the final value of the improved product. Another factor has been taken into account and that is the saturation of the market with the new product. In the case of successful projects the participation of irradiation cost is low, and the demand for the better product is covered. A limited availability of sources makes the modest saturation of the market difficult with all food subjected to correct radiation treatment. The implementation of the preservation of food needs a decided selection of these kinds of food which comply to all conditions i.e. of acceptance by regulatory bodies, real improvement of quality and economy. The last condition prefers the possibility of use of electron beams of low energy. The best fulfilment of conditions for successful processing is observed in the group of dry food, in expensive spices in particular. (author)

  7. Possible implications of large scale radiation processing of food

    Science.gov (United States)

    Zagórski, Z. P.

    Large scale irradiation has been discussed in terms of the participation of processing cost in the final value of the improved product. Another factor has been taken into account and that is the saturation of the market with the new product. In the case of succesful projects the participation of irradiation cost is low, and the demand for the better product is covered. A limited availability of sources makes the modest saturation of the market difficult with all food subjected to correct radiation treatment. The implementation of the preservation of food needs a decided selection of these kinds of food which comply to all conditions i.e. of acceptance by regulatory bodies, real improvement of quality and economy. The last condition prefers the possibility of use of electron beams of low energy. The best fullfilment of conditions for succesful processing is observed in the group of dry food, in expensive spices in particular.

  8. 2005 dossier: clay. Tome: phenomenological evolution of the geologic disposal

    International Nuclear Information System (INIS)

    2005-01-01

    This document makes a status of the researches carried out by the French national agency of radioactive wastes (ANDRA) about the phenomenological processes taking place in an argilite-type geologic disposal facility for high-level and long-lived (HLLL) radioactive wastes. Content: 1 - introduction: goal, input data, time and space scales, long-time forecasting of the phenomenological evolution; 2 - the Meuse/Haute-Marne site, the HLLL wastes and the disposal concepts: impact of the repository architecture; 3 - initial state of the geologic environment prior to the building up of the repository: general framework, geologic formations, tectonics and fractures, surface environment, geologic synthesis; 4 - phenomenological processes: storage-related processes, geodynamics-related processes, time scales of processes and of radionuclides migration, independence and evolution similarities of the repository and of the geologic environment; 5 - heat loads: heat transfers between containers and geologic formations, spatial organization of the thermal load, for C-type wastes and spent fuels, for B-type wastes, synthesis of the repository thermal load; 6 - flows and liquid solution and gas transfers: hydraulic behaviour of surrounding Jurassic formations (Tithonian, Kimmeridgian, Callovian, Oxfordian); 7 - chemical phenomena: chemical evolution of ventilated facilities (alveoles, galleries, boreholes), chemical evolution of B-type waste alveoles and of gallery and borehole sealing after closure, far field chemical evolution of Callovo-Oxfordian argilites and of other surrounding formations; 8 - mechanical evolution of the disposal and of the surrounding geologic environment: creation of an initial excavated damaged zone (EDZ), mechanical evolution of ventilated galleries, alveoles and sealing before and after closure, large-scale mechanical evolution; 9 - geodynamical evolution of the Callovo-Oxfordian and other surrounding formations and of the surface environment: internal

  9. Accelerating large-scale protein structure alignments with graphics processing units

    Directory of Open Access Journals (Sweden)

    Pang Bin

    2012-02-01

    Full Text Available Abstract Background Large-scale protein structure alignment, an indispensable tool to structural bioinformatics, poses a tremendous challenge on computational resources. To ensure structure alignment accuracy and efficiency, efforts have been made to parallelize traditional alignment algorithms in grid environments. However, these solutions are costly and of limited accessibility. Others trade alignment quality for speedup by using high-level characteristics of structure fragments for structure comparisons. Findings We present ppsAlign, a parallel protein structure Alignment framework designed and optimized to exploit the parallelism of Graphics Processing Units (GPUs. As a general-purpose GPU platform, ppsAlign could take many concurrent methods, such as TM-align and Fr-TM-align, into the parallelized algorithm design. We evaluated ppsAlign on an NVIDIA Tesla C2050 GPU card, and compared it with existing software solutions running on an AMD dual-core CPU. We observed a 36-fold speedup over TM-align, a 65-fold speedup over Fr-TM-align, and a 40-fold speedup over MAMMOTH. Conclusions ppsAlign is a high-performance protein structure alignment tool designed to tackle the computational complexity issues from protein structural data. The solution presented in this paper allows large-scale structure comparisons to be performed using massive parallel computing power of GPU.

  10. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  11. Operational experinece with large scale biogas production at the promest manure processing plant in Helmond, the Netherlands

    International Nuclear Information System (INIS)

    Schomaker, A.H.H.M.

    1992-01-01

    In The Netherlands a surplus of 15 million tons of liquid pig manure is produced yearly on intensive pig breeding farms. The dutch government has set a three-way policy to reduce this excess of manure: 1. conversion of animal fodder into a product with less and better ingestible nutrients; 2. distribution of the surplus to regions with a shortage of animal manure; 3. processing of the remainder of the surplus in large scale processing plants. The first large scale plant for the processing of liquid pig manure was put in operation in 1988 as a demonstration plant at Promest in Helmond. The design capacity of this plant is 100,000 tons of pig manure per year. The plant was initiated by the Manure Steering Committee of the province Noord-Brabant in order to prove at short notice whether large scale manure processing might contribute to the solution of the problem of the manure surplus in The Netherlands. This steering committee is a corporation of the national and provincial government and the agricultural industrial life. (au)

  12. Large scale processing of dielectric electroactive polymers

    DEFF Research Database (Denmark)

    Vudayagiri, Sindhu

    Efficient processing techniques are vital to the success of any manufacturing industry. The processing techniques determine the quality of the products and thus to a large extent the performance and reliability of the products that are manufactured. The dielectric electroactive polymer (DEAP...

  13. Proposals of geological sites for L/ILW and HLW repositories. Geological background. Text volume

    International Nuclear Information System (INIS)

    2008-01-01

    On April 2008, the Swiss Federal Council approved the conceptual part of the Sectoral Plan for Deep Geological Repositories. The Plan sets out the details of the site selection procedure for geological repositories for low- and intermediate-level waste (L/ILW) and high-level waste (HLW). It specifies that selection of geological siting regions and sites for repositories in Switzerland will be conducted in three stages, the first one (the subject of this report) being the definition of geological siting regions within which the repository projects will be elaborated in more detail in the later stages of the Sectoral Plan. The geoscientific background is based on the one hand on an evaluation of the geological investigations previously carried out by Nagra on deep geological disposal of HLW and L/ILW in Switzerland (investigation programmes in the crystalline basement and Opalinus Clay in Northern Switzerland, investigations of L/ILW sites in the Alps, research in rock laboratories in crystalline rock and clay); on the other hand, new geoscientific studies have also been carried out in connection with the site selection process. Formulation of the siting proposals is conducted in five steps: A) In a first step, the waste inventory is allocated to the L/ILW and HLW repositories; B) The second step involves defining the barrier and safety concepts for the two repositories. With a view to evaluating the geological siting possibilities, quantitative and qualitative guidelines and requirements on the geology are derived on the basis of these concepts. These relate to the time period to be considered, the space requirements for the repository, the properties of the host rock (depth, thickness, lateral extent, hydraulic conductivity), long-term stability, reliability of geological findings and engineering suitability; C) In the third step, the large-scale geological-tectonic situation is assessed and large-scale areas that remain under consideration are defined. For the L

  14. Large-scale simulation of ductile fracture process of microstructured materials

    International Nuclear Information System (INIS)

    Tian Rong; Wang Chaowei

    2011-01-01

    The promise of computational science in the extreme-scale computing era is to reduce and decompose macroscopic complexities into microscopic simplicities with the expense of high spatial and temporal resolution of computing. In materials science and engineering, the direct combination of 3D microstructure data sets and 3D large-scale simulations provides unique opportunity for the development of a comprehensive understanding of nano/microstructure-property relationships in order to systematically design materials with specific desired properties. In the paper, we present a framework simulating the ductile fracture process zone in microstructural detail. The experimentally reconstructed microstructural data set is directly embedded into a FE mesh model to improve the simulation fidelity of microstructure effects on fracture toughness. To the best of our knowledge, it is for the first time that the linking of fracture toughness to multiscale microstructures in a realistic 3D numerical model in a direct manner is accomplished. (author)

  15. Ways forward in quantifying data uncertainty in geological databases

    Science.gov (United States)

    Kint, Lars; Chademenos, Vasileios; De Mol, Robin; Kapel, Michel; Lagring, Ruth; Stafleu, Jan; van Heteren, Sytze; Van Lancker, Vera

    2017-04-01

    Issues of compatibility of geological data resulting from the merging of many different data sources and time periods may jeopardize harmonization of data products. Important progress has been made due to increasing data standardization, e.g., at a European scale through the SeaDataNet and Geo-Seas data management infrastructures. Common geological data standards are unambiguously defined, avoiding semantic overlap in geological data and associated metadata. Quality flagging is also applied increasingly, though ways in further propagating this information in data products is still at its infancy. For the Belgian and southern Netherlands part of the North Sea, databases are now rigorously re-analyzed in view of quantifying quality flags in terms of uncertainty to be propagated through a 3D voxel model of the subsurface (https://odnature.naturalsciences.be/tiles/). An approach is worked out to consistently account for differences in positioning, sampling gear, analysis procedures and vintage. The flag scaling is used in the interpolation process of geological data, but will also be used when visualizing the suitability of geological resources in a decision support system. Expert knowledge is systematically revisited as to avoid totally inappropriate use of the flag scaling process. The quality flagging is also important when communicating results to end-users. Therefore, an open data policy in combination with several processing tools will be at the heart of a new Belgian geological data portal as a platform for knowledge building (KB) and knowledge management (KM) serving the marine geoscience, the policy community and the public at large.

  16. Scale determinants of fiscal investment in geological exploration: evidence from China.

    Science.gov (United States)

    Lu, Linna; Lei, Yalin

    2013-01-01

    With the continued growth in demand for mineral resources and China's efforts in increasing investment in geological prospecting, fiscal investment in geological exploration becomes a research hotspot. This paper examines the yearly relationship among fiscal investment in geological exploration of the current term, that of the last term and prices of mining rights over the period 1999-2009. Hines and Catephores' investment acceleration model is applied to describe the scale determinants of fiscal investment in geological exploration which are value-added of mining rights, value of mining rights and fiscal investment in the last term. The results indicate that when value-added of mining rights, value of mining rights or fiscal investment in the last term moves at 1 unit, fiscal investment in the current term will move 0.381, 1.094 or 0.907 units respectively. In order to determine the scale of fiscal investment in geological exploration for the current year, the Chinese government should take fiscal investment in geological exploration for the last year and the capital stock of the previous investments into account. In practice, combination of government fiscal investment in geological exploration with its performance evaluation can create a virtuous circle of capital management mechanism.

  17. Large transverse momentum processes in a non-scaling parton model

    International Nuclear Information System (INIS)

    Stirling, W.J.

    1977-01-01

    The production of large transverse momentum mesons in hadronic collisions by the quark fusion mechanism is discussed in a parton model which gives logarithmic corrections to Bjorken scaling. It is found that the moments of the large transverse momentum structure function exhibit a simple scale breaking behaviour similar to the behaviour of the Drell-Yan and deep inelastic structure functions of the model. An estimate of corresponding experimental consequences is made and the extent to which analogous results can be expected in an asymptotically free gauge theory is discussed. A simple set of rules is presented for incorporating the logarithmic corrections to scaling into all covariant parton model calculations. (Auth.)

  18. Large-scale continuous process to vitrify nuclear defense waste: operating experience with nonradioactive waste

    International Nuclear Information System (INIS)

    Cosper, M.B.; Randall, C.T.; Traverso, G.M.

    1982-01-01

    The developmental program underway at SRL has demonstrated the vitrification process proposed for the sludge processing facility of the DWPF on a large scale. DWPF design criteria for production rate, equipment lifetime, and operability have all been met. The expected authorization and construction of the DWPF will result in the safe and permanent immobilization of a major quantity of existing high level waste. 11 figures, 4 tables

  19. Large-scale calculations of the beta-decay rates and r-process nucleosynthesis

    Energy Technology Data Exchange (ETDEWEB)

    Borzov, I N; Goriely, S [Inst. d` Astronomie et d` Astrophysique, Univ. Libre de Bruxelles, Campus Plaine, Bruxelles (Belgium); Pearson, J M [Inst. d` Astronomie et d` Astrophysique, Univ. Libre de Bruxelles, Campus Plaine, Bruxelles (Belgium); [Lab. de Physique Nucleaire, Univ. de Montreal, Montreal (Canada)

    1998-06-01

    An approximation to a self-consistent model of the ground state and {beta}-decay properties of neutron-rich nuclei is outlined. The structure of the {beta}-strength functions in stable and short-lived nuclei is discussed. The results of large-scale calculations of the {beta}-decay rates for spherical and slightly deformed nuclides of relevance to the r-process are analysed and compared with the results of existing global calculations and recent experimental data. (orig.)

  20. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Hardenberg, J. von; Parodi, A.; Passoni, G.; Provenzale, A.; Spiegel, E.A.

    2008-01-01

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 10 5 and 10 8 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  1. PLANNING QUALITY ASSURANCE PROCESSES IN A LARGE SCALE GEOGRAPHICALLY SPREAD HYBRID SOFTWARE DEVELOPMENT PROJECT

    Directory of Open Access Journals (Sweden)

    Святослав Аркадійович МУРАВЕЦЬКИЙ

    2016-02-01

    Full Text Available There have been discussed key points of operational activates in a large scale geographically spread software development projects. A look taken at required QA processes structure in such project. There have been given up to date methods of integration quality assurance processes into software development processes. There have been reviewed existing groups of software development methodologies. Such as sequential, agile and based on RPINCE2. There have been given a condensed overview of quality assurance processes in each group. There have been given a review of common challenges that sequential and agile models are having in case of large geographically spread hybrid software development project. Recommendations were given in order to tackle those challenges.  The conclusions about the best methodology choice and appliance to the particular project have been made.

  2. Hydrologic test plans for large-scale, multiple-well tests in support of site characterization at Hanford, Washington

    International Nuclear Information System (INIS)

    Rogers, P.M.; Stone, R.; Lu, A.H.

    1985-01-01

    The Basalt Waste Isolation Project is preparing plans for tests and has begun work on some tests that will provide the data necessary for the hydrogeologic characterization of a site located on a United States government reservation at Hanford, Washington. This site is being considered for the Nation's first geologic repository of high level nuclear waste. Hydrogeologic characterization of this site requires several lines of investigation which include: surface-based small-scale tests, testing performed at depth from an exploratory shaft, geochemistry investigations, regional studies, and site-specific investigations using large-scale, multiple-well hydraulic tests. The large-scale multiple-well tests are planned for several locations in and around the site. These tests are being designed to provide estimates of hydraulic parameter values of the geologic media, chemical properties of the groundwater, and hydrogeologic boundary conditions at a scale appropriate for evaluating repository performance with respect to potential radionuclide transport

  3. Constructing a large-scale 3D Geologic Model for Analysis of the Non-Proliferation Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Wagoner, J; Myers, S

    2008-04-09

    We have constructed a regional 3D geologic model of the southern Great Basin, in support of a seismic wave propagation investigation of the 1993 Nonproliferation Experiment (NPE) at the Nevada Test Site (NTS). The model is centered on the NPE and spans longitude -119.5{sup o} to -112.6{sup o} and latitude 34.5{sup o} to 39.8{sup o}; the depth ranges from the topographic surface to 150 km below sea level. The model includes the southern half of Nevada, as well as parts of eastern California, western Utah, and a portion of northwestern Arizona. The upper crust is constrained by both geologic and geophysical studies, while the lower crust and upper mantle are constrained by geophysical studies. The mapped upper crustal geologic units are Quaternary basin fill, Tertiary deposits, pre-Tertiary deposits, intrusive rocks of all ages, and calderas. The lower crust and upper mantle are parameterized with 5 layers, including the Moho. Detailed geologic data, including surface maps, borehole data, and geophysical surveys, were used to define the geology at the NTS. Digital geologic outcrop data were available for both Nevada and Arizona, whereas geologic maps for California and Utah were scanned and hand-digitized. Published gravity data (2km spacing) were used to determine the thickness of the Cenozoic deposits and thus estimate the depth of the basins. The free surface is based on a 10m lateral resolution DEM at the NTS and a 90m lateral resolution DEM elsewhere. Variations in crustal thickness are based on receiver function analysis and a framework compilation of reflection/refraction studies. We used Earthvision (Dynamic Graphics, Inc.) to integrate the geologic and geophysical information into a model of x,y,z,p nodes, where p is a unique integer index value representing the geologic unit. For seismic studies, the geologic units are mapped to specific seismic velocities. The gross geophysical structure of the crust and upper mantle is taken from regional surface

  4. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  5. The power of event-driven analytics in Large Scale Data Processing

    CERN Multimedia

    CERN. Geneva; Marques, Paulo

    2011-01-01

    FeedZai is a software company specialized in creating high-­‐throughput low-­‐latency data processing solutions. FeedZai develops a product called "FeedZai Pulse" for continuous event-­‐driven analytics that makes application development easier for end users. It automatically calculates key performance indicators and baselines, showing how current performance differ from previous history, creating timely business intelligence updated to the second. The tool does predictive analytics and trend analysis, displaying data on real-­‐time web-­‐based graphics. In 2010 FeedZai won the European EBN Smart Entrepreneurship Competition, in the Digital Models category, being considered one of the "top-­‐20 smart companies in Europe". The main objective of this seminar/workshop is to explore the topic for large-­‐scale data processing using Complex Event Processing and, in particular, the possible uses of Pulse in...

  6. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  7. Geologic context of large karst springs and caves in the Ozark National Scenic Riverways, Missouri

    Science.gov (United States)

    Weary, David J.; Orndorff, Randall C.

    2016-01-01

    The ONSR is a karst park, containing many springs and caves. The “jewels” of the park are large springs, several of first magnitude, that contribute significantly to the flow and water quality of the Current River and its tributaries. Completion of 1:24,000-scale geologic mapping of the park and surrounding river basin, along with synthesis of published hydrologic data, allows us to examine the spatial relationships between the springs and the geologic framework to develop a conceptual model for genesis of these springs. Based on their similarity to mapped spring conduits, many of the caves in the ONSR are fossil conduit segments. Therefore, geologic control on the evolution of the springs also applies to speleogenesis in this part of the southern Missouri Ozarks.Large springs occur in the ONSR area because: (1) the Ozark aquifer, from which they rise, is chiefly dolomite affected by solution via various processes over a long time period, (2) Paleozoic hypogenic fluid migration through these rocks exploited and enhanced flow-paths, (3) a consistent and low regional dip of the rocks off of the Salem Plateau (less than 2° to the southeast) allows integration of flow into large groundwater basins with a few discreet outlets, (4) the springs are located where the rivers have cut down into structural highs, allowing access to water from stratigraphic units deeper in the aquifer thus allowing development of springsheds that have volumetrically larger storage than smaller springs higher in the section, and (5) quartz sandstone and bedded chert in the carbonate stratigraphic succession that are locally to regionally continuous, serve as aquitards that locally confine groundwater up dip of the springs creating artesian conditions. This subhorizontal partitioning of the Ozark aquifer allows contributing areas for different springs to overlap, as evidenced by dye traces that cross adjacent groundwater basin boundaries, and possibly contributes to alternate flow routes

  8. Large-scale production of diesel-like biofuels - process design as an inherent part of microorganism development.

    Science.gov (United States)

    Cuellar, Maria C; Heijnen, Joseph J; van der Wielen, Luuk A M

    2013-06-01

    Industrial biotechnology is playing an important role in the transition to a bio-based economy. Currently, however, industrial implementation is still modest, despite the advances made in microorganism development. Given that the fuels and commodity chemicals sectors are characterized by tight economic margins, we propose to address overall process design and efficiency at the start of bioprocess development. While current microorganism development is targeted at product formation and product yield, addressing process design at the start of bioprocess development means that microorganism selection can also be extended to other critical targets for process technology and process scale implementation, such as enhancing cell separation or increasing cell robustness at operating conditions that favor the overall process. In this paper we follow this approach for the microbial production of diesel-like biofuels. We review current microbial routes with both oleaginous and engineered microorganisms. For the routes leading to extracellular production, we identify the process conditions for large scale operation. The process conditions identified are finally translated to microorganism development targets. We show that microorganism development should be directed at anaerobic production, increasing robustness at extreme process conditions and tailoring cell surface properties. All the same time, novel process configurations integrating fermentation and product recovery, cell reuse and low-cost technologies for product separation are mandatory. This review provides a state-of-the-art summary of the latest challenges in large-scale production of diesel-like biofuels. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. The geological processes time scale of the Ingozersky block TTG complex (Kola Peninsula)

    Science.gov (United States)

    Nitkina, Elena

    2013-04-01

    Ingozersky block located in the Tersky Terrane of the Kola Peninsula is composed of Archean gneisses and granitoids [1; 5; 8]. The Archaean basement complexes on the regional geological maps have called tonalite-trondemit-gneisses (TTG) complexes [6]. In the previous studies [1; 3; 4; 5; 7] within Ingozersky block the following types of rocks were established: biotite, biotite-amphibole, amphibole-biotite gneisses, granites, granodiorites and pegmatites [2]. In the rocks of the complex following corresponding sequence of endogenous processes observed (based on [5]): stage 1 - the biotitic gneisses formation; 2 - the introduction of dikes of basic rocks; 3 phase - deformation and foliation; 4 stage - implementation bodies of granite and migmatization; 5 stage - implementation of large pegmatite bodies; stage 6 - the formation of differently pegmatite and granite veins of low power, with and without garnet; stage 7 - quartz veins. Previous U-Pb isotopic dating of the samples was done for biotite gneisses, amphibole-biotite gneisses and biotite-amphibole gneisses. Thus, some Sm-Nd TDM ages are 3613 Ma - biotite gnesses, 2596 Ma - amphibole-biotite gnesses and 3493 Ma biotite-amphibole gneisses.. U-Pb ages of the metamorphism processes in the TTG complex are obtained: 2697±9 Ma - for the biotite gneiss, 2725±2 and 2667±7 Ma - for the amphibole-biotite gneisses, and 2727±5 Ma for the biotite-amphibole gneisses. The age defined for the biotite gneisses by using single zircon dating to be about 3149±46 Ma corresponds to the time of the gneisses protolith formation. The purpose of these studies is the age establishing of granite and pegmatite bodies emplacement and finding a geological processes time scale of the Ingozerskom block. Preliminary U-Pb isotopic dating of zircon and other accessory minerals were held for granites - 2615±8 Ma, migmatites - 2549±30 Ma and veined granites - 1644±7 Ma. As a result of the isotope U-Pb dating of the different Ingozerskogo TTG

  10. The role of large-scale, extratropical dynamics in climate change

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, T.G. [ed.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop`s University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database.

  11. The role of large-scale, extratropical dynamics in climate change

    International Nuclear Information System (INIS)

    Shepherd, T.G.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop's University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database

  12. A large-scale circuit mechanism for hierarchical dynamical processing in the primate cortex

    OpenAIRE

    Chaudhuri, Rishidev; Knoblauch, Kenneth; Gariel, Marie-Alice; Kennedy, Henry; Wang, Xiao-Jing

    2015-01-01

    We developed a large-scale dynamical model of the macaque neocortex, which is based on recently acquired directed- and weighted-connectivity data from tract-tracing experiments, and which incorporates heterogeneity across areas. A hierarchy of timescales naturally emerges from this system: sensory areas show brief, transient responses to input (appropriate for sensory processing), whereas association areas integrate inputs over time and exhibit persistent activity (suitable for decision-makin...

  13. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  14. 3D fast adaptive correlation imaging for large-scale gravity data based on GPU computation

    Science.gov (United States)

    Chen, Z.; Meng, X.; Guo, L.; Liu, G.

    2011-12-01

    In recent years, large scale gravity data sets have been collected and employed to enhance gravity problem-solving abilities of tectonics studies in China. Aiming at the large scale data and the requirement of rapid interpretation, previous authors have carried out a lot of work, including the fast gradient module inversion and Euler deconvolution depth inversion ,3-D physical property inversion using stochastic subspaces and equivalent storage, fast inversion using wavelet transforms and a logarithmic barrier method. So it can be say that 3-D gravity inversion has been greatly improved in the last decade. Many authors added many different kinds of priori information and constraints to deal with nonuniqueness using models composed of a large number of contiguous cells of unknown property and obtained good results. However, due to long computation time, instability and other shortcomings, 3-D physical property inversion has not been widely applied to large-scale data yet. In order to achieve 3-D interpretation with high efficiency and precision for geological and ore bodies and obtain their subsurface distribution, there is an urgent need to find a fast and efficient inversion method for large scale gravity data. As an entirely new geophysical inversion method, 3D correlation has a rapid development thanks to the advantage of requiring no a priori information and demanding small amount of computer memory. This method was proposed to image the distribution of equivalent excess masses of anomalous geological bodies with high resolution both longitudinally and transversely. In order to tranform the equivalence excess masses into real density contrasts, we adopt the adaptive correlation imaging for gravity data. After each 3D correlation imaging, we change the equivalence into density contrasts according to the linear relationship, and then carry out forward gravity calculation for each rectangle cells. Next, we compare the forward gravity data with real data, and

  15. Large-Scale Sentinel-1 Processing for Solid Earth Science and Urgent Response using Cloud Computing and Machine Learning

    Science.gov (United States)

    Hua, H.; Owen, S. E.; Yun, S. H.; Agram, P. S.; Manipon, G.; Starch, M.; Sacco, G. F.; Bue, B. D.; Dang, L. B.; Linick, J. P.; Malarout, N.; Rosen, P. A.; Fielding, E. J.; Lundgren, P.; Moore, A. W.; Liu, Z.; Farr, T.; Webb, F.; Simons, M.; Gurrola, E. M.

    2017-12-01

    With the increased availability of open SAR data (e.g. Sentinel-1 A/B), new challenges are being faced with processing and analyzing the voluminous SAR datasets to make geodetic measurements. Upcoming SAR missions such as NISAR are expected to generate close to 100TB per day. The Advanced Rapid Imaging and Analysis (ARIA) project can now generate geocoded unwrapped phase and coherence products from Sentinel-1 TOPS mode data in an automated fashion, using the ISCE software. This capability is currently being exercised on various study sites across the United States and around the globe, including Hawaii, Central California, Iceland and South America. The automated and large-scale SAR data processing and analysis capabilities use cloud computing techniques to speed the computations and provide scalable processing power and storage. Aspects such as how to processing these voluminous SLCs and interferograms at global scales, keeping up with the large daily SAR data volumes, and how to handle the voluminous data rates are being explored. Scene-partitioning approaches in the processing pipeline help in handling global-scale processing up to unwrapped interferograms with stitching done at a late stage. We have built an advanced science data system with rapid search functions to enable access to the derived data products. Rapid image processing of Sentinel-1 data to interferograms and time series is already being applied to natural hazards including earthquakes, floods, volcanic eruptions, and land subsidence due to fluid withdrawal. We will present the status of the ARIA science data system for generating science-ready data products and challenges that arise from being able to process SAR datasets to derived time series data products at large scales. For example, how do we perform large-scale data quality screening on interferograms? What approaches can be used to minimize compute, storage, and data movement costs for time series analysis in the cloud? We will also

  16. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  17. Manufacturing test of large scale hollow capsule and long length cladding in the large scale oxide dispersion strengthened (ODS) martensitic steel

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2004-04-01

    Mass production capability of oxide dispersion strengthened (ODS) martensitic steel cladding (9Cr) has being evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube (raw materials powder production, mechanical alloying (MA) by ball mill, canning, hot extrusion, and machining) is a dominant factor in the total cost for manufacturing ODS ferritic steel cladding. In this study, the large-sale 9Cr-ODS martensitic steel mother tube which is made with a large-scale hollow capsule, and long length claddings were manufactured, and the applicability of these processes was evaluated. Following results were obtained in this study. (1) Manufacturing the large scale mother tube in the dimension of 32 mm OD, 21 mm ID, and 2 m length has been successfully carried out using large scale hollow capsule. This mother tube has a high degree of accuracy in size. (2) The chemical composition and the micro structure of the manufactured mother tube are similar to the existing mother tube manufactured by a small scale can. And the remarkable difference between the bottom and top sides in the manufactured mother tube has not been observed. (3) The long length cladding has been successfully manufactured from the large scale mother tube which was made using a large scale hollow capsule. (4) For reducing the manufacturing cost of the ODS steel claddings, manufacturing process of the mother tubes using a large scale hollow capsules is promising. (author)

  18. Lunar and Planetary Geology

    Science.gov (United States)

    Basilevsky, Alexander T.

    2018-05-01

    Lunar and planetary geology can be described using examples such as the geology of Earth (as the reference case) and geologies of the Earth's satellite the Moon; the planets Mercury, Mars and Venus; the satellite of Saturn Enceladus; the small stony asteroid Eros; and the nucleus of the comet 67P Churyumov-Gerasimenko. Each body considered is illustrated by its global view, with information given as to its position in the solar system, size, surface, environment including gravity acceleration and properties of its atmosphere if it is present, typical landforms and processes forming them, materials composing these landforms, information on internal structure of the body, stages of its geologic evolution in the form of stratigraphic scale, and estimates of the absolute ages of the stratigraphic units. Information about one body may be applied to another body and this, in particular, has led to the discovery of the existence of heavy "meteoritic" bombardment in the early history of the solar system, which should also significantly affect Earth. It has been shown that volcanism and large-scale tectonics may have not only been an internal source of energy in the form of radiogenic decay of potassium, uranium and thorium, but also an external source in the form of gravity tugging caused by attractions of the neighboring bodies. The knowledge gained by lunar and planetary geology is important for planning and managing space missions and for the practical exploration of other bodies of the solar system and establishing manned outposts on them.

  19. A Group Simulation of the Development of the Geologic Time Scale.

    Science.gov (United States)

    Bennington, J. Bret

    2000-01-01

    Explains how to demonstrate to students that the relative dating of rock layers is redundant. Uses two column diagrams to simulate stratigraphic sequences from two different geological time scales and asks students to complete the time scale. (YDS)

  20. Progress in Root Cause and Fault Propagation Analysis of Large-Scale Industrial Processes

    Directory of Open Access Journals (Sweden)

    Fan Yang

    2012-01-01

    Full Text Available In large-scale industrial processes, a fault can easily propagate between process units due to the interconnections of material and information flows. Thus the problem of fault detection and isolation for these processes is more concerned about the root cause and fault propagation before applying quantitative methods in local models. Process topology and causality, as the key features of the process description, need to be captured from process knowledge and process data. The modelling methods from these two aspects are overviewed in this paper. From process knowledge, structural equation modelling, various causal graphs, rule-based models, and ontological models are summarized. From process data, cross-correlation analysis, Granger causality and its extensions, frequency domain methods, information-theoretical methods, and Bayesian nets are introduced. Based on these models, inference methods are discussed to find root causes and fault propagation paths under abnormal situations. Some future work is proposed in the end.

  1. Evaluating cloud processes in large-scale models: Of idealized case studies, parameterization testbeds and single-column modelling on climate time-scales

    Science.gov (United States)

    Neggers, Roel

    2016-04-01

    Boundary-layer schemes have always formed an integral part of General Circulation Models (GCMs) used for numerical weather and climate prediction. The spatial and temporal scales associated with boundary-layer processes and clouds are typically much smaller than those at which GCMs are discretized, which makes their representation through parameterization a necessity. The need for generally applicable boundary-layer parameterizations has motivated many scientific studies, which in effect has created its own active research field in the atmospheric sciences. Of particular interest has been the evaluation of boundary-layer schemes at "process-level". This means that parameterized physics are studied in isolated mode from the larger-scale circulation, using prescribed forcings and excluding any upscale interaction. Although feedbacks are thus prevented, the benefit is an enhanced model transparency, which might aid an investigator in identifying model errors and understanding model behavior. The popularity and success of the process-level approach is demonstrated by the many past and ongoing model inter-comparison studies that have been organized by initiatives such as GCSS/GASS. A red line in the results of these studies is that although most schemes somehow manage to capture first-order aspects of boundary layer cloud fields, there certainly remains room for improvement in many areas. Only too often are boundary layer parameterizations still found to be at the heart of problems in large-scale models, negatively affecting forecast skills of NWP models or causing uncertainty in numerical predictions of future climate. How to break this parameterization "deadlock" remains an open problem. This presentation attempts to give an overview of the various existing methods for the process-level evaluation of boundary-layer physics in large-scale models. This includes i) idealized case studies, ii) longer-term evaluation at permanent meteorological sites (the testbed approach

  2. Large-scale structure of the Universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.

    1978-01-01

    The problems, discussed at the ''Large-scale Structure of the Universe'' symposium are considered on a popular level. Described are the cell structure of galaxy distribution in the Universe, principles of mathematical galaxy distribution modelling. The images of cell structures, obtained after reprocessing with the computer are given. Discussed are three hypothesis - vortical, entropic, adiabatic, suggesting various processes of galaxy and galaxy clusters origin. A considerable advantage of the adiabatic hypothesis is recognized. The relict radiation, as a method of direct studying the processes taking place in the Universe is considered. The large-scale peculiarities and small-scale fluctuations of the relict radiation temperature enable one to estimate the turbance properties at the pre-galaxy stage. The discussion of problems, pertaining to studying the hot gas, contained in galaxy clusters, the interactions within galaxy clusters and with the inter-galaxy medium, is recognized to be a notable contribution into the development of theoretical and observational cosmology

  3. Native American Students' Understanding of Geologic Time Scale: 4th-8th Grade Ojibwe Students' Understanding of Earth's Geologic History

    Science.gov (United States)

    Nam, Younkyeong; Karahan, Engin; Roehrig, Gillian

    2016-01-01

    Geologic time scale is a very important concept for understanding long-term earth system events such as climate change. This study examines forty-three 4th-8th grade Native American--particularly Ojibwe tribe--students' understanding of relative ordering and absolute time of Earth's significant geological and biological events. This study also…

  4. Geological data integration techniques

    International Nuclear Information System (INIS)

    1988-09-01

    The objectives of this Technical Committee are to bring together current knowledge on geological data handling and analysis technologies as developed in the mineral and petroleum industries for geological, geophysical, geochemical and remote sensing data that can be applied to uranium exploration and resource appraisal. The recommendation for work on this topic was first made at the meeting of the NEA-IAEA Joint Group of Experts on R and D in Uranium Exploration Techniques (Paris, May 1984). In their report, processing of integrated data sets was considered to be extremely important in view of the very extensive data sets built up over the recent years by large uranium reconnaissance programmes. With the development of large, multidisciplinary data sets which includes geochemical, geophysical, geological and remote sensing data, the ability of the geologist to easily interpret large volumes of information has been largely the result of developments in the field of computer science in the past decade. Advances in data management systems, image processing software, the size and speed of computer systems and significantly reduced processing costs have made large data set integration and analysis practical and affordable. The combined signatures which can be obtained from the different types of data significantly enhance the geologists ability to interpret fundamental geological properties thereby improving the chances of finding a significant ore body. This volume is the product of one of a number of activities related to uranium geology and exploration during the past few years with the intent of bringing new technologies and exploration techniques to the IAEA Member States

  5. Some Examples of Residence-Time Distribution Studies in Large-Scale Chemical Processes by Using Radiotracer Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bullock, R. M.; Johnson, P.; Whiston, J. [Imperial Chemical Industries Ltd., Billingham, Co., Durham (United Kingdom)

    1967-06-15

    The application of radiotracers to determine flow patterns in chemical processes is discussed with particular reference to the derivation of design data from model reactors for translation to large-scale units, the study of operating efficiency and design attainment in established plant and the rapid identification of various types of process malfunction. The requirements governing the selection of tracers for various types of media are considered and an example is given of the testing of the behaviour of a typical tracer before use in a particular large-scale process operating at 250 atm and 200 Degree-Sign C. Information which may be derived from flow patterns is discussed including the determination of mixing parameters, gas hold-up in gas/liquid reactions and the detection of channelling and stagnant regions. Practical results and their interpretation are given in relation to an define hydroformylation reaction system, a process for the conversion of propylene to isopropanol, a moving bed catalyst system for the isomerization of xylenes and a three-stage gas-liquid reaction system. The use of mean residence-time data for the detection of leakage between reaction vessels and a heat interchanger system is given as an example of the identification of process malfunction. (author)

  6. The geological map of Canelones Department scale 1:1000.000

    International Nuclear Information System (INIS)

    Spoturno, J.; Oyhantcabal, P.; Goso, C.; Aubet, N.; Cazaux; S; Huelmo, S.; Morales, E.; Loureiro, J.

    2004-01-01

    The geological map of Canelones Department (Uruguay), scale 1:100.000 is presented. This map shows the distribution of the proterozoic, mesozoic and cenozoic lithological units. A stratigraphic division of this region is included [es

  7. The geological map of Montevideo Department scale 1:50.000

    International Nuclear Information System (INIS)

    Spoturno, J.; Oyhantcabal, P.; Goso, C.; Aubet, N.; Cazaux; S; Huelmo, S.; Morales, E.; Loureiro, J.

    2004-01-01

    The geological map of Montevideo Department (Uruguay), scale 1:50.000 is presented. This map shows the distribution of the proterozoic, mesozoic and cenozoic lithological units. A stratigraphic division of this region is included [es

  8. Evolution of scaling emergence in large-scale spatial epidemic spreading.

    Science.gov (United States)

    Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan

    2011-01-01

    Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease.

  9. Modeling of a Large-Scale High Temperature Regenerative Sulfur Removal Process

    DEFF Research Database (Denmark)

    Konttinen, Jukka T.; Johnsson, Jan Erik

    1999-01-01

    model that does not account for bed hydrodynamics. The pilot-scale test run results, obtained in the test runs of the sulfur removal process with real coal gasifier gas, have been used for parameter estimation. The validity of the reactor model for commercial-scale design applications is discussed.......Regenerable mixed metal oxide sorbents are prime candidates for the removal of hydrogen sulfide from hot gasifier gas in the simplified integrated gasification combined cycle (IGCC) process. As part of the regenerative sulfur removal process development, reactor models are needed for scale......-up. Steady-state kinetic reactor models are needed for reactor sizing, and dynamic models can be used for process control design and operator training. The regenerative sulfur removal process to be studied in this paper consists of two side-by-side fluidized bed reactors operating at temperatures of 400...

  10. Innovation Processes in Large-Scale Public Foodservice-Case Findings from the Implementation of Organic Foods in a Danish County

    DEFF Research Database (Denmark)

    Mikkelsen, Bent Egberg; Nielsen, Thorkild; Kristensen, Niels Heine

    2005-01-01

    is the idea that the large-scale foodservice such as hospital food service should adopt a buy organic policy due to their large buying volume. But whereas implementation of organic foods has developed quite unproblematically in smaller institutions such as kindergartens and nurseries, introduction of organic...... foods into large-scale foodservice such as that taking place in hospitals and larger homes for the elderly, has proven to be quite difficult. The very complex planning, procurement and processing procedures used in such facilities are among reasons for this. Against this background an evaluation...

  11. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  12. Large-scale groundwater modeling using global datasets: a test case for the Rhine-Meuse basin

    Directory of Open Access Journals (Sweden)

    E. H. Sutanudjaja

    2011-09-01

    Full Text Available The current generation of large-scale hydrological models does not include a groundwater flow component. Large-scale groundwater models, involving aquifers and basins of multiple countries, are still rare mainly due to a lack of hydro-geological data which are usually only available in developed countries. In this study, we propose a novel approach to construct large-scale groundwater models by using global datasets that are readily available. As the test-bed, we use the combined Rhine-Meuse basin that contains groundwater head data used to verify the model output. We start by building a distributed land surface model (30 arc-second resolution to estimate groundwater recharge and river discharge. Subsequently, a MODFLOW transient groundwater model is built and forced by the recharge and surface water levels calculated by the land surface model. Results are promising despite the fact that we still use an offline procedure to couple the land surface and MODFLOW groundwater models (i.e. the simulations of both models are separately performed. The simulated river discharges compare well to the observations. Moreover, based on our sensitivity analysis, in which we run several groundwater model scenarios with various hydro-geological parameter settings, we observe that the model can reasonably well reproduce the observed groundwater head time series. However, we note that there are still some limitations in the current approach, specifically because the offline-coupling technique simplifies the dynamic feedbacks between surface water levels and groundwater heads, and between soil moisture states and groundwater heads. Also the current sensitivity analysis ignores the uncertainty of the land surface model output. Despite these limitations, we argue that the results of the current model show a promise for large-scale groundwater modeling practices, including for data-poor environments and at the global scale.

  13. Major geological events and uranium metallogenesis in South-west China

    International Nuclear Information System (INIS)

    Zhang Chengjiang; Xu Zhengqi; Ni Shijun; Chen Youliang

    2012-01-01

    Uranium is widely distributed in South-west China, with all types but on a not-so-large scale. South-west China is located on the combining site of several large tectonic elements and every tectonic movement has different effects on different regions. To study and clarify the correlation between the major geological events in South-west China and the Uranium metallogenesis, comprehensive research and field investigation are made besides collecting a lot of materials. Through analysis and research on the major geological events in South-west China, the evolution of those e vents is basically clarified and the events closely related with uranium mineralization are determined. It is discovered that there are several ore-forming geologic events in the geological history of South-west China; almost every major tectonic movement cycle is accompanied with uranium metallogenesis, from Jinning Movement to Chengjiang Movement, to Hercynian Movement, to Indosinian Movement. to Yanshan Movement. to Himalayan movement. Even though every major tectonic cycle is accompanied with uranium mineralization, three major geological events are generally obviously related with uranium metallogenesis, i.e. the Rodinian supercontinent breakup even in Jinning-Chengjiang Period, Yanshan Movement and Himalayan movement, in which the first one is the process of uranium pre-enrichment and provides the source of uranium. Yanshan Movement and Himalayan movement are the important processes for mineralization, mainly the hydrothermal superimposed mineralization. (authors)

  14. How semantics can inform the geological mapping process and support intelligent queries

    Science.gov (United States)

    Lombardo, Vincenzo; Piana, Fabrizio; Mimmo, Dario

    2017-04-01

    The geologic mapping process requires the organization of data according to the general knowledge about the objects, namely the geologic units, and to the objectives of a graphic representation of such objects in a map, following an established model of geotectonic evolution. Semantics can greatly help such a process in two concerns: the provision of a terminological base to name and classify the objects of the map; on the other, the implementation of a machine-readable encoding of the geologic knowledge base supports the application of reasoning mechanisms and the derivation of novel properties and relations about the objects of the map. The OntoGeonous initiative has built a terminological base of geological knowledge in a machine-readable format, following the Semantic Web tenets and the Linked Data paradigm. The major knowledge sources of the OntoGeonous initiative are GeoScience Markup Language schemata and vocabularies (through its last version, GeoSciML 4, 2015, published by the IUGS CGI Commission) and the INSPIRE "Data Specification on Geology" directives (an operative simplification of GeoSciML, published by INSPIRE Thematic Working Group Geology of the European Commission). The Linked Data paradigm has been exploited by linking (without replicating, to avoid inconsistencies) the already existing machine-readable encoding for some specific domains, such as the lithology domain (vocabulary Simple Lithology) and the geochronologic time scale (ontology "gts"). Finally, for the upper level knowledge, shared across several geologic domains, we have resorted to NASA SWEET ontology. The OntoGeonous initiative has also produced a wiki that explains how the geologic knowledge has been encoded from shared geoscience vocabularies (https://www.di.unito.it/wikigeo/). In particular, the sections dedicated to axiomatization will support the construction of an appropriate data base schema that can be then filled with the objects of the map. This contribution will discuss

  15. Forest landscape models, a tool for understanding the effect of the large-scale and long-term landscape processes

    Science.gov (United States)

    Hong S. He; Robert E. Keane; Louis R. Iverson

    2008-01-01

    Forest landscape models have become important tools for understanding large-scale and long-term landscape (spatial) processes such as climate change, fire, windthrow, seed dispersal, insect outbreak, disease propagation, forest harvest, and fuel treatment, because controlled field experiments designed to study the effects of these processes are often not possible (...

  16. Consultancy on Large-Scale Submerged Aerobic Cultivation Process Design - Final Technical Report: February 1, 2016 -- June 30, 2016

    Energy Technology Data Exchange (ETDEWEB)

    Crater, Jason [Gemomatica, Inc., San Diego, CA (United States); Galleher, Connor [Gemomatica, Inc., San Diego, CA (United States); Lievense, Jeff [Gemomatica, Inc., San Diego, CA (United States)

    2017-05-12

    NREL is developing an advanced aerobic bubble column model using Aspen Custom Modeler (ACM). The objective of this work is to integrate the new fermentor model with existing techno-economic models in Aspen Plus and Excel to establish a new methodology for guiding process design. To assist this effort, NREL has contracted Genomatica to critique and make recommendations for improving NREL's bioreactor model and large scale aerobic bioreactor design for biologically producing lipids at commercial scale. Genomatica has highlighted a few areas for improving the functionality and effectiveness of the model. Genomatica recommends using a compartment model approach with an integrated black-box kinetic model of the production microbe. We also suggest including calculations for stirred tank reactors to extend the models functionality and adaptability for future process designs. Genomatica also suggests making several modifications to NREL's large-scale lipid production process design. The recommended process modifications are based on Genomatica's internal techno-economic assessment experience and are focused primarily on minimizing capital and operating costs. These recommendations include selecting/engineering a thermotolerant yeast strain with lipid excretion; using bubble column fermentors; increasing the size of production fermentors; reducing the number of vessels; employing semi-continuous operation; and recycling cell mass.

  17. Curbing variations in packaging process through Six Sigma way in a large-scale food-processing industry

    Science.gov (United States)

    Desai, Darshak A.; Kotadiya, Parth; Makwana, Nikheel; Patel, Sonalinkumar

    2015-03-01

    Indian industries need overall operational excellence for sustainable profitability and growth in the present age of global competitiveness. Among different quality and productivity improvement techniques, Six Sigma has emerged as one of the most effective breakthrough improvement strategies. Though Indian industries are exploring this improvement methodology to their advantage and reaping the benefits, not much has been presented and published regarding experience of Six Sigma in the food-processing industries. This paper is an effort to exemplify the application of Six Sigma quality improvement drive to one of the large-scale food-processing sectors in India. The paper discusses the phase wiz implementation of define, measure, analyze, improve, and control (DMAIC) on one of the chronic problems, variations in the weight of milk powder pouch. The paper wraps up with the improvements achieved and projected bottom-line gain to the unit by application of Six Sigma methodology.

  18. Displays for promotion of public understanding of geological repository concept and the spatial scale

    International Nuclear Information System (INIS)

    Shobu, Nobuhiro; Kashiwazaki, Hiroshi

    2003-05-01

    Japan Nuclear Cycle Development Institutes (JNC) has a few thousands of short term visitors to Geological Isolation Basic Research Facility of Tokai works in every year. From the viewpoint of promotion of the visitor's understanding and also smooth communication between researchers and visitors, the explanation of the technical information on geological disposal should be carried out in more easily understandable methods, as well as conventional tour to the engineering-scale test facility (ENTRY). This paper reports on the background information and the appearance of displays, which were installed at ENTRY, to promote public understanding of geological repository concept and the spatial scale. They have been practically used as one of the explanation tools to support visitor's understanding. (author)

  19. Uranium ore deposits: geology and processing implications

    International Nuclear Information System (INIS)

    Belyk, C.L.

    2010-01-01

    There are fifteen accepted types of uranium ore deposits and at least forty subtypes readily identified around the world. Each deposit type has a unique set of geological characteristics which may also result in unique processing implications. Primary uranium production in the past decade has predominantly come from only a few of these deposit types including: unconformity, sandstone, calcrete, intrusive, breccia complex and volcanic ones. Processing implications can vary widely between and within the different geological models. Some key characteristics of uranium deposits that may have processing implications include: ore grade, uranium and gangue mineralogy, ore hardness, porosity, uranium mineral morphology and carbon content. Processing difficulties may occur as a result of one or more of these characteristics. In order to meet future uranium demand, it is imperative that innovative processing approaches and new technological advances be developed in order that many of the marginally economic traditional and uneconomic non-traditional uranium ore deposits can be exploited. (author)

  20. From cellulose to kerogen: molecular simulation of a geological process.

    Science.gov (United States)

    Atmani, Lea; Bichara, Christophe; Pellenq, Roland J-M; Van Damme, Henri; van Duin, Adri C T; Raza, Zamaan; Truflandier, Lionel A; Obliger, Amaël; Kralert, Paul G; Ulm, Franz J; Leyssale, Jean-Marc

    2017-12-01

    The process by which organic matter decomposes deep underground to form petroleum and its underlying kerogen matrix has so far remained a no man's land to theoreticians, largely because of the geological (Myears) timescale associated with the process. Using reactive molecular dynamics and an accelerated simulation framework, the replica exchange molecular dynamics method, we simulate the full transformation of cellulose into kerogen and its associated fluid phase under prevailing geological conditions. We observe in sequence the fragmentation of the cellulose crystal and production of water, the development of an unsaturated aliphatic macromolecular phase and its aromatization. The composition of the solid residue along the maturation pathway strictly follows what is observed for natural type III kerogen and for artificially matured samples under confined conditions. After expulsion of the fluid phase, the obtained microporous kerogen possesses the structure, texture, density, porosity and stiffness observed for mature type III kerogen and a microporous carbon obtained by saccharose pyrolysis at low temperature. As expected for this variety of precursor, the main resulting hydrocarbon is methane. The present work thus demonstrates that molecular simulations can now be used to assess, almost quantitatively, such complex chemical processes as petrogenesis in fossil reservoirs and, more generally, the possible conversion of any natural product into bio-sourced materials and/or fuel.

  1. Design Methodology of Process Layout considering Various Equipment Types for Large scale Pyro processing Facility

    International Nuclear Information System (INIS)

    Yu, Seung Nam; Lee, Jong Kwang; Lee, Hyo Jik

    2016-01-01

    At present, each item of process equipment required for integrated processing is being examined, based on experience acquired during the Pyropocess Integrated Inactive Demonstration Facility (PRIDE) project, and considering the requirements and desired performance enhancement of KAPF as a new facility beyond PRIDE. Essentially, KAPF will be required to handle hazardous materials such as spent nuclear fuel, which must be processed in an isolated and shielded area separate from the operator location. Moreover, an inert-gas atmosphere must be maintained, because of the radiation and deliquescence of the materials. KAPF must also achieve the goal of significantly increased yearly production beyond that of the previous facility; therefore, several parts of the production line must be automated. This article presents the method considered for the conceptual design of both the production line and the overall layout of the KAPF process equipment. This study has proposed a design methodology that can be utilized as a preliminary step for the design of a hot-cell-type, large-scale facility, in which the various types of processing equipment operated by the remote handling system are integrated. The proposed methodology applies to part of the overall design procedure and contains various weaknesses. However, if the designer is required to maximize the efficiency of the installed material-handling system while considering operation restrictions and maintenance conditions, this kind of design process can accommodate the essential components that must be employed simultaneously in a general hot-cell system

  2. Quantifying geological processes on Mars - Results of the high resolution stereo camera (HRSC) on Mars express

    NARCIS (Netherlands)

    Jaumann, R.; Tirsch, D.; Hauber, E.; Ansan, V.; Di Achille, G.; Erkeling, G.; Fueten, F.; Head, J.; Kleinhans, M. G.; Mangold, N.; Michael, G. G.; Neukum, G.; Pacifici, A.; Platz, T.; Pondrelli, M.; Raack, J.; Reiss, D.; Williams, D. A.; Adeli, S.; Baratoux, D.; De Villiers, G.; Foing, B.; Gupta, S.; Gwinner, K.; Hiesinger, H.; Hoffmann, H.; Deit, L. Le; Marinangeli, L.; Matz, K. D.; Mertens, V.; Muller, J. P.; Pasckert, J. H.; Roatsch, T.; Rossi, A. P.; Scholten, F.; Sowe, M.; Voigt, J.; Warner, N.

    2015-01-01

    Abstract This review summarizes the use of High Resolution Stereo Camera (HRSC) data as an instrumental tool and its application in the analysis of geological processes and landforms on Mars during the last 10 years of operation. High-resolution digital elevations models on a local to regional scale

  3. Geologic Map of the Thaumasia Region, Mars

    Science.gov (United States)

    Dohm, Janes M.; Tanaka, Kenneth L.; Hare, Trent M.

    2001-01-01

    The geology of the Thaumasia region (fig. 1, sheet 3) includes a wide array of rock materials, depositional and erosional landforms, and tectonic structures. The region is dominated by the Thaumasia plateau, which includes central high lava plains ringed by highly deformed highlands; the plateau may comprise the ancestral center of Tharsis tectonism (Frey, 1979; Plescia and Saunders, 1982). The extensive structural deformation of the map region, which is without parallel on Mars in both complexity and diversity, occurred largely throughout the Noachian and Hesperian periods (Tanaka and Davis, 1988; Scott and Dohm, 1990a). The deformation produced small and large extensional and contractional structures (fig. 2, sheet 3) that resulted from stresses related to the formation of Tharsis (Frey, 1979; Wise and others, 1979; Plescia and Saunders, 1982; Banerdt and others, 1982, 1992; Watters and Maxwell, 1986; Tanaka and Davis, 1988; Francis, 1988; Watters, 1993; Schultz and Tanaka, 1994), from magmatic-driven uplifts, such as at Syria Planum (Tanaka and Davis, 1988; Dohm and others, 1998; Dohm and Tanaka, 1999) and central Valles Marineris (Dohm and others, 1998, Dohm and Tanaka, 1999), and from the Argyre impact (Wilhelms, 1973; Scott and Tanaka, 1986). In addition, volcanic, eolian, and fluvial processes have highly modified older surfaces in the map region. Local volcanic and tectonic activity often accompanied episodes of valley formation. Our mapping depicts and describes the diverse terrains and complex geologic history of this unique ancient tectonic region of Mars. The geologic (sheet 1), paleotectonic (sheet 2), and paleoerosional (sheet 3) maps of the Thaumasia region were compiled on a Viking 1:5,000,000-scale digital photomosaic base. The base is a combination of four quadrangles: the southeast part of Phoenicis Lacus (MC–17), most of the southern half of Coprates (MC–18), a large part of Thaumasia (MC–25), and the northwest margin of Argyre (MC–26

  4. Research on the evolution model and deformation mechanisms of Baishuihe landslide based on analyzing geologic process of slope

    Science.gov (United States)

    Zhang, S.; Tang, H.; Cai, Y.; Tan, Q.

    2016-12-01

    The landslide is a result of both inner and exterior geologic agents, and inner ones always have significant influences on the susceptibility of geologic bodies to the exterior ones. However, current researches focus more on impacts of exterior factors, such as precipitation and reservoir water, than that of geologic process. Baishuihe landslide, located on the south bank of Yangtze River and 56km upstream from the Three Gorges Project, was taken as the study subject with the in-situ investigation and exploration carried out for the first step. After the spatial analysis using the 3D model of topography built by ArcGIS (Fig.1), geologic characteristics of the slope that lies in a certain range near the Baishuihe landslide on the same bank were investigated for further insights into geologic process of the slope, with help of the geological map and structure outline map. Baishuihe landslide developed on the north limb of Baifuping anticline, a dip slope on the southwest margin of Zigui basin. The eastern and western boundaries are both ridges and in the middle a distinct slide depression is in process of deforming. Evolutionary process of Baishuihe landslide includes three steps below. 1) Emergence of Baifuping anticline leaded to interbedded dislocation, tension cracks and joint fractures in bedrocks. 2) Weathering continuously weakened strength of soft interlayers in the Shazhenxi Formation (T3s). 3) Rock slide caused by neotectonics happened on a large scale along the weak layers and joint planes, forming initial Baishuihe landslide. Although the landslide has undergone reconstruction for a long time, it could still be divided clearly into two parts, namely a) the rock landslide at the back half (south) and b) the debris landslide at the front half (north). a) The deformation mechanism for the rock landslide is believed to be deterioration in strength of weak bedding planes due to precipitation and free face caused by human activities or river incision. b

  5. 3D Geological Model for "LUSI" - a Deep Geothermal System

    Science.gov (United States)

    Sohrabi, Reza; Jansen, Gunnar; Mazzini, Adriano; Galvan, Boris; Miller, Stephen A.

    2016-04-01

    Geothermal applications require the correct simulation of flow and heat transport processes in porous media, and many of these media, like deep volcanic hydrothermal systems, host a certain degree of fracturing. This work aims to understand the heat and fluid transport within a new-born sedimentary hosted geothermal system, termed Lusi, that began erupting in 2006 in East Java, Indonesia. Our goal is to develop conceptual and numerical models capable of simulating multiphase flow within large-scale fractured reservoirs such as the Lusi region, with fractures of arbitrary size, orientation and shape. Additionally, these models can also address a number of other applications, including Enhanced Geothermal Systems (EGS), CO2 sequestration (Carbon Capture and Storage CCS), and nuclear waste isolation. Fractured systems are ubiquitous, with a wide-range of lengths and scales, making difficult the development of a general model that can easily handle this complexity. We are developing a flexible continuum approach with an efficient, accurate numerical simulator based on an appropriate 3D geological model representing the structure of the deep geothermal reservoir. Using previous studies, borehole information and seismic data obtained in the framework of the Lusi Lab project (ERC grant n°308126), we present here the first 3D geological model of Lusi. This model is calculated using implicit 3D potential field or multi-potential fields, depending on the geological context and complexity. This method is based on geological pile containing the geological history of the area and relationship between geological bodies allowing automatic computation of intersections and volume reconstruction. Based on the 3D geological model, we developed a new mesh algorithm to create hexahedral octree meshes to transfer the structural geological information for 3D numerical simulations to quantify Thermal-Hydraulic-Mechanical-Chemical (THMC) physical processes.

  6. Large-scale hydrogen production using nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ryland, D.; Stolberg, L.; Kettner, A.; Gnanapragasam, N.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    For many years, Atomic Energy of Canada Limited (AECL) has been studying the feasibility of using nuclear reactors, such as the Supercritical Water-cooled Reactor, as an energy source for large scale hydrogen production processes such as High Temperature Steam Electrolysis and the Copper-Chlorine thermochemical cycle. Recent progress includes the augmentation of AECL's experimental capabilities by the construction of experimental systems to test high temperature steam electrolysis button cells at ambient pressure and temperatures up to 850{sup o}C and CuCl/HCl electrolysis cells at pressures up to 7 bar and temperatures up to 100{sup o}C. In parallel, detailed models of solid oxide electrolysis cells and the CuCl/HCl electrolysis cell are being refined and validated using experimental data. Process models are also under development to assess options for economic integration of these hydrogen production processes with nuclear reactors. Options for large-scale energy storage, including hydrogen storage, are also under study. (author)

  7. Energy transfers in large-scale and small-scale dynamos

    Science.gov (United States)

    Samtaney, Ravi; Kumar, Rohit; Verma, Mahendra

    2015-11-01

    We present the energy transfers, mainly energy fluxes and shell-to-shell energy transfers in small-scale dynamo (SSD) and large-scale dynamo (LSD) using numerical simulations of MHD turbulence for Pm = 20 (SSD) and for Pm = 0.2 on 10243 grid. For SSD, we demonstrate that the magnetic energy growth is caused by nonlocal energy transfers from the large-scale or forcing-scale velocity field to small-scale magnetic field. The peak of these energy transfers move towards lower wavenumbers as dynamo evolves, which is the reason for the growth of the magnetic fields at the large scales. The energy transfers U2U (velocity to velocity) and B2B (magnetic to magnetic) are forward and local. For LSD, we show that the magnetic energy growth takes place via energy transfers from large-scale velocity field to large-scale magnetic field. We observe forward U2U and B2B energy flux, similar to SSD.

  8. Geological-geotechnical investigation for large horizontal directional drilling

    Energy Technology Data Exchange (ETDEWEB)

    Martins, Pedro R.R.; Rocha, Ronaldo; Avesani Neto, Jose Orlando; Placido, Rafael R.; Ignatius, Scandar G.; Galli, Vicente Luiz [Instituto de Pesquisas Tecnologicas do Estado de Sao Paulo (IPT), Sao Paulo, SP (Brazil); Amaral, Claudio S. [Centro de Pesquisa Leopoldo A. Miguez de Melo (CENPES/PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    Use of Horizontal Directional Drilling - HDD for large diameter (OD>20 inches) pipeline installation started in the second half of the seventies. Since then the method became the preferred alternative for situations in which it is necessary an underground pipeline but there are concerns about digging trenches. Crossings of roadways, water bodies and environmental sensitive areas are typical examples of its application. Technical and economic feasibility of HDD depends significantly on the properties of the materials that will be drilled. Lack of information about these materials can lead to several problems as: schedule delays, cost elevation, pipeline damage, unforeseen environmental impacts and even the failure of the entire operation. Ground investigation campaigns for HDD should define a consistent geological-geotechnical model, which must include determination of behaviour parameters for soil and rock masses that will be drilled. Thus it is proposed an investigation in tree stages: review of available geological-geotechnical information, site reconnaissance, and field survey. (author)

  9. State of the Art in Large-Scale Soil Moisture Monitoring

    Science.gov (United States)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  10. Structure and dating errors in the geologic time scale and periodicity in mass extinctions

    Science.gov (United States)

    Stothers, Richard B.

    1989-01-01

    Structure in the geologic time scale reflects a partly paleontological origin. As a result, ages of Cenozoic and Mesozoic stage boundaries exhibit a weak 28-Myr periodicity that is similar to the strong 26-Myr periodicity detected in mass extinctions of marine life by Raup and Sepkoski. Radiometric dating errors in the geologic time scale, to which the mass extinctions are stratigraphically tied, do not necessarily lessen the likelihood of a significant periodicity in mass extinctions, but do spread the acceptable values of the period over the range 25-27 Myr for the Harland et al. time scale or 25-30 Myr for the DNAG time scale. If the Odin time scale is adopted, acceptable periods fall between 24 and 33 Myr, but are not robust against dating errors. Some indirect evidence from independently-dated flood-basalt volcanic horizons tends to favor the Odin time scale.

  11. Volcanogenic Uranium Deposits: Geology, Geochemical Processes, and Criteria for Resource Assessment

    Science.gov (United States)

    Nash, J. Thomas

    2010-01-01

    Felsic volcanic rocks have long been considered a primary source of uranium for many kinds of uranium deposits, but volcanogenic uranium deposits themselves have generally not been important resources. Until the past few years, resource summaries for the United States or the world generally include volcanogenic in the broad category of 'other deposits' because they comprised less than 0.5 percent of past production or estimated resources. Exploration in the United States from the 1940s through 1982 discovered hundreds of prospects in volcanic rocks, of which fewer than 20 had some recorded production. Intensive exploration in the late 1970s found some large deposits, but low grades (less than about 0.10 percent U3O8) discouraged economic development. A few deposits in the world, drilled in the 1980s and 1990s, are now known to contain large resources (>20,000 tonnes U3O8). However, research on ore-forming processes and exploration for volcanogenic deposits has lagged behind other kinds of uranium deposits and has not utilized advances in understanding of geology, geochemistry, and paleohydrology of ore deposits in general and epithermal deposits in particular. This review outlines new ways to explore and assess for volcanogenic deposits, using new concepts of convection, fluid mixing, and high heat flow to mobilize uranium from volcanic source rocks and form deposits that are postulated to be large. Much can also be learned from studies of epithermal metal deposits, such as the important roles of extensional tectonics, bimodal volcanism, and fracture-flow systems related to resurgent calderas. Regional resource assessment is helped by genetic concepts, but hampered by limited information on frontier areas and undiscovered districts. Diagnostic data used to define ore deposit genesis, such as stable isotopic data, are rarely available for frontier areas. A volcanic environment classification, with three classes (proximal, distal, and pre-volcanic structures

  12. Dazai super-large uranium-bearing germanium deposit in western Yunnan region metallogenic geological conditions and prospect

    International Nuclear Information System (INIS)

    Han Yanrong; Yuan Qingbang; Li Yonghua; Zhang Ling; Dai Jiemin

    1995-05-01

    The Dazai super-large uranium-bearing germanium deposit is located in Bangmai Fault Basin, Western Yunnan, China. The basin basement is migmatitic granite and the cover is miocene coal-bearing clastics, Bangmai Formation. The basin development had undergone faulted rhombus basin forming, synsedimentary structure-developing and up-lifted-denuded stages. Synsedimentary faults had controlled distribution of sedimentary formation and lithofacies, and uranium and germanium mineralization. Germanium ore-bodies occur mainly in master lignite-bed of lower rhythmite. Hosted germanium-lignite is taken as main ore-type. Germanium occurs in vitrinite of lignite in the form of metal-organic complex. The metallogenetic geological conditions of the deposit are that ground preparation is uplift zone-migmatitic granite-fault basin-geothermal anomaly area, rich and thick ore-body is controlled by synsedimentary fault, peat-bog phase is favorable to accumulation for ore-forming elements, and unconformity between overlying cover and underlying basement is a channel-way of mineralizing fluid. A multiperiodic composite, being regarded sedimentation and diagenesis as a major process, uranium and germanium ore deposit has been formed through two mineralization. Four prospecting areas have been forecasted and two deposits have been accordingly discovered again. Technical-economic provableness shows that the deposit is characterized by shallow-buried, rich grade, large scale, easy mining and smelting. (9 figs.)

  13. The effects of large scale processing on caesium leaching from cemented simulant sodium nitrate waste

    International Nuclear Information System (INIS)

    Lee, D.J.; Brown, D.J.

    1982-01-01

    The effects of large scale processing on the properties of cemented simulant sodium nitrate waste have been investigated. Leach tests have been performed on full-size drums, cores and laboratory samples of cement formulations containing Ordinary Portland Cement (OPC), Sulphate Resisting Portland Cement (SRPC) and a blended cement (90% ground granulated blast furnace slag/10% OPC). In addition, development of the cement hydration exotherms with time and the temperature distribution in 220 dm 3 samples have been followed. (author)

  14. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  15. Modelling aggregation on the large scale and regularity on the small scale in spatial point pattern datasets

    DEFF Research Database (Denmark)

    Lavancier, Frédéric; Møller, Jesper

    We consider a dependent thinning of a regular point process with the aim of obtaining aggregation on the large scale and regularity on the small scale in the resulting target point process of retained points. Various parametric models for the underlying processes are suggested and the properties...

  16. Large scale laboratory diffusion experiments in clay rocks

    International Nuclear Information System (INIS)

    Garcia-Gutierrez, M.; Missana, T.; Mingarro, M.; Martin, P.L.; Cormenzana, J.L.

    2005-01-01

    Full text of publication follows: Clay formations are potential host rocks for high-level radioactive waste repositories. In clay materials the radionuclide diffusion is the main transport mechanism. Thus, the understanding of the diffusion processes and the determination of diffusion parameters in conditions as similar as possible to the real ones, are critical for the performance assessment of deep geological repository. Diffusion coefficients are mainly measured in the laboratory using small samples, after a preparation to fit into the diffusion cell. In addition, a few field tests are usually performed for confirming laboratory results, and analyse scale effects. In field or 'in situ' tests the experimental set-up usually includes the injection of a tracer diluted in reconstituted formation water into a packed off section of a borehole. Both experimental systems may produce artefacts in the determination of diffusion coefficients. In laboratory the preparation of the sample can generate structural change mainly if the consolidated clay have a layered fabric, and in field test the introduction of water could modify the properties of the saturated clay in the first few centimeters, just where radionuclide diffusion is expected to take place. In this work, a large scale laboratory diffusion experiment is proposed, using a large cylindrical sample of consolidated clay that can overcome the above mentioned problems. The tracers used were mixed with clay obtained by drilling a central hole, re-compacted into the hole at approximately the same density as the consolidated block and finally sealed. Neither additional treatment of the sample nor external monitoring are needed. After the experimental time needed for diffusion to take place (estimated by scoping calculations) the block was sampled to obtain a 3D distribution of the tracer concentration and the results were modelled. An additional advantage of the proposed configuration is that it could be used in 'in situ

  17. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  18. Geologic processes and sedimentary system on Mars

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, A S

    1988-01-01

    The subject is covered under following headings: (1) morphology and processes at the martian surface (impact craters, water and ice, landslide, aeolian processes, volcanism, chemical weathering); (2) the sedimentary system (martian geologic documentation, sedimentary balance, regolith, pyroclastics, erosion phenomena, deposit and loss of sediments) as well as (3) summary and final remarks. 72 refs.

  19. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  20. Vertical equilibrium with sub-scale analytical methods for geological CO2 sequestration

    KAUST Repository

    Gasda, S. E.; Nordbotten, J. M.; Celia, M. A.

    2009-01-01

    equilibrium with sub-scale analytical method (VESA) combines the flexibility of a numerical method, allowing for heterogeneous and geologically complex systems, with the efficiency and accuracy of an analytical method, thereby eliminating expensive grid

  1. On the use of Cloud Computing and Machine Learning for Large-Scale SAR Science Data Processing and Quality Assessment Analysi

    Science.gov (United States)

    Hua, H.

    2016-12-01

    Geodetic imaging is revolutionizing geophysics, but the scope of discovery has been limited by labor-intensive technological implementation of the analyses. The Advanced Rapid Imaging and Analysis (ARIA) project has proven capability to automate SAR data processing and analysis. Existing and upcoming SAR missions such as Sentinel-1A/B and NISAR are also expected to generate massive amounts of SAR data. This has brought to the forefront the need for analytical tools for SAR quality assessment (QA) on the large volumes of SAR data-a critical step before higher-level time series and velocity products can be reliably generated. Initially leveraging an advanced hybrid-cloud computing science data system for performing large-scale processing, machine learning approaches were augmented for automated analysis of various quality metrics. Machine learning-based user-training of features, cross-validation, prediction models were integrated into our cloud-based science data processing flow to enable large-scale and high-throughput QA analytics for enabling improvements to the production quality of geodetic data products.

  2. Intermediate Scale Laboratory Testing to Understand Mechanisms of Capillary and Dissolution Trapping during Injection and Post-Injection of CO2 in Heterogeneous Geological Formations

    Energy Technology Data Exchange (ETDEWEB)

    Illangasekare, Tissa [Colorado School of Mines, Golden, CO (United States); Trevisan, Luca [Colorado School of Mines, Golden, CO (United States); Agartan, Elif [Colorado School of Mines, Golden, CO (United States); Mori, Hiroko [Colorado School of Mines, Golden, CO (United States); Vargas-Johnson, Javier [Colorado School of Mines, Golden, CO (United States); Gonzalez-Nicolas, Ana [Colorado School of Mines, Golden, CO (United States); Cihan, Abdullah [Colorado School of Mines, Golden, CO (United States); Birkholzer, Jens [Colorado School of Mines, Golden, CO (United States); Zhou, Quanlin [Colorado School of Mines, Golden, CO (United States)

    2015-03-31

    Carbon Capture and Storage (CCS) represents a technology aimed to reduce atmospheric loading of CO2 from power plants and heavy industries by injecting it into deep geological formations, such as saline aquifers. A number of trapping mechanisms contribute to effective and secure storage of the injected CO2 in supercritical fluid phase (scCO2) in the formation over the long term. The primary trapping mechanisms are structural, residual, dissolution and mineralization. Knowledge gaps exist on how the heterogeneity of the formation manifested at all scales from the pore to the site scales affects trapping and parameterization of contributing mechanisms in models. An experimental and modeling study was conducted to fill these knowledge gaps. Experimental investigation of fundamental processes and mechanisms in field settings is not possible as it is not feasible to fully characterize the geologic heterogeneity at all relevant scales and gathering data on migration, trapping and dissolution of scCO2. Laboratory experiments using scCO2 under ambient conditions are also not feasible as it is technically challenging and cost prohibitive to develop large, two- or three-dimensional test systems with controlled high pressures to keep the scCO2 as a liquid. Hence, an innovative approach that used surrogate fluids in place of scCO2 and formation brine in multi-scale, synthetic aquifers test systems ranging in scales from centimeter to meter scale developed used. New modeling algorithms were developed to capture the processes controlled by the formation heterogeneity, and they were tested using the data from the laboratory test systems. The results and findings are expected to contribute toward better conceptual models, future improvements to DOE numerical codes, more accurate assessment of storage capacities, and optimized placement strategies. This report presents the experimental and modeling methods

  3. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  4. Sediment Transport Capacity of Turbidity Currents: from Microscale to Geological Scale.

    Science.gov (United States)

    Eggenhuisen, J. T.; Tilston, M.; Cartigny, M.; Pohl, F.; de Leeuw, J.; van der Grind, G. J.

    2016-12-01

    A big question in sedimentology concerns the magnitude of fluxes of sediment particles, solute matter and dissolved gasses from shallow marine waters to deep basins by turbidity current flow. Here we establish sediment transport capacity of turbidity current flow on three levels. The most elementary level is set by the maximum amount of sediment that can be contained at the base of turbidity currents without causing complete extinction of boundary layer turbulence. The second level concerns the capacity in a vertical column within turbidity currents. The third level involves the amount of sediment that can be transported in turbidite systems on geological timescales. The capacity parameter Γ compares turbulent forces near the boundary of a turbulent suspension to gravity and buoyancy forces acting on suspended particles. The condition of Γ>1 coincides with complete suppression of coherent boundary layer turbulence in Direct Numerical Simulations of sediment-laden turbulent flow. Γ=1 coincides with the upper limit of observed suspended particle concentrations in flume and field measurements. Γ is grainsize independent, yet capacity of the full vertical structure of turbidity currents becomes grainsize dependent. This is due to the appearance of grainsize dependent vertical motions within turbulence as a primary control on the shape of the vertical concentration profile. We illustrate this dependence with experiments and theory and conclude that capacity depends on the competence of prevailing turbulence to suspend particle sizes. The concepts of capacity and competence are thus tangled. Finally, the capacity of turbidity current flow structure is coupled to geological constraints on recurrence times, channel and lobe life cycles, and allogenic forcing on system activity to arrive at system scale sediment transport capacity. We demonstrate a simple model that uses the fundamental process insight described above to estimate geological sediment budgets from

  5. Statistical geological discrete fracture network model. Forsmark modelling stage 2.2

    Energy Technology Data Exchange (ETDEWEB)

    Fox, Aaron; La Pointe, Paul [Golder Associates Inc (United States); Simeonov, Assen [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Hermanson, Jan; Oehman, Johan [Golder Associates AB, Stockholm (Sweden)

    2007-11-15

    . These models describe fracture intensity and size as a single range from borehole to outcrop scale; and - the combined outcrop scale and tectonic fault models (OSM+TFM), where separate distributions for size and intensity describe the fractures observed at outcrop scale (largely joints) and the features observed at regional scales (lineaments that are largely faults or deformation zones). Fracture intensity and fracture size are not rigidly coupled. The stochastic intensity model is build using power laws, and combines fracture intensity data from outcrops (P21) and boreholes (P10) to simultaneously match both data sets. Intensity statistics are presented for each fracture set in each domain, and the spatial variation of intensity described as a function of lithology or as a gamma distribution where possible. This report also describes the sources of uncertainty in the methodologies, data, and analyses used to build the version 2.2 geological DFN, and offers insight as to the potential magnitudes of their effects on downstream models. The outputs of the geological DFN modeling process are recommended parameters or statistical distributions describing fracture set orientations, radius sizes, volumetric intensities, spatial correlations and models, and other parameters necessary to build stochastic models (lithology and scaling corrections, termination matrices)

  6. Statistical geological discrete fracture network model. Forsmark modelling stage 2.2

    International Nuclear Information System (INIS)

    Fox, Aaron; La Pointe, Paul; Simeonov, Assen; Hermanson, Jan; Oehman, Johan

    2007-11-01

    . These models describe fracture intensity and size as a single range from borehole to outcrop scale; and - the combined outcrop scale and tectonic fault models (OSM+TFM), where separate distributions for size and intensity describe the fractures observed at outcrop scale (largely joints) and the features observed at regional scales (lineaments that are largely faults or deformation zones). Fracture intensity and fracture size are not rigidly coupled. The stochastic intensity model is build using power laws, and combines fracture intensity data from outcrops (P21) and boreholes (P10) to simultaneously match both data sets. Intensity statistics are presented for each fracture set in each domain, and the spatial variation of intensity described as a function of lithology or as a gamma distribution where possible. This report also describes the sources of uncertainty in the methodologies, data, and analyses used to build the version 2.2 geological DFN, and offers insight as to the potential magnitudes of their effects on downstream models. The outputs of the geological DFN modeling process are recommended parameters or statistical distributions describing fracture set orientations, radius sizes, volumetric intensities, spatial correlations and models, and other parameters necessary to build stochastic models (lithology and scaling corrections, termination matrices)

  7. The large-scale process of microbial carbonate precipitation for nickel remediation from an industrial soil.

    Science.gov (United States)

    Zhu, Xuejiao; Li, Weila; Zhan, Lu; Huang, Minsheng; Zhang, Qiuzhuo; Achal, Varenyam

    2016-12-01

    Microbial carbonate precipitation is known as an efficient process for the remediation of heavy metals from contaminated soils. In the present study, a urease positive bacterial isolate, identified as Bacillus cereus NS4 through 16S rDNA sequencing, was utilized on a large scale to remove nickel from industrial soil contaminated by the battery industry. The soil was highly contaminated with an initial total nickel concentration of approximately 900 mg kg -1 . The soluble-exchangeable fraction was reduced to 38 mg kg -1 after treatment. The primary objective of metal stabilization was achieved by reducing the bioavailability through immobilizing the nickel in the urease-driven carbonate precipitation. The nickel removal in the soils contributed to the transformation of nickel from mobile species into stable biominerals identified as calcite, vaterite, aragonite and nickelous carbonate when analyzed under XRD. It was proven that during precipitation of calcite, Ni 2+ with an ion radius close to Ca 2+ was incorporated into the CaCO 3 crystal. The biominerals were also characterized by using SEM-EDS to observe the crystal shape and Raman-FTIR spectroscopy to predict responsible bonding during bioremediation with respect to Ni immobilization. The electronic structure and chemical-state information of the detected elements during MICP bioremediation process was studied by XPS. This is the first study in which microbial carbonate precipitation was used for the large-scale remediation of metal-contaminated industrial soil. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  9. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    Science.gov (United States)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  10. Multi-format all-optical processing based on a large-scale, hybridly integrated photonic circuit.

    Science.gov (United States)

    Bougioukos, M; Kouloumentas, Ch; Spyropoulou, M; Giannoulis, G; Kalavrouziotis, D; Maziotis, A; Bakopoulos, P; Harmon, R; Rogers, D; Harrison, J; Poustie, A; Maxwell, G; Avramopoulos, H

    2011-06-06

    We investigate through numerical studies and experiments the performance of a large scale, silica-on-silicon photonic integrated circuit for multi-format regeneration and wavelength-conversion. The circuit encompasses a monolithically integrated array of four SOAs inside two parallel Mach-Zehnder structures, four delay interferometers and a large number of silica waveguides and couplers. Exploiting phase-incoherent techniques, the circuit is capable of processing OOK signals at variable bit rates, DPSK signals at 22 or 44 Gb/s and DQPSK signals at 44 Gbaud. Simulation studies reveal the wavelength-conversion potential of the circuit with enhanced regenerative capabilities for OOK and DPSK modulation formats and acceptable quality degradation for DQPSK format. Regeneration of 22 Gb/s OOK signals with amplified spontaneous emission (ASE) noise and DPSK data signals degraded with amplitude, phase and ASE noise is experimentally validated demonstrating a power penalty improvement up to 1.5 dB.

  11. The geological thought process: A help in developing business instincts

    Energy Technology Data Exchange (ETDEWEB)

    Epstein, S.A. [Dean Witter Reynolds, New York, NY (United States)

    1995-09-01

    Since the beginning of modern-day geology it has been understood that the present is the key to the past. However, when attempting to apply current geological models one discovers that there are no exact look-alikes. Thus, the geological discipline inherently accepts modifications, omissions, and relatively large margins of error compared with engineering. Geologists are comfortable in a world of non-unique solutions. Thus the experience in working with numerous geological settings is extremely critical in selecting the most reasonable geological interpretations, often by using a composite of specific models. One can not simply replace a dynamic geologist`s life-time of experiences and geologic instinct with simply a book-smart young upstart. Petroleum corporations accept geologic risk and manage it by drilling numerous wells in various geological provenances. Oil corporations have attempted to quantify and manage risk by using Monte Carlo simulations, thus invoking a formal discipline of risk. The acceptance of risk, results in an asset allocation approach to investing. Asset allocators attempt to reduce volatility and risk, inherently understanding that in any specific time interval anything can happen. Dollar cost averaging significantly reduces market risk over time, however it requires discipline and commitment. The single most important ingredient to a successful investing plan is to assign a reasonable holding period. Historically, a majority of the investment community demands instant gratification causing unneeded anxiety and failure. As in geology nothing can replace experience.

  12. Efficient algorithms for collaborative decision making for large scale settings

    DEFF Research Database (Denmark)

    Assent, Ira

    2011-01-01

    to bring about more effective and more efficient retrieval systems that support the users' decision making process. We sketch promising research directions for more efficient algorithms for collaborative decision making, especially for large scale systems.......Collaborative decision making is a successful approach in settings where data analysis and querying can be done interactively. In large scale systems with huge data volumes or many users, collaboration is often hindered by impractical runtimes. Existing work on improving collaboration focuses...... on avoiding redundancy for users working on the same task. While this improves the effectiveness of the user work process, the underlying query processing engine is typically considered a "black box" and left unchanged. Research in multiple query processing, on the other hand, ignores the application...

  13. Geologic Reconnaissance and Lithologic Identification by Remote Sensing

    Science.gov (United States)

    remote sensing in geologic reconnaissance for purposes of tunnel site selection was studied further and a test case was undertaken to evaluate this geological application. Airborne multispectral scanning (MSS) data were obtained in May, 1972, over a region between Spearfish and Rapid City, South Dakota. With major effort directed toward the analysis of these data, the following geologic features were discriminated: (1) exposed rock areas, (2) five separate rock groups, (3) large-scale structures. This discrimination was accomplished by ratioing multispectral channels.

  14. Large scale hydrogeological modelling of a low-lying complex coastal aquifer system

    DEFF Research Database (Denmark)

    Meyer, Rena

    2018-01-01

    intrusion. In this thesis a new methodological approach was developed to combine 3D numerical groundwater modelling with a detailed geological description and hydrological, geochemical and geophysical data. It was applied to a regional scale saltwater intrusion in order to analyse and quantify...... the groundwater flow dynamics, identify the driving mechanisms that formed the saltwater intrusion to its present extent and to predict its progression in the future. The study area is located in the transboundary region between Southern Denmark and Northern Germany, adjacent to the Wadden Sea. Here, a large-scale...... parametrization schemes that accommodate hydrogeological heterogeneities. Subsequently, density-dependent flow and transport modelling of multiple salt sources was successfully applied to simulate the formation of the saltwater intrusion during the last 4200 years, accounting for historic changes in the hydraulic...

  15. Complex geologic characterization of the repository environment

    Energy Technology Data Exchange (ETDEWEB)

    Harper, T R [British Petroleum Research Center, Sunberry, England; Szymanski, J S

    1982-01-01

    The present basis for characterizing geological environments is identified in this paper, and the additional requirements imposed by the need to isolate high-level waste safely are discussed. Solutions to these additional requirements are proposed. The time scale of concern and the apparent complexity of the required multidisciplinary approach are identified. It is proposed that an increased use of the geologic record, together with a recognition that all geologic processes operate within an interdependent system, be a key feature in geologic characterization of deep repositories.

  16. Drawing 1/100,000 scale geological map of Mt. Hakkoda geothermal district

    Energy Technology Data Exchange (ETDEWEB)

    Muraoka, Hirobumi; Takakura, Shin' ichi

    1987-10-01

    Geological map of geothermal district of Mt. Hakkoda was made which included the main volcanos created after Pliocene era. For the purpose, geothermal liquid, terrestial heat sources and its storing structures, were studied with consulting geological map. Aerial and satelite photographs were used for the study of faults, foldings, and dikes,. By the result, stratigragic order of layers, developing process of vocanos, and evoluting process of magma, were summarized for report. (5 figs, 4 tabs, 101 refs)

  17. Parallelizing Gene Expression Programming Algorithm in Enabling Large-Scale Classification

    Directory of Open Access Journals (Sweden)

    Lixiong Xu

    2017-01-01

    Full Text Available As one of the most effective function mining algorithms, Gene Expression Programming (GEP algorithm has been widely used in classification, pattern recognition, prediction, and other research fields. Based on the self-evolution, GEP is able to mine an optimal function for dealing with further complicated tasks. However, in big data researches, GEP encounters low efficiency issue due to its long time mining processes. To improve the efficiency of GEP in big data researches especially for processing large-scale classification tasks, this paper presents a parallelized GEP algorithm using MapReduce computing model. The experimental results show that the presented algorithm is scalable and efficient for processing large-scale classification tasks.

  18. Large-scale hydrology in Europe : observed patterns and model performance

    Energy Technology Data Exchange (ETDEWEB)

    Gudmundsson, Lukas

    2011-06-15

    In a changing climate, terrestrial water storages are of great interest as water availability impacts key aspects of ecosystem functioning. Thus, a better understanding of the variations of wet and dry periods will contribute to fully grasp processes of the earth system such as nutrient cycling and vegetation dynamics. Currently, river runoff from small, nearly natural, catchments is one of the few variables of the terrestrial water balance that is regularly monitored with detailed spatial and temporal coverage on large scales. River runoff, therefore, provides a foundation to approach European hydrology with respect to observed patterns on large scales, with regard to the ability of models to capture these.The analysis of observed river flow from small catchments, focused on the identification and description of spatial patterns of simultaneous temporal variations of runoff. These are dominated by large-scale variations of climatic variables but also altered by catchment processes. It was shown that time series of annual low, mean and high flows follow the same atmospheric drivers. The observation that high flows are more closely coupled to large scale atmospheric drivers than low flows, indicates the increasing influence of catchment properties on runoff under dry conditions. Further, it was shown that the low-frequency variability of European runoff is dominated by two opposing centres of simultaneous variations, such that dry years in the north are accompanied by wet years in the south.Large-scale hydrological models are simplified representations of our current perception of the terrestrial water balance on large scales. Quantification of the models strengths and weaknesses is the prerequisite for a reliable interpretation of simulation results. Model evaluations may also enable to detect shortcomings with model assumptions and thus enable a refinement of the current perception of hydrological systems. The ability of a multi model ensemble of nine large-scale

  19. Hanford Site Guidelines for Preparation and Presentation of Geologic Information

    Energy Technology Data Exchange (ETDEWEB)

    Lanigan, David C.; Last, George V.; Bjornstad, Bruce N.; Thorne, Paul D.; Webber, William D.

    2010-04-30

    A complex geology lies beneath the Hanford Site of southeastern Washington State. Within this geology is a challenging large-scale environmental cleanup project. Geologic and contaminant transport information generated by several U.S. Department of Energy contractors must be documented in geologic graphics clearly, consistently, and accurately. These graphics must then be disseminated in formats readily acceptable by general graphics and document producing software applications. The guidelines presented in this document are intended to facilitate consistent, defensible, geologic graphics and digital data/graphics sharing among the various Hanford Site agencies and contractors.

  20. Sculpting Mountains: Interactive Terrain Modeling Based on Subsurface Geology.

    Science.gov (United States)

    Cordonnier, Guillaume; Cani, Marie-Paule; Benes, Bedrich; Braun, Jean; Galin, Eric

    2018-05-01

    Most mountain ranges are formed by the compression and folding of colliding tectonic plates. Subduction of one plate causes large-scale asymmetry while their layered composition (or stratigraphy) explains the multi-scale folded strata observed on real terrains. We introduce a novel interactive modeling technique to generate visually plausible, large scale terrains that capture these phenomena. Our method draws on both geological knowledge for consistency and on sculpting systems for user interaction. The user is provided hands-on control on the shape and motion of tectonic plates, represented using a new geologically-inspired model for the Earth crust. The model captures their volume preserving and complex folding behaviors under collision, causing mountains to grow. It generates a volumetric uplift map representing the growth rate of subsurface layers. Erosion and uplift movement are jointly simulated to generate the terrain. The stratigraphy allows us to render folded strata on eroded cliffs. We validated the usability of our sculpting interface through a user study, and compare the visual consistency of the earth crust model with geological simulation results and real terrains.

  1. Tectonic and climatic considerations for deep geological disposal of radioactive waste: A UK perspective

    International Nuclear Information System (INIS)

    McEvoy, F.M.; Schofield, D.I.; Shaw, R.P.; Norris, S.

    2016-01-01

    Identifying and evaluating the factors that might impact on the long-term integrity of a deep Geological Disposal Facility (GDF) and its surrounding geological and surface environment is central to developing a safety case for underground disposal of radioactive waste. The geological environment should be relatively stable and its behaviour adequately predictable so that scientifically sound evaluations of the long-term radiological safety of a GDF can be made. In considering this, it is necessary to take into account natural processes that could affect a GDF or modify its geological environment up to 1 million years into the future. Key processes considered in this paper include those which result from plate tectonics, such as seismicity and volcanism, as well as climate-related processes, such as erosion, uplift and the effects of glaciation. Understanding the inherent variability of process rates, critical thresholds and likely potential influence of unpredictable perturbations represent significant challenges to predicting the natural environment. From a plate-tectonic perspective, a one million year time frame represents a very short segment of geological time and is largely below the current resolution of observation of past processes. Similarly, predicting climate system evolution on such time-scales, particularly beyond 200 ka AP is highly uncertain, relying on estimating the extremes within which climate and related processes may vary with reasonable confidence. The paper highlights some of the challenges facing a deep geological disposal program in the UK to review understanding of the natural changes that may affect siting and design of a GDF. - Highlights: • Natural processes are key to developing a safety case for geological disposal. • Key factors include plate tectonic and climate-mediated processes. • Process variability is a challenge to predicting the natural environment. • We highlight the challenges for geological disposal programs using

  2. Tectonic and climatic considerations for deep geological disposal of radioactive waste: A UK perspective

    Energy Technology Data Exchange (ETDEWEB)

    McEvoy, F.M., E-mail: fmcevoy@bgs.ac.uk [British Geological Survey, Keyworth, Nottingham NG12 5GG (United Kingdom); Schofield, D.I. [British Geological Survey, Tongwynlais, CF15 7NE (United Kingdom); Shaw, R.P. [British Geological Survey, Keyworth, Nottingham NG12 5GG (United Kingdom); Norris, S. [Radioactive Waste Management Limited, B587, Curie Avenue, Harwell, Didcot OX11 0RH (United Kingdom)

    2016-11-15

    Identifying and evaluating the factors that might impact on the long-term integrity of a deep Geological Disposal Facility (GDF) and its surrounding geological and surface environment is central to developing a safety case for underground disposal of radioactive waste. The geological environment should be relatively stable and its behaviour adequately predictable so that scientifically sound evaluations of the long-term radiological safety of a GDF can be made. In considering this, it is necessary to take into account natural processes that could affect a GDF or modify its geological environment up to 1 million years into the future. Key processes considered in this paper include those which result from plate tectonics, such as seismicity and volcanism, as well as climate-related processes, such as erosion, uplift and the effects of glaciation. Understanding the inherent variability of process rates, critical thresholds and likely potential influence of unpredictable perturbations represent significant challenges to predicting the natural environment. From a plate-tectonic perspective, a one million year time frame represents a very short segment of geological time and is largely below the current resolution of observation of past processes. Similarly, predicting climate system evolution on such time-scales, particularly beyond 200 ka AP is highly uncertain, relying on estimating the extremes within which climate and related processes may vary with reasonable confidence. The paper highlights some of the challenges facing a deep geological disposal program in the UK to review understanding of the natural changes that may affect siting and design of a GDF. - Highlights: • Natural processes are key to developing a safety case for geological disposal. • Key factors include plate tectonic and climate-mediated processes. • Process variability is a challenge to predicting the natural environment. • We highlight the challenges for geological disposal programs using

  3. Advances in planetary geology

    International Nuclear Information System (INIS)

    1987-06-01

    The surface of Mars displays a broad range of channel and valley features. There is as great a range in morphology as in scale. Some of the features of Martian geography are examined. Geomorphic mapping, crater counts on selected surfaces, and a detailed study of drainage basins are used to trace the geologic evolution of the Margaritifer Sinus Quandrangle. The layered deposits in the Valles Marineris are described in detail and the geologic processes that could have led to their formation are analyzed

  4. GIS-project: geodynamic globe for global monitoring of geological processes

    Science.gov (United States)

    Ryakhovsky, V.; Rundquist, D.; Gatinsky, Yu.; Chesalova, E.

    2003-04-01

    A multilayer geodynamic globe at the scale 1:10,000,000 was created at the end of the nineties in the GIS Center of the Vernadsky Museum. A special soft-and-hardware complex was elaborated for its visualization with a set of multitarget object directed databases. The globe includes separate thematic covers represented by digital sets of spatial geological, geochemical, and geophysical information (maps, schemes, profiles, stratigraphic columns, arranged databases etc.). At present the largest databases included in the globe program are connected with petrochemical and isotopic data on magmatic rocks of the World Ocean and with the large and supperlarge mineral deposits. Software by the Environmental Scientific Research Institute (ESRI), USA as well as ArcScan vectrorizator were used for covers digitizing and database adaptation (ARC/INFO 7.0, 8.0). All layers of the geoinformational project were obtained by scanning of separate objects and their transfer to the real geographic co-ordinates of an equiintermediate conic projection. Then the covers were projected on plane degree-system geographic co-ordinates. Some attributive databases were formed for each thematic layer, and in the last stage all covers were combined into the single information system. Separate digital covers represent mathematical descriptions of geological objects and relations between them, such as Earth's altimetry, active fault systems, seismicity etc. Some grounds of the cartographic generalization were taken into consideration in time of covers compilation with projection and co-ordinate systems precisely answered a given scale. The globe allows us to carry out in the interactive regime the formation of coordinated with each other object-oriented databases and thematic covers directly connected with them. They can be spread for all the Earth and the near-Earth space, and for the most well known parts of divergent and convergent boundaries of the lithosphere plates. Such covers and time series

  5. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  6. Geological disposal of radioactive wastes: national commitment, local and regional involvement

    International Nuclear Information System (INIS)

    2013-07-01

    Long-term radioactive waste management, including geological disposal, involves the construction of a limited number of facilities and it is therefore a national challenge with a strong local/regional dimension. Public information, consultation and/or participation in environmental or technological decision-making are today's best practice and must take place at the different geographical and political scales. Large-scale technology projects are much more likely to be accepted when stakeholders have been involved in making them possible and have developed a sense of interest in or responsibility for them. In this way, national commitment, and local and regional involvement are two essential dimensions of the complex task of securing continued societal agreement for the deep geological disposal of radioactive wastes. Long-term radioactive waste management, including geological disposal, is a national challenge with a strong local/regional dimension. The national policy frameworks increasingly support participatory, flexible and accountable processes. Radioactive waste management institutions are evolving away from a technocratic stance, demonstrating constructive interest in learning and adapting to societal requirements. Empowerment of the local and regional actors has been growing steadily in the last decade. Regional and local players tend to take an active role concerning the siting and implementation of geological repositories. National commitment and local/regional involvement go hand-in-hand in supporting sustainable decisions for the geological disposal of radioactive waste

  7. Reducing Plug and Process Loads for a Large Scale, Low Energy Office Building: NREL's Research Support Facility; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Lobato, C.; Pless, S.; Sheppy, M.; Torcellini, P.

    2011-02-01

    This paper documents the design and operational plug and process load energy efficiency measures needed to allow a large scale office building to reach ultra high efficiency building goals. The appendices of this document contain a wealth of documentation pertaining to plug and process load design in the RSF, including a list of equipment was selected for use.

  8. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  9. Challenges and opportunities : One stop processing of automatic large-scale base map production using airborne lidar data within gis environment case study: Makassar City, Indonesia

    NARCIS (Netherlands)

    Widyaningrum, E.; Gorte, B.G.H.

    2017-01-01

    LiDAR data acquisition is recognized as one of the fastest solutions to provide basis data for large-scale topographical base maps worldwide. Automatic LiDAR processing is believed one possible scheme to accelerate the large-scale topographic base map provision by the Geospatial Information

  10. Validation of the process control system of an automated large scale manufacturing plant.

    Science.gov (United States)

    Neuhaus, H; Kremers, H; Karrer, T; Traut, R H

    1998-02-01

    The validation procedure for the process control system of a plant for the large scale production of human albumin from plasma fractions is described. A validation master plan is developed, defining the system and elements to be validated, the interfaces with other systems with the validation limits, a general validation concept and supporting documentation. Based on this master plan, the validation protocols are developed. For the validation, the system is subdivided into a field level, which is the equipment part, and an automation level. The automation level is further subdivided into sections according to the different software modules. Based on a risk categorization of the modules, the qualification activities are defined. The test scripts for the different qualification levels (installation, operational and performance qualification) are developed according to a previously performed risk analysis.

  11. Large-Scale Transit Signal Priority Implementation

    OpenAIRE

    Lee, Kevin S.; Lozner, Bailey

    2018-01-01

    In 2016, the District Department of Transportation (DDOT) deployed Transit Signal Priority (TSP) at 195 intersections in highly urbanized areas of Washington, DC. In collaboration with a broader regional implementation, and in partnership with the Washington Metropolitan Area Transit Authority (WMATA), DDOT set out to apply a systems engineering–driven process to identify, design, test, and accept a large-scale TSP system. This presentation will highlight project successes and lessons learned.

  12. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    Science.gov (United States)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating large-scale/local-scale correlation, enmpirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the large-scale/local-scale links were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach which integrated discrete wavelet multiresolution analysis for reconstructing local hydrometeorological processes (predictand : precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector) on a monthly time-step. This approach

  13. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  14. Inverse problem to constrain the controlling parameters of large-scale heat transport processes: The Tiberias Basin example

    Science.gov (United States)

    Goretzki, Nora; Inbar, Nimrod; Siebert, Christian; Möller, Peter; Rosenthal, Eliyahu; Schneider, Michael; Magri, Fabien

    2015-04-01

    Salty and thermal springs exist along the lakeshore of the Sea of Galilee, which covers most of the Tiberias Basin (TB) in the northern Jordan- Dead Sea Transform, Israel/Jordan. As it is the only freshwater reservoir of the entire area, it is important to study the salinisation processes that pollute the lake. Simulations of thermohaline flow along a 35 km NW-SE profile show that meteoric and relic brines are flushed by the regional flow from the surrounding heights and thermally induced groundwater flow within the faults (Magri et al., 2015). Several model runs with trial and error were necessary to calibrate the hydraulic conductivity of both faults and major aquifers in order to fit temperature logs and spring salinity. It turned out that the hydraulic conductivity of the faults ranges between 30 and 140 m/yr whereas the hydraulic conductivity of the Upper Cenomanian aquifer is as high as 200 m/yr. However, large-scale transport processes are also dependent on other physical parameters such as thermal conductivity, porosity and fluid thermal expansion coefficient, which are hardly known. Here, inverse problems (IP) are solved along the NW-SE profile to better constrain the physical parameters (a) hydraulic conductivity, (b) thermal conductivity and (c) thermal expansion coefficient. The PEST code (Doherty, 2010) is applied via the graphical interface FePEST in FEFLOW (Diersch, 2014). The results show that both thermal and hydraulic conductivity are consistent with the values determined with the trial and error calibrations. Besides being an automatic approach that speeds up the calibration process, the IP allows to cover a wide range of parameter values, providing additional solutions not found with the trial and error method. Our study shows that geothermal systems like TB are more comprehensively understood when inverse models are applied to constrain coupled fluid flow processes over large spatial scales. References Diersch, H.-J.G., 2014. FEFLOW Finite

  15. PROCESS FOR LICENSE APPLICATION DEVELOPMENT FOR THE GEOLOGIC REPOSITORY

    International Nuclear Information System (INIS)

    DOUGLAS M. FRANKS AND NORMAN C. HENDERSON

    1997-01-01

    The Department of Energy (DOE), specifically the Office of Civilian Radioactive Waste Management (OCRWM) has been charged by the U.S. Congress, through the Nuclear Waste Policy Act (NWPA), with the responsibility for obtaining a license to develop a geologic repository. The NRC is the licensing authority for geologic disposal, and its regulations pertinent to construction authorization and license application are specified in 10 CFR Part 60, Disposal of High-Level Radioactive Wastes in Geologic Repositories, (section)60.21ff and (section)60.31ff. This paper discusses the process the Yucca Mountain Site Site Characterization Project (YMP) will use to identify and apply regulatory and industry guidance to development of the license application (LA) for a geologic repository at Yucca Mountain, Nevada. This guidance will be implemented by the ''Technical Guidance Document for Preparation of the License Application'' (TGD), currently in development

  16. Process for license application development for the geologic repository

    International Nuclear Information System (INIS)

    Franks, D.M.; Henderson, N.C.

    1998-01-01

    The Department of Energy (DOE), specifically the Office of Civilian Radioactive Waste Management (OCRWM) has been charged by the US Congress, through the Nuclear Waste Policy Act (NWPA), with the responsibility for obtaining a license to develop a geologic repository. The NRC is the licensing authority for geologic disposal, and its regulations pertinent to construction authorization and license application are specified in 10 CFR Part 60, Disposal of High-Level Radioactive Wastes in Geologic Repositories, section 60.21ff and section 60.31ff. This paper discusses the process the Yucca Mountain Site Characterization Project (YMP) will use to identify and apply regulatory and industry guidance to development of the license application (LA) for a geologic repository at Yucca Mountain, Nevada. This guidance will be implemented by the Technical Guidance Document for Preparation of the License Application (TGD), currently in development

  17. Wave Propagation in Jointed Geologic Media

    Energy Technology Data Exchange (ETDEWEB)

    Antoun, T

    2009-12-17

    Predictive modeling capabilities for wave propagation in a jointed geologic media remain a modern day scientific frontier. In part this is due to a lack of comprehensive understanding of the complex physical processes associated with the transient response of geologic material, and in part it is due to numerical challenges that prohibit accurate representation of the heterogeneities that influence the material response. Constitutive models whose properties are determined from laboratory experiments on intact samples have been shown to over-predict the free field environment in large scale field experiments. Current methodologies for deriving in situ properties from laboratory measured properties are based on empirical equations derived for static geomechanical applications involving loads of lower intensity and much longer durations than those encountered in applications of interest involving wave propagation. These methodologies are not validated for dynamic applications, and they do not account for anisotropic behavior stemming from direcitonal effects associated with the orientation of joint sets in realistic geologies. Recent advances in modeling capabilities coupled with modern high performance computing platforms enable physics-based simulations of jointed geologic media with unprecedented details, offering a prospect for significant advances in the state of the art. This report provides a brief overview of these modern computational approaches, discusses their advantages and limitations, and attempts to formulate an integrated framework leading to the development of predictive modeling capabilities for wave propagation in jointed and fractured geologic materials.

  18. A Comparison of Geodetic and Geologic Rates Prior to Large Strike-Slip Earthquakes: A Diversity of Earthquake-Cycle Behaviors?

    Science.gov (United States)

    Dolan, James F.; Meade, Brendan J.

    2017-12-01

    Comparison of preevent geodetic and geologic rates in three large-magnitude (Mw = 7.6-7.9) strike-slip earthquakes reveals a wide range of behaviors. Specifically, geodetic rates of 26-28 mm/yr for the North Anatolian fault along the 1999 MW = 7.6 Izmit rupture are ˜40% faster than Holocene geologic rates. In contrast, geodetic rates of ˜6-8 mm/yr along the Denali fault prior to the 2002 MW = 7.9 Denali earthquake are only approximately half as fast as the latest Pleistocene-Holocene geologic rate of ˜12 mm/yr. In the third example where a sufficiently long pre-earthquake geodetic time series exists, the geodetic and geologic rates along the 2001 MW = 7.8 Kokoxili rupture on the Kunlun fault are approximately equal at ˜11 mm/yr. These results are not readily explicable with extant earthquake-cycle modeling, suggesting that they may instead be due to some combination of regional kinematic fault interactions, temporal variations in the strength of lithospheric-scale shear zones, and/or variations in local relative plate motion rate. Whatever the exact causes of these variable behaviors, these observations indicate that either the ratio of geodetic to geologic rates before an earthquake may not be diagnostic of the time to the next earthquake, as predicted by many rheologically based geodynamic models of earthquake-cycle behavior, or different behaviors characterize different fault systems in a manner that is not yet understood or predictable.

  19. Innovation Processes in Large-Scale Public Foodservice-Case Findings from the Implementation of Organic Foods in a Danish County

    DEFF Research Database (Denmark)

    Mikkelsen, Bent Egberg; Nielsen, Thorkild; Kristensen, Niels Heine

    2005-01-01

    was carried out of the change process related implementation of organic foods in large-scale foodservice facilities in Greater Copenhagen county in order to study the effects of such a change. Based on the findings, a set of guidelines has been developed for the successful implementation of organic foods...

  20. Evidence for Enhanced Matrix Diffusion in Geological Environment

    Science.gov (United States)

    Sato, Kiminori; Fujimoto, Koichiro; Nakata, Masataka; Shikazono, Naotatsu

    2013-01-01

    Molecular diffusion in rock matrix, called as matrix diffusion, has been appreciated as a static process for elemental migration in geological environment that has been acknowledged in the context of geological disposal of radioactive waste. However, incomprehensible enhancement of matrix diffusion has been reported at a number of field test sites. Here, the matrix diffusion of saline water at Horonobe, Hokkaido, Japan is highlighted directly probing angstrom-scale pores on a field scale up to 1 km by positron--positronium annihilation spectroscopy. The first application of positron--positronium annihilation spectroscopy to field-scale geophysical research reveals the slight variation of angstrom-scale pores influenced by saline water diffusion with complete accuracy. We found widely interconnected 3 Å pores, which offer the pathway of saline water diffusion with the highly enhanced effective matrix diffusion coefficient of 4× 10-6 cm2 s-1. The present findings provide unambiguous evidence that the angstrom-scale pores enhance effective matrix diffusion on a field scale in geological environment.

  1. Outstanding diversity of heritage features in large geological bodies: The Gachsaran Formation in southwest Iran

    Science.gov (United States)

    Habibi, Tahereh; Ruban, Dmitry A.

    2017-09-01

    The ideas of geological heritage and geological diversity have become very popular in the modern science. These are usually applied to geological domains or countries, provinces, districts, etc. Additionally, it appears to be sensible to assess heritage value of geological bodies. The review of the available knowledge and the field investigation of the Gachsaran Formation (lower Miocene) in southwest Iran permit to assign its features and the relevant phenomena to as much as 10 geological heritage types, namely stratigraphical, sedimentary, palaeontological, palaeogeographical, geomorphological, hydrogeological, engineering, structural, economical, and geohistorical types. The outstanding diversity of the features of this formation determines its high heritage value and the national rank. The geological heritage of the Gachsaran Formation is important to scientists, educators, and tourists. The Papoon and Abolhaiat sections of this formation are potential geological heritage sites, although these do not represent all above-mentioned types. The large territory, where the Gachsaran Formation outcrop, has a significant geoconservation and geotourism potential, and further inventory of geosites on this territory is necessary. Similar studies of geological bodies in North Africa and the Middle East can facilitate better understanding of the geological heritage of this vast territory.

  2. Importance of regional species pools and functional traits in colonization processes: predicting re-colonization after large-scale destruction of ecosystems

    NARCIS (Netherlands)

    Kirmer, A.; Tischew, S.; Ozinga, W.A.; Lampe, von M.; Baasch, A.; Groenendael, van J.M.

    2008-01-01

    Large-scale destruction of ecosystems caused by surface mining provides an opportunity for the study of colonization processes starting with primary succession. Surprisingly, over several decades and without any restoration measures, most of these sites spontaneously developed into valuable biotope

  3. Formation and fate of marine snow: small-scale processes with large- scale implications

    Directory of Open Access Journals (Sweden)

    Thomas Kiørboe

    2001-12-01

    Full Text Available Marine snow aggregates are believed to be the main vehicles for vertical material transport in the ocean. However, aggregates are also sites of elevated heterotrophic activity, which may rather cause enhanced retention of aggregated material in the upper ocean. Small-scale biological-physical interactions govern the formation and fate of marine snow. Aggregates may form by physical coagulation: fluid motion causes collisions between small primary particles (e.g. phytoplankton that may then stick together to form aggregates with enhanced sinking velocities. Bacteria may subsequently solubilise and remineralise aggregated particles. Because the solubilization rate exceeds the remineralization rate, organic solutes leak out of sinking aggregates. The leaking solutes spread by diffusion and advection and form a chemical trail in the wake of the sinking aggregate that may guide small zooplankters to the aggregate. Also, suspended bacteria may enjoy the elevated concentration of organic solutes in the plume. I explore these small-scale formation and degradation processes by means of models, experiments and field observations. The larger scale implications for the structure and functioning of pelagic food chains of export vs. retention of material will be discussed.

  4. Large-scale self-assembled zirconium phosphate smectic layers via a simple spray-coating process

    Science.gov (United States)

    Wong, Minhao; Ishige, Ryohei; White, Kevin L.; Li, Peng; Kim, Daehak; Krishnamoorti, Ramanan; Gunther, Robert; Higuchi, Takeshi; Jinnai, Hiroshi; Takahara, Atsushi; Nishimura, Riichi; Sue, Hung-Jue

    2014-04-01

    The large-scale assembly of asymmetric colloidal particles is used in creating high-performance fibres. A similar concept is extended to the manufacturing of thin films of self-assembled two-dimensional crystal-type materials with enhanced and tunable properties. Here we present a spray-coating method to manufacture thin, flexible and transparent epoxy films containing zirconium phosphate nanoplatelets self-assembled into a lamellar arrangement aligned parallel to the substrate. The self-assembled mesophase of zirconium phosphate nanoplatelets is stabilized by epoxy pre-polymer and exhibits rheology favourable towards large-scale manufacturing. The thermally cured film forms a mechanically robust coating and shows excellent gas barrier properties at both low- and high humidity levels as a result of the highly aligned and overlapping arrangement of nanoplatelets. This work shows that the large-scale ordering of high aspect ratio nanoplatelets is easier to achieve than previously thought and may have implications in the technological applications for similar materials.

  5. Comparative Analysis of Different Protocols to Manage Large Scale Networks

    OpenAIRE

    Anil Rao Pimplapure; Dr Jayant Dubey; Prashant Sen

    2013-01-01

    In recent year the numbers, complexity and size is increased in Large Scale Network. The best example of Large Scale Network is Internet, and recently once are Data-centers in Cloud Environment. In this process, involvement of several management tasks such as traffic monitoring, security and performance optimization is big task for Network Administrator. This research reports study the different protocols i.e. conventional protocols like Simple Network Management Protocol and newly Gossip bas...

  6. Subgrid-scale models for large-eddy simulation of rotating turbulent channel flows

    Science.gov (United States)

    Silvis, Maurits H.; Bae, Hyunji Jane; Trias, F. Xavier; Abkar, Mahdi; Moin, Parviz; Verstappen, Roel

    2017-11-01

    We aim to design subgrid-scale models for large-eddy simulation of rotating turbulent flows. Rotating turbulent flows form a challenging test case for large-eddy simulation due to the presence of the Coriolis force. The Coriolis force conserves the total kinetic energy while transporting it from small to large scales of motion, leading to the formation of large-scale anisotropic flow structures. The Coriolis force may also cause partial flow laminarization and the occurrence of turbulent bursts. Many subgrid-scale models for large-eddy simulation are, however, primarily designed to parametrize the dissipative nature of turbulent flows, ignoring the specific characteristics of transport processes. We, therefore, propose a new subgrid-scale model that, in addition to the usual dissipative eddy viscosity term, contains a nondissipative nonlinear model term designed to capture transport processes, such as those due to rotation. We show that the addition of this nonlinear model term leads to improved predictions of the energy spectra of rotating homogeneous isotropic turbulence as well as of the Reynolds stress anisotropy in spanwise-rotating plane-channel flows. This work is financed by the Netherlands Organisation for Scientific Research (NWO) under Project Number 613.001.212.

  7. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  8. Processing of space images and geologic interpretation

    Energy Technology Data Exchange (ETDEWEB)

    Yudin, V S

    1981-01-01

    Using data for standard sections, a correlation was established between natural formations in geologic/geophysical dimensions and the form they take in the imaging. With computer processing, important data can be derived from the image. Use of the above correlations has allowed to make a number of preliminary classifications of tectonic structures, and to determine certain ongoing processes in the given section. The derived data may be used for search of useful minerals.

  9. Development and analysis of prognostic equations for mesoscale kinetic energy and mesoscale (subgrid scale) fluxes for large-scale atmospheric models

    Science.gov (United States)

    Avissar, Roni; Chen, Fei

    1993-01-01

    Generated by landscape discontinuities (e.g., sea breezes) mesoscale circulation processes are not represented in large-scale atmospheric models (e.g., general circulation models), which have an inappropiate grid-scale resolution. With the assumption that atmospheric variables can be separated into large scale, mesoscale, and turbulent scale, a set of prognostic equations applicable in large-scale atmospheric models for momentum, temperature, moisture, and any other gaseous or aerosol material, which includes both mesoscale and turbulent fluxes is developed. Prognostic equations are also developed for these mesoscale fluxes, which indicate a closure problem and, therefore, require a parameterization. For this purpose, the mean mesoscale kinetic energy (MKE) per unit of mass is used, defined as E-tilde = 0.5 (the mean value of u'(sub i exp 2), where u'(sub i) represents the three Cartesian components of a mesoscale circulation (the angle bracket symbol is the grid-scale, horizontal averaging operator in the large-scale model, and a tilde indicates a corresponding large-scale mean value). A prognostic equation is developed for E-tilde, and an analysis of the different terms of this equation indicates that the mesoscale vertical heat flux, the mesoscale pressure correlation, and the interaction between turbulence and mesoscale perturbations are the major terms that affect the time tendency of E-tilde. A-state-of-the-art mesoscale atmospheric model is used to investigate the relationship between MKE, landscape discontinuities (as characterized by the spatial distribution of heat fluxes at the earth's surface), and mesoscale sensible and latent heat fluxes in the atmosphere. MKE is compared with turbulence kinetic energy to illustrate the importance of mesoscale processes as compared to turbulent processes. This analysis emphasizes the potential use of MKE to bridge between landscape discontinuities and mesoscale fluxes and, therefore, to parameterize mesoscale fluxes

  10. Towards large-scale production of solution-processed organic tandem modules based on ternary composites: Design of the intermediate layer, device optimization and laser based module processing

    DEFF Research Database (Denmark)

    Li, Ning; Kubis, Peter; Forberich, Karen

    2014-01-01

    on commercially available materials, which enhances the absorption of poly(3-hexylthiophene) (P3HT) and as a result increase the PCE of the P3HT-based large-scale OPV devices; 3. laser-based module processing, which provides an excellent processing resolution and as a result can bring the power conversion...... efficiency (PCE) of mass-produced organic photovoltaic (OPV) devices close to the highest PCE values achieved for lab-scale solar cells through a significant increase in the geometrical fill factor. We believe that the combination of the above mentioned concepts provides a clear roadmap to push OPV towards...

  11. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  12. Large-Scale Spray Releases: Additional Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Daniel, Richard C.; Gauglitz, Phillip A.; Burns, Carolyn A.; Fountain, Matthew S.; Shimskey, Rick W.; Billing, Justin M.; Bontha, Jagannadha R.; Kurath, Dean E.; Jenks, Jeromy WJ; MacFarlan, Paul J.; Mahoney, Lenna A.

    2013-08-01

    One of the events postulated in the hazard analysis for the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak event involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids that behave as a Newtonian fluid. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and in processing facilities across the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are mostly absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale testing. The small-scale testing and resultant data are described in Mahoney et al. (2012b), and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used

  13. Accelerating Relevance Vector Machine for Large-Scale Data on Spark

    Directory of Open Access Journals (Sweden)

    Liu Fang

    2017-01-01

    Full Text Available Relevance vector machine (RVM is a machine learning algorithm based on a sparse Bayesian framework, which performs well when running classification and regression tasks on small-scale datasets. However, RVM also has certain drawbacks which restricts its practical applications such as (1 slow training process, (2 poor performance on training large-scale datasets. In order to solve these problem, we propose Discrete AdaBoost RVM (DAB-RVM which incorporate ensemble learning in RVM at first. This method performs well with large-scale low-dimensional datasets. However, as the number of features increases, the training time of DAB-RVM increases as well. To avoid this phenomenon, we utilize the sufficient training samples of large-scale datasets and propose all features boosting RVM (AFB-RVM, which modifies the way of obtaining weak classifiers. In our experiments we study the differences between various boosting techniques with RVM, demonstrating the performance of the proposed approaches on Spark. As a result of this paper, two proposed approaches on Spark for different types of large-scale datasets are available.

  14. Brine flow in heated geologic salt.

    Energy Technology Data Exchange (ETDEWEB)

    Kuhlman, Kristopher L.; Malama, Bwalya

    2013-03-01

    This report is a summary of the physical processes, primary governing equations, solution approaches, and historic testing related to brine migration in geologic salt. Although most information presented in this report is not new, we synthesize a large amount of material scattered across dozens of laboratory reports, journal papers, conference proceedings, and textbooks. We present a mathematical description of the governing brine flow mechanisms in geologic salt. We outline the general coupled thermal, multi-phase hydrologic, and mechanical processes. We derive these processes governing equations, which can be used to predict brine flow. These equations are valid under a wide variety of conditions applicable to radioactive waste disposal in rooms and boreholes excavated into geologic salt.

  15. Large-scale retrieval for medical image analytics: A comprehensive review.

    Science.gov (United States)

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Tectonic and climatic considerations for deep geological disposal of radioactive waste: A UK perspective.

    Science.gov (United States)

    McEvoy, F M; Schofield, D I; Shaw, R P; Norris, S

    2016-11-15

    Identifying and evaluating the factors that might impact on the long-term integrity of a deep Geological Disposal Facility (GDF) and its surrounding geological and surface environment is central to developing a safety case for underground disposal of radioactive waste. The geological environment should be relatively stable and its behaviour adequately predictable so that scientifically sound evaluations of the long-term radiological safety of a GDF can be made. In considering this, it is necessary to take into account natural processes that could affect a GDF or modify its geological environment up to 1millionyears into the future. Key processes considered in this paper include those which result from plate tectonics, such as seismicity and volcanism, as well as climate-related processes, such as erosion, uplift and the effects of glaciation. Understanding the inherent variability of process rates, critical thresholds and likely potential influence of unpredictable perturbations represent significant challenges to predicting the natural environment. From a plate-tectonic perspective, a one million year time frame represents a very short segment of geological time and is largely below the current resolution of observation of past processes. Similarly, predicting climate system evolution on such time-scales, particularly beyond 200ka AP is highly uncertain, relying on estimating the extremes within which climate and related processes may vary with reasonable confidence. The paper highlights some of the challenges facing a deep geological disposal program in the UK to review understanding of the natural changes that may affect siting and design of a GDF. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.

  17. Scale up risk of developing oil shale processing units

    International Nuclear Information System (INIS)

    Oepik, I.

    1991-01-01

    The experiences in oil shale processing in three large countries, China, the U.S.A. and the U.S.S.R. have demonstrated, that the relative scale up risk of developing oil shale processing units is related to the scale up factor. On the background of large programmes for developing the oil shale industry branch, i.e. the $30 billion investments in colorado and Utah or 50 million t/year oil shale processing in Estonia and Leningrad Region planned in the late seventies, the absolute scope of the scale up risk of developing single retorting plants, seems to be justified. But under the conditions of low crude oil prices, when the large-scale development of oil shale processing industry is stopped, the absolute scope of the scale up risk is to be divided between a small number of units. Therefore, it is reasonable to build the new commercial oil shale processing plants with a minimum scale up risk. For example, in Estonia a new oil shale processing plant with gas combustion retorts projected to start in the early nineties will be equipped with four units of 1500 t/day enriched oil shale throughput each, designed with scale up factor M=1.5 and with a minimum scale up risk, only r=2.5-4.5%. The oil shale retorting unit for the PAMA plant in Israel [1] is planned to develop in three steps, also with minimum scale up risk: feasibility studies in Colorado with Israel's shale at Paraho 250 t/day retort and other tests, demonstration retort of 700 t/day and M=2.8 in Israel, and commercial retorts in the early nineties with the capacity of about 1000 t/day with M=1.4. The scale up risk of the PAMA project r=2-4% is approximately the same as that in Estonia. the knowledge of the scope of the scale up risk of developing oil shale processing retorts assists on the calculation of production costs in erecting new units. (author). 9 refs., 2 tabs

  18. Geologic Time.

    Science.gov (United States)

    Newman, William L.

    One of a series of general interest publications on science topics, the booklet provides those interested in geologic time with an introduction to the subject. Separate sections discuss the relative time scale, major divisions in geologic time, index fossils used as guides for telling the age of rocks, the atomic scale, and the age of the earth.…

  19. Feasibility of large volume casting cementation process for intermediate level radioactive waste

    International Nuclear Information System (INIS)

    Chen Zhuying; Chen Baisong; Zeng Jishu; Yu Chengze

    1988-01-01

    The recent tendency of radioactive waste treatment and disposal both in China and abroad is reviewed. The feasibility of the large volume casting cementation process for treating and disposing the intermediate level radioactive waste from spent fuel reprocessing plant in shallow land is assessed on the basis of the analyses of the experimental results (such as formulation study, solidified radioactive waste properties measurement ect.). It can be concluded large volume casting cementation process is a promising, safe and economic process. It is feasible to dispose the intermediate level radioactive waste from reprocessing plant it the disposal site chosen has resonable geological and geographical conditions and some additional effective protection means are taken

  20. Large Spatial Scale Ground Displacement Mapping through the P-SBAS Processing of Sentinel-1 Data on a Cloud Computing Environment

    Science.gov (United States)

    Casu, F.; Bonano, M.; de Luca, C.; Lanari, R.; Manunta, M.; Manzo, M.; Zinno, I.

    2017-12-01

    Since its launch in 2014, the Sentinel-1 (S1) constellation has played a key role on SAR data availability and dissemination all over the World. Indeed, the free and open access data policy adopted by the European Copernicus program together with the global coverage acquisition strategy, make the Sentinel constellation as a game changer in the Earth Observation scenario. Being the SAR data become ubiquitous, the technological and scientific challenge is focused on maximizing the exploitation of such huge data flow. In this direction, the use of innovative processing algorithms and distributed computing infrastructures, such as the Cloud Computing platforms, can play a crucial role. In this work we present a Cloud Computing solution for the advanced interferometric (DInSAR) processing chain based on the Parallel SBAS (P-SBAS) approach, aimed at processing S1 Interferometric Wide Swath (IWS) data for the generation of large spatial scale deformation time series in efficient, automatic and systematic way. Such a DInSAR chain ingests Sentinel 1 SLC images and carries out several processing steps, to finally compute deformation time series and mean deformation velocity maps. Different parallel strategies have been designed ad hoc for each processing step of the P-SBAS S1 chain, encompassing both multi-core and multi-node programming techniques, in order to maximize the computational efficiency achieved within a Cloud Computing environment and cut down the relevant processing times. The presented P-SBAS S1 processing chain has been implemented on the Amazon Web Services platform and a thorough analysis of the attained parallel performances has been performed to identify and overcome the major bottlenecks to the scalability. The presented approach is used to perform national-scale DInSAR analyses over Italy, involving the processing of more than 3000 S1 IWS images acquired from both ascending and descending orbits. Such an experiment confirms the big advantage of

  1. GPU-based large-scale visualization

    KAUST Repository

    Hadwiger, Markus

    2013-11-19

    Recent advances in image and volume acquisition as well as computational advances in simulation have led to an explosion of the amount of data that must be visualized and analyzed. Modern techniques combine the parallel processing power of GPUs with out-of-core methods and data streaming to enable the interactive visualization of giga- and terabytes of image and volume data. A major enabler for interactivity is making both the computational and the visualization effort proportional to the amount of data that is actually visible on screen, decoupling it from the full data size. This leads to powerful display-aware multi-resolution techniques that enable the visualization of data of almost arbitrary size. The course consists of two major parts: An introductory part that progresses from fundamentals to modern techniques, and a more advanced part that discusses details of ray-guided volume rendering, novel data structures for display-aware visualization and processing, and the remote visualization of large online data collections. You will learn how to develop efficient GPU data structures and large-scale visualizations, implement out-of-core strategies and concepts such as virtual texturing that have only been employed recently, as well as how to use modern multi-resolution representations. These approaches reduce the GPU memory requirements of extremely large data to a working set size that fits into current GPUs. You will learn how to perform ray-casting of volume data of almost arbitrary size and how to render and process gigapixel images using scalable, display-aware techniques. We will describe custom virtual texturing architectures as well as recent hardware developments in this area. We will also describe client/server systems for distributed visualization, on-demand data processing and streaming, and remote visualization. We will describe implementations using OpenGL as well as CUDA, exploiting parallelism on GPUs combined with additional asynchronous

  2. Parallel Index and Query for Large Scale Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chou, Jerry; Wu, Kesheng; Ruebel, Oliver; Howison, Mark; Qiang, Ji; Prabhat,; Austin, Brian; Bethel, E. Wes; Ryne, Rob D.; Shoshani, Arie

    2011-07-18

    Modern scientific datasets present numerous data management and analysis challenges. State-of-the-art index and query technologies are critical for facilitating interactive exploration of large datasets, but numerous challenges remain in terms of designing a system for process- ing general scientific datasets. The system needs to be able to run on distributed multi-core platforms, efficiently utilize underlying I/O infrastructure, and scale to massive datasets. We present FastQuery, a novel software framework that address these challenges. FastQuery utilizes a state-of-the-art index and query technology (FastBit) and is designed to process mas- sive datasets on modern supercomputing platforms. We apply FastQuery to processing of a massive 50TB dataset generated by a large scale accelerator modeling code. We demonstrate the scalability of the tool to 11,520 cores. Motivated by the scientific need to search for inter- esting particles in this dataset, we use our framework to reduce search time from hours to tens of seconds.

  3. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  4. Geological site selection studies in Precambrian crystalline rocks in Finland

    International Nuclear Information System (INIS)

    Vuorela, P.

    1988-01-01

    In general geological investigations made since 1977 the Finnish crystalline bedrock has been determined to be suitable for the final disposal of the spent nuclear fuel. Regional investigations have been mainly based on already existing geological studies. Special attention has been paid on the international geological Finland as the Baltic Shield is stiff and stable and situated far outside the zones of volcanic and seismic activity. The present day crustal movements in Finland are related to landuplift process. Movements and possible faults in the bedrock follow fracture zones which devide the bedrock into mosaiclike blocks. As compared to small scale geological maps the bedrock blocks are often indicated as large granite rock formations which are less broken than the surrounding rocks, though the age of granite formations is at least 1500 millions of years. The large bedrock blocks (20-300 km 2 ) are divided to smaller units by different magnitudes of fractures and these smaller bedrock units (5-20 km 2 ) have been selected for further site selection investigations. At the first stage of investigations 327 suitable regional bedrock blocks have been identified on the basis of Landsat-1 winter and summer mosaics of Finland. After two years of investigations 134 investigation areas were selected inside 61 bedrock blocks and classified to four priority classes, the three first of which were redommended for further investigations. Geological criteries used in classification indicated clear differences between the classes one and three, however all classified areas are situated in large rather homogenous bedrock blocks and more exact three dimensional suitability errors may not be observed until deep bore holes have been made

  5. Evaluation of drought propagation in an ensemble mean of large-scale hydrological models

    Directory of Open Access Journals (Sweden)

    A. F. Van Loon

    2012-11-01

    underestimation of wet-to-dry-season droughts and snow-related droughts. Furthermore, almost no composite droughts were simulated for slowly responding areas, while many multi-year drought events were expected in these systems.

    We conclude that most drought propagation processes are reasonably well reproduced by the ensemble mean of large-scale models in contrasting catchments in Europe. Challenges, however, remain in catchments with cold and semi-arid climates and catchments with large storage in aquifers or lakes. This leads to a high uncertainty in hydrological drought simulation at large scales. Improvement of drought simulation in large-scale models should focus on a better representation of hydrological processes that are important for drought development, such as evapotranspiration, snow accumulation and melt, and especially storage. Besides the more explicit inclusion of storage in large-scale models, also parametrisation of storage processes requires attention, for example through a global-scale dataset on aquifer characteristics, improved large-scale datasets on other land characteristics (e.g. soils, land cover, and calibration/evaluation of the models against observations of storage (e.g. in snow, groundwater.

  6. Large Deviations for Two-Time-Scale Diffusions, with Delays

    International Nuclear Information System (INIS)

    Kushner, Harold J.

    2010-01-01

    We consider the problem of large deviations for a two-time-scale reflected diffusion process, possibly with delays in the dynamical terms. The Dupuis-Ellis weak convergence approach is used. It is perhaps the most intuitive and simplest for the problems of concern. The results have applications to the problem of approximating optimal controls for two-time-scale systems via use of the averaged equation.

  7. Large-Scale Consumption and Zero-Waste Recycling Method of Red Mud in Steel Making Process

    Directory of Open Access Journals (Sweden)

    Guoshan Ning

    2018-03-01

    Full Text Available To release the environmental pressure from the massive discharge of bauxite residue (red mud, a novel recycling method of red mud in steel making process was investigated through high-temperature experiments and thermodynamic analysis. The results showed that after the reduction roasting of the carbon-bearing red mud pellets at 1100–1200 °C for 12–20 min, the metallic pellets were obtained with the metallization ratio of ≥88%. Then, the separation of slag and iron achieved from the metallic pellets at 1550 °C, after composition adjustment targeting the primary crystal region of the 12CaO·7Al2O3 phase. After iron removal and composition adjustment, the smelting-separation slag had good smelting performance and desulfurization capability, which meets the demand of sulfurization flux in steel making process. The pig iron quality meets the requirements of the high-quality raw material for steel making. In virtue of the huge scale and output of steel industry, the large-scale consumption and zero-waste recycling method of red mud was proposed, which comprised of the carbon-bearing red mud pellets roasting in the rotary hearth furnace and smelting separation in the electric arc furnace after composition adjustment.

  8. Environmental impact assessments and geological repositories: A model process

    International Nuclear Information System (INIS)

    Webster, S.

    2000-01-01

    In a recent study carried out for the European Commission, the scope and application of environmental impact assessment (EIA) legislation and current EIA practice in European Union Member States and applicant countries of Central and Eastern Europe was investigated, specifically in relation to the geological disposal of radioactive waste. This paper reports the study's investigations into a model approach to EIA in the context of geological repositories, including the role of the assessment in the overall decision processes and public involvement. (author)

  9. Large-scale ground motion simulation using GPGPU

    Science.gov (United States)

    Aoi, S.; Maeda, T.; Nishizawa, N.; Aoki, T.

    2012-12-01

    Huge computation resources are required to perform large-scale ground motion simulations using 3-D finite difference method (FDM) for realistic and complex models with high accuracy. Furthermore, thousands of various simulations are necessary to evaluate the variability of the assessment caused by uncertainty of the assumptions of the source models for future earthquakes. To conquer the problem of restricted computational resources, we introduced the use of GPGPU (General purpose computing on graphics processing units) which is the technique of using a GPU as an accelerator of the computation which has been traditionally conducted by the CPU. We employed the CPU version of GMS (Ground motion Simulator; Aoi et al., 2004) as the original code and implemented the function for GPU calculation using CUDA (Compute Unified Device Architecture). GMS is a total system for seismic wave propagation simulation based on 3-D FDM scheme using discontinuous grids (Aoi&Fujiwara, 1999), which includes the solver as well as the preprocessor tools (parameter generation tool) and postprocessor tools (filter tool, visualization tool, and so on). The computational model is decomposed in two horizontal directions and each decomposed model is allocated to a different GPU. We evaluated the performance of our newly developed GPU version of GMS on the TSUBAME2.0 which is one of the Japanese fastest supercomputer operated by the Tokyo Institute of Technology. First we have performed a strong scaling test using the model with about 22 million grids and achieved 3.2 and 7.3 times of the speed-up by using 4 and 16 GPUs. Next, we have examined a weak scaling test where the model sizes (number of grids) are increased in proportion to the degree of parallelism (number of GPUs). The result showed almost perfect linearity up to the simulation with 22 billion grids using 1024 GPUs where the calculation speed reached to 79.7 TFlops and about 34 times faster than the CPU calculation using the same number

  10. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  11. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  12. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  13. On the Large-Scaling Issues of Cloud-based Applications for Earth Science Dat

    Science.gov (United States)

    Hua, H.

    2016-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as NASA's SWOT and NISAR where its SAR data volumes and data throughput rates are order of magnitude larger than present day missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Experiences have shown that to embrace efficient cloud computing approaches for large-scale science data systems requires more than just moving existing code to cloud environments. At large cloud scales, we need to deal with scaling and cost issues. We present our experiences on deploying multiple instances of our hybrid-cloud computing science data system (HySDS) to support large-scale processing of Earth Science data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer 75%-90% costs savings but with an unpredictable computing environment based on market forces.

  14. Cell therapy-processing economics: small-scale microfactories as a stepping stone toward large-scale macrofactories.

    Science.gov (United States)

    Harrison, Richard P; Medcalf, Nicholas; Rafiq, Qasim A

    2018-03-01

    Manufacturing methods for cell-based therapies differ markedly from those established for noncellular pharmaceuticals and biologics. Attempts to 'shoehorn' these into existing frameworks have yielded poor outcomes. Some excellent clinical results have been realized, yet emergence of a 'blockbuster' cell-based therapy has so far proved elusive.  The pressure to provide these innovative therapies, even at a smaller scale, remains. In this process, economics research paper, we utilize cell expansion research data combined with operational cost modeling in a case study to demonstrate the alternative ways in which a novel mesenchymal stem cell-based therapy could be provided at small scale. This research outlines the feasibility of cell microfactories but highlighted that there is a strong pressure to automate processes and split the quality control cost-burden over larger production batches. The study explores one potential paradigm of cell-based therapy provisioning as a potential exemplar on which to base manufacturing strategy.

  15. Large-scale landslide triggering mechanisms in Debre Sina area, Central Ethiopian Highlands at the western Afar rift margin

    Science.gov (United States)

    Kiros, T.; Wohnlich, S.; Hussien, B.

    2017-12-01

    The Central Highlands of Ethiopia have repeatedly experiencing large-scale landslide events. Debre Sina area is one of the most landslide prone areas located along the western Afar rift margin of Ethiopia, which is frequently affected by large-scale and deep-seated landslides. Despite that, urban and rural development is currently taking place in almost all constricted valleys as well as on the imposing cliffs. Therefore, understanding the major triggering factors and failure mechanisms in the Debre Sina area and surroundings is of critical importance. In the present study, we investigate the landslide in the area using geological and topographic analysis, structural settings, geophysical investigation (seismic refraction), rainfall data and seismicity. Furthermore, petrographical as well as X-ray Diffraction (XRD) analysis are conducted to explain the mineral composition of parent rock and its weathering products. The topographic analysis result revealed that the slope range from 100 - 400, with elevation of 1,800 - 2,500m, with aspect to east and southeast are highly prone to landslide. The seismic refraction method identified four main layers of geomaterials which contained a subsurface landslides anomaly within the layers. The results consist of clay, loosely cemented colluvial sediments and highly weathered agglomerates (1000-1500m/s) 7-15m, highly to moderately fractured porphyritic basalt, ignimbrite, rhyolite/trachyte and volcanic ash (1500-2500m/s) 10-30m, moderately to slightly fractured ignimbrite, rhyolite/trachyte and basalt (2500-3500m/s) 30-50m and very strong, massive, fresh rock/bed rock (>3500m/s) from 45m depth. The large-scale and deep-seated landslides problem in the study area appears to be caused by heavy rainfall, complex geology and rugged topography, the presence of geological structures oriented parallel to the rift margin N-S fault (NNE-SSW trending) of the central Ethiopian highlands and coinciding with the head scarp of the slides and

  16. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  17. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  18. On the universal character of the large scale structure of the universe

    International Nuclear Information System (INIS)

    Demianski, M.; International Center for Relativistic Astrophysics; Rome Univ.; Doroshkevich, A.G.

    1991-01-01

    We review different theories of formation of the large scale structure of the Universe. Special emphasis is put on the theory of inertial instability. We show that for a large class of initial spectra the resulting two point correlation functions are similar. We discuss also the adhesion theory which uses the Burgers equation, Navier-Stokes equation or coagulation process. We review the Zeldovich theory of gravitational instability and discuss the internal structure of pancakes. Finally we discuss the role of the velocity potential in determining the global characteristics of large scale structures (distribution of caustics, scale of voids, etc.). In the last chapter we list the main unsolved problems and main successes of the theory of formation of large scale structure. (orig.)

  19. Survey of large-scale solar water heaters installed in Taiwan, China

    Energy Technology Data Exchange (ETDEWEB)

    Chang Keh-Chin; Lee Tsong-Sheng; Chung Kung-Ming [Cheng Kung Univ., Tainan (China); Lien Ya-Feng; Lee Chine-An [Cheng Kung Univ. Research and Development Foundation, Tainan (China)

    2008-07-01

    Almost all the solar collectors installed in Taiwan, China were used for production of hot water for homeowners (residential systems), in which the area of solar collectors is less than 10 square meters. From 2001 to 2006, there were only 39 large-scale systems (defined as the area of solar collectors being over 100 m{sup 2}) installed. Their utilization purposes are for rooming house (dormitory), swimming pool, restaurant, and manufacturing process. A comprehensive survey of those large-scale solar water heaters was conducted in 2006. The objectives of the survey were to asses the systems' performance and to have the feedback from the individual users. It is found that lack of experience in system design and maintenance are the key factors for reliable operation of a system. For further promotion of large-scale solar water heaters in Taiwan, a more compressive program on a system design for manufacturing process should be conducted. (orig.)

  20. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    Full Text Available Birds are highly mobile organisms and there is increasing evidence that studies at large spatial scales are needed if we are to properly understand their population dynamics. While classical metapopulation models have rarely proved useful for birds, more general metapopulation ideas involving collections of populations interacting within spatially structured landscapes are highly relevant (Harrison, 1994. There is increasing interest in understanding patterns of synchrony, or lack of synchrony, between populations and the environmental and dispersal mechanisms that bring about these patterns (Paradis et al., 2000. To investigate these processes we need to measure abundance, demographic rates and dispersal at large spatial scales, in addition to gathering data on relevant environmental variables. There is an increasing realisation that conservation needs to address rapid declines of common and widespread species (they will not remain so if such trends continue as well as the management of small populations that are at risk of extinction. While the knowledge needed to support the management of small populations can often be obtained from intensive studies in a few restricted areas, conservation of widespread species often requires information on population trends and processes measured at regional, national and continental scales (Baillie, 2001. While management prescriptions for widespread populations may initially be developed from a small number of local studies or experiments, there is an increasing need to understand how such results will scale up when applied across wider areas. There is also a vital role for monitoring at large spatial scales both in identifying such population declines and in assessing population recovery. Gathering data on avian abundance and demography at large spatial scales usually relies on the efforts of large numbers of skilled volunteers. Volunteer studies based on ringing (for example Constant Effort Sites [CES

  1. Preliminary Geological Survey on the Proposed Sites for the New Research Reactor

    International Nuclear Information System (INIS)

    Lim, In Cheol; Ha, J. J.; Oh, K. B.

    2010-12-01

    · Performing the preliminary geological survey on the proposed sites for the new research reactor through the technical service · Ordering a technical service from The Geological Society of Korea · Contents of the geological survey - Confirmation of active fault - Confirmation of a large-scale fracture zone or weak zone - Confirmation of inappropriate items related to the underground water - Confirmation of historical seismicity and instrumental earthquakes data · Synthesized analysis and holding a report meeting · Results of the geological survey - Confirmation of the geological characteristics of the sites and drawing the requirements for the precise geological survey in the future

  2. Micrometer-scale magnetic imaging of geological samples using a quantum diamond microscope

    Science.gov (United States)

    Glenn, D. R.; Fu, R. R.; Kehayias, P.; Le Sage, D.; Lima, E. A.; Weiss, B. P.; Walsworth, R. L.

    2017-08-01

    Remanent magnetization in geological samples may record the past intensity and direction of planetary magnetic fields. Traditionally, this magnetization is analyzed through measurements of the net magnetic moment of bulk millimeter to centimeter sized samples. However, geological samples are often mineralogically and texturally heterogeneous at submillimeter scales, with only a fraction of the ferromagnetic grains carrying the remanent magnetization of interest. Therefore, characterizing this magnetization in such cases requires a technique capable of imaging magnetic fields at fine spatial scales and with high sensitivity. To address this challenge, we developed a new instrument, based on nitrogen-vacancy centers in diamond, which enables direct imaging of magnetic fields due to both remanent and induced magnetization, as well as optical imaging, of room-temperature geological samples with spatial resolution approaching the optical diffraction limit. We describe the operating principles of this device, which we call the quantum diamond microscope (QDM), and report its optimized image-area-normalized magnetic field sensitivity (20 µTṡµm/Hz1/2), spatial resolution (5 µm), and field of view (4 mm), as well as trade-offs between these parameters. We also perform an absolute magnetic field calibration for the device in different modes of operation, including three-axis (vector) and single-axis (projective) magnetic field imaging. Finally, we use the QDM to obtain magnetic images of several terrestrial and meteoritic rock samples, demonstrating its ability to resolve spatially distinct populations of ferromagnetic carriers.

  3. Karhunen-Loève (PCA) based detection of multiple oscillations in multiple measurement signals from large-scale process plants

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Wickerhauser, M.V.

    2007-01-01

     In the perspective of optimizing the control and operation of large scale process plants, it is important to detect and to locate oscillations in the plants. This paper presents a scheme for detecting and localizing multiple oscillations in multiple measurements from such a large-scale power plant....... The scheme is based on a Karhunen-Lo\\`{e}ve analysis of the data from the plant. The proposed scheme is subsequently tested on two sets of data: a set of synthetic data and a set of data from a coal-fired power plant. In both cases the scheme detects the beginning of the oscillation within only a few samples....... In addition the oscillation localization has also shown its potential by localizing the oscillations in both data sets....

  4. Large-scale Lurgi plant would be uneconomic: study group

    Energy Technology Data Exchange (ETDEWEB)

    1964-03-21

    Gas Council and National Coal Board agreed that building of large scale Lurgi plant on the basis of study is not at present acceptable on economic grounds. The committee considered that new processes based on naphtha offered more economic sources of base and peak load production. Tables listing data provided in contractors' design studies and summary of contractors' process designs are included.

  5. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  6. Review of uranium in Australia: its geology, exploration and world significance

    Energy Technology Data Exchange (ETDEWEB)

    Shepard, J; Gaskell, J L; Spaargaren, F A; Butler, R D; Francis, T; Ross, J

    1973-01-01

    The aim of this report is to review and classify all known Australian uranium occurrences, to compare them with world-wide deposits and on this basis, derive conclusions on the uranium potential in various Australian geological environments. In an introductory section the properties, uses, chemistry, mineralogy and processing of uranium are summarised. An outline of modern prospecting techniques applicable in different geological environments is also presented. Foreign uranium deposits are classified and briefly discussed. World supply and demand to the year 2000 is analysed and the importance of Australia as a major uranium producer is considered. Uranium occurrences and deposits in all States are described in detail, and potential uraniferous geological environments are reviewed. A large scale map is presented which delineates these environments and indicates areas considered to be the most prospective. Conclusions are drawn and recommendations made concerning the selection of areas which are considered to hold the most promise for the discovery of further uranium deposits.

  7. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  8. Large-scale melting and impact mixing on early-formed asteroids

    DEFF Research Database (Denmark)

    Greenwood, Richard; Barrat, J.-A.; Scott, Edward Robert Dalton

    Large-scale melting of asteroids and planetesimals is now known to have taken place ex-tremely early in solar system history [1]. The first-generation bodies produced by this process would have been subject to rapid collisional reprocessing, leading in most cases to fragmentation and/or accretion...... the relationship between the different groups of achondrites [3, 4]. Here we present new oxygen isotope evidence con-cerning the role of large-scale melting and subsequent impact mixing in the evolution of three important achondrite groups: the main-group pallasites, meso-siderites and HEDs....

  9. Building Participation in Large-scale Conservation: Lessons from Belize and Panama

    Directory of Open Access Journals (Sweden)

    Jesse Guite Hastings

    2015-01-01

    Full Text Available Motivated by biogeography and a desire for alignment with the funding priorities of donors, the twenty-first century has seen big international NGOs shifting towards a large-scale conservation approach. This shift has meant that even before stakeholders at the national and local scale are involved, conservation programmes often have their objectives defined and funding allocated. This paper uses the experiences of Conservation International′s Marine Management Area Science (MMAS programme in Belize and Panama to explore how to build participation at the national and local scale while working within the bounds of the current conservation paradigm. Qualitative data about MMAS was gathered through a multi-sited ethnographic research process, utilising document review, direct observation, and semi-structured interviews with 82 informants in Belize, Panama, and the United States of America. Results indicate that while a large-scale approach to conservation disadvantages early national and local stakeholder participation, this effect can be mediated through focusing engagement efforts, paying attention to context, building horizontal and vertical partnerships, and using deliberative processes that promote learning. While explicit consideration of geopolitics and local complexity alongside biogeography in the planning phase of a large-scale conservation programme is ideal, actions taken by programme managers during implementation can still have a substantial impact on conservation outcomes.

  10. Survey of high-voltage pulse technology suitable for large-scale plasma source ion implantation processes

    International Nuclear Information System (INIS)

    Reass, W.A.

    1994-01-01

    Many new plasma processes ideas are finding their way from the research lab to the manufacturing plant floor. These require high voltage (HV) pulse power equipment, which must be optimized for application, system efficiency, and reliability. Although no single HV pulse technology is suitable for all plasma processes, various classes of high voltage pulsers may offer a greater versatility and economy to the manufacturer. Technology developed for existing radar and particle accelerator modulator power systems can be utilized to develop a modern large scale plasma source ion implantation (PSII) system. The HV pulse networks can be broadly defined by two classes of systems, those that generate the voltage directly, and those that use some type of pulse forming network and step-up transformer. This article will examine these HV pulse technologies and discuss their applicability to the specific PSII process. Typical systems that will be reviewed will include high power solid state, hard tube systems such as crossed-field ''hollow beam'' switch tubes and planar tetrodes, and ''soft'' tube systems with crossatrons and thyratrons. Results will be tabulated and suggestions provided for a particular PSII process

  11. Synthetic geology - Exploring the "what if?" in geology

    Science.gov (United States)

    Klump, J. F.; Robertson, J.

    2015-12-01

    The spatial and temporal extent of geological phenomena makes experiments in geology difficult to conduct, if not entirely impossible and collection of data is laborious and expensive - so expensive that most of the time we cannot test a hypothesis. The aim, in many cases, is to gather enough data to build a predictive geological model. Even in a mine, where data are abundant, a model remains incomplete because the information at the level of a blasting block is two orders of magnitude larger than the sample from a drill core, and we have to take measurement errors into account. So, what confidence can we have in a model based on sparse data, uncertainties and measurement error? Synthetic geology does not attempt to model the real world in terms of geological processes with all their uncertainties, rather it offers an artificial geological data source with fully known properties. On the basis of this artificial geology, we can simulate geological sampling by established or future technologies to study the resulting dataset. Conducting these experiments in silico removes the constraints of testing in the field or in production, and provides us with a known ground-truth against which the steps in a data analysis and integration workflow can be validated.Real-time simulation of data sources can be used to investigate crucial questions such as the potential information gain from future sensing capabilities, or from new sampling strategies, or the combination of both, and it enables us to test many "what if?" questions, both in geology and in data engineering. What would we be able to see if we could obtain data at higher resolution? How would real-time data analysis change sampling strategies? Does our data infrastructure handle many new real-time data streams? What feature engineering can be deducted for machine learning approaches? By providing a 'data sandbox' able to scale to realistic geological scenarios we hope to start answering some of these questions.

  12. Digitizing rocks: Standardizing the process of geologic description with workstations

    International Nuclear Information System (INIS)

    Saunders, M.R.; Shields, J.A.; Taylor, M.R.

    1995-01-01

    In the drive to squeeze the most value from every dollar spent on exploration and development, increasing use is being made of stored data through methods that rely on the completeness and accuracy of the database for their usefulness. Although many types of engineering data are available to the process, geologic data, especially those collected at a sufficiently detailed level to show reservoir heterogeneity, are often unavailable to later workers in any useful form. Traditionally, most wellsite geologic data are recorded on worksheets or notebooks, from which summary data are often transferred to computers. The only changes in recent years have been related to the process by which computer-drafted lithology logs have superseded hand-drawn logs; in some exceptions, some of the plotting data may be held in a simple database. These descriptions and analyses, gathered at considerable cost and capable of showing significant petrological detail, are not available to the whole field-development process. The authors set out to tackle these problems of limited usefulness and development a system that would deliver quality geologic data deep into the field of play in a form that was easy to select and integrated with existing models

  13. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele

    2015-08-23

    The interaction between scales is investigated in a turbulent mixing layer. The large-scale amplitude modulation of the small scales already observed in other works depends on the crosswise location. Large-scale positive fluctuations correlate with a stronger activity of the small scales on the low speed-side of the mixing layer, and a reduced activity on the high speed-side. However, from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  14. An establishment on the hazard mitigation system of large scale landslides for Zengwen reservoir watershed management in Taiwan

    Science.gov (United States)

    Tsai, Kuang-Jung; Lee, Ming-Hsi; Chen, Yie-Ruey; Huang, Meng-Hsuan; Yu, Chia-Ching

    2016-04-01

    Extremely heavy rainfall with accumulated rainfall amount more than 2900mm within continuous 3 day event occurred at southern Taiwan has been recognized as a serious natural hazard caused by Morakot typhoon in august, 2009. Very destructive large scale landslides and debris flows were induced by this heavy rainfall event. According to the satellite image processing and monitoring project was conducted by Soil & Water Conservation Bureau after Morakot typhoon. More than 10904 sites of landslide with total sliding area of 18113 ha were significantly found by this project. Also, the field investigation on all landslide areas were executed by this research on the basis of disaster type, scale and location related to the topographic condition, colluvium soil characteristics, bedrock formation and geological structure after Morakot hazard. The mechanism, characteristics and behavior of this large scale landslide combined with debris flow disasters are analyzed and Investigated to rule out the interaction of factors concerned above and identify the disaster extent of rainfall induced landslide during the period of this study. In order to reduce the disaster risk of large scale landslide and debris flow, the adaption strategy of hazard mitigation system should be set up as soon as possible and taken into consideration of slope land conservation, landslide control countermeasure planning, disaster database establishment, environment impact analysis and disaster risk assessment respectively. As a result, this 3-year research has been focused on the field investigation by using GPS/GIS/RS integration, mechanism and behavior study regarding to the rainfall induced landslide occurrence, disaster database and hazard mitigation system establishment. In fact, this project has become an important issue which was seriously concerned by the government and people live in Taiwan. Hopefully, all results come from this research can be used as a guidance for the disaster prevention and

  15. Evolutionary leap in large-scale flood risk assessment needed

    OpenAIRE

    Vorogushyn, Sergiy; Bates, Paul D.; de Bruijn, Karin; Castellarin, Attilio; Kreibich, Heidi; Priest, Sally J.; Schröter, Kai; Bagli, Stefano; Blöschl, Günter; Domeneghetti, Alessio; Gouldby, Ben; Klijn, Frans; Lammersen, Rita; Neal, Jeffrey C.; Ridder, Nina

    2018-01-01

    Current approaches for assessing large-scale flood risks contravene the fundamental principles of the flood risk system functioning because they largely ignore basic interactions and feedbacks between atmosphere, catchments, river-floodplain systems and socio-economic processes. As a consequence, risk analyses are uncertain and might be biased. However, reliable risk estimates are required for prioritizing national investments in flood risk mitigation or for appraisal and management of insura...

  16. Low-Complexity Transmit Antenna Selection and Beamforming for Large-Scale MIMO Communications

    Directory of Open Access Journals (Sweden)

    Kun Qian

    2014-01-01

    Full Text Available Transmit antenna selection plays an important role in large-scale multiple-input multiple-output (MIMO communications, but optimal large-scale MIMO antenna selection is a technical challenge. Exhaustive search is often employed in antenna selection, but it cannot be efficiently implemented in large-scale MIMO communication systems due to its prohibitive high computation complexity. This paper proposes a low-complexity interactive multiple-parameter optimization method for joint transmit antenna selection and beamforming in large-scale MIMO communication systems. The objective is to jointly maximize the channel outrage capacity and signal-to-noise (SNR performance and minimize the mean square error in transmit antenna selection and minimum variance distortionless response (MVDR beamforming without exhaustive search. The effectiveness of all the proposed methods is verified by extensive simulation results. It is shown that the required antenna selection processing time of the proposed method does not increase along with the increase of selected antennas, but the computation complexity of conventional exhaustive search method will significantly increase when large-scale antennas are employed in the system. This is particularly useful in antenna selection for large-scale MIMO communication systems.

  17. Large scale synthesis of α-Si3N4 nanowires through a kinetically favored chemical vapour deposition process

    Science.gov (United States)

    Liu, Haitao; Huang, Zhaohui; Zhang, Xiaoguang; Fang, Minghao; Liu, Yan-gai; Wu, Xiaowen; Min, Xin

    2018-01-01

    Understanding the kinetic barrier and driving force for crystal nucleation and growth is decisive for the synthesis of nanowires with controllable yield and morphology. In this research, we developed an effective reaction system to synthesize very large scale α-Si3N4 nanowires (hundreds of milligrams) and carried out a comparative study to characterize the kinetic influence of gas precursor supersaturation and liquid metal catalyst. The phase composition, morphology, microstructure and photoluminescence properties of the as-synthesized products were characterized by X-ray diffraction, fourier-transform infrared spectroscopy, field emission scanning electron microscopy, transmission electron microscopy and room temperature photoluminescence measurement. The yield of the products not only relates to the reaction temperature (thermodynamic condition) but also to the distribution of gas precursors (kinetic condition). As revealed in this research, by controlling the gas diffusion process, the yield of the nanowire products could be greatly improved. The experimental results indicate that the supersaturation is the dominant factor in the as-designed system rather than the catalyst. With excellent non-flammability and high thermal stability, the large scale α-Si3N4 products would have potential applications to the improvement of strength of high temperature ceramic composites. The photoluminescence spectrum of the α-Si3N4 shows a blue shift which could be valued for future applications in blue-green emitting devices. There is no doubt that the large scale products are the base of these applications.

  18. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  19. Risk management in a large-scale CO2 geosequestration pilot project, Illinois, USA

    Science.gov (United States)

    Hnottavange-Telleen, K.; Chabora, E.; Finley, R.J.; Greenberg, S.E.; Marsteller, S.

    2011-01-01

    Like most large-scale infrastructure projects, carbon dioxide (CO 2) geological sequestration (GS) projects have multiple success criteria and multiple stakeholders. In this context "risk evaluation" encompasses multiple scales. Yet a risk management program aims to maximize the chance of project success by assessing, monitoring, minimizing all risks in a consistent framework. The 150,000-km2 Illinois Basin underlies much of the state of Illinois, USA, and parts of adjacent Kentucky and Indiana. Its potential for CO2 storage is first-rate among basins in North America, an impression that has been strengthened by early testing of the injection well of the Midwest Geological Sequestration Consortium's (MGSC's) Phase III large scale demonstration project, the Illinois Basin - Decatur Project (IBDP). The IBDP, funded by the U.S. Department of Energy's National Energy Technology Laboratory (NETL), represents a key trial of GS technologies and project-management techniques. Though risks are specific to each site and project, IBDP risk management methodologies provide valuable experience for future GS projects. IBDP views risk as the potential for negative impact to any of these five values: health and safety, environment, financial, advancing the viability and public acceptability of a GS industry, and research. Research goals include monitoring one million metric tonnes of injected CO2 in the subsurface. Risk management responds to the ways in which any values are at risk: for example, monitoring is designed to reduce uncertainties in parameter values that are important for research and system control, and is also designed to provide public assurance. Identified risks are the primary basis for risk-reduction measures: risks linked to uncertainty in geologic parameters guide further characterization work and guide simulations applied to performance evaluation. Formally, industry defines risk (more precisely risk criticality) as the product L*S, the Likelihood multiplied

  20. Innovative Techniques for Large-Scale Collection, Processing, and Storage of Eelgrass (Zostera marina) Seeds

    National Research Council Canada - National Science Library

    Orth, Robert J; Marion, Scott R

    2007-01-01

    .... Although methods for hand-collecting, processing and storing eelgrass seeds have advanced to match the scale of collections, the number of seeds collected has limited the scale of restoration efforts...

  1. Parameter and State Estimation of Large-Scale Complex Systems Using Python Tools

    Directory of Open Access Journals (Sweden)

    M. Anushka S. Perera

    2015-07-01

    Full Text Available This paper discusses the topics related to automating parameter, disturbance and state estimation analysis of large-scale complex nonlinear dynamic systems using free programming tools. For large-scale complex systems, before implementing any state estimator, the system should be analyzed for structural observability and the structural observability analysis can be automated using Modelica and Python. As a result of structural observability analysis, the system may be decomposed into subsystems where some of them may be observable --- with respect to parameter, disturbances, and states --- while some may not. The state estimation process is carried out for those observable subsystems and the optimum number of additional measurements are prescribed for unobservable subsystems to make them observable. In this paper, an industrial case study is considered: the copper production process at Glencore Nikkelverk, Kristiansand, Norway. The copper production process is a large-scale complex system. It is shown how to implement various state estimators, in Python, to estimate parameters and disturbances, in addition to states, based on available measurements.

  2. Key scientific challenges in geological disposal of high level radioactive waste

    International Nuclear Information System (INIS)

    Wang Ju

    2007-01-01

    The geological disposal of high radioactive waste is a challenging task facing the scientific and technical world. This paper introduces the latest progress of high level radioactive disposal programs in the latest progress of high level radioactive disposal programs in the world, and discusses the following key scientific challenges: (1) precise prediction of the evolution of a repository site; (2) characteristics of deep geological environment; (3) behaviour of deep rock mass, groundwater and engineering material under coupled con-ditions (intermediate to high temperature, geostress, hydraulic, chemical, biological and radiation process, etc); (4) geo-chemical behaviour of transuranic radionuclides with low concentration and its migration with groundwater; and (5) safety assessment of disposal system. Several large-scale research projects and several hot topics related with high-level waste disposal are also introduced. (authors)

  3. Planetary Geologic Mapping Handbook - 2009

    Science.gov (United States)

    Tanaka, K. L.; Skinner, J. A.; Hare, T. M.

    2009-01-01

    Geologic maps present, in an historical context, fundamental syntheses of interpretations of the materials, landforms, structures, and processes that characterize planetary surfaces and shallow subsurfaces (e.g., Varnes, 1974). Such maps also provide a contextual framework for summarizing and evaluating thematic research for a given region or body. In planetary exploration, for example, geologic maps are used for specialized investigations such as targeting regions of interest for data collection and for characterizing sites for landed missions. Whereas most modern terrestrial geologic maps are constructed from regional views provided by remote sensing data and supplemented in detail by field-based observations and measurements, planetary maps have been largely based on analyses of orbital photography. For planetary bodies in particular, geologic maps commonly represent a snapshot of a surface, because they are based on available information at a time when new data are still being acquired. Thus the field of planetary geologic mapping has been evolving rapidly to embrace the use of new data and modern technology and to accommodate the growing needs of planetary exploration. Planetary geologic maps have been published by the U.S. Geological Survey (USGS) since 1962 (Hackman, 1962). Over this time, numerous maps of several planetary bodies have been prepared at a variety of scales and projections using the best available image and topographic bases. Early geologic map bases commonly consisted of hand-mosaicked photographs or airbrushed shaded-relief views and geologic linework was manually drafted using mylar bases and ink drafting pens. Map publishing required a tedious process of scribing, color peel-coat preparation, typesetting, and photo-laboratory work. Beginning in the 1990s, inexpensive computing, display capability and user-friendly illustration software allowed maps to be drawn using digital tools rather than pen and ink, and mylar bases became obsolete

  4. Parallel clustering algorithm for large-scale biological data sets.

    Science.gov (United States)

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies.

  5. Large-scale Intelligent Transporation Systems simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  6. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    Science.gov (United States)

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.

  7. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    Directory of Open Access Journals (Sweden)

    Qiang Liu

    2018-05-01

    Full Text Available Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal computer, a Graphics Processing Unit (GPU-based, high-performance computing method using the OpenACC application was adopted to parallelize the shallow water model. An unstructured data management method was presented to control the data transportation between the GPU and CPU (Central Processing Unit with minimum overhead, and then both computation and data were offloaded from the CPU to the GPU, which exploited the computational capability of the GPU as much as possible. The parallel model was validated using various benchmarks and real-world case studies. The results demonstrate that speed-ups of up to one order of magnitude can be achieved in comparison with the serial model. The proposed parallel model provides a fast and reliable tool with which to quickly assess flood hazards in large-scale areas and, thus, has a bright application prospect for dynamic inundation risk identification and disaster assessment.

  8. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  9. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  10. Scale interactions in a mixing layer – the role of the large-scale gradients

    KAUST Repository

    Fiscaletti, D.

    2016-02-15

    © 2016 Cambridge University Press. The interaction between the large and the small scales of turbulence is investigated in a mixing layer, at a Reynolds number based on the Taylor microscale of , via direct numerical simulations. The analysis is performed in physical space, and the local vorticity root-mean-square (r.m.s.) is taken as a measure of the small-scale activity. It is found that positive large-scale velocity fluctuations correspond to large vorticity r.m.s. on the low-speed side of the mixing layer, whereas, they correspond to low vorticity r.m.s. on the high-speed side. The relationship between large and small scales thus depends on position if the vorticity r.m.s. is correlated with the large-scale velocity fluctuations. On the contrary, the correlation coefficient is nearly constant throughout the mixing layer and close to unity if the vorticity r.m.s. is correlated with the large-scale velocity gradients. Therefore, the small-scale activity appears closely related to large-scale gradients, while the correlation between the small-scale activity and the large-scale velocity fluctuations is shown to reflect a property of the large scales. Furthermore, the vorticity from unfiltered (small scales) and from low pass filtered (large scales) velocity fields tend to be aligned when examined within vortical tubes. These results provide evidence for the so-called \\'scale invariance\\' (Meneveau & Katz, Annu. Rev. Fluid Mech., vol. 32, 2000, pp. 1-32), and suggest that some of the large-scale characteristics are not lost at the small scales, at least at the Reynolds number achieved in the present simulation.

  11. Large temporal scale and capacity subsurface bulk energy storage with CO2

    Science.gov (United States)

    Saar, M. O.; Fleming, M. R.; Adams, B. M.; Ogland-Hand, J.; Nelson, E. S.; Randolph, J.; Sioshansi, R.; Kuehn, T. H.; Buscheck, T. A.; Bielicki, J. M.

    2017-12-01

    Decarbonizing energy systems by increasing the penetration of variable renewable energy (VRE) technologies requires efficient and short- to long-term energy storage. Very large amounts of energy can be stored in the subsurface as heat and/or pressure energy in order to provide both short- and long-term (seasonal) storage, depending on the implementation. This energy storage approach can be quite efficient, especially where geothermal energy is naturally added to the system. Here, we present subsurface heat and/or pressure energy storage with supercritical carbon dioxide (CO2) and discuss the system's efficiency, deployment options, as well as its advantages and disadvantages, compared to several other energy storage options. CO2-based subsurface bulk energy storage has the potential to be particularly efficient and large-scale, both temporally (i.e., seasonal) and spatially. The latter refers to the amount of energy that can be stored underground, using CO2, at a geologically conducive location, potentially enabling storing excess power from a substantial portion of the power grid. The implication is that it would be possible to employ centralized energy storage for (a substantial part of) the power grid, where the geology enables CO2-based bulk subsurface energy storage, whereas the VRE technologies (solar, wind) are located on that same power grid, where (solar, wind) conditions are ideal. However, this may require reinforcing the power grid's transmission lines in certain parts of the grid to enable high-load power transmission from/to a few locations.

  12. Planetary Geologic Mapping Handbook - 2010. Appendix

    Science.gov (United States)

    Tanaka, K. L.; Skinner, J. A., Jr.; Hare, T. M.

    2010-01-01

    Geologic maps present, in an historical context, fundamental syntheses of interpretations of the materials, landforms, structures, and processes that characterize planetary surfaces and shallow subsurfaces. Such maps also provide a contextual framework for summarizing and evaluating thematic research for a given region or body. In planetary exploration, for example, geologic maps are used for specialized investigations such as targeting regions of interest for data collection and for characterizing sites for landed missions. Whereas most modern terrestrial geologic maps are constructed from regional views provided by remote sensing data and supplemented in detail by field-based observations and measurements, planetary maps have been largely based on analyses of orbital photography. For planetary bodies in particular, geologic maps commonly represent a snapshot of a surface, because they are based on available information at a time when new data are still being acquired. Thus the field of planetary geologic mapping has been evolving rapidly to embrace the use of new data and modern technology and to accommodate the growing needs of planetary exploration. Planetary geologic maps have been published by the U.S. Geological Survey (USGS) since 1962. Over this time, numerous maps of several planetary bodies have been prepared at a variety of scales and projections using the best available image and topographic bases. Early geologic map bases commonly consisted of hand-mosaicked photographs or airbrushed shaded-relief views and geologic linework was manually drafted using mylar bases and ink drafting pens. Map publishing required a tedious process of scribing, color peel-coat preparation, typesetting, and photo-laboratory work. Beginning in the 1990s, inexpensive computing, display capability and user-friendly illustration software allowed maps to be drawn using digital tools rather than pen and ink, and mylar bases became obsolete. Terrestrial geologic maps published by

  13. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  14. Integration and segregation of large-scale brain networks during short-term task automatization.

    Science.gov (United States)

    Mohr, Holger; Wolfensteller, Uta; Betzel, Richard F; Mišić, Bratislav; Sporns, Olaf; Richiardi, Jonas; Ruge, Hannes

    2016-11-03

    The human brain is organized into large-scale functional networks that can flexibly reconfigure their connectivity patterns, supporting both rapid adaptive control and long-term learning processes. However, it has remained unclear how short-term network dynamics support the rapid transformation of instructions into fluent behaviour. Comparing fMRI data of a learning sample (N=70) with a control sample (N=67), we find that increasingly efficient task processing during short-term practice is associated with a reorganization of large-scale network interactions. Practice-related efficiency gains are facilitated by enhanced coupling between the cingulo-opercular network and the dorsal attention network. Simultaneously, short-term task automatization is accompanied by decreasing activation of the fronto-parietal network, indicating a release of high-level cognitive control, and a segregation of the default mode network from task-related networks. These findings suggest that short-term task automatization is enabled by the brain's ability to rapidly reconfigure its large-scale network organization involving complementary integration and segregation processes.

  15. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  16. VOSGES, a long and rich geologic history

    Science.gov (United States)

    Dominique, Carteaux; Cyrille, Delangle; Sophie, Demangel

    2015-04-01

    The study of geology in scientific classes is often too theoretical and abstract for the pupils. How can teachers make the link between some samples of rocks observed in a practical class and the geologic story of the region? There's nothing better than outdoor education to establish a relationship between the rock observed in macroscopic and microscopic scale in the classroom,with the outcrop scale and the landscape scale in the field: all of them are the result of a fascinating geologic history.Our pupils are lucky enough to live at the heart of a modest mountain massif that has a very rich geologic story: the massif from Vosges situated in the east of France. During two expeditions we show the students all the following tectonic processes: Accretion at the scale of the landscape with the Rhenish Ditch (tectonic and volcanic markers) Obductionis observed due to ophiolites found in the massive of Thalhorn (peridotite, gabbro and sedimentary marine rocks of great depth). Collisionis illuminated with numerous sites like the schists of Steige, the phyllite of Villé, the gneisses of Climont. Subductionis captured bystudying the outcrops of magmatic rocks within the continental crust (andesite, diorite, granodiorite). At each of the stops we have the students, from a hand sample, to findits story in a more global context. So the theory becomes reality. A study of thin slides of rocks observed on the ground finishes these exits and so various scales of understanding are approached. The long and rich geologic history of Vosges maybe reconstituted on hundreds of million years, allowing certainly giving another aspect to the living environment of our pupils.

  17. On BLM scale fixing in exclusive processes

    International Nuclear Information System (INIS)

    Anikin, I.V.; Pire, B.; Szymanowski, L.; Teryaev, O.V.; Wallon, S.

    2005-01-01

    We discuss the BLM scale fixing procedure in exclusive electroproduction processes in the Bjorken regime with rather large x B . We show that in the case of vector meson production dominated in this case by quark exchange the usual way to apply the BLM method fails due to singularities present in the equations fixing the BLM scale. We argue that the BLM scale should be extracted from the squared amplitudes which are directly related to observables. (orig.)

  18. On BLM scale fixing in exclusive processes

    Energy Technology Data Exchange (ETDEWEB)

    Anikin, I.V. [JINR, Bogoliubov Laboratory of Theoretical Physics, Dubna (Russian Federation); Universite Paris-Sud, LPT, Orsay (France); Pire, B. [Ecole Polytechnique, CPHT, Palaiseau (France); Szymanowski, L. [Soltan Institute for Nuclear Studies, Warsaw (Poland); Univ. de Liege, Inst. de Physique, Liege (Belgium); Teryaev, O.V. [JINR, Bogoliubov Laboratory of Theoretical Physics, Dubna (Russian Federation); Wallon, S. [Universite Paris-Sud, LPT, Orsay (France)

    2005-07-01

    We discuss the BLM scale fixing procedure in exclusive electroproduction processes in the Bjorken regime with rather large x{sub B}. We show that in the case of vector meson production dominated in this case by quark exchange the usual way to apply the BLM method fails due to singularities present in the equations fixing the BLM scale. We argue that the BLM scale should be extracted from the squared amplitudes which are directly related to observables. (orig.)

  19. Episodic events in long-term geological processes: A new classification and its applications

    Directory of Open Access Journals (Sweden)

    Dmitry A. Ruban

    2018-03-01

    Full Text Available Long-term geological processes are usually described with curves reflecting continuous changes in the characteristic parameters through the geological history, and such curves can be employed directly for recognition of episodic (relatively long-term events linked to these changes. The episodic events can be classified into several categories according to their scale (ordinary and anomalous events, “shape” (positive, negative, and neutral events, and relation to long-term trend change (successive, interruptive, facilitative, stabilizing, transformative, increasing, and decreasing. Many types of these events can be defined depending on the combination of the above-mentioned patterns. Of course, spatial rank, duration, and origin can be also considered in description of these events. The proposed classification can be applied to events in some real long-term geological processes, which include global sea-level changes, biodiversity dynamics, lithospheric plate number changes, and palaeoclimate changes. Several case examples prove the usefulness of the classification. It is established that the Early Valanginian (Early Cretaceous eustatic lowstand (the lowest position of the sea level in the entire Cretaceous was negative, but ordinary and only interruptive event. In the other case, it becomes clear that the only end-Ordovician and the Permian/Triassic mass extinctions transformed the trends of the biodiversity dynamics (from increase to decrease and from decrease to increase respectively, and the only Cretaceous/Paleogene mass extinction was really anomalous event on the Phanerozoic biodiversity curve. The new palaeontological data are employed to reconstruct the diversity dynamics of brachiopods in Germany (without the Alps and the Swiss Jura Mountains. The further interpretation of the both diversity curves implies that the Early Toarcian mass extinction affected the regional brachiopod faunas strongly, but this event was only decreasing

  20. Thermal oxidation of nuclear graphite: A large scale waste treatment option

    Science.gov (United States)

    Jones, Abbie N.; Marsden, Barry J.

    2017-01-01

    This study has investigated the laboratory scale thermal oxidation of nuclear graphite, as a proof-of-concept for the treatment and decommissioning of reactor cores on a larger industrial scale. If showed to be effective, this technology could have promising international significance with a considerable impact on the nuclear waste management problem currently facing many countries worldwide. The use of thermal treatment of such graphite waste is seen as advantageous since it will decouple the need for an operational Geological Disposal Facility (GDF). Particulate samples of Magnox Reactor Pile Grade-A (PGA) graphite, were oxidised in both air and 60% O2, over the temperature range 400–1200°C. Oxidation rates were found to increase with temperature, with a particular rise between 700–800°C, suggesting a change in oxidation mechanism. A second increase in oxidation rate was observed between 1000–1200°C and was found to correspond to a large increase in the CO/CO2 ratio, as confirmed through gas analysis. Increasing the oxidant flow rate gave a linear increase in oxidation rate, up to a certain point, and maximum rates of 23.3 and 69.6 mg / min for air and 60% O2 respectively were achieved at a flow of 250 ml / min and temperature of 1000°C. These promising results show that large-scale thermal treatment could be a potential option for the decommissioning of graphite cores, although the design of the plant would need careful consideration in order to achieve optimum efficiency and throughput. PMID:28793326

  1. Thermal oxidation of nuclear graphite: A large scale waste treatment option.

    Directory of Open Access Journals (Sweden)

    Alex Theodosiou

    Full Text Available This study has investigated the laboratory scale thermal oxidation of nuclear graphite, as a proof-of-concept for the treatment and decommissioning of reactor cores on a larger industrial scale. If showed to be effective, this technology could have promising international significance with a considerable impact on the nuclear waste management problem currently facing many countries worldwide. The use of thermal treatment of such graphite waste is seen as advantageous since it will decouple the need for an operational Geological Disposal Facility (GDF. Particulate samples of Magnox Reactor Pile Grade-A (PGA graphite, were oxidised in both air and 60% O2, over the temperature range 400-1200°C. Oxidation rates were found to increase with temperature, with a particular rise between 700-800°C, suggesting a change in oxidation mechanism. A second increase in oxidation rate was observed between 1000-1200°C and was found to correspond to a large increase in the CO/CO2 ratio, as confirmed through gas analysis. Increasing the oxidant flow rate gave a linear increase in oxidation rate, up to a certain point, and maximum rates of 23.3 and 69.6 mg / min for air and 60% O2 respectively were achieved at a flow of 250 ml / min and temperature of 1000°C. These promising results show that large-scale thermal treatment could be a potential option for the decommissioning of graphite cores, although the design of the plant would need careful consideration in order to achieve optimum efficiency and throughput.

  2. Integrated path towards geological storage

    International Nuclear Information System (INIS)

    Bouchard, R.; Delaytermoz, A.

    2004-01-01

    Among solutions to contribute to CO 2 emissions mitigation, sequestration is a promising path that presents the main advantage of being able to cope with the large volume at stake when considering the growing energy demand. Of particular importance, geological storage has widely been seen as an effective solution for large CO 2 sources like power plants or refineries. Many R and D projects have been initiated, whereby research institutes, government agencies and end-users achieve an effective collaboration. So far, progress has been made towards reinjection of CO 2 , in understanding and then predicting the phenomenon and fluid dynamics inside the geological target, while monitoring the expansion of the CO 2 bubble in the case of demonstration projects. A question arises however when talking about sequestration, namely the time scale to be taken into account. Time is indeed of the essence, and points out the need to understand leakage as well as trapping mechanisms. It is therefore of prime importance to be able to predict the fate of the injected fluids, in an accurate manner and over a relevant period of time. On the grounds of geology, four items are involved in geological storage reliability: the matrix itself, which is the recipient of the injected fluids; the seal, that is the mechanistic trap preventing the injected fluids to flow upward and escape; the lower part of the concerned structure, usually an aquifer, that can be a migration way for dissolved fluids; and the man- made injecting hole, the well, whose characteristics should be as good as the geological formation itself. These issues call for specific competencies such as reservoir engineering, geology and hydrodynamics, mineral chemistry, geomechanics, and well engineering. These competencies, even if put to use to a large extent in the oil industry, have never been connected with the reliability of geological storage as ultimate goal. This paper aims at providing an introduction to these

  3. Nitrate reduction in geologically heterogeneous catchments — A framework for assessing the scale of predictive capability of hydrological models

    International Nuclear Information System (INIS)

    Refsgaard, Jens Christian; Auken, Esben; Bamberg, Charlotte A.; Christensen, Britt S.B.; Clausen, Thomas; Dalgaard, Esben; Effersø, Flemming; Ernstsen, Vibeke; Gertz, Flemming; Hansen, Anne Lausten; He, Xin; Jacobsen, Brian H.; Jensen, Karsten Høgh; Jørgensen, Flemming; Jørgensen, Lisbeth Flindt; Koch, Julian; Nilsson, Bertel; Petersen, Christian; De Schepper, Guillaume; Schamper, Cyril

    2014-01-01

    In order to fulfil the requirements of the EU Water Framework Directive nitrate load from agricultural areas to surface water in Denmark needs to be reduced by about 40%. The regulations imposed until now have been uniform, i.e. the same restrictions for all areas independent of the subsurface conditions. Studies have shown that on a national basis about 2/3 of the nitrate leaching from the root zone is reduced naturally, through denitrification, in the subsurface before reaching the streams. Therefore, it is more cost-effective to identify robust areas, where nitrate leaching through the root zone is reduced in the saturated zone before reaching the streams, and vulnerable areas, where no subsurface reduction takes place, and then only impose regulations/restrictions on the vulnerable areas. Distributed hydrological models can make predictions at grid scale, i.e. at much smaller scale than the entire catchment. However, as distributed models often do not include local scale hydrogeological heterogeneities, they are typically not able to make accurate predictions at scales smaller than they are calibrated. We present a framework for assessing nitrate reduction in the subsurface and for assessing at which spatial scales modelling tools have predictive capabilities. A new instrument has been developed for airborne geophysical measurements, Mini-SkyTEM, dedicated to identifying geological structures and heterogeneities with horizontal and lateral resolutions of 30–50 m and 2 m, respectively, in the upper 30 m. The geological heterogeneity and uncertainty are further analysed by use of the geostatistical software TProGS by generating stochastic geological realisations that are soft conditioned against the geophysical data. Finally, the flow paths within the catchment are simulated by use of the MIKE SHE hydrological modelling system for each of the geological models generated by TProGS and the prediction uncertainty is characterised by the variance between the

  4. Nitrate reduction in geologically heterogeneous catchments — A framework for assessing the scale of predictive capability of hydrological models

    Energy Technology Data Exchange (ETDEWEB)

    Refsgaard, Jens Christian, E-mail: jcr@geus.dk [Geological Survey of Denmark and Greenland (GEUS) (Denmark); Auken, Esben [Department of Earth Sciences, Aarhus University (Denmark); Bamberg, Charlotte A. [City of Aarhus (Denmark); Christensen, Britt S.B. [Geological Survey of Denmark and Greenland (GEUS) (Denmark); Clausen, Thomas [DHI, Hørsholm (Denmark); Dalgaard, Esben [Department of Earth Sciences, Aarhus University (Denmark); Effersø, Flemming [SkyTEM Aps, Beder (Denmark); Ernstsen, Vibeke [Geological Survey of Denmark and Greenland (GEUS) (Denmark); Gertz, Flemming [Knowledge Center for Agriculture, Skejby (Denmark); Hansen, Anne Lausten [Department of Geosciences and Natural Resource Management, University of Copenhagen (Denmark); He, Xin [Geological Survey of Denmark and Greenland (GEUS) (Denmark); Jacobsen, Brian H. [Department of Food and Resource Economics, University of Copenhagen (Denmark); Jensen, Karsten Høgh [Department of Geosciences and Natural Resource Management, University of Copenhagen (Denmark); Jørgensen, Flemming; Jørgensen, Lisbeth Flindt [Geological Survey of Denmark and Greenland (GEUS) (Denmark); Koch, Julian [Department of Geosciences and Natural Resource Management, University of Copenhagen (Denmark); Nilsson, Bertel [Geological Survey of Denmark and Greenland (GEUS) (Denmark); Petersen, Christian [City of Odder (Denmark); De Schepper, Guillaume [Université Laval, Québec (Canada); Schamper, Cyril [Department of Earth Sciences, Aarhus University (Denmark); and others

    2014-01-01

    In order to fulfil the requirements of the EU Water Framework Directive nitrate load from agricultural areas to surface water in Denmark needs to be reduced by about 40%. The regulations imposed until now have been uniform, i.e. the same restrictions for all areas independent of the subsurface conditions. Studies have shown that on a national basis about 2/3 of the nitrate leaching from the root zone is reduced naturally, through denitrification, in the subsurface before reaching the streams. Therefore, it is more cost-effective to identify robust areas, where nitrate leaching through the root zone is reduced in the saturated zone before reaching the streams, and vulnerable areas, where no subsurface reduction takes place, and then only impose regulations/restrictions on the vulnerable areas. Distributed hydrological models can make predictions at grid scale, i.e. at much smaller scale than the entire catchment. However, as distributed models often do not include local scale hydrogeological heterogeneities, they are typically not able to make accurate predictions at scales smaller than they are calibrated. We present a framework for assessing nitrate reduction in the subsurface and for assessing at which spatial scales modelling tools have predictive capabilities. A new instrument has been developed for airborne geophysical measurements, Mini-SkyTEM, dedicated to identifying geological structures and heterogeneities with horizontal and lateral resolutions of 30–50 m and 2 m, respectively, in the upper 30 m. The geological heterogeneity and uncertainty are further analysed by use of the geostatistical software TProGS by generating stochastic geological realisations that are soft conditioned against the geophysical data. Finally, the flow paths within the catchment are simulated by use of the MIKE SHE hydrological modelling system for each of the geological models generated by TProGS and the prediction uncertainty is characterised by the variance between the

  5. A SKOS-based multilingual thesaurus of geological time scale for interopability of online geological maps

    NARCIS (Netherlands)

    Ma, X.; Carranza, E.J.M.; Wu, C.; Meer, F.D. van der; Liu, G.

    2011-01-01

    The usefulness of online geological maps is hindered by linguistic barriers. Multilingual geoscience thesauri alleviate linguistic barriers of geological maps. However, the benefits of multilingual geoscience thesauri for online geological maps are less studied. In this regard, we developed a

  6. Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah; Carns, Philip; Ross, Robert; Li, Jianping Kelvin; Ma, Kwan-Liu

    2016-11-13

    Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has to gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform a

  7. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  8. A state geological survey commitment to environmental geology - the Texas Bureau of Economic Geology

    International Nuclear Information System (INIS)

    Wermund, E.G.

    1990-01-01

    In several Texas environmental laws, the Bureau of Economic Geology is designated as a planning participant and review agency in the process of fulfilling environmental laws. Two examples are legislation on reclamation of surface mines and regulation of processing low level radioactive wastes. Also, the Bureau is the principal geological reviewer of all Environmental Assessments and Environmental Impact Statements which the Office of the Governor circulates for state review on all major developmental activities in Texas. The BEG continues its strong interest in environmental geology. In February 1988, it recommitted its Land Resources Laboratory, initiated in 1974, toward fulfilling needs of state, county, and city governments for consultation and research on environmental geologic problems. An editorial from another state geological survey would resemble the about description of texas work in environmental geology. State geological surveys have led federal agencies into many developments of environmental geology, complemented federal efforts in their evolution, and continued a strong commitment to the maintenance of a quality environment through innovative geologic studies

  9. Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan.

    Science.gov (United States)

    Tsai, Kuang-Jung; Chiang, Jie-Lun; Lee, Ming-Hsi; Chen, Yie-Ruey

    2017-04-01

    Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan. Kuang-Jung Tsai 1, Jie-Lun Chiang 2,Ming-Hsi Lee 2, Yie-Ruey Chen 1, 1Department of Land Management and Development, Chang Jung Christian Universityt, Tainan, Taiwan. 2Department of Soil and Water Conservation, National Pingtung University of Science and Technology, Pingtung, Taiwan. ABSTRACT The accumulated rainfall amount was recorded more than 2,900mm that were brought by Morakot typhoon in August, 2009 within continuous 3 days. Very serious landslides, and sediment related disasters were induced by this heavy rainfall event. The satellite image analysis project conducted by Soil and Water Conservation Bureau after Morakot event indicated that more than 10,904 sites of landslide with total sliding area of 18,113ha were found by this project. At the same time, all severe sediment related disaster areas are also characterized based on their disaster type, scale, topography, major bedrock formations and geologic structures during the period of extremely heavy rainfall events occurred at the southern Taiwan. Characteristics and mechanism of large scale landslide are collected on the basis of the field investigation technology integrated with GPS/GIS/RS technique. In order to decrease the risk of large scale landslides on slope land, the strategy of slope land conservation, and critical rainfall database should be set up and executed as soon as possible. Meanwhile, study on the establishment of critical rainfall value used for predicting large scale landslides induced by heavy rainfall become an important issue which was seriously concerned by the government and all people live in Taiwan. The mechanism of large scale landslide, rainfall frequency analysis ,sediment budge estimation and river hydraulic analysis under the condition of extremely climate change during the past 10 years would be seriously concerned and recognized as a required issue by this

  10. The First USGS Global Geologic Map of Europa

    Science.gov (United States)

    Leonard, E. J.; Patthoff, D. A.; Senske, D.; Collins, G. C.

    2017-12-01

    Understanding the global scale geology of Europa is paramount to gaining insight into the potential habitability of this icy world. To this end, work is ongoing to complete a global geological map at the scale of 1:15 million that incorporates data at all resolutions collected by the Voyager and Galileo missions. The results of this work will aid the Europa Clipper mission, now in formulation, by providing a framework for collaborative and synergistic science investigations. To understand global geologic and tectonic relations, a total of 10 geologic units have been defined. These include: Low Albedo Ridge Material (lam)—low albedo material that irregularly surrounds large (>20 km) ridge structures; Ridged plains (pr)—distributed over all latitudes and characterized by subparallel to cross-cutting ridges and troughs visible at high resolution (material (b)—linear to curvilinear zones with a distinct, abrupt albedo change from the surrounding region; Crater material (c), Continuous Crater Ejecta (ce) and Discontinuous Crater Ejecta (dce)—features associated with impact craters including the site of the impact, crater material, and the fall-out debris respectively; Low Albedo Chaos (chl), Mottled Albedo Chaos (chm) and High Albedo Chaos (chh)—disrupted terrain with a relatively uniform low albedo, patchy/variegated albedo, and uniform high albedo appearance respectively; Knobby Chaos (chk) - disrupted terrain with rough and blocky texture occurring in the high latitudes. In addition to the geologic units, our mapping also includes structural features—Ridges, Cycloids, Undifferentiated Linea, Crater Rims, Depression Margins, Dome Margins and Troughs. We also introduce a point feature (at the global scale), Microchaos, to denote small (material. The completed map will constrain the distribution of different Europa terrains and provide a general stratigraphic framework to assess the geologic history of Europa from the regional to the global scale. Here, we

  11. Large-scale production of UO2 kernels by sol–gel process at INET

    International Nuclear Information System (INIS)

    Hao, Shaochang; Ma, Jingtao; Zhao, Xingyu; Wang, Yang; Zhou, Xiangwen; Deng, Changsheng

    2014-01-01

    In order to supply elements (300,000 elements per year) for the Chinese pebble bed modular high temperature gas cooled reactor (HTR-PM), it is necessary to scale up the production of UO 2 kernels to 3–6 kgU per batch. The sol–gel process for preparation of UO 2 kernels have been improved and optimized at Institute of Nuclear and New Energy Technology (INET), Tsinghua University, PR China, and a whole set of facility was designed and constructed based on the process. This report briefly describes the main steps of the process, the key equipment and the production capacities of every step. Six batches of kernels for scale-up verification and four batches of kernels for fuel elements for in-pile irradiation tests have been successfully produced, respectively. The quality of the produced kernels meets the design requirements. The production capacity of the process reaches 3–6 kgU per batch

  12. Forensic geoscience: applications of geology, geomorphology and geophysics to criminal investigations

    Science.gov (United States)

    Ruffell, Alastair; McKinley, Jennifer

    2005-03-01

    One hundred years ago Georg Popp became the first scientist to present in court a case where the geological makeup of soils was used to secure a criminal conviction. Subsequently there have been significant advances in the theory and practice of forensic geoscience: many of them subsequent to the seminal publication of "Forensic Geology" by Murray and Tedrow [Murray, R., Tedrow, J.C.F. 1975 (republished 1986). Forensic Geology: Earth Sciences and Criminal Investigation. Rutgers University Press, New York, 240 pp.]. Our review places historical development in the modern context of how the allied disciplines of geology (mineralogy, sedimentology, microscopy), geophysics, soil science, microbiology, anthropology and geomorphology have been used as tool to aid forensic (domestic, serious, terrorist and international) crime investigations. The latter half of this paper uses the concept of scales of investigation, from large-scale landforms through to microscopic particles as a method of categorising the large number of geoscience applications to criminal investigation. Forensic geoscience has traditionally used established non-forensic techniques: 100 years after Popp's seminal work, research into forensic geoscience is beginning to lead, as opposed to follow other scientific disciplines.

  13. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  14. Osmotic generation of 'anomalous' fluid pressures in geological environments

    Science.gov (United States)

    Neuzii, C.E.

    2000-01-01

    Osmotic pressures are generated by differences in chemical potential of a solution across a membrane. But whether osmosis can have a significant effect on the pressure of fluids in geological environments has been controversial, because the membrane properties of geological media are poorly understood. 'Anomalous' pressures - large departures from hydrostatic pressure that are not explicable in terms of topographic or fluid-density effects are widely found in geological settings, and are commonly considered to result from processes that alter the pore or fluid volume, which in turn implies crustal changes happening at a rate too slow to observe directly. Yet if osmosis can explain some anomalies, there is no need to invoke such dynamic geological processes in those cases. Here I report results of a nine- year in situ measurement of fluid pressures and solute concentrations in shale that are consistent with the generation of large (up to 20 MPa) osmotic-pressure anomalies which could persist for tens of millions of years. Osmotic pressures of this magnitude and duration can explain many of the pressure anomalies observed in geological settings. The require, however, small shale porosity and large contrasts in the amount of dissolved solids in the pore waters - criteria that may help to distinguish between osmotic and crystal-dynamic origins of anomalous pressures.

  15. Nanoscale Chemical Processes Affecting Storage Capacities and Seals during Geologic CO2 Sequestration.

    Science.gov (United States)

    Jun, Young-Shin; Zhang, Lijie; Min, Yujia; Li, Qingyun

    2017-07-18

    Geologic CO 2 sequestration (GCS) is a promising strategy to mitigate anthropogenic CO 2 emission to the atmosphere. Suitable geologic storage sites should have a porous reservoir rock zone where injected CO 2 can displace brine and be stored in pores, and an impermeable zone on top of reservoir rocks to hinder upward movement of buoyant CO 2 . The injection wells (steel casings encased in concrete) pass through these geologic zones and lead CO 2 to the desired zones. In subsurface environments, CO 2 is reactive as both a supercritical (sc) phase and aqueous (aq) species. Its nanoscale chemical reactions with geomedia and wellbores are closely related to the safety and efficiency of CO 2 storage. For example, the injection pressure is determined by the wettability and permeability of geomedia, which can be sensitive to nanoscale mineral-fluid interactions; the sealing safety of the injection sites is affected by the opening and closing of fractures in caprocks and the alteration of wellbore integrity caused by nanoscale chemical reactions; and the time scale for CO 2 mineralization is also largely dependent on the chemical reactivities of the reservoir rocks. Therefore, nanoscale chemical processes can influence the hydrogeological and mechanical properties of geomedia, such as their wettability, permeability, mechanical strength, and fracturing. This Account reviews our group's work on nanoscale chemical reactions and their qualitative impacts on seal integrity and storage capacity at GCS sites from four points of view. First, studies on dissolution of feldspar, an important reservoir rock constituent, and subsequent secondary mineral precipitation are discussed, focusing on the effects of feldspar crystallography, cations, and sulfate anions. Second, interfacial reactions between caprock and brine are introduced using model clay minerals, with focuses on the effects of water chemistries (salinity and organic ligands) and water content on mineral dissolution and

  16. Large-scale machine learning and evaluation platform for real-time traffic surveillance

    Science.gov (United States)

    Eichel, Justin A.; Mishra, Akshaya; Miller, Nicholas; Jankovic, Nicholas; Thomas, Mohan A.; Abbott, Tyler; Swanson, Douglas; Keller, Joel

    2016-09-01

    In traffic engineering, vehicle detectors are trained on limited datasets, resulting in poor accuracy when deployed in real-world surveillance applications. Annotating large-scale high-quality datasets is challenging. Typically, these datasets have limited diversity; they do not reflect the real-world operating environment. There is a need for a large-scale, cloud-based positive and negative mining process and a large-scale learning and evaluation system for the application of automatic traffic measurements and classification. The proposed positive and negative mining process addresses the quality of crowd sourced ground truth data through machine learning review and human feedback mechanisms. The proposed learning and evaluation system uses a distributed cloud computing framework to handle data-scaling issues associated with large numbers of samples and a high-dimensional feature space. The system is trained using AdaBoost on 1,000,000 Haar-like features extracted from 70,000 annotated video frames. The trained real-time vehicle detector achieves an accuracy of at least 95% for 1/2 and about 78% for 19/20 of the time when tested on ˜7,500,000 video frames. At the end of 2016, the dataset is expected to have over 1 billion annotated video frames.

  17. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  18. Two-scale large deviations for chemical reaction kinetics through second quantization path integral

    International Nuclear Information System (INIS)

    Li, Tiejun; Lin, Feng

    2016-01-01

    Motivated by the study of rare events for a typical genetic switching model in systems biology, in this paper we aim to establish the general two-scale large deviations for chemical reaction systems. We build a formal approach to explicitly obtain the large deviation rate functionals for the considered two-scale processes based upon the second quantization path integral technique. We get three important types of large deviation results when the underlying two timescales are in three different regimes. This is realized by singular perturbation analysis to the rate functionals obtained by the path integral. We find that the three regimes possess the same deterministic mean-field limit but completely different chemical Langevin approximations. The obtained results are natural extensions of the classical large volume limit for chemical reactions. We also discuss its implication on the single-molecule Michaelis–Menten kinetics. Our framework and results can be applied to understand general multi-scale systems including diffusion processes. (paper)

  19. Geologic Mapping Investigations of Alba Mons, Mars

    Science.gov (United States)

    Crown, D. A.; Berman, D. C.; Scheidt, S. P.; Hauber, E.

    2018-06-01

    Geologic mapping of the summit region and western flank of Alba Mons at 1:1M-scale is revealing sequences of volcanic, tectonic, impact, and degradation processes that have formed and modified the northernmost of the Tharsis volcanoes.

  20. Success Factors of Large Scale ERP Implementation in Thailand

    OpenAIRE

    Rotchanakitumnuai; Siriluck

    2010-01-01

    The objectives of the study are to examine the determinants of ERP implementation success factors of ERP implementation. The result indicates that large scale ERP implementation success consist of eight factors: project management competence, knowledge sharing, ERP system quality , understanding, user involvement, business process re-engineering, top management support, organization readiness.

  1. Extending SME to Handle Large-Scale Cognitive Modeling.

    Science.gov (United States)

    Forbus, Kenneth D; Ferguson, Ronald W; Lovett, Andrew; Gentner, Dedre

    2017-07-01

    Analogy and similarity are central phenomena in human cognition, involved in processes ranging from visual perception to conceptual change. To capture this centrality requires that a model of comparison must be able to integrate with other processes and handle the size and complexity of the representations required by the tasks being modeled. This paper describes extensions to Structure-Mapping Engine (SME) since its inception in 1986 that have increased its scope of operation. We first review the basic SME algorithm, describe psychological evidence for SME as a process model, and summarize its role in simulating similarity-based retrieval and generalization. Then we describe five techniques now incorporated into the SME that have enabled it to tackle large-scale modeling tasks: (a) Greedy merging rapidly constructs one or more best interpretations of a match in polynomial time: O(n 2 log(n)); (b) Incremental operation enables mappings to be extended as new information is retrieved or derived about the base or target, to model situations where information in a task is updated over time; (c) Ubiquitous predicates model the varying degrees to which items may suggest alignment; (d) Structural evaluation of analogical inferences models aspects of plausibility judgments; (e) Match filters enable large-scale task models to communicate constraints to SME to influence the mapping process. We illustrate via examples from published studies how these enable it to capture a broader range of psychological phenomena than before. Copyright © 2016 Cognitive Science Society, Inc.

  2. Effect Of Up-Scaling On The Study Of The Steel/Bentonite Interface In A Deep Geological Repository

    International Nuclear Information System (INIS)

    Torres Alvarez, Elena; Turrero, Maria Jesus; Martin, Pedro Luis; Escribano, Alicia

    2008-01-01

    Deep geological disposal is the most accepted management option for High Level Nuclear Wastes. The multi-barrier system for the isolation of high-level radioactive waste includes the concept of the spent fuel encapsulated in canisters of carbon steel. Corrosion phenomena affect the integrity of the canister and can modify the chemical environment either at the interface or in the bentonite pore water. The experimental studies conducted by CIEMAT are focused on the iron canister corrosion products interaction with the bentonite system and are based on a series of short term and medium term experiments conceived at different scales, from conventional laboratory experiments and experiments in cylindrical cells, to those specifically designed 3D mock up experiments, the so called 'GAME (Geochemical Mock up experiments) scale'. The results obtained from the up-scaling could be a useful tool to understand the key processes at the steel/bentonite interface and the later modelling work. (authors)

  3. Effect Of Up-Scaling On The Study Of The Steel/Bentonite Interface In A Deep Geological Repository

    Energy Technology Data Exchange (ETDEWEB)

    Torres Alvarez, Elena; Turrero, Maria Jesus; Martin, Pedro Luis; Escribano, Alicia [CIEMAT, Avda. Complutense 22, 28040, Madrid (Spain)

    2008-07-01

    Deep geological disposal is the most accepted management option for High Level Nuclear Wastes. The multi-barrier system for the isolation of high-level radioactive waste includes the concept of the spent fuel encapsulated in canisters of carbon steel. Corrosion phenomena affect the integrity of the canister and can modify the chemical environment either at the interface or in the bentonite pore water. The experimental studies conducted by CIEMAT are focused on the iron canister corrosion products interaction with the bentonite system and are based on a series of short term and medium term experiments conceived at different scales, from conventional laboratory experiments and experiments in cylindrical cells, to those specifically designed 3D mock up experiments, the so called 'GAME (Geochemical Mock up experiments) scale'. The results obtained from the up-scaling could be a useful tool to understand the key processes at the steel/bentonite interface and the later modelling work. (authors)

  4. Geological exploration of Angola from Sumbe to Namibe: A review at the frontier between geology, natural resources and the history of geology

    Science.gov (United States)

    Masse, Pierre; Laurent, Olivier

    2016-01-01

    This paper provides a review of the Geological exploration of the Angola Coast (from Sumbe to Namibe) from pioneer's first geological descriptions and mining inventory to the most recent publications supported by the oil industry. We focus our attention on the following periods: 1875-1890 (Paul Choffat's work, mainly), 1910-1949 (first maps at country scale), 1949-1974 (detailed mapping of the Kwanza-Namibe coastal series), 1975-2000, with the editing of the last version of the Angola geological map at 1:1 million scale and the progressive completion of previous works. Since 2000, there is a renewal in geological fieldwork publications on the area mainly due to the work of university teams. This review paper thus stands at the frontier between geology, natural resources and the history of geology. It shows how geological knowledge has progressed in time, fueled by economic and scientific reasons.

  5. Methods for large-scale international studies on ICT in education

    NARCIS (Netherlands)

    Pelgrum, W.J.; Plomp, T.; Voogt, Joke; Knezek, G.A.

    2008-01-01

    International comparative assessment is a research method applied for describing and analyzing educational processes and outcomes. They are used to ‘describe the status quo’ in educational systems from an international comparative perspective. This chapter reviews different large scale international

  6. Geological research for public outreach and education in Lithuania

    Science.gov (United States)

    Skridlaite, Grazina; Guobyte, Rimante

    2013-04-01

    Successful IYPE activities and implementation of Geoheritage day in Lithuania increased public awareness in geology. A series of projects introducing geology to the general public and youth, supported by EU funds and local communities, were initiated. Researchers from the scientific and applied geology institutions of Lithuania participated in these projects and provided with the geological data. In one case, the Lithuanian Survey of Protected Areas supported the installation of a series of geological exhibitions in several regional and national parks. An animation demonstrating glacial processes was chosen for most of these because the Lithuanian surface is largely covered with sedimentary deposits of the Nemunas (Weichselian) glaciation. Researchers from the Lithuanian Geological Survey used the mapping results to demonstrate real glacial processes for every chosen area. In another case, 3D models showing underground structures of different localities were based on detailed geological maps and profiles obtained for that area. In case of the Sartai regional park, the results of previous geological research projects provided the possibility to create a movie depicting the ca. 2 Ga geological evolution of the region. The movie starts with the accretion of volcanic island arcs on the earlier continental margin at ca. 2 Ga and deciphers later Precambrian tectonic and magmatic events. The reconstruction is based on numerous scientific articles and interpretation of geophysical data. Later Paleozoic activities and following erosion sculptured the surface which was covered with several ice sheets in Quaternary. For educational purpose, a collection of minerals and rocks at the Forestry Institute was used to create an exhibition called "Cycle of geological processes". Forestry scientists and their students are able to study the interactions of geodiversity and biodiversity and to understand ancient and modern geological processes leading to a soil formation. An aging

  7. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  8. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    Science.gov (United States)

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright

  9. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  10. The use of production management techniques in the construction of large scale physics detectors

    CERN Document Server

    Bazan, A; Estrella, F; Kovács, Z; Le Flour, T; Le Goff, J M; Lieunard, S; McClatchey, R; Murray, S; Varga, L Z; Vialle, J P; Zsenei, M

    1999-01-01

    The construction process of detectors for the Large Hadron Collider (LHC) experiments is large scale, heavily constrained by resource availability and evolves with time. As a consequence, changes in detector component design need to be tracked and quickly reflected in the construction process. With similar problems in industry engineers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so- called Workflow Management software (WfMS) to coordinate production work processes. However, PDM and WfMS software are not generally integrated in industry. The scale of LHC experiments, like CMS, demands that industrial production techniques be applied in detector construction. This paper outlines the major functions and applications of the CRISTAL system (Cooperating Repositories and an information System for Tracking Assembly Lifecycles) in use in CMS which successfully integrates PDM and WfMS techniques in managing large scale physics detector ...

  11. Environmental Responses to Carbon Mitigation through Geological Storage

    Energy Technology Data Exchange (ETDEWEB)

    Cunningham, Alfred [Montana State Univ., Bozeman, MT (United States); Bromenshenk, Jerry [Montana State Univ., Bozeman, MT (United States)

    2013-08-30

    In summary, this DOE EPSCoR project is contributing to the study of carbon mitigation through geological storage. Both deep and shallow subsurface research needs are being addressed through research directed at improved understanding of environmental responses associated with large scale injection of CO2 into geologic formations. The research plan has two interrelated research objectives. Objective 1: Determine the influence of CO2-related injection of fluids on pore structure, material properties, and microbial activity in rock cores from potential geological carbon sequestration sites. Objective 2: Determine the Effects of CO2 leakage on shallow subsurface ecosystems (microbial and plant) using field experiments from an outdoor field testing facility.

  12. History matching of large scale fractures to production data; Calage de la geometrie des reseaux de fractures aux donnees hydrodynamiques de production d'un champ petrolier

    Energy Technology Data Exchange (ETDEWEB)

    Jenni, S.

    2005-01-01

    Object based models are very helpful to represent complex geological media such as fractured reservoirs. For building realistic fracture networks, these models have to be constrained to both static (seismic, geomechanics, geology) and dynamic data (well tests and production history). In this report we present a procedure for the calibration of large-scale fracture networks to production history. The history matching procedure includes a realistic geological modeling, a parameterization method coherent with the geological model and allowing an efficient optimization. Fluid flow modeling is based on a double medium approach. The calibration procedure was applied to a semi-synthetic case based on a real fractured reservoir. The calibration to water-cut data was performed. (author)

  13. Development of Best Practices for Large-scale Data Management Infrastructure

    NARCIS (Netherlands)

    S. Stadtmüller; H.F. Mühleisen (Hannes); C. Bizer; M.L. Kersten (Martin); J.A. de Rijke (Arjen); F.E. Groffen (Fabian); Y. Zhang (Ying); G. Ladwig; A. Harth; M Trampus

    2012-01-01

    htmlabstractThe amount of available data for processing is constantly increasing and becomes more diverse. We collect our experiences on deploying large-scale data management tools on local-area clusters or cloud infrastructures and provide guidance to use these computing and storage

  14. A new large-scale manufacturing platform for complex biopharmaceuticals.

    Science.gov (United States)

    Vogel, Jens H; Nguyen, Huong; Giovannini, Roberto; Ignowski, Jolene; Garger, Steve; Salgotra, Anil; Tom, Jennifer

    2012-12-01

    Complex biopharmaceuticals, such as recombinant blood coagulation factors, are addressing critical medical needs and represent a growing multibillion-dollar market. For commercial manufacturing of such, sometimes inherently unstable, molecules it is important to minimize product residence time in non-ideal milieu in order to obtain acceptable yields and consistently high product quality. Continuous perfusion cell culture allows minimization of residence time in the bioreactor, but also brings unique challenges in product recovery, which requires innovative solutions. In order to maximize yield, process efficiency, facility and equipment utilization, we have developed, scaled-up and successfully implemented a new integrated manufacturing platform in commercial scale. This platform consists of a (semi-)continuous cell separation process based on a disposable flow path and integrated with the upstream perfusion operation, followed by membrane chromatography on large-scale adsorber capsules in rapid cycling mode. Implementation of the platform at commercial scale for a new product candidate led to a yield improvement of 40% compared to the conventional process technology, while product quality has been shown to be more consistently high. Over 1,000,000 L of cell culture harvest have been processed with 100% success rate to date, demonstrating the robustness of the new platform process in GMP manufacturing. While membrane chromatography is well established for polishing in flow-through mode, this is its first commercial-scale application for bind/elute chromatography in the biopharmaceutical industry and demonstrates its potential in particular for manufacturing of potent, low-dose biopharmaceuticals. Copyright © 2012 Wiley Periodicals, Inc.

  15. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele; Attili, Antonio; Bisetti, Fabrizio; Elsinga, Gerrit E.

    2015-01-01

    from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  16. Large-Scale Production of Nanographite by Tube-Shear Exfoliation in Water.

    Directory of Open Access Journals (Sweden)

    Nicklas Blomquist

    Full Text Available The number of applications based on graphene, few-layer graphene, and nanographite is rapidly increasing. A large-scale process for production of these materials is critically needed to achieve cost-effective commercial products. Here, we present a novel process to mechanically exfoliate industrial quantities of nanographite from graphite in an aqueous environment with low energy consumption and at controlled shear conditions. This process, based on hydrodynamic tube shearing, produced nanometer-thick and micrometer-wide flakes of nanographite with a production rate exceeding 500 gh-1 with an energy consumption about 10 Whg-1. In addition, to facilitate large-area coating, we show that the nanographite can be mixed with nanofibrillated cellulose in the process to form highly conductive, robust and environmentally friendly composites. This composite has a sheet resistance below 1.75 Ω/sq and an electrical resistivity of 1.39×10-4 Ωm and may find use in several applications, from supercapacitors and batteries to printed electronics and solar cells. A batch of 100 liter was processed in less than 4 hours. The design of the process allow scaling to even larger volumes and the low energy consumption indicates a low-cost process.

  17. Large-scale dynamic compaction demonstration using WIPP salt: Fielding and preliminary results

    International Nuclear Information System (INIS)

    Ahrens, E.H.; Hansen, F.D.

    1995-10-01

    Reconsolidation of crushed rock salt is a phenomenon of great interest to programs studying isolation of hazardous materials in natural salt geologic settings. Of particular interest is the potential for disaggregated salt to be restored to nearly an impermeable state. For example, reconsolidated crushed salt is proposed as a major shaft seal component for the Waste Isolation Pilot Plant (WIPP) Project. The concept for a permanent shaft seal component of the WIPP repository is to densely compact crushed salt in the four shafts; an effective seal will then be developed as the surrounding salt creeps into the shafts, further consolidating the crushed salt. Fundamental information on placement density and permeability is required to ensure attainment of the design function. The work reported here is the first large-scale compaction demonstration to provide information on initial salt properties applicable to design, construction, and performance expectations. The shaft seals must function for 10,000 years. Over this period a crushed salt mass will become less permeable as it is compressed by creep closure of salt surrounding the shaft. These facts preclude the possibility of conducting a full-scale, real-time field test. Because permanent seals taking advantage of salt reconsolidation have never been constructed, performance measurements have not been made on an appropriately large scale. An understanding of potential construction methods, achievable initial density and permeability, and performance of reconsolidated salt over time is required for seal design and performance assessment. This report discusses fielding and operations of a nearly full-scale dynamic compaction of mine-run WIPP salt, and presents preliminary density and in situ (in place) gas permeability results

  18. County digital geologic mapping. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Hess, R.H.; Johnson, G.L.; dePolo, C.M.

    1995-12-31

    The purpose of this project is to create quality-county wide digital 1:250,000-scale geologic maps from existing published 1:250,000-scale Geologic and Mineral Resource Bulletins published by the Nevada Bureau of Mines and Geology (NBMG). An additional data set, based on current NBMG research, Major and Significant Quaternary and Suspected Quaternary Faults of Nevada, at 1:250,000 scale has also been included.

  19. County digital geologic mapping. Final report

    International Nuclear Information System (INIS)

    Hess, R.H.; Johnson, G.L.; dePolo, C.M.

    1995-01-01

    The purpose of this project is to create quality-county wide digital 1:250,000-scale geologic maps from existing published 1:250,000-scale Geologic and Mineral Resource Bulletins published by the Nevada Bureau of Mines and Geology (NBMG). An additional data set, based on current NBMG research, Major and Significant Quaternary and Suspected Quaternary Faults of Nevada, at 1:250,000 scale has also been included

  20. Detection of large-scale concentric gravity waves from a Chinese airglow imager network

    Science.gov (United States)

    Lai, Chang; Yue, Jia; Xu, Jiyao; Yuan, Wei; Li, Qinzeng; Liu, Xiao

    2018-06-01

    Concentric gravity waves (CGWs) contain a broad spectrum of horizontal wavelengths and periods due to their instantaneous localized sources (e.g., deep convection, volcanic eruptions, or earthquake, etc.). However, it is difficult to observe large-scale gravity waves of >100 km wavelength from the ground for the limited field of view of a single camera and local bad weather. Previously, complete large-scale CGW imagery could only be captured by satellite observations. In the present study, we developed a novel method that uses assembling separate images and applying low-pass filtering to obtain temporal and spatial information about complete large-scale CGWs from a network of all-sky airglow imagers. Coordinated observations from five all-sky airglow imagers in Northern China were assembled and processed to study large-scale CGWs over a wide area (1800 km × 1 400 km), focusing on the same two CGW events as Xu et al. (2015). Our algorithms yielded images of large-scale CGWs by filtering out the small-scale CGWs. The wavelengths, wave speeds, and periods of CGWs were measured from a sequence of consecutive assembled images. Overall, the assembling and low-pass filtering algorithms can expand the airglow imager network to its full capacity regarding the detection of large-scale gravity waves.

  1. ``Large''- vs Small-scale friction control in turbulent channel flow

    Science.gov (United States)

    Canton, Jacopo; Örlü, Ramis; Chin, Cheng; Schlatter, Philipp

    2017-11-01

    We reconsider the ``large-scale'' control scheme proposed by Hussain and co-workers (Phys. Fluids 10, 1049-1051 1998 and Phys. Rev. Fluids, 2, 62601 2017), using new direct numerical simulations (DNS). The DNS are performed in a turbulent channel at friction Reynolds number Reτ of up to 550 in order to eliminate low-Reynolds-number effects. The purpose of the present contribution is to re-assess this control method in the light of more modern developments in the field, in particular also related to the discovery of (very) large-scale motions. The goals of the paper are as follows: First, we want to better characterise the physics of the control, and assess what external contribution (vortices, forcing, wall motion) are actually needed. Then, we investigate the optimal parameters and, finally, determine which aspects of this control technique actually scale in outer units and can therefore be of use in practical applications. In addition to discussing the mentioned drag-reduction effects, the present contribution will also address the potential effect of the naturally occurring large-scale motions on frictional drag, and give indications on the physical processes for potential drag reduction possible at all Reynolds numbers.

  2. Semihard processes with BLM renormalization scale setting

    Energy Technology Data Exchange (ETDEWEB)

    Caporale, Francesco [Instituto de Física Teórica UAM/CSIC, Nicolás Cabrera 15 and U. Autónoma de Madrid, E-28049 Madrid (Spain); Ivanov, Dmitry Yu. [Sobolev Institute of Mathematics and Novosibirsk State University, 630090 Novosibirsk (Russian Federation); Murdaca, Beatrice; Papa, Alessandro [Dipartimento di Fisica, Università della Calabria, and Istituto Nazionale di Fisica Nucleare, Gruppo collegato di Cosenza, Arcavacata di Rende, I-87036 Cosenza (Italy)

    2015-04-10

    We apply the BLM scale setting procedure directly to amplitudes (cross sections) of several semihard processes. It is shown that, due to the presence of β{sub 0}-terms in the NLA results for the impact factors, the obtained optimal renormalization scale is not universal, but depends both on the energy and on the process in question. We illustrate this general conclusion considering the following semihard processes: (i) inclusive production of two forward high-p{sub T} jets separated by large interval in rapidity (Mueller-Navelet jets); (ii) high-energy behavior of the total cross section for highly virtual photons; (iii) forward amplitude of the production of two light vector mesons in the collision of two virtual photons.

  3. Large-scale grain growth in the solid-state process: From "Abnormal" to "Normal"

    Science.gov (United States)

    Jiang, Minhong; Han, Shengnan; Zhang, Jingwei; Song, Jiageng; Hao, Chongyan; Deng, Manjiao; Ge, Lingjing; Gu, Zhengfei; Liu, Xinyu

    2018-02-01

    Abnormal grain growth (AGG) has been a common phenomenon during the ceramic or metallurgy processing since prehistoric times. However, usually it had been very difficult to grow big single crystal (centimeter scale over) by using the AGG method due to its so-called occasionality. Based on the AGG, a solid-state crystal growth (SSCG) method was developed. The greatest advantages of the SSCG technology are the simplicity and cost-effectiveness of the technique. But the traditional SSCG technology is still uncontrollable. This article first summarizes the history and current status of AGG, and then reports recent technical developments from AGG to SSCG, and further introduces a new seed-free, solid-state crystal growth (SFSSCG) technology. This SFSSCG method allows us to repeatedly and controllably fabricate large-scale single crystals with appreciable high quality and relatively stable chemical composition at a relatively low temperature, at least in (K0.5Na0.5)NbO3(KNN) and Cu-Al-Mn systems. In this sense, the exaggerated grain growth is no longer 'Abnormal' but 'Normal' since it is able to be artificially controllable and repeated now. This article also provides a crystal growth model to qualitatively explain the mechanism of SFSSCG for KNN system. Compared with the traditional melt and high temperature solution growth methods, the SFSSCG method has the advantages of low energy consumption, low investment, simple technique, composition homogeneity overcoming the issues with incongruent melting and high volatility. This SFSSCG could be helpful for improving the mechanical and physical properties of single crystals, which should be promising for industrial applications.

  4. Radionuclide migration in clayrock host formations for deep geological disposal of radioactive waste: advances in process understanding and up-scaling methods resulting from the EC integrated project `Funmig

    Science.gov (United States)

    Altmann, S.; Tournassat, C.; Goutelard, F.; Parneix, J. C.; Gimmi, T.; Maes, N.

    2009-04-01

    migration of most radionuclides in clayrocks, in particular the actinides, is limited by their strong sorption on rock mineral surfaces. Much effort was devoted in Funmig to improving understanding of this process on the clayrocks being studied in the Swiss, Belgian and French radwaste management programs. Specific attention was focused on (i) elucidating the effect of dissolved organic matter on Am(III), Th(IV), Eu(III) sorption on clayrock surfaces and (ii) determining the link between Kd measured on dispersed rock systems and the Kd operant in intact rock volumes, i.e. during diffusion. Regarding the latter question, results indicate that Kd values for ‘dispersed' and ‘intact' materials are quite similar for certain elements (Na, Sr, Cs, Co). On the other hand, Kd values obtained by modeling results of diffusion experiments involving strongly sorbing elements as Cs, Co and Eu were always significantly smaller than those predicted based on sorption data measured in corresponding batch systems. This is an area where additional research is being planned. A major effort was devoted to improving understanding of the effects of small-scale (m to cm) clayrock structure and large-scale (dm to hm) mineralogical composition on radionuclide diffusion-retention. The program focusing on the small-scale produced a method for simulating the results of tracer diffusion in an intact rock based on the actual rock microstructure of the rock sample to be used in the diffusion experiment. This model was used to predict / inverse model the spatial distribution of highly sorbing tracers (Eu, Cu). This overall approach is also being used to understand how changes in mineralogical composition can affect the values of macroscopic diffusion parameters (De, tortuosity, anisiotropy). At a much larger scale, the results of (i) a geostatistical analysis of clayrock mineralogical variability and (ii) measurements of De and Kd dependence on mineralogy for Cs and Cl, were combined to create models

  5. Salinization of aquifers at the regional scale by marine transgression: Time scales and processes

    Science.gov (United States)

    Armandine Les Landes, A.; Davy, P.; Aquilina, L.

    2014-12-01

    Saline fluids with moderate concentrations have been sampled and reported in the Armorican basement at the regional scale (northwestern France). The horizontal and vertical distributions of high chloride concentrations (60-1400mg/L) at the regional scale support the marine origin and provide constraints on the age of these saline fluids. The current distribution of fresh and "saline" groundwater at depth is the result mostly of processes occurring at geological timescales - seawater intrusion processes followed by fresh groundwater flushing -, and only slightly of recent anthropogenic activities. In this study, we focus on seawater intrusion mechanisms in continental aquifers. We argue that one of the most efficient processes in macrotidal environments is the gravity-driven downconing instability below coastal salinized rivers. 2-D numerical experiments have been used to quantify this process according to four main parameter types: (1) the groundwater system permeability, (2) the salinity degree of the river, (3) the river width and slope, and (4) the tidal amplitude. A general expression of the salinity inflow rates have been derived, which has been used to estimate groundwater salinization rates in Brittany, given the geomorphological and environmental characteristics (drainage basin area, river widths and slopes, tidal range, aquifer permeability). We found that downconing below coastal rivers entail very high saline rates, indicating that this process play a major role in the salinization of regional aquifers. This is also likely to be an issue in the context of climate change, where sea-level rise is expected.

  6. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  7. Using Agent Base Models to Optimize Large Scale Network for Large System Inventories

    Science.gov (United States)

    Shameldin, Ramez Ahmed; Bowling, Shannon R.

    2010-01-01

    The aim of this paper is to use Agent Base Models (ABM) to optimize large scale network handling capabilities for large system inventories and to implement strategies for the purpose of reducing capital expenses. The models used in this paper either use computational algorithms or procedure implementations developed by Matlab to simulate agent based models in a principal programming language and mathematical theory using clusters, these clusters work as a high performance computational performance to run the program in parallel computational. In both cases, a model is defined as compilation of a set of structures and processes assumed to underlie the behavior of a network system.

  8. Large - scale Rectangular Ruler Automated Verification Device

    Science.gov (United States)

    Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie

    2018-03-01

    This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.

  9. Idealised modelling of storm surges in large-scale coastal basins

    NARCIS (Netherlands)

    Chen, Wenlong

    2015-01-01

    Coastal areas around the world are frequently attacked by various types of storms, threatening human life and property. This study aims to understand storm surge processes in large-scale coastal basins, particularly focusing on the influences of geometry, topography and storm characteristics on the

  10. An efficient method based on the uniformity principle for synthesis of large-scale heat exchanger networks

    International Nuclear Information System (INIS)

    Zhang, Chunwei; Cui, Guomin; Chen, Shang

    2016-01-01

    Highlights: • Two dimensionless uniformity factors are presented to heat exchange network. • The grouping of process streams reduces the computational complexity of large-scale HENS problems. • The optimal sub-network can be obtained by Powell particle swarm optimization algorithm. • The method is illustrated by a case study involving 39 process streams, with a better solution. - Abstract: The optimal design of large-scale heat exchanger networks is a difficult task due to the inherent non-linear characteristics and the combinatorial nature of heat exchangers. To solve large-scale heat exchanger network synthesis (HENS) problems, two dimensionless uniformity factors to describe the heat exchanger network (HEN) uniformity in terms of the temperature difference and the accuracy of process stream grouping are deduced. Additionally, a novel algorithm that combines deterministic and stochastic optimizations to obtain an optimal sub-network with a suitable heat load for a given group of streams is proposed, and is named the Powell particle swarm optimization (PPSO). As a result, the synthesis of large-scale heat exchanger networks is divided into two corresponding sub-parts, namely, the grouping of process streams and the optimization of sub-networks. This approach reduces the computational complexity and increases the efficiency of the proposed method. The robustness and effectiveness of the proposed method are demonstrated by solving a large-scale HENS problem involving 39 process streams, and the results obtained are better than those previously published in the literature.

  11. Disribution and interplay of geologic processes on Titan from Cassini radar data

    Science.gov (United States)

    Lopes, R.M.C.; Stofan, E.R.; Peckyno, R.; Radebaugh, J.; Mitchell, K.L.; Mitri, Giuseppe; Wood, C.A.; Kirk, R.L.; Wall, S.D.; Lunine, J.I.; Hayes, A.; Lorenz, R.; Farr, Tom; Wye, L.; Craig, J.; Ollerenshaw, R.J.; Janssen, M.; LeGall, A.; Paganelli, F.; West, R.; Stiles, B.; Callahan, P.; Anderson, Y.; Valora, P.; Soderblom, L.

    2010-01-01

    The Cassini Titan Radar Mapper is providing an unprecedented view of Titan's surface geology. Here we use Synthetic Aperture Radar (SAR) image swaths (Ta-T30) obtained from October 2004 to December 2007 to infer the geologic processes that have shaped Titan's surface. These SAR swaths cover about 20% of the surface, at a spatial resolution ranging from ~350 m to ~2 km. The SAR data are distributed over a wide latitudinal and longitudinal range, enabling some conclusions to be drawn about the global distribution of processes. They reveal a geologically complex surface that has been modified by all the major geologic processes seen on Earth - volcanism, tectonism, impact cratering, and erosion and deposition by fluvial and aeolian activity. In this paper, we map geomorphological units from SAR data and analyze their areal distribution and relative ages of modification in order to infer the geologic evolution of Titan's surface. We find that dunes and hummocky and mountainous terrains are more widespread than lakes, putative cryovolcanic features, mottled plains, and craters and crateriform structures that may be due to impact. Undifferentiated plains are the largest areal unit; their origin is uncertain. In terms of latitudinal distribution, dunes and hummocky and mountainous terrains are located mostly at low latitudes (less than 30 degrees), with no dunes being present above 60 degrees. Channels formed by fluvial activity are present at all latitudes, but lakes are at high latitudes only. Crateriform structures that may have been formed by impact appear to be uniformly distributed with latitude, but the well-preserved impact craters are all located at low latitudes, possibly indicating that more resurfacing has occurred at higher latitudes. Cryovolcanic features are not ubiquitous, and are mostly located between 30 degrees and 60 degrees north. We examine temporal relationships between units wherever possible, and conclude that aeolian and fluvial

  12. Distribution and interplay of geologic processes on Titan from Cassini radar data

    Science.gov (United States)

    Lopes, R.M.C.; Stofan, E.R.; Peckyno, R.; Radebaugh, J.; Mitchell, K.L.; Mitri, Giuseppe; Wood, C.A.; Kirk, R.L.; Wall, S.D.; Lunine, J.I.; Hayes, A.; Lorenz, R.; Farr, Tom; Wye, L.; Craig, J.; Ollerenshaw, R.J.; Janssen, M.; LeGall, A.; Paganelli, F.; West, R.; Stiles, B.; Callahan, P.; Anderson, Y.; Valora, P.; Soderblom, L.

    2010-01-01

    The Cassini Titan Radar Mapper is providing an unprecedented view of Titan's surface geology. Here we use Synthetic Aperture Radar (SAR) image swaths (Ta-T30) obtained from October 2004 to December 2007 to infer the geologic processes that have shaped Titan's surface. These SAR swaths cover about 20% of the surface, at a spatial resolution ranging from ???350 m to ???2 km. The SAR data are distributed over a wide latitudinal and longitudinal range, enabling some conclusions to be drawn about the global distribution of processes. They reveal a geologically complex surface that has been modified by all the major geologic processes seen on Earth - volcanism, tectonism, impact cratering, and erosion and deposition by fluvial and aeolian activity. In this paper, we map geomorphological units from SAR data and analyze their areal distribution and relative ages of modification in order to infer the geologic evolution of Titan's surface. We find that dunes and hummocky and mountainous terrains are more widespread than lakes, putative cryovolcanic features, mottled plains, and craters and crateriform structures that may be due to impact. Undifferentiated plains are the largest areal unit; their origin is uncertain. In terms of latitudinal distribution, dunes and hummocky and mountainous terrains are located mostly at low latitudes (less than 30??), with no dunes being present above 60??. Channels formed by fluvial activity are present at all latitudes, but lakes are at high latitudes only. Crateriform structures that may have been formed by impact appear to be uniformly distributed with latitude, but the well-preserved impact craters are all located at low latitudes, possibly indicating that more resurfacing has occurred at higher latitudes. Cryovolcanic features are not ubiquitous, and are mostly located between 30?? and 60?? north. We examine temporal relationships between units wherever possible, and conclude that aeolian and fluvial/pluvial/lacustrine processes are the

  13. Mirror dark matter and large scale structure

    International Nuclear Information System (INIS)

    Ignatiev, A.Yu.; Volkas, R.R.

    2003-01-01

    Mirror matter is a dark matter candidate. In this paper, we reexamine the linear regime of density perturbation growth in a universe containing mirror dark matter. Taking adiabatic scale-invariant perturbations as the input, we confirm that the resulting processed power spectrum is richer than for the more familiar cases of cold, warm and hot dark matter. The new features include a maximum at a certain scale λ max , collisional damping below a smaller characteristic scale λ S ' , with oscillatory perturbations between the two. These scales are functions of the fundamental parameters of the theory. In particular, they decrease for decreasing x, the ratio of the mirror plasma temperature to that of the ordinary. For x∼0.2, the scale λ max becomes galactic. Mirror dark matter therefore leads to bottom-up large scale structure formation, similar to conventional cold dark matter, for x(less-or-similar sign)0.2. Indeed, the smaller the value of x, the closer mirror dark matter resembles standard cold dark matter during the linear regime. The differences pertain to scales smaller than λ S ' in the linear regime, and generally in the nonlinear regime because mirror dark matter is chemically complex and to some extent dissipative. Lyman-α forest data and the early reionization epoch established by WMAP may hold the key to distinguishing mirror dark matter from WIMP-style cold dark matter

  14. The Role of Geologic Mapping in NASA PDSI Planning

    Science.gov (United States)

    Williams, D. A.; Skinner, J. A.; Radebaugh, J.

    2017-12-01

    Geologic mapping is an investigative process designed to derive the geologic history of planetary objects at local, regional, hemispheric or global scales. Geologic maps are critical products that aid future exploration by robotic spacecraft or human missions, support resource exploration, and provide context for and help guide scientific discovery. Creation of these tools, however, can be challenging in that, relative to their terrestrial counterparts, non-terrestrial planetary geologic maps lack expansive field-based observations. They rely, instead, on integrating diverse data types wth a range of spatial scales and areal coverage. These facilitate establishment of geomorphic and geologic context but are generally limited with respect to identifying outcrop-scale textural details and resolving temporal and spatial changes in depositional environments. As a result, planetary maps should be prepared with clearly defined contact and unit descriptions as well as a range of potential interpretations. Today geologic maps can be made from images obtained during the traverses of the Mars rovers, and for every new planetary object visited by NASA orbital or flyby spacecraft (e.g., Vesta, Ceres, Titan, Enceladus, Pluto). As Solar System Exploration develops and as NASA prepares to send astronauts back to the Moon and on to Mars, the importance of geologic mapping will increase. In this presentation, we will discuss the past role of geologic mapping in NASA's planetary science activities and our thoughts on the role geologic mapping will have in exploration in the coming decades. Challenges that planetary mapping must address include, among others: 1) determine the geologic framework of all Solar System bodies through the systematic development of geologic maps at appropriate scales, 2) develop digital Geographic Information Systems (GIS)-based mapping techniques and standards to assist with communicating map information to the scientific community and public, 3) develop

  15. Dose monitoring in large-scale flowing aqueous media

    International Nuclear Information System (INIS)

    Kuruca, C.N.

    1995-01-01

    The Miami Electron Beam Research Facility (EBRF) has been in operation for six years. The EBRF houses a 1.5 MV, 75 KW DC scanned electron beam. Experiments have been conducted to evaluate the effectiveness of high-energy electron irradiation in the removal of toxic organic chemicals from contaminated water and the disinfection of various wastewater streams. The large-scale plant operates at approximately 450 L/min (120 gal/min). The radiation dose absorbed by the flowing aqueous streams is estimated by measuring the difference in water temperature before and after it passes in front of the beam. Temperature measurements are made using resistance temperature devices (RTDs) and recorded by computer along with other operating parameters. Estimated dose is obtained from the measured temperature differences using the specific heat of water. This presentation will discuss experience with this measurement system, its application to different water presentation devices, sources of error, and the advantages and disadvantages of its use in large-scale process applications

  16. Failure Impact Assessment for Large-Scale Landslides Located Near Human Settlement: Case Study in Southern Taiwan

    Directory of Open Access Journals (Sweden)

    Ming-Chien Chung

    2018-05-01

    Full Text Available In 2009, Typhoon Morakot caused over 680 deaths and more than 20,000 landslides in Taiwan. From 2010 to 2015, the Central Geological Survey of the Ministry of Economic Affairs identified 1047 potential large-scale landslides in Taiwan, of which 103 may have affected human settlements. This paper presents an analytical procedure that can be applied to assess the possible impact of a landslide collapse on nearby settlements. In this paper, existing technologies, including interpretation of remote sensing images, hydrogeological investigation, and numerical analysis, are integrated to evaluate potential failure scenarios and the landslide scale of a specific case: the Xinzhuang landslide. GeoStudio and RAMMS analysis modes and hazard classification produced the following results: (1 evaluation of the failure mechanisms and the influence zones of large-scale landslides; (2 assessment of the migration and accumulation of the landslide mass after failure; and (3 a landslide hazard and evacuation map. The results of the case study show that this analytical procedure can quantitatively estimate potential threats to human settlements. Furthermore, it can be applied to other villages and used as a reference in disaster prevention and evacuation planning.

  17. Charm production and mass scales in deep inelastic processes

    International Nuclear Information System (INIS)

    Close, F.E.; Scott, D.M.; Sivers, D.

    1976-07-01

    Because of their large mass, the production of charmed particles offers the possibility of new insight into fundamental dynamics. An approach to deep inelastic processes is discussed in which Generalized Vector Meson Dominance is used to extend parton model results away from the strict Bjorken scaling limit into regions where mass scales play an important role. The processes e + e - annihilation, photoproduction, deep inelastic leptoproduction, photon-photon scattering and the production of lepton pairs in hadronic collisions are discussed. The GCMD approach provides a reasonably unified framework and makes specific predictions concerning the way in which these reactions reflect an underlying flavour symmetry, broken by large mass differences. (author)

  18. Inducing a health-promoting change process within an organization: the effectiveness of a large-scale intervention on social capital, openness, and autonomous motivation toward health.

    Science.gov (United States)

    van Scheppingen, Arjella R; de Vroome, Ernest M M; Ten Have, Kristin C J M; Bos, Ellen H; Zwetsloot, Gerard I J M; van Mechelen, W

    2014-11-01

    To examine the effectiveness of an organizational large-scale intervention applied to induce a health-promoting organizational change process. A quasi-experimental, "as-treated" design was used. Regression analyses on data of employees of a Dutch dairy company (n = 324) were used to examine the effects on bonding social capital, openness, and autonomous motivation toward health and on employees' lifestyle, health, vitality, and sustainable employability. Also, the sensitivity of the intervention components was examined. Intervention effects were found for bonding social capital, openness toward health, smoking, healthy eating, and sustainable employability. The effects were primarily attributable to the intervention's dialogue component. The change process initiated by the large-scale intervention contributed to a social climate in the workplace that promoted health and ownership toward health. The study confirms the relevance of collective change processes for health promotion.

  19. Modeling field scale unsaturated flow and transport processes

    International Nuclear Information System (INIS)

    Gelhar, L.W.; Celia, M.A.; McLaughlin, D.

    1994-08-01

    The scales of concern in subsurface transport of contaminants from low-level radioactive waste disposal facilities are in the range of 1 to 1,000 m. Natural geologic materials generally show very substantial spatial variability in hydraulic properties over this range of scales. Such heterogeneity can significantly influence the migration of contaminants. It is also envisioned that complex earth structures will be constructed to isolate the waste and minimize infiltration of water into the facility. The flow of water and gases through such facilities must also be a concern. A stochastic theory describing unsaturated flow and contamination transport in naturally heterogeneous soils has been enhanced by adopting a more realistic characterization of soil variability. The enhanced theory is used to predict field-scale effective properties and variances of tension and moisture content. Applications illustrate the important effects of small-scale heterogeneity on large-scale anisotropy and hysteresis and demonstrate the feasibility of simulating two-dimensional flow systems at time and space scales of interest in radioactive waste disposal investigations. Numerical algorithms for predicting field scale unsaturated flow and contaminant transport have been improved by requiring them to respect fundamental physical principles such as mass conservation. These algorithms are able to provide realistic simulations of systems with very dry initial conditions and high degrees of heterogeneity. Numerical simulation of the movement of water and air in unsaturated soils has demonstrated the importance of air pathways for contaminant transport. The stochastic flow and transport theory has been used to develop a systematic approach to performance assessment and site characterization. Hypothesis-testing techniques have been used to determine whether model predictions are consistent with observed data

  20. Towards large-scale plasma-assisted synthesis of nanowires

    Science.gov (United States)

    Cvelbar, U.

    2011-05-01

    Large quantities of nanomaterials, e.g. nanowires (NWs), are needed to overcome the high market price of nanomaterials and make nanotechnology widely available for general public use and applications to numerous devices. Therefore, there is an enormous need for new methods or routes for synthesis of those nanostructures. Here plasma technologies for synthesis of NWs, nanotubes, nanoparticles or other nanostructures might play a key role in the near future. This paper presents a three-dimensional problem of large-scale synthesis connected with the time, quantity and quality of nanostructures. Herein, four different plasma methods for NW synthesis are presented in contrast to other methods, e.g. thermal processes, chemical vapour deposition or wet chemical processes. The pros and cons are discussed in detail for the case of two metal oxides: iron oxide and zinc oxide NWs, which are important for many applications.

  1. Risk Management Challenges in Large-scale Energy PSS

    DEFF Research Database (Denmark)

    Tegeltija, Miroslava; Oehmen, Josef; Kozin, Igor

    2017-01-01

    Probabilistic risk management approaches have a long tradition in engineering. A large variety of tools and techniques based on the probabilistic view of risk is available and applied in PSS practice. However, uncertainties that arise due to lack of knowledge and information are still missing...... adequate representations. We focus on a large-scale energy company in Denmark as one case of current product/servicesystems risk management best practices. We analyze their risk management process and investigate the tools they use in order to support decision making processes within the company. First, we...... identify the following challenges in the current risk management practices that are in line with literature: (1) current methods are not appropriate for the situations dominated by weak knowledge and information; (2) quality of traditional models in such situations is open to debate; (3) quality of input...

  2. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  3. Large-scale digitizer system, analog converters

    International Nuclear Information System (INIS)

    Althaus, R.F.; Lee, K.L.; Kirsten, F.A.; Wagner, L.J.

    1976-10-01

    Analog to digital converter circuits that are based on the sharing of common resources, including those which are critical to the linearity and stability of the individual channels, are described. Simplicity of circuit composition is valued over other more costly approaches. These are intended to be applied in a large-scale processing and digitizing system for use with high-energy physics detectors such as drift-chambers or phototube-scintillator arrays. Signal distribution techniques are of paramount importance in maintaining adequate signal-to-noise ratio. Noise in both amplitude and time-jitter senses is held sufficiently low so that conversions with 10-bit charge resolution and 12-bit time resolution are achieved

  4. Formation of Large-scale Coronal Loops Interconnecting Two Active Regions through Gradual Magnetic Reconnection and an Associated Heating Process

    Science.gov (United States)

    Du, Guohui; Chen, Yao; Zhu, Chunming; Liu, Chang; Ge, Lili; Wang, Bing; Li, Chuanyang; Wang, Haimin

    2018-06-01

    Coronal loops interconnecting two active regions (ARs), called interconnecting loops (ILs), are prominent large-scale structures in the solar atmosphere. They carry a significant amount of magnetic flux and therefore are considered to be an important element of the solar dynamo process. Earlier observations showed that eruptions of ILs are an important source of CMEs. It is generally believed that ILs are formed through magnetic reconnection in the high corona (>150″–200″), and several scenarios have been proposed to explain their brightening in soft X-rays (SXRs). However, the detailed IL formation process has not been fully explored, and the associated energy release in the corona still remains unresolved. Here, we report the complete formation process of a set of ILs connecting two nearby ARs, with successive observations by STEREO-A on the far side of the Sun and by SDO and Hinode on the Earth side. We conclude that ILs are formed by gradual reconnection high in the corona, in line with earlier postulations. In addition, we show evidence that ILs brighten in SXRs and EUVs through heating at or close to the reconnection site in the corona (i.e., through the direct heating process of reconnection), a process that has been largely overlooked in earlier studies of ILs.

  5. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  6. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  7. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  8. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  9. Large Scale Cosmological Anomalies and Inhomogeneous Dark Energy

    Directory of Open Access Journals (Sweden)

    Leandros Perivolaropoulos

    2014-01-01

    Full Text Available A wide range of large scale observations hint towards possible modifications on the standard cosmological model which is based on a homogeneous and isotropic universe with a small cosmological constant and matter. These observations, also known as “cosmic anomalies” include unexpected Cosmic Microwave Background perturbations on large angular scales, large dipolar peculiar velocity flows of galaxies (“bulk flows”, the measurement of inhomogenous values of the fine structure constant on cosmological scales (“alpha dipole” and other effects. The presence of the observational anomalies could either be a large statistical fluctuation in the context of ΛCDM or it could indicate a non-trivial departure from the cosmological principle on Hubble scales. Such a departure is very much constrained by cosmological observations for matter. For dark energy however there are no significant observational constraints for Hubble scale inhomogeneities. In this brief review I discuss some of the theoretical models that can naturally lead to inhomogeneous dark energy, their observational constraints and their potential to explain the large scale cosmic anomalies.

  10. The Pilot Lunar Geologic Mapping Project: Summary Results and Recommendations from the Copernicus Quadrangle

    Science.gov (United States)

    Skinner, J. A., Jr.; Gaddis, L. R.; Hagerty, J. J.

    2010-01-01

    The first systematic lunar geologic maps were completed at 1:1M scale for the lunar near side during the 1960s using telescopic and Lunar Orbiter (LO) photographs [1-3]. The program under which these maps were completed established precedents for map base, scale, projection, and boundaries in order to avoid widely discrepant products. A variety of geologic maps were subsequently produced for various purposes, including 1:5M scale global maps [4-9] and large scale maps of high scientific interest (including the Apollo landing sites) [10]. Since that time, lunar science has benefitted from an abundance of surface information, including high resolution images and diverse compositional data sets, which have yielded a host of topical planetary investigations. The existing suite of lunar geologic maps and topical studies provide exceptional context in which to unravel the geologic history of the Moon. However, there has been no systematic approach to lunar geologic mapping since the flight of post-Apollo scientific orbiters. Geologic maps provide a spatial and temporal framework wherein observations can be reliably benchmarked and compared. As such, a lack of a systematic mapping program means that modern (post- Apollo) data sets, their scientific ramifications, and the lunar scientists who investigate these data, are all marginalized in regard to geologic mapping. Marginalization weakens the overall understanding of the geologic evolution of the Moon and unnecessarily partitions lunar research. To bridge these deficiencies, we began a pilot geologic mapping project in 2005 as a means to assess the interest, relevance, and technical methods required for a renewed lunar geologic mapping program [11]. Herein, we provide a summary of the pilot geologic mapping project, which focused on the geologic materials and stratigraphic relationships within the Copernicus quadrangle (0-30degN, 0-45degW).

  11. Comparative study of large scale simulation of underground explosions inalluvium and in fractured granite using stochastic characterization

    Science.gov (United States)

    Vorobiev, O.; Ezzedine, S. M.; Antoun, T.; Glenn, L.

    2014-12-01

    This work describes a methodology used for large scale modeling of wave propagation fromunderground explosions conducted at the Nevada Test Site (NTS) in two different geological settings:fractured granitic rock mass and in alluvium deposition. We show that the discrete nature of rockmasses as well as the spatial variability of the fabric of alluvium is very important to understand groundmotions induced by underground explosions. In order to build a credible conceptual model of thesubsurface we integrated the geological, geomechanical and geophysical characterizations conductedduring recent test at the NTS as well as historical data from the characterization during the undergroundnuclear test conducted at the NTS. Because detailed site characterization is limited, expensive and, insome instances, impossible we have numerically investigated the effects of the characterization gaps onthe overall response of the system. We performed several computational studies to identify the keyimportant geologic features specific to fractured media mainly the joints; and those specific foralluvium porous media mainly the spatial variability of geological alluvium facies characterized bytheir variances and their integral scales. We have also explored common key features to both geologicalenvironments such as saturation and topography and assess which characteristics affect the most theground motion in the near-field and in the far-field. Stochastic representation of these features based onthe field characterizations have been implemented in Geodyn and GeodynL hydrocodes. Both codeswere used to guide site characterization efforts in order to provide the essential data to the modelingcommunity. We validate our computational results by comparing the measured and computed groundmotion at various ranges. This work performed under the auspices of the U.S. Department of Energy by Lawrence LivermoreNational Laboratory under Contract DE-AC52-07NA27344.

  12. Technique for large-scale structural mapping at uranium deposits i in non-metamorphosed sedimentary cover rocks

    International Nuclear Information System (INIS)

    Kochkin, B.T.

    1985-01-01

    The technique for large-scale construction (1:1000 - 1:10000), reflecting small amplitude fracture plicate structures, is given for uranium deposits in non-metamorphozed sedimentary cover rocks. Structure drill log sections, as well as a set of maps with the results of area analysis of hidden disturbances, structural analysis of iso-pachous lines and facies of platform mantle horizons serve as sour ce materials for structural mapplotting. The steps of structural map construction are considered: 1) structural carcass construction; 2) reconstruction of structure contour; 3) time determination of structure initiation; 4) plotting of an additional geologic load

  13. Development of multiple source data processing for structural analysis at a regional scale. [digital remote sensing in geology

    Science.gov (United States)

    Carrere, Veronique

    1990-01-01

    Various image processing techniques developed for enhancement and extraction of linear features, of interest to the structural geologist, from digital remote sensing, geologic, and gravity data, are presented. These techniques include: (1) automatic detection of linear features and construction of rose diagrams from Landsat MSS data; (2) enhancement of principal structural directions using selective filters on Landsat MSS, Spacelab panchromatic, and HCMM NIR data; (3) directional filtering of Spacelab panchromatic data using Fast Fourier Transform; (4) detection of linear/elongated zones of high thermal gradient from thermal infrared data; and (5) extraction of strong gravimetric gradients from digitized Bouguer anomaly maps. Processing results can be compared to each other through the use of a geocoded database to evaluate the structural importance of each lineament according to its depth: superficial structures in the sedimentary cover, or deeper ones affecting the basement. These image processing techniques were successfully applied to achieve a better understanding of the transition between Provence and the Pyrenees structural blocks, in southeastern France, for an improved structural interpretation of the Mediterranean region.

  14. A conceptual analysis of standard setting in large-scale assessments

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1994-01-01

    Elements of arbitrariness in the standard setting process are explored, and an alternative to the use of cut scores is presented. The first part of the paper analyzes the use of cut scores in large-scale assessments, discussing three different functions: (1) cut scores define the qualifications used

  15. Large-scale gas dynamical processes affecting the origin and evolution of gaseous galactic halos

    Science.gov (United States)

    Shapiro, Paul R.

    1991-01-01

    Observations of galactic halo gas are consistent with an interpretation in terms of the galactic fountain model in which supernova heated gas in the galactic disk escapes into the halo, radiatively cools and forms clouds which fall back to the disk. The results of a new study of several large-scale gas dynamical effects which are expected to occur in such a model for the origin and evolution of galactic halo gas will be summarized, including the following: (1) nonequilibrium absorption line and emission spectrum diagnostics for radiatively cooling halo gas in our own galaxy, as well the implications of such absorption line diagnostics for the origin of quasar absorption lines in galactic halo clouds of high redshift galaxies; (2) numerical MHD simulations and analytical analysis of large-scale explosions ad superbubbles in the galactic disk and halo; (3) numerical MHD simulations of halo cloud formation by thermal instability, with and without magnetic field; and (4) the effect of the galactic fountain on the galactic dynamo.

  16. A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project

    Science.gov (United States)

    Ewers, Robert M.; Didham, Raphael K.; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D.; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L.; Turner, Edgar C.

    2011-01-01

    Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification. PMID:22006969

  17. A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project.

    Science.gov (United States)

    Ewers, Robert M; Didham, Raphael K; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L; Turner, Edgar C

    2011-11-27

    Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification.

  18. Participatory Design of Large-Scale Information Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    into a PD process model that (1) emphasizes PD experiments as transcending traditional prototyping by evaluating fully integrated systems exposed to real work practices; (2) incorporates improvisational change management including anticipated, emergent, and opportunity-based change; and (3) extends initial...... design and development into a sustained and ongoing stepwise implementation that constitutes an overall technology-driven organizational change. The process model is presented through a largescale PD experiment in the Danish healthcare sector. We reflect on our experiences from this experiment......In this article we discuss how to engage in large-scale information systems development by applying a participatory design (PD) approach that acknowledges the unique situated work practices conducted by the domain experts of modern organizations. We reconstruct the iterative prototyping approach...

  19. Gas-water-rock interactions induced by reservoir exploitation, CO2 sequestration, and other geological storage

    International Nuclear Information System (INIS)

    Lecourtier, J.

    2005-01-01

    Here is given a summary of the opening address of the IFP International Workshop: 'gas-water-rock interactions induced by reservoir exploitation, CO 2 sequestration, and other geological storage' (18-20 November 2003). 'This broad topic is of major interest to the exploitation of geological sites since gas-water-mineral interactions determine the physicochemical characteristics of these sites, the strategies to adopt to protect the environment, and finally, the operational costs. Modelling the phenomena is a prerequisite for the engineering of a geological storage, either for disposal efficiency or for risk assessment and environmental protection. During the various sessions, several papers focus on the great achievements that have been made in the last ten years in understanding and modelling the coupled reaction and transport processes occurring in geological systems, from borehole to reservoir scale. Remaining challenges such as the coupling of mechanical processes of deformation with chemical reactions, or the influence of microbiological environments on mineral reactions will also be discussed. A large part of the conference programme will address the problem of mitigating CO 2 emissions, one of the most important issues that our society must solve in the coming years. From both a technical and an economic point of view, CO 2 geological sequestration is the most realistic solution proposed by the experts today. The results of ongoing pilot operations conducted in Europe and in the United States are strongly encouraging, but geological storage will be developed on a large scale in the future only if it becomes possible to predict the long term behaviour of stored CO 2 underground. In order to reach this objective, numerous issues must be solved: - thermodynamics of CO 2 in brines; - mechanisms of CO 2 trapping inside the host rock; - geochemical modelling of CO 2 behaviour in various types of geological formations; - compatibility of CO 2 with oil-well cements

  20. Large-scale climatic anomalies affect marine predator foraging behaviour and demography

    Science.gov (United States)

    Bost, Charles A.; Cotté, Cedric; Terray, Pascal; Barbraud, Christophe; Bon, Cécile; Delord, Karine; Gimenez, Olivier; Handrich, Yves; Naito, Yasuhiko; Guinet, Christophe; Weimerskirch, Henri

    2015-10-01

    Determining the links between the behavioural and population responses of wild species to environmental variations is critical for understanding the impact of climate variability on ecosystems. Using long-term data sets, we show how large-scale climatic anomalies in the Southern Hemisphere affect the foraging behaviour and population dynamics of a key marine predator, the king penguin. When large-scale subtropical dipole events occur simultaneously in both subtropical Southern Indian and Atlantic Oceans, they generate tropical anomalies that shift the foraging zone southward. Consequently the distances that penguins foraged from the colony and their feeding depths increased and the population size decreased. This represents an example of a robust and fast impact of large-scale climatic anomalies affecting a marine predator through changes in its at-sea behaviour and demography, despite lack of information on prey availability. Our results highlight a possible behavioural mechanism through which climate variability may affect population processes.

  1. Visual attention mitigates information loss in small- and large-scale neural codes

    Science.gov (United States)

    Sprague, Thomas C; Saproo, Sameer; Serences, John T

    2015-01-01

    Summary The visual system transforms complex inputs into robust and parsimonious neural codes that efficiently guide behavior. Because neural communication is stochastic, the amount of encoded visual information necessarily decreases with each synapse. This constraint requires processing sensory signals in a manner that protects information about relevant stimuli from degradation. Such selective processing – or selective attention – is implemented via several mechanisms, including neural gain and changes in tuning properties. However, examining each of these effects in isolation obscures their joint impact on the fidelity of stimulus feature representations by large-scale population codes. Instead, large-scale activity patterns can be used to reconstruct representations of relevant and irrelevant stimuli, providing a holistic understanding about how neuron-level modulations collectively impact stimulus encoding. PMID:25769502

  2. Individual differences influence two-digit number processing, but not their analog magnitude processing: a large-scale online study.

    Science.gov (United States)

    Huber, Stefan; Nuerk, Hans-Christoph; Reips, Ulf-Dietrich; Soltanlou, Mojtaba

    2017-12-23

    Symbolic magnitude comparison is one of the most well-studied cognitive processes in research on numerical cognition. However, while the cognitive mechanisms of symbolic magnitude processing have been intensively studied, previous studies have paid less attention to individual differences influencing symbolic magnitude comparison. Employing a two-digit number comparison task in an online setting, we replicated previous effects, including the distance effect, the unit-decade compatibility effect, and the effect of cognitive control on the adaptation to filler items, in a large-scale study in 452 adults. Additionally, we observed that the most influential individual differences were participants' first language, time spent playing computer games and gender, followed by reported alcohol consumption, age and mathematical ability. Participants who used a first language with a left-to-right reading/writing direction were faster than those who read and wrote in the right-to-left direction. Reported playing time for computer games was correlated with faster reaction times. Female participants showed slower reaction times and a larger unit-decade compatibility effect than male participants. Participants who reported never consuming alcohol showed overall slower response times than others. Older participants were slower, but more accurate. Finally, higher grades in mathematics were associated with faster reaction times. We conclude that typical experiments on numerical cognition that employ a keyboard as an input device can also be run in an online setting. Moreover, while individual differences have no influence on domain-specific magnitude processing-apart from age, which increases the decade distance effect-they generally influence performance on a two-digit number comparison task.

  3. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  4. Unraveling The Connectome: Visualizing and Abstracting Large-Scale Connectomics Data

    KAUST Repository

    Al-Awami, Ali K.

    2017-04-30

    We explore visualization and abstraction approaches to represent neuronal data. Neuroscientists acquire electron microscopy volumes to reconstruct a complete wiring diagram of the neurons in the brain, called the connectome. This will be crucial to understanding brains and their development. However, the resulting data is complex and large, posing a big challenge to existing visualization techniques in terms of clarity and scalability. We describe solutions to tackle the problems of scalability and cluttered presentation. We first show how a query-guided interactive approach to visual exploration can reduce the clutter and help neuroscientists explore their data dynamically. We use a knowledge-based query algebra that facilitates the interactive creation of queries. This allows neuroscientists to pose domain-specific questions related to their research. Simple queries can be combined to form complex queries to answer more sophisticated questions. We then show how visual abstractions from 3D to 2D can significantly reduce the visual clutter and add clarity to the visualization so that scientists can focus more on the analysis. We abstract the topology of 3D neurons into a multi-scale, relative distance-preserving subway map visualization that allows scientists to interactively explore the morphological and connectivity features of neuronal cells. We then focus on the process of acquisition, where neuroscientists segment electron microscopy images to reconstruct neurons. The segmentation process of such data is tedious, time-intensive, and usually performed using a diverse set of tools. We present a novel web-based visualization system for tracking the state, progress, and evolution of segmentation data in neuroscience. Our multi-user system seamlessly integrates a diverse set of tools. Our system provides support for the management, provenance, accountability, and auditing of large-scale segmentations. Finally, we present a novel architecture to render very large

  5. Algorithm of search and track of static and moving large-scale objects

    Directory of Open Access Journals (Sweden)

    Kalyaev Anatoly

    2017-01-01

    Full Text Available We suggest an algorithm for processing of a sequence, which contains images of search and track of static and moving large-scale objects. The possible software implementation of the algorithm, based on multithread CUDA processing, is suggested. Experimental analysis of the suggested algorithm implementation is performed.

  6. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  7. Large-scale influences in near-wall turbulence.

    Science.gov (United States)

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  8. Workshop Report on Additive Manufacturing for Large-Scale Metal Components - Development and Deployment of Metal Big-Area-Additive-Manufacturing (Large-Scale Metals AM) System

    Energy Technology Data Exchange (ETDEWEB)

    Babu, Sudarsanam Suresh [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Love, Lonnie J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Peter, William H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Dehoff, Ryan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility

    2016-05-01

    Additive manufacturing (AM) is considered an emerging technology that is expected to transform the way industry can make low-volume, high value complex structures. This disruptive technology promises to replace legacy manufacturing methods for the fabrication of existing components in addition to bringing new innovation for new components with increased functional and mechanical properties. This report outlines the outcome of a workshop on large-scale metal additive manufacturing held at Oak Ridge National Laboratory (ORNL) on March 11, 2016. The charter for the workshop was outlined by the Department of Energy (DOE) Advanced Manufacturing Office program manager. The status and impact of the Big Area Additive Manufacturing (BAAM) for polymer matrix composites was presented as the background motivation for the workshop. Following, the extension of underlying technology to low-cost metals was proposed with the following goals: (i) High deposition rates (approaching 100 lbs/h); (ii) Low cost (<$10/lbs) for steel, iron, aluminum, nickel, as well as, higher cost titanium, (iii) large components (major axis greater than 6 ft) and (iv) compliance of property requirements. The above concept was discussed in depth by representatives from different industrial sectors including welding, metal fabrication machinery, energy, construction, aerospace and heavy manufacturing. In addition, DOE’s newly launched High Performance Computing for Manufacturing (HPC4MFG) program was reviewed. This program will apply thermo-mechanical models to elucidate deeper understanding of the interactions between design, process, and materials during additive manufacturing. Following these presentations, all the attendees took part in a brainstorming session where everyone identified the top 10 challenges in large-scale metal AM from their own perspective. The feedback was analyzed and grouped in different categories including, (i) CAD to PART software, (ii) selection of energy source, (iii

  9. In the Shadow of Coal: How Large-Scale Industries Contributed to Present-Day Regional Differences in Personality and Well-Being.

    Science.gov (United States)

    Obschonka, Martin; Stuetzer, Michael; Rentfrow, Peter J; Shaw-Taylor, Leigh; Satchell, Max; Silbereisen, Rainer K; Potter, Jeff; Gosling, Samuel D

    2017-11-20

    Recent research has identified regional variation of personality traits within countries but we know little about the underlying drivers of this variation. We propose that the Industrial Revolution, as a key era in the history of industrialized nations, has led to a persistent clustering of well-being outcomes and personality traits associated with psychological adversity via processes of selective migration and socialization. Analyzing data from England and Wales, we examine relationships between the historical employment share in large-scale coal-based industries (coal mining and steam-powered manufacturing industries that used this coal as fuel for their steam engines) and today's regional variation in personality and well-being. Even after controlling for possible historical confounds (historical energy supply, education, wealth, geology, climate, population density), we find that the historical local dominance of large-scale coal-based industries predicts today's markers of psychological adversity (lower Conscientiousness [and order facet scores], higher Neuroticism [and anxiety and depression facet scores], lower activity [an Extraversion facet], and lower life satisfaction and life expectancy). An instrumental variable analysis, using the historical location of coalfields, supports the causal assumption behind these effects (with the exception of life satisfaction). Further analyses focusing on mechanisms hint at the roles of selective migration and persisting economic hardship. Finally, a robustness check in the U.S. replicates the effect of the historical concentration of large-scale industries on today's levels of psychological adversity. Taken together, the results show how today's regional patterns of personality and well-being (which shape the future trajectories of these regions) may have their roots in major societal changes underway decades or centuries earlier. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. Separating macroecological pattern and process: comparing ecological, economic, and geological systems.

    Directory of Open Access Journals (Sweden)

    Benjamin Blonder

    Full Text Available Theories of biodiversity rest on several macroecological patterns describing the relationship between species abundance and diversity. A central problem is that all theories make similar predictions for these patterns despite disparate assumptions. A troubling implication is that these patterns may not reflect anything unique about organizational principles of biology or the functioning of ecological systems. To test this, we analyze five datasets from ecological, economic, and geological systems that describe the distribution of objects across categories in the United States. At the level of functional form ('first-order effects', these patterns are not unique to ecological systems, indicating they may reveal little about biological process. However, we show that mechanism can be better revealed in the scale-dependency of first-order patterns ('second-order effects'. These results provide a roadmap for biodiversity theory to move beyond traditional patterns, and also suggest ways in which macroecological theory can constrain the dynamics of economic systems.

  11. Separating macroecological pattern and process: comparing ecological, economic, and geological systems.

    Science.gov (United States)

    Blonder, Benjamin; Sloat, Lindsey; Enquist, Brian J; McGill, Brian

    2014-01-01

    Theories of biodiversity rest on several macroecological patterns describing the relationship between species abundance and diversity. A central problem is that all theories make similar predictions for these patterns despite disparate assumptions. A troubling implication is that these patterns may not reflect anything unique about organizational principles of biology or the functioning of ecological systems. To test this, we analyze five datasets from ecological, economic, and geological systems that describe the distribution of objects across categories in the United States. At the level of functional form ('first-order effects'), these patterns are not unique to ecological systems, indicating they may reveal little about biological process. However, we show that mechanism can be better revealed in the scale-dependency of first-order patterns ('second-order effects'). These results provide a roadmap for biodiversity theory to move beyond traditional patterns, and also suggest ways in which macroecological theory can constrain the dynamics of economic systems.

  12. Internet-based information system of digital geological data providing

    Science.gov (United States)

    Yuon, Egor; Soukhanov, Mikhail; Markov, Kirill

    2015-04-01

    One of the Russian Federal аgency of mineral resources problems is to provide the geological information which was delivered during the field operation for the means of federal budget. This information should be present in the current, conditional form. Before, the leading way of presenting geological information were paper geological maps, slices, borehole diagrams reports etc. Technologies of database construction, including distributed databases, technologies of construction of distributed information-analytical systems and Internet-technologies are intensively developing nowadays. Most of geological organizations create their own information systems without any possibility of integration into other systems of the same orientation. In 2012, specialists of VNIIgeosystem together with specialists of VSEGEI started the large project - creating the system of providing digital geological materials with using modern and perspective internet-technologies. The system is based on the web-server and the set of special programs, which allows users to efficiently get rasterized and vectorised geological materials. These materials are: geological maps of scale 1:1M, geological maps of scale 1:200 000 and 1:2 500 000, the fragments of seamless geological 1:1M maps, structural zoning maps inside the seamless fragments, the legends for State geological maps 1:200 000 and 1:1 000 000, full author's set of maps and also current materials for international projects «Atlas of geological maps for Circumpolar Arctic scale 1:5 000 000» and «Atlas of Geologic maps of central Asia and adjacent areas scale 1:2 500 000». The most interesting and functional block of the system - is the block of providing structured and well-formalized geological vector materials, based on Gosgeolkart database (NGKIS), managed by Oracle and the Internet-access is supported by web-subsystem NGKIS, which is currently based on MGS-Framework platform, developed by VNIIgeosystem. One of the leading elements

  13. 3D magnetization vector inversion based on fuzzy clustering: inversion algorithm, uncertainty analysis, and application to geology differentiation

    Science.gov (United States)

    Sun, J.; Li, Y.

    2017-12-01

    Magnetic data contain important information about the subsurface rocks that were magnetized in the geological history, which provides an important avenue to the study of the crustal heterogeneities associated with magmatic and hydrothermal activities. Interpretation of magnetic data has been widely used in mineral exploration, basement characterization and large scale crustal studies for several decades. However, interpreting magnetic data has been often complicated by the presence of remanent magnetizations with unknown magnetization directions. Researchers have developed different methods to deal with the challenges posed by remanence. We have developed a new and effective approach to inverting magnetic data for magnetization vector distributions characterized by region-wise consistency in the magnetization directions. This approach combines the classical Tikhonov inversion scheme with fuzzy C-means clustering algorithm, and constrains the estimated magnetization vectors to a specified small number of possible directions while fitting the observed magnetic data to within noise level. Our magnetization vector inversion recovers both the magnitudes and the directions of the magnetizations in the subsurface. Magnetization directions reflect the unique geological or hydrothermal processes applied to each geological unit, and therefore, can potentially be used for the purpose of differentiating various geological units. We have developed a practically convenient and effective way of assessing the uncertainty associated with the inverted magnetization directions (Figure 1), and investigated how geological differentiation results might be affected (Figure 2). The algorithm and procedures we have developed for magnetization vector inversion and uncertainty analysis open up new possibilities of extracting useful information from magnetic data affected by remanence. We will use a field data example from exploration of an iron-oxide-copper-gold (IOCG) deposit in Brazil to

  14. Achieving online consent to participation in large-scale gene-environment studies: a tangible destination

    NARCIS (Netherlands)

    Wood, F.; Kowalczuk, J.; Elwyn, G.; Mitchell, C.; Gallacher, J.

    2011-01-01

    BACKGROUND: Population based genetics studies are dependent on large numbers of individuals in the pursuit of small effect sizes. Recruiting and consenting a large number of participants is both costly and time consuming. We explored whether an online consent process for large-scale genetics studies

  15. Geologic field trip guide to Mount Mazama and Crater Lake Caldera, Oregon

    Science.gov (United States)

    Bacon, Charles R.; Wright, Heather M.

    2017-08-08

    Crater Lake partly fills one of the most spectacular calderas of the world—an 8 by 10 kilometer (km) basin more than 1 km deep formed by collapse of the Mount Mazama volcano during a rapid series of explosive eruptions ~7,700 years ago. Having a maximum depth of 594 meters (m), Crater Lake is the deepest lake in the United States. Crater Lake National Park, dedicated in 1902, encompasses 645 square kilometers (km2) of pristine forested and alpine terrain, including the lake itself, and virtually all of Mount Mazama. The geology of the area was first described in detail by Diller and Patton (1902) and later by Williams (1942), whose vivid account led to international recognition of Crater Lake as the classic collapse caldera. Because of excellent preservation and access, Mount Mazama, Crater Lake caldera, and the deposits formed by the climactic eruption constitute a natural laboratory for study of volcanic and magmatic processes. For example, the climactic ejecta are renowned among volcanologists as evidence for systematic compositional zonation within a subterranean magma chamber. Mount Mazama’s climactic eruption also is important as the source of the widespread Mazama ash, a useful Holocene stratigraphic marker throughout the Pacific Northwest United States, adjacent Canada, and offshore. A detailed bathymetric survey of the floor of Crater Lake in 2000 (Bacon and others, 2002) provides a unique record of postcaldera eruptions, the interplay between volcanism and filling of the lake, and sediment transport within this closed basin. Knowledge of the geology and eruptive history of the Mount Mazama edifice, enhanced by the caldera wall exposures, gives exceptional insight into how large volcanoes of magmatic arcs grow and evolve. In addition, many smaller volcanoes of the High Cascades beyond the limits of Mount Mazama provide information on the flux of mantle-derived magma through the region. General principles of magmatic and eruptive processes revealed by

  16. PKI security in large-scale healthcare networks.

    Science.gov (United States)

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  17. Map Archive Mining: Visual-Analytical Approaches to Explore Large Historical Map Collections

    Directory of Open Access Journals (Sweden)

    Johannes H. Uhl

    2018-04-01

    Full Text Available Historical maps are unique sources of retrospective geographical information. Recently, several map archives containing map series covering large spatial and temporal extents have been systematically scanned and made available to the public. The geographical information contained in such data archives makes it possible to extend geospatial analysis retrospectively beyond the era of digital cartography. However, given the large data volumes of such archives (e.g., more than 200,000 map sheets in the United States Geological Survey topographic map archive and the low graphical quality of older, manually-produced map sheets, the process to extract geographical information from these map archives needs to be automated to the highest degree possible. To understand the potential challenges (e.g., salient map characteristics and data quality variations in automating large-scale information extraction tasks for map archives, it is useful to efficiently assess spatio-temporal coverage, approximate map content, and spatial accuracy of georeferenced map sheets at different map scales. Such preliminary analytical steps are often neglected or ignored in the map processing literature but represent critical phases that lay the foundation for any subsequent computational processes including recognition. Exemplified for the United States Geological Survey topographic map and the Sanborn fire insurance map archives, we demonstrate how such preliminary analyses can be systematically conducted using traditional analytical and cartographic techniques, as well as visual-analytical data mining tools originating from machine learning and data science.

  18. Hydrogen combustion modelling in large-scale geometries

    International Nuclear Information System (INIS)

    Studer, E.; Beccantini, A.; Kudriakov, S.; Velikorodny, A.

    2014-01-01

    Hydrogen risk mitigation issues based on catalytic recombiners cannot exclude flammable clouds to be formed during the course of a severe accident in a Nuclear Power Plant. Consequences of combustion processes have to be assessed based on existing knowledge and state of the art in CFD combustion modelling. The Fukushima accidents have also revealed the need for taking into account the hydrogen explosion phenomena in risk management. Thus combustion modelling in a large-scale geometry is one of the remaining severe accident safety issues. At present day there doesn't exist a combustion model which can accurately describe a combustion process inside a geometrical configuration typical of the Nuclear Power Plant (NPP) environment. Therefore the major attention in model development has to be paid on the adoption of existing approaches or creation of the new ones capable of reliably predicting the possibility of the flame acceleration in the geometries of that type. A set of experiments performed previously in RUT facility and Heiss Dampf Reactor (HDR) facility is used as a validation database for development of three-dimensional gas dynamic model for the simulation of hydrogen-air-steam combustion in large-scale geometries. The combustion regimes include slow deflagration, fast deflagration, and detonation. Modelling is based on Reactive Discrete Equation Method (RDEM) where flame is represented as an interface separating reactants and combustion products. The transport of the progress variable is governed by different flame surface wrinkling factors. The results of numerical simulation are presented together with the comparisons, critical discussions and conclusions. (authors)

  19. Biotic survival in the cryobiosphere on geological scale: implication for astro/terrestrial biogeoscience

    Science.gov (United States)

    Gilichinsky, D.

    2003-04-01

    In current opinion the most fundamental aspect of any environment, the temperature regime, acts as a regulator of all of the physical-chemical reactions and forms the basis of all biological processes. Now hard data indicate the biotic survival over geological periods from subzero temperatures (down to -27oC in permafrost and to -50oC in ice) to positive one in amber and halite. All these very different environments have, nevertheless, common features: complete isolation, stability and waterproof. In such unique physical-chemical complexes, the dehydration of macromolecules and the reorganization of membrane components apparently lead to a considerable decrease or stop of metabolic activity independently of temperature. This allowed the prolonged survival of ancient microbial lineage that realize unknown possibilities of physiological and biochemical adaptation incomparably longer than any other known habitat. The ability of microorganisms to survive on geological scale within the broad limits of natural systems forces us to redefine the spatial and temporal limits of the terrestrial and extraterrestrial biospheres and suggested that universal mechanisms of such adaptation might operate for millions of years. Among new scientific directions formed on this base, the most general is the fundamental question: how long the life might be preserved and what mechanisms could ensure survival? Because the length of lifetime cannot be reproduced, this highlights the natural storages that make possible the observation of the results of biotic survival on geological scale. Of special interest is the interaction of knowledge to understanding of the limits of the deep cold biosphere as a depository of ancient active biosignatures (biogases, biominerals, pigments, lipids, enzymes, proteins, RNA/DNA fragments) and viable cells. The last are the only known a huge mass of organisms that have retained viability over geological periods and upon thawing, renew physiological activity

  20. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  1. Emerging large-scale solar heating applications

    Energy Technology Data Exchange (ETDEWEB)

    Wong, W.P.; McClung, J.L. [Science Applications International Corporation (SAIC Canada), Ottawa, Ontario (Canada)

    2009-07-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  2. Environment and host as large-scale controls of ectomycorrhizal fungi.

    Science.gov (United States)

    van der Linde, Sietse; Suz, Laura M; Orme, C David L; Cox, Filipa; Andreae, Henning; Asi, Endla; Atkinson, Bonnie; Benham, Sue; Carroll, Christopher; Cools, Nathalie; De Vos, Bruno; Dietrich, Hans-Peter; Eichhorn, Johannes; Gehrmann, Joachim; Grebenc, Tine; Gweon, Hyun S; Hansen, Karin; Jacob, Frank; Kristöfel, Ferdinand; Lech, Paweł; Manninger, Miklós; Martin, Jan; Meesenburg, Henning; Merilä, Päivi; Nicolas, Manuel; Pavlenda, Pavel; Rautio, Pasi; Schaub, Marcus; Schröck, Hans-Werner; Seidling, Walter; Šrámek, Vít; Thimonier, Anne; Thomsen, Iben Margrete; Titeux, Hugues; Vanguelova, Elena; Verstraeten, Arne; Vesterdal, Lars; Waldner, Peter; Wijk, Sture; Zhang, Yuxin; Žlindra, Daniel; Bidartondo, Martin I

    2018-06-06

    Explaining the large-scale diversity of soil organisms that drive biogeochemical processes-and their responses to environmental change-is critical. However, identifying consistent drivers of belowground diversity and abundance for some soil organisms at large spatial scales remains problematic. Here we investigate a major guild, the ectomycorrhizal fungi, across European forests at a spatial scale and resolution that is-to our knowledge-unprecedented, to explore key biotic and abiotic predictors of ectomycorrhizal diversity and to identify dominant responses and thresholds for change across complex environmental gradients. We show the effect of 38 host, environment, climate and geographical variables on ectomycorrhizal diversity, and define thresholds of community change for key variables. We quantify host specificity and reveal plasticity in functional traits involved in soil foraging across gradients. We conclude that environmental and host factors explain most of the variation in ectomycorrhizal diversity, that the environmental thresholds used as major ecosystem assessment tools need adjustment and that the importance of belowground specificity and plasticity has previously been underappreciated.

  3. Mapping spatial patterns of denitrifiers at large scales (Invited)

    Science.gov (United States)

    Philippot, L.; Ramette, A.; Saby, N.; Bru, D.; Dequiedt, S.; Ranjard, L.; Jolivet, C.; Arrouays, D.

    2010-12-01

    Little information is available regarding the landscape-scale distribution of microbial communities and its environmental determinants. Here we combined molecular approaches and geostatistical modeling to explore spatial patterns of the denitrifying community at large scales. The distribution of denitrifrying community was investigated over 107 sites in Burgundy, a 31 500 km2 region of France, using a 16 X 16 km sampling grid. At each sampling site, the abundances of denitrifiers and 42 soil physico-chemical properties were measured. The relative contributions of land use, spatial distance, climatic conditions, time and soil physico-chemical properties to the denitrifier spatial distribution were analyzed by canonical variation partitioning. Our results indicate that 43% to 85% of the spatial variation in community abundances could be explained by the measured environmental parameters, with soil chemical properties (mostly pH) being the main driver. We found spatial autocorrelation up to 739 km and used geostatistical modelling to generate predictive maps of the distribution of denitrifiers at the landscape scale. Studying the distribution of the denitrifiers at large scale can help closing the artificial gap between the investigation of microbial processes and microbial community ecology, therefore facilitating our understanding of the relationships between the ecology of denitrifiers and N-fluxes by denitrification.

  4. An industrial perspective on bioreactor scale-down: what we can learn from combined large-scale bioprocess and model fluid studies.

    Science.gov (United States)

    Noorman, Henk

    2011-08-01

    For industrial bioreactor design, operation, control and optimization, the scale-down approach is often advocated to efficiently generate data on a small scale, and effectively apply suggested improvements to the industrial scale. In all cases it is important to ensure that the scale-down conditions are representative of the real large-scale bioprocess. Progress is hampered by limited detailed and local information from large-scale bioprocesses. Complementary to real fermentation studies, physical aspects of model fluids such as air-water in large bioreactors provide useful information with limited effort and cost. Still, in industrial practice, investments of time, capital and resources often prohibit systematic work, although, in the end, savings obtained in this way are trivial compared to the expenses that result from real process disturbances, batch failures, and non-flyers with loss of business opportunity. Here we try to highlight what can be learned from real large-scale bioprocess in combination with model fluid studies, and to provide suitable computation tools to overcome data restrictions. Focus is on a specific well-documented case for a 30-m(3) bioreactor. Areas for further research from an industrial perspective are also indicated. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. The STRATAFORM Project: U.S. Geological Survey geotechnical studies

    Science.gov (United States)

    Minasian, Diane L.; Lee, Homa J.; Locat, Jaques; Orzech, Kevin M.; Martz, Gregory R.; Israel, Kenneth

    2001-01-01

    This report presents physical property logs of core samples from an offshore area near Eureka, CA. The cores were obtained as part of the STRATAFORM Program (Nittrouer and Kravitz, 1995, 1996), a study investigating how present sedimentation and sediment transport processes influence long-term stratigraphic sequences preserved in the geologic record. The core samples were collected during four separate research cruises to the northern California study area, and data shown in the logs of the cores were collected using a multi-sensor whole core logger. The physical properties collected are useful in identifying stratigraphic units, ground-truthing acoustic imagery and sub-bottom profiles, and in understanding mass movement processes. STRATA FORmation on Margins was initiated in 1994 by the Office of Naval Research, Marine Geology and Geophysics Department as a coordinated multi-investigator study of continental-margin sediment transport processes and stratigraphy (Nittrouer and Kravitz, 1996). The program is investigating the stratigraphic signature of the shelf and slope parts of the continental margins, and is designed to provide a better understanding of the sedimentary record and a better prediction of strata. Specifically, the goals of the STRATAFORM Program are to (Nittrouer and Kravitz, 1995): - determine the geological relevance of short-term physical processes that erode, transport, and deposit particles and those processes that subsequently rework the seabed over time scales - improve capabilities for identifying the processes that form the strata observed within the upper ~100 m of the seabed commonly representing 104-106 years of sedimentation. - synthesize this knowledge and bridge the gap between time scales of sedimentary processes and those of sequence stratigraphy. The STRATAFORM Program is divided into studies of the continental shelf and the continental slope; the geotechnical group within the U.S. Geological Survey provides support to both parts

  6. Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan

    Science.gov (United States)

    Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun

    2017-04-01

    Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.

  7. Massive Cloud Computing Processing of P-SBAS Time Series for Displacement Analyses at Large Spatial Scale

    Science.gov (United States)

    Casu, F.; de Luca, C.; Lanari, R.; Manunta, M.; Zinno, I.

    2016-12-01

    A methodology for computing surface deformation time series and mean velocity maps of large areas is presented. Our approach relies on the availability of a multi-temporal set of Synthetic Aperture Radar (SAR) data collected from ascending and descending orbits over an area of interest, and also permits to estimate the vertical and horizontal (East-West) displacement components of the Earth's surface. The adopted methodology is based on an advanced Cloud Computing implementation of the Differential SAR Interferometry (DInSAR) Parallel Small Baseline Subset (P-SBAS) processing chain which allows the unsupervised processing of large SAR data volumes, from the raw data (level-0) imagery up to the generation of DInSAR time series and maps. The presented solution, which is highly scalable, has been tested on the ascending and descending ENVISAT SAR archives, which have been acquired over a large area of Southern California (US) that extends for about 90.000 km2. Such an input dataset has been processed in parallel by exploiting 280 computing nodes of the Amazon Web Services Cloud environment. Moreover, to produce the final mean deformation velocity maps of the vertical and East-West displacement components of the whole investigated area, we took also advantage of the information available from external GPS measurements that permit to account for possible regional trends not easily detectable by DInSAR and to refer the P-SBAS measurements to an external geodetic datum. The presented results clearly demonstrate the effectiveness of the proposed approach that paves the way to the extensive use of the available ERS and ENVISAT SAR data archives. Furthermore, the proposed methodology can be particularly suitable to deal with the very huge data flow provided by the Sentinel-1 constellation, thus permitting to extend the DInSAR analyses at a nearly global scale. This work is partially supported by: the DPC-CNR agreement, the EPOS-IP project and the ESA GEP project.

  8. Diffusion Experiments with Opalinus and Callovo-Oxfordian Clays: Laboratory, Large-Scale Experiments and Microscale Analysis by RBS

    International Nuclear Information System (INIS)

    Garcia-Gutierrez, M.; Alonso, U.; Missana, T.; Cormenzana, J.L.; Mingarro, M.; Morejon, J.; Gil, P.

    2009-01-01

    Consolidated clays are potential host rocks for deep geological repositories for high-level radioactive waste. Diffusion is the main transport process for radionuclides (RN) in these clays. Radionuclide (RN) diffusion coefficients are the most important parameters for Performance Assessment (PA) calculations of clay barriers. Different diffusion methodologies were applied at a laboratory scale to analyse the diffusion behaviour of a wide range of RN. Main aims were to understand the transport properties of different RNs in two different clays and to contribute with feasible methodologies to improve in-situ diffusion experiments, using samples of larger scale. Classical laboratory essays and a novel experimental set-up for large-scale diffusion experiments were performed, together to a novel application of the nuclear ion beam technique Rutherford Backscattering Spectrometry (RBS), for diffusion analyses at the micrometer scale. The main experimental and theoretical characteristics of the different methodologies, and their advantages and limitations are here discussed. Experiments were performed with the Opalinus and the Callovo-Oxfordian clays. Both clays are studied as potential host rock for a repository. Effective diffusion coefficients ranged between 1.10 - 10 to 1.10 - 12 m 2 /s for neutral, low sorbing cations (as Na and Sr) and anions. Apparent diffusion coefficients for strongly sorbing elements, as Cs and Co, are in the order of 1.10-13 m 2 /s; europium present the lowest diffusion coefficient (5.10 - 15 m 2 /s). The results obtained by the different approaches gave a comprehensive database of diffusion coefficients for RN with different transport behaviour within both clays. (Author) 42 refs

  9. Large scale tracking algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Ross L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Joshua Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Melgaard, David Kennett [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Karelitz, David B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pitts, Todd Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zollweg, Joshua David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Dylan Z. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nandy, Prabal [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Whitlow, Gary L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bender, Daniel A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond Harry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  10. Practical modeling approaches for geological storage of carbon dioxide.

    Science.gov (United States)

    Celia, Michael A; Nordbotten, Jan M

    2009-01-01

    The relentless increase of anthropogenic carbon dioxide emissions and the associated concerns about climate change have motivated new ideas about carbon-constrained energy production. One technological approach to control carbon dioxide emissions is carbon capture and storage, or CCS. The underlying idea of CCS is to capture the carbon before it emitted to the atmosphere and store it somewhere other than the atmosphere. Currently, the most attractive option for large-scale storage is in deep geological formations, including deep saline aquifers. Many physical and chemical processes can affect the fate of the injected CO2, with the overall mathematical description of the complete system becoming very complex. Our approach to the problem has been to reduce complexity as much as possible, so that we can focus on the few truly important questions about the injected CO2, most of which involve leakage out of the injection formation. Toward this end, we have established a set of simplifying assumptions that allow us to derive simplified models, which can be solved numerically or, for the most simplified cases, analytically. These simplified models allow calculation of solutions to large-scale injection and leakage problems in ways that traditional multicomponent multiphase simulators cannot. Such simplified models provide important tools for system analysis, screening calculations, and overall risk-assessment calculations. We believe this is a practical and important approach to model geological storage of carbon dioxide. It also serves as an example of how complex systems can be simplified while retaining the essential physics of the problem.

  11. Intelligent Learning for Knowledge Graph towards Geological Data

    Directory of Open Access Journals (Sweden)

    Yueqin Zhu

    2017-01-01

    Full Text Available Knowledge graph (KG as a popular semantic network has been widely used. It provides an effective way to describe semantic entities and their relationships by extending ontology in the entity level. This article focuses on the application of KG in the traditional geological field and proposes a novel method to construct KG. On the basis of natural language processing (NLP and data mining (DM algorithms, we analyze those key technologies for designing a KG towards geological data, including geological knowledge extraction and semantic association. Through this typical geological ontology extracting on a large number of geological documents and open linked data, the semantic interconnection is achieved, KG framework for geological data is designed, application system of KG towards geological data is constructed, and dynamic updating of the geological information is completed accordingly. Specifically, unsupervised intelligent learning method using linked open data is incorporated into the geological document preprocessing, which generates a geological domain vocabulary ultimately. Furthermore, some application cases in the KG system are provided to show the effectiveness and efficiency of our proposed intelligent learning approach for KG.

  12. The Geology of the Marcia Quadrangle of Asteroid Vesta: Assessing the Effects of Large, Young Craters

    Science.gov (United States)

    Williams, David A.; Denevi, Brett W.; Mittlefehldt, David W.; Mest, Scott C.; Schenk, Paul M.; Yingst, R. Aileen; Buczowski, Debra L.; Scully, Jennifer E. C.; Garry, W. Brent; McCord, Thomas B.; hide

    2014-01-01

    We used Dawn spacecraft data to identify and delineate geological units and landforms in the Marcia quadrangle of Vesta as a means to assess the role of the large, relatively young impact craters Marcia (approximately 63 kilometers diameter) and Calpurnia (approximately 53 kilometers diameter) and their surrounding ejecta field on the local geology. We also investigated a local topographic high with a dark-rayed crater named Aricia Tholus, and the impact crater Octavia that is surrounded by a distinctive diffuse mantle. Crater counts and stratigraphic relations suggest that Marcia is the youngest large crater on Vesta, in which a putative impact melt on the crater floor ranges in age between approximately 40 and 60 million years (depending upon choice of chronology system), and Marcia's ejecta blanket ranges in age between approximately 120 and 390 million years (depending upon choice of chronology system). We interpret the geologic units in and around Marcia crater to mark a major Vestan time-stratigraphic event, and that the Marcia Formation is one of the geologically youngest formations on Vesta. Marcia crater reveals pristine bright and dark material in its walls and smooth and pitted terrains on its floor. The smooth unit we interpret as evidence of flow of impact melts and (for the pitted terrain) release of volatiles during or after the impact process. The distinctive dark ejecta surrounding craters Marcia and Calpurnia is enriched in OH- or H-bearing phases and has a variable morphology, suggestive of a complex mixture of impact ejecta and impact melts including dark materials possibly derived from carbonaceous chondrite-rich material. Aricia Tholus, which was originally interpreted as a putative Vestan volcanic edifice based on lower resolution observations, appears to be a fragment of an ancient impact basin rim topped by a dark-rayed impact crater. Octavia crater has a cratering model formation age of approximately 280-990 million years based on counts

  13. Investigating textural controls on Archie's porosity exponent using process-based, pore-scale modelling

    Science.gov (United States)

    Niu, Q.; Zhang, C.

    2017-12-01

    Archie's law is an important empirical relationship linking the electrical resistivity of geological materials to their porosity. It has been found experimentally that the porosity exponent m in Archie's law in sedimentary rocks might be related to the degree of cementation, and therefore m is termed as "cementation factor" in most literatures. Despite it has been known for many years, there is lack of well-accepted physical interpretations of the porosity exponent. Some theoretical and experimental evidences have also shown that m may be controlled by the particle and/or pore shape. In this study, we conduct a pore-scale modeling of the porosity exponent that incorporates different geological processes. The evolution of m of eight synthetic samples with different particle sizes and shapes are calculated during two geological processes, i.e., compaction and cementation. The numerical results show that in dilute conditions, m is controlled by the particle shape. As the samples deviate from dilute conditions, m increases gradually due to the strong interaction between particles. When the samples are at static equilibrium, m is noticeably larger than its values at dilution condition. The numerical simulation results also show that both geological compaction and cementation induce a significant increase in m. In addition, the geometric characteristics of these samples (e.g., pore space/throat size, and their distributions) during compaction and cementation are also calculated. Preliminary analysis shows a unique correlation between the pore size broadness and porosity exponent for all eight samples. However, such a correlation is not found between m and other geometric characteristics.

  14. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  15. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  16. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    Science.gov (United States)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  17. Hypersingular integral equations, waveguiding effects in Cantorian Universe and genesis of large scale structures

    International Nuclear Information System (INIS)

    Iovane, G.; Giordano, P.

    2005-01-01

    In this work we introduce the hypersingular integral equations and analyze a realistic model of gravitational waveguides on a cantorian space-time. A waveguiding effect is considered with respect to the large scale structure of the Universe, where the structure formation appears as if it were a classically self-similar random process at all astrophysical scales. The result is that it seems we live in an El Naschie's o (∞) Cantorian space-time, where gravitational lensing and waveguiding effects can explain the appearing Universe. In particular, we consider filamentary and planar large scale structures as possible refraction channels for electromagnetic radiation coming from cosmological structures. From this vision the Universe appears like a large self-similar adaptive mirrors set, thanks to three numerical simulations. Consequently, an infinite Universe is just an optical illusion that is produced by mirroring effects connected with the large scale structure of a finite and not a large Universe

  18. Visual attention mitigates information loss in small- and large-scale neural codes.

    Science.gov (United States)

    Sprague, Thomas C; Saproo, Sameer; Serences, John T

    2015-04-01

    The visual system transforms complex inputs into robust and parsimonious neural codes that efficiently guide behavior. Because neural communication is stochastic, the amount of encoded visual information necessarily decreases with each synapse. This constraint requires that sensory signals are processed in a manner that protects information about relevant stimuli from degradation. Such selective processing--or selective attention--is implemented via several mechanisms, including neural gain and changes in tuning properties. However, examining each of these effects in isolation obscures their joint impact on the fidelity of stimulus feature representations by large-scale population codes. Instead, large-scale activity patterns can be used to reconstruct representations of relevant and irrelevant stimuli, thereby providing a holistic understanding about how neuron-level modulations collectively impact stimulus encoding. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Calculating Soil Wetness, Evapotranspiration and Carbon Cycle Processes Over Large Grid Areas Using a New Scaling Technique

    Science.gov (United States)

    Sellers, Piers

    2012-01-01

    Soil wetness typically shows great spatial variability over the length scales of general circulation model (GCM) grid areas (approx 100 km ), and the functions relating evapotranspiration and photosynthetic rate to local-scale (approx 1 m) soil wetness are highly non-linear. Soil respiration is also highly dependent on very small-scale variations in soil wetness. We therefore expect significant inaccuracies whenever we insert a single grid area-average soil wetness value into a function to calculate any of these rates for the grid area. For the particular case of evapotranspiration., this method - use of a grid-averaged soil wetness value - can also provoke severe oscillations in the evapotranspiration rate and soil wetness under some conditions. A method is presented whereby the probability distribution timction(pdf) for soil wetness within a grid area is represented by binning. and numerical integration of the binned pdf is performed to provide a spatially-integrated wetness stress term for the whole grid area, which then permits calculation of grid area fluxes in a single operation. The method is very accurate when 10 or more bins are used, can deal realistically with spatially variable precipitation, conserves moisture exactly and allows for precise modification of the soil wetness pdf after every time step. The method could also be applied to other ecological problems where small-scale processes must be area-integrated, or upscaled, to estimate fluxes over large areas, for example in treatments of the terrestrial carbon budget or trace gas generation.

  20. FEBEX project: full-scale engineered barriers experiment for a deep geological repository for high level radioactive waste in crystalline host rock

    Energy Technology Data Exchange (ETDEWEB)

    Alberid, J; Barcala, J M; Campos, R; Cuevas, A M; Fernandez, E [Ciemat. Madrid (Spain)

    2000-07-01

    FEBEX has the multiple objective of demonstrating the feasibility of manufacturing, handling and constructing the engineered barriers and of developing codes for the thermo-hydro-mechanical and thermo-hydro-geochemical performance assessment of a deep geological repository for high level radioactive wastes. These objectives require integrated theoretical and experimental development work. The experimental work consists of three parts: an in situ test, a mock-up test and a series of laboratory tests. The experiments is based on the Spanish reference concept for crystalline rock, in which the waste capsules are placed horizontally in drifts surround by high density compacted bentonite blocks. In the two large-scale tests, the thermal effects of the wastes were simulated by means of heaters; hydration was natural in the in situ test and controlled in the mock-up test. The large-scale tests, with their monitoring systems, have been in operation for more than two years. the demonstration has been achieved in the in situ test and there are great expectation that numerical models sufficiently validated for the near-field performance assessment will be achieved. (Author)

  1. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  2. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  3. The relationship between large-scale and convective states in the tropics - Towards an improved representation of convection in large-scale models

    Energy Technology Data Exchange (ETDEWEB)

    Jakob, Christian [Monash Univ., Melbourne, VIC (Australia)

    2015-02-26

    This report summarises an investigation into the relationship of tropical thunderstorms to the atmospheric conditions they are embedded in. The study is based on the use of radar observations at the Atmospheric Radiation Measurement site in Darwin run under the auspices of the DOE Atmospheric Systems Research program. Linking the larger scales of the atmosphere with the smaller scales of thunderstorms is crucial for the development of the representation of thunderstorms in weather and climate models, which is carried out by a process termed parametrisation. Through the analysis of radar and wind profiler observations the project made several fundamental discoveries about tropical storms and quantified the relationship of the occurrence and intensity of these storms to the large-scale atmosphere. We were able to show that the rainfall averaged over an area the size of a typical climate model grid-box is largely controlled by the number of storms in the area, and less so by the storm intensity. This allows us to completely rethink the way we represent such storms in climate models. We also found that storms occur in three distinct categories based on their depth and that the transition between these categories is strongly related to the larger scale dynamical features of the atmosphere more so than its thermodynamic state. Finally, we used our observational findings to test and refine a new approach to cumulus parametrisation which relies on the stochastic modelling of the area covered by different convective cloud types.

  4. A full scale approximation of covariance functions for large spatial data sets

    KAUST Repository

    Sang, Huiyan

    2011-10-10

    Gaussian process models have been widely used in spatial statistics but face tremendous computational challenges for very large data sets. The model fitting and spatial prediction of such models typically require O(n 3) operations for a data set of size n. Various approximations of the covariance functions have been introduced to reduce the computational cost. However, most existing approximations cannot simultaneously capture both the large- and the small-scale spatial dependence. A new approximation scheme is developed to provide a high quality approximation to the covariance function at both the large and the small spatial scales. The new approximation is the summation of two parts: a reduced rank covariance and a compactly supported covariance obtained by tapering the covariance of the residual of the reduced rank approximation. Whereas the former part mainly captures the large-scale spatial variation, the latter part captures the small-scale, local variation that is unexplained by the former part. By combining the reduced rank representation and sparse matrix techniques, our approach allows for efficient computation for maximum likelihood estimation, spatial prediction and Bayesian inference. We illustrate the new approach with simulated and real data sets. © 2011 Royal Statistical Society.

  5. A full scale approximation of covariance functions for large spatial data sets

    KAUST Repository

    Sang, Huiyan; Huang, Jianhua Z.

    2011-01-01

    Gaussian process models have been widely used in spatial statistics but face tremendous computational challenges for very large data sets. The model fitting and spatial prediction of such models typically require O(n 3) operations for a data set of size n. Various approximations of the covariance functions have been introduced to reduce the computational cost. However, most existing approximations cannot simultaneously capture both the large- and the small-scale spatial dependence. A new approximation scheme is developed to provide a high quality approximation to the covariance function at both the large and the small spatial scales. The new approximation is the summation of two parts: a reduced rank covariance and a compactly supported covariance obtained by tapering the covariance of the residual of the reduced rank approximation. Whereas the former part mainly captures the large-scale spatial variation, the latter part captures the small-scale, local variation that is unexplained by the former part. By combining the reduced rank representation and sparse matrix techniques, our approach allows for efficient computation for maximum likelihood estimation, spatial prediction and Bayesian inference. We illustrate the new approach with simulated and real data sets. © 2011 Royal Statistical Society.

  6. Centralized manure digestion. Selection of locations and estimation of costs of large-scale manure storage application

    International Nuclear Information System (INIS)

    1995-03-01

    A study to assess the possibilities and the consequences of the use of existing Dutch large scale manure silos at centralised anaerobic digestion plants (CAD-plants) for manure and energy-rich organic wastes is carried out. Reconstruction of these large scale manure silos into digesters for a CAD-plant is not self-evident due to the high height/diameter ratio of these silos and the extra investments that have to be made for additional facilities for roofing, insulation, mixing and heating. From the results of an inventory and selection of large scale manure silos with a storage capacity above 1,500 m 3 it appeared that there are 21 locations in The Netherlands that can be qualified for realisation of a CAD plant with a processing capacity of 100 m 3 biomass (80% manure, 20% additives) per day. These locations are found in particular at the 'shortage-areas' for manure fertilisation in the Dutch provinces Groningen and Drenthe. Three of these 21 locations with large scale silos are considered to be the most suitable for realisation of a large scale CAD-plant. The selection is based on an optimal scale for a CAD-plant of 300 m 3 material (80% manure, 20% additives) to be processed per day and the most suitable consuming markets for the biogas produced at the CAD-plant. The three locations are at Middelharnis, Veendam, and Klazinaveen. Applying the conditions as used in this study and accounting for all costs for transport of manure, additives and end-product including the costs for the storage facilities, a break-even operation might be realised at a minimum income for the additives of approximately 50 Dutch guilders per m 3 (including TAV). This income price is considerably lower than the prevailing costs for tipping or processing of organic wastes in The Netherlands. This study revealed that a break-even exploitation of a large scale CAD-plant for the processing of manure with energy-rich additives is possible. (Abstract Truncated)

  7. Human visual system automatically represents large-scale sequential regularities.

    Science.gov (United States)

    Kimura, Motohiro; Widmann, Andreas; Schröger, Erich

    2010-03-04

    Our brain recordings reveal that large-scale sequential regularities defined across non-adjacent stimuli can be automatically represented in visual sensory memory. To show that, we adopted an auditory paradigm developed by Sussman, E., Ritter, W., and Vaughan, H. G. Jr. (1998). Predictability of stimulus deviance and the mismatch negativity. NeuroReport, 9, 4167-4170, Sussman, E., and Gumenyuk, V. (2005). Organization of sequential sounds in auditory memory. NeuroReport, 16, 1519-1523 to the visual domain by presenting task-irrelevant infrequent luminance-deviant stimuli (D, 20%) inserted among task-irrelevant frequent stimuli being of standard luminance (S, 80%) in randomized (randomized condition, SSSDSSSSSDSSSSD...) and fixed manners (fixed condition, SSSSDSSSSDSSSSD...). Comparing the visual mismatch negativity (visual MMN), an event-related brain potential (ERP) index of memory-mismatch processes in human visual sensory system, revealed that visual MMN elicited by deviant stimuli was reduced in the fixed compared to the randomized condition. Thus, the large-scale sequential regularity being present in the fixed condition (SSSSD) must have been represented in visual sensory memory. Interestingly, this effect did not occur in conditions with stimulus-onset asynchronies (SOAs) of 480 and 800 ms but was confined to the 160-ms SOA condition supporting the hypothesis that large-scale regularity extraction was based on perceptual grouping of the five successive stimuli defining the regularity. 2010 Elsevier B.V. All rights reserved.

  8. Preparing laboratory and real-world EEG data for large-scale analysis: A containerized approach

    Directory of Open Access Journals (Sweden)

    Nima eBigdely-Shamlo

    2016-03-01

    Full Text Available Large-scale analysis of EEG and other physiological measures promises new insights into brain processes and more accurate and robust brain-computer interface (BCI models.. However, the absence of standard-ized vocabularies for annotating events in a machine understandable manner, the welter of collection-specific data organizations, the diffi-culty in moving data across processing platforms, and the unavailability of agreed-upon standards for preprocessing have prevented large-scale analyses of EEG. Here we describe a containerized approach and freely available tools we have developed to facilitate the process of an-notating, packaging, and preprocessing EEG data collections to enable data sharing, archiving, large-scale machine learning/data mining and (meta-analysis. The EEG Study Schema (ESS comprises three data Levels, each with its own XML-document schema and file/folder convention, plus a standardized (PREP pipeline to move raw (Data Level 1 data to a basic preprocessed state (Data Level 2 suitable for application of a large class of EEG analysis methods. Researchers can ship a study as a single unit and operate on its data using a standardized interface. ESS does not require a central database and provides all the metadata data necessary to execute a wide variety of EEG processing pipelines. The primary focus of ESS is automated in-depth analysis and meta-analysis EEG studies. However, ESS can also encapsulate meta-information for the other modalities such as eye tracking, that are in-creasingly used in both laboratory and real-world neuroimaging. ESS schema and tools are freely available at eegstudy.org, and a central cata-log of over 850 GB of existing data in ESS format is available at study-catalog.org. These tools and resources are part of a larger effort to ena-ble data sharing at sufficient scale for researchers to engage in truly large-scale EEG analysis and data mining (BigEEG.org.

  9. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  10. Detonation and fragmentation modeling for the description of large scale vapor explosions

    International Nuclear Information System (INIS)

    Buerger, M.; Carachalios, C.; Unger, H.

    1985-01-01

    The thermal detonation modeling of large-scale vapor explosions is shown to be indispensable for realistic safety evaluations. A steady-state as well as transient detonation model have been developed including detailed descriptions of the dynamics as well as the fragmentation processes inside a detonation wave. Strong restrictions for large-scale vapor explosions are obtained from this modeling and they indicate that the reactor pressure vessel would even withstand explosions with unrealistically high masses of corium involved. The modeling is supported by comparisons with a detonation experiment and - concerning its key part - hydronamic fragmentation experiments. (orig.) [de

  11. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  12. Cerebral methodology based computing to estimate real phenomena from large-scale nuclear simulation

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2011-01-01

    Our final goal is to estimate real phenomena from large-scale nuclear simulations by using computing processes. Large-scale simulations mean that they include scale variety and physical complexity so that corresponding experiments and/or theories do not exist. In nuclear field, it is indispensable to estimate real phenomena from simulations in order to improve the safety and security of nuclear power plants. Here, the analysis of uncertainty included in simulations is needed to reveal sensitivity of uncertainty due to randomness, to reduce the uncertainty due to lack of knowledge and to lead a degree of certainty by verification and validation (V and V) and uncertainty quantification (UQ) processes. To realize this, we propose 'Cerebral Methodology based Computing (CMC)' as computing processes with deductive and inductive approaches by referring human reasoning processes. Our idea is to execute deductive and inductive simulations contrasted with deductive and inductive approaches. We have established its prototype system and applied it to a thermal displacement analysis of a nuclear power plant. The result shows that our idea is effective to reduce the uncertainty and to get the degree of certainty. (author)

  13. Geology and geochemistry of Nuku Hiva, Marquesas: temporal trends in a large Polynesian shield volcano

    International Nuclear Information System (INIS)

    Le Dez, A.; Maury, R.C.; Bellon, H.; Cotten, J.; Vidal, P.; Brousse, R.

    1996-01-01

    Nuku Hiva, one of the largest shield volcanoes in French Polynesia, was built up largely between 4.8 and 3.7 Ma. We present a geological sketch map of the island showing three nested calderas opened southward, the origin of which is attributed to submarine gravity landslide collapses. The emergent part of the Tekao shield is made up of thin tholeiitic flows mostly emplaced between 4.8 and 4.5 Ma, overlain by transitional basalts, alkali basalts and hawaiites. The main caldera collapse event is dated at 4.05 ± 0.10 Ma. It was immediately followed by the construction of the Taiohae volcano which exposes an alkalic suite ranging from basalts to trachytes. Major and trace element data document a rapid transition from tholeiites to alkali basalts, which we relate to time-decreasing degrees of melting of a garnet lherzolite source. The isotopic Sr, Nd, Pb variability of Nuku Hiva basalts, and especially of the Tekao shield tholeiites, may reflect small-scale heterogeneities in a plume of dominantly EMII-HIMU composition. (authors). 56 refs., 11 figs., 2 tabs

  14. Shock modification and chemistry and planetary geologic processes

    International Nuclear Information System (INIS)

    Boslough, M.S.

    1991-01-01

    This paper brings the rapid advances on shock processing of materials to the attention of Earth scientists, and to put these advances in the context of planetary geologic processes. Most of the recent research in this area has been directed at materials modification an synthesis, and the information gained has direct relevance to shock effects in nature. Research on various types of shock modification and chemistry in both naturally and experimentally shocked rocks and minerals is reviewed, and where appropriate their significance to planetary processes is indicated. As a case study, the surface of Mars is suggested as a place where conditions are optimal for shock processing to be a dominant factor. The various mechanisms of shock modification, activation, synthesis and decomposition are all proposed as major contributors to the evolution of chemical, mineralogical, and physical properties of the Martian regolith

  15. Geologic map of the east half of the Lime Hills 1:250,000-scale quadrangle, Alaska

    Science.gov (United States)

    Gamble, Bruce M.; Reed, Bruce L.; Richter, Donald H.; Lanphere, Marvin A.

    2013-01-01

    This map is compiled from geologic mapping conducted between 1985 and 1992 by the U.S. Geological Survey as part of the Alaska Mineral Resource Assessment Program. That mapping built upon previous USGS work (1963–1988) unraveling the magmatic history of the Alaska–Aleutian Range batholith. Quaternary unit contacts depicted on this map are derived largely from aerial-photograph interpretation. K-Ar ages made prior to this study have been recalculated using 1977 decay constants. The east half of the Lime Hills 1:250,000-scale quadrangle includes part of the Alaska–Aleutian Range batholith and several sequences of sedimentary rocks or mixed sedimentary and volcanic rocks. The Alaska–Aleutian Range batholith contains rocks that represent three major igneous episodes, (1) Early and Middle Jurassic, (2) Late Cretaceous and early Tertiary, and (3) middle Tertiary; only rocks from the latter two episodes are found in this map area. The map area is one of very steep and rugged terrain; elevations range from a little under 1,000 ft (305 m) to 9,828 ft (2,996 m). Foot traverses are generally restricted to lowermost elevations. Areas suitable for helicopter landings can be scarce at higher elevations. Most of the area was mapped from the air, supplemented by direct examination of rocks where possible. This restricted access greatly complicates understanding some of the more complex geologic units. For example, we know there are plutons whose compositions vary from gabbro to granodiorite, but we have little insight as to how these phases are distributed and what their relations might be to each other. It is also possible that some of what we have described as compositionally complex plutons might actually be several distinct intrusions.

  16. Geology Before Pluto: Pre-encounter Considerations

    Science.gov (United States)

    Moore, J. M.

    2014-12-01

    Pluto, its large satellite Charon, and its four small known satellites represent the first trans-Neptunian Kuiper Belt objects populating the outer-most solar system beyond the gas giant planets to be studied in detail from a spacecraft (New Horizons). A complete picture of the solar nebula and solar system formation cannot be confidently formulated until representatives of this group of bodies at the edge of solar space have been examined. The Pluto system is composed of unique, lunar- and intermediate-sized objects that can tell us much about how objects with volatile icy compositions evolve. Modeling of the interior suggests that geologic activity may have been significant to some degree, and observations of frost on the surface could imply the need for a geologic reservoir for the replenishment of these phases. However, these putative indicators of Pluto's geologic history are inconclusive and unspecific. Detailed examination of Pluto's geologic record is the only plausible means of bridging the gap between theory and observation. In this talk I will examine the potential importance of these tentative indications of geologic activity and how specific spacecraft observations have been designed and used to constrain the Pluto system's geologic history. The cameras of New Horizons will provide robust data sets that should be immanently amenable to geological analysis of the Pluto system's landscapes. In this talk, we begin with a brief discussion of the planned observations by the New Horizons cameras that will bear most directly on geological interpretability. Then I will broadly review major geological processes that could potentially operate on the surfaces of Pluto and its moons. I will first survey exogenic processes (i.e., those for which energy for surface modification is supplied externally to the planetary surface): impact cratering, sedimentary processes (including volatile migration), and the work of wind. I will conclude with an assessment of the

  17. Geology Before Pluto: Pre-Encounter Considerations

    Science.gov (United States)

    Moore, Jeffrey M.

    2014-01-01

    Pluto, its large satellite Charon, and its four known satellites represent the first trans-Neptunian Kuiper Belt objects populating the outer-most solar system beyond the gas giant planets to be studied in detail from a spacecraft (New Horizons). A complete picture of the solar nebula, and solar system formation cannot be confidently formulated until representatives of this group of bodies at the edge of solar space have been examined. The Pluto system is composed of unique lunar- and intermediate-sized objects that can tell us much about how objects with volatile icy compositions evolve. Modeling of the interior suggests that geologic activity may have been to some degree, and observations of frost on the surface could imply the need for a geologic reservoir for the replenishment of these phases. However, the putative indicators of Pluto's geologic history are inconclusive and unspecific. Detailed examination of Pluto's geologic record is the only plausible means of bridging the gap between theory and observations. In this talk I will examine the potential importance of these tentative indications of geologic activity and how specific spacecraft observations have been designed and used to constrain the Pluto system's geologic history. The cameras of New Horizons will provide robust data sets that should be immanently amenable to geological analysis of the Pluto System's landscapes. In this talk, we begin with a brief discussion of the planned observations by New Horizons' cameras that will bear most directly on geological interpretability. Then I will broadly review major geological processes that could potentially operate of the surfaces of Pluto and its moons. I will first survey exogenic processes (i.e., those for which energy for surface modification is supplied externally to the planetary surface): impact cratering, sedimentary processes (including volatile migration) and the work of wind. I will conclude with an assessment of prospects for endogenic activity

  18. THE DECAY OF A WEAK LARGE-SCALE MAGNETIC FIELD IN TWO-DIMENSIONAL TURBULENCE

    Energy Technology Data Exchange (ETDEWEB)

    Kondić, Todor; Hughes, David W.; Tobias, Steven M., E-mail: t.kondic@leeds.ac.uk [Department of Applied Mathematics, University of Leeds, Leeds LS2 9JT (United Kingdom)

    2016-06-01

    We investigate the decay of a large-scale magnetic field in the context of incompressible, two-dimensional magnetohydrodynamic turbulence. It is well established that a very weak mean field, of strength significantly below equipartition value, induces a small-scale field strong enough to inhibit the process of turbulent magnetic diffusion. In light of ever-increasing computer power, we revisit this problem to investigate fluids and magnetic Reynolds numbers that were previously inaccessible. Furthermore, by exploiting the relation between the turbulent diffusion of the magnetic potential and that of the magnetic field, we are able to calculate the turbulent magnetic diffusivity extremely accurately through the imposition of a uniform mean magnetic field. We confirm the strong dependence of the turbulent diffusivity on the product of the magnetic Reynolds number and the energy of the large-scale magnetic field. We compare our findings with various theoretical descriptions of this process.

  19. In situ vitrification large-scale operational acceptance test analysis

    International Nuclear Information System (INIS)

    Buelt, J.L.; Carter, J.G.

    1986-05-01

    A thermal treatment process is currently under study to provide possible enhancement of in-place stabilization of transuranic and chemically contaminated soil sites. The process is known as in situ vitrification (ISV). In situ vitrification is a remedial action process that destroys solid and liquid organic contaminants and incorporates radionuclides into a glass-like material that renders contaminants substantially less mobile and less likely to impact the environment. A large-scale operational acceptance test (LSOAT) was recently completed in which more than 180 t of vitrified soil were produced in each of three adjacent settings. The LSOAT demonstrated that the process conforms to the functional design criteria necessary for the large-scale radioactive test (LSRT) to be conducted following verification of the performance capabilities of the process. The energy requirements and vitrified block size, shape, and mass are sufficiently equivalent to those predicted by the ISV mathematical model to confirm its usefulness as a predictive tool. The LSOAT demonstrated an electrode replacement technique, which can be used if an electrode fails, and techniques have been identified to minimize air oxidation, thereby extending electrode life. A statistical analysis was employed during the LSOAT to identify graphite collars and an insulative surface as successful cold cap subsidence techniques. The LSOAT also showed that even under worst-case conditions, the off-gas system exceeds the flow requirements necessary to maintain a negative pressure on the hood covering the area being vitrified. The retention of simulated radionuclides and chemicals in the soil and off-gas system exceeds requirements so that projected emissions are one to two orders of magnitude below the maximum permissible concentrations of contaminants at the stack

  20. Magnetic Properties of Large-Scale Nanostructured Graphene Systems

    DEFF Research Database (Denmark)

    Gregersen, Søren Schou

    The on-going progress in two-dimensional (2D) materials and nanostructure fabrication motivates the study of altered and combined materials. Graphene—the most studied material of the 2D family—displays unique electronic and spintronic properties. Exceptionally high electron mobilities, that surpass...... those in conventional materials such as silicon, make graphene a very interesting material for high-speed electronics. Simultaneously, long spin-diffusion lengths and spin-life times makes graphene an eligible spin-transport channel. In this thesis, we explore fundamental features of nanostructured...... graphene systems using large-scale modeling techniques. Graphene perforations, or antidots, have received substantial interest in the prospect of opening large band gaps in the otherwise gapless graphene. Motivated by recent improvements of fabrication processes, such as forming graphene antidots and layer...

  1. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  2. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  3. Predicting multi-scale relationships between geomorphology and bedrock geology of the rocky intertidal in Central and Northern California

    Science.gov (United States)

    Wheeler, A.; Aiello, I. W.

    2014-12-01

    Substratum geology is fundamental in shaping rocky shore morphology. Specific lithologies have various responses to wave action, tectonic features (e.g. fractures, faults) and sedimentary structures (e.g. bedding), creating distinctive weathering profiles. Along with local oceanography and climate forcing, different rock substrata create coastal morphologies that can vary distinctly between scales, ranging from mm to km. Despite the complexity of the system, qualitative observations show coastal areas with similar rock types share similar geomorphologies. Thus, a statistic relationship between geomorphology (expressed for instance by surface parameter rugosity) and geology can be envisaged. There are multiple benefits of finding such a relationship, as rocky intertidal geomorphology can be an important determinant in which organisms can settle, grow, and survive in near shore communities: allowing the prediction of geomorphologic parameters determining coastal ecology solely based on substratum geology, a crucial aspect in guiding the selection of marine protected areas. This study presents preliminary results of multi-scale geospatial surveys (cm to tens of meters) of rocky intertidal outcrops from Central to Northern California using a Terrestrial Laser Scanner. The outcrops investigated are representative of the most common igneous and sedimentary rocks in California (granitoids, conglomerates, sandstones, mudstones) and metamorphic units. The statistical analysis of the survey data support the hypothesis that surface properties can change significantly with changing scale, each rock type having distinct surface characteristics which are similar to comparable lithologies exposed at different locations. These scale dependent variations are controlled by different lithologic and structural characteristics of the outcrop in question. Our data also suggests lithologic variability within a rock unit could be a very significant factor in controlling changes in

  4. Geology

    Data.gov (United States)

    Kansas Data Access and Support Center — This database is an Arc/Info implementation of the 1:500,000 scale Geology Map of Kansas, M­23, 1991. This work wasperformed by the Automated Cartography section of...

  5. A decomposition heuristics based on multi-bottleneck machines for large-scale job shop scheduling problems

    Directory of Open Access Journals (Sweden)

    Yingni Zhai

    2014-10-01

    Full Text Available Purpose: A decomposition heuristics based on multi-bottleneck machines for large-scale job shop scheduling problems (JSP is proposed.Design/methodology/approach: In the algorithm, a number of sub-problems are constructed by iteratively decomposing the large-scale JSP according to the process route of each job. And then the solution of the large-scale JSP can be obtained by iteratively solving the sub-problems. In order to improve the sub-problems' solving efficiency and the solution quality, a detection method for multi-bottleneck machines based on critical path is proposed. Therewith the unscheduled operations can be decomposed into bottleneck operations and non-bottleneck operations. According to the principle of “Bottleneck leads the performance of the whole manufacturing system” in TOC (Theory Of Constraints, the bottleneck operations are scheduled by genetic algorithm for high solution quality, and the non-bottleneck operations are scheduled by dispatching rules for the improvement of the solving efficiency.Findings: In the process of the sub-problems' construction, partial operations in the previous scheduled sub-problem are divided into the successive sub-problem for re-optimization. This strategy can improve the solution quality of the algorithm. In the process of solving the sub-problems, the strategy that evaluating the chromosome's fitness by predicting the global scheduling objective value can improve the solution quality.Research limitations/implications: In this research, there are some assumptions which reduce the complexity of the large-scale scheduling problem. They are as follows: The processing route of each job is predetermined, and the processing time of each operation is fixed. There is no machine breakdown, and no preemption of the operations is allowed. The assumptions should be considered if the algorithm is used in the actual job shop.Originality/value: The research provides an efficient scheduling method for the

  6. A new algorithm for coding geological terminology

    Science.gov (United States)

    Apon, W.

    The Geological Survey of The Netherlands has developed an algorithm to convert the plain geological language of lithologic well logs into codes suitable for computer processing and link these to existing plotting programs. The algorithm is based on the "direct method" and operates in three steps: (1) searching for defined word combinations and assigning codes; (2) deleting duplicated codes; (3) correcting incorrect code combinations. Two simple auxiliary files are used. A simple PC demonstration program is included to enable readers to experiment with this algorithm. The Department of Quarternary Geology of the Geological Survey of The Netherlands possesses a large database of shallow lithologic well logs in plain language and has been using a program based on this algorithm for about 3 yr. Erroneous codes resulting from using this algorithm are less than 2%.

  7. The use of production management techniques in the construction of large scale physics detectors

    International Nuclear Information System (INIS)

    Bazan, A.; Chevenier, G.; Estrella, F.

    1999-01-01

    The construction process of detectors for the Large Hadron Collider (LHC) experiments is large scale, heavily constrained by resource availability and evolves with time. As a consequence, changes in detector component design need to be tracked and quickly reflected in the construction process. With similar problems in industry engineers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so-called Workflow Management Software (WfMS) to coordinate production work processes. However, PDM and WfMS software are not generally integrated in industry. The scale of LHC experiments, like CMS, demands that industrial production techniques be applied in detector construction. This paper outlines the major functions and applications of the CRISTAL system (Cooperating Repositories and an Information System for Tracking Assembly Lifecycles) in use in CMS which successfully integrates PDM and WfMS techniques in managing large scale physics detector construction. This is the first time industrial production techniques have been deployed to this extent in detector construction

  8. Research and development of safeguards measures for the large scale reprocessing plant

    Energy Technology Data Exchange (ETDEWEB)

    Kikuchi, Masahiro; Sato, Yuji; Yokota, Yasuhiro; Masuda, Shoichiro; Kobayashi, Isao; Uchikoshi, Seiji; Tsutaki, Yasuhiro; Nidaira, Kazuo [Nuclear Material Control Center, Tokyo (Japan)

    1994-12-31

    The Government of Japan agreed on the safeguards concepts of commercial size reprocessing plant under the bilateral agreement for cooperation between the Japan and the United States. In addition, the LASCAR, that is the forum of large scale reprocessing plant safeguards, could obtain the fruitful results in the spring of 1992. The research and development of safeguards measures for the Rokkasho Reprocessing Plant should be progressed with every regard to the concepts described in both documents. Basically, the material accountancy and monitoring system should be established, based on the NRTA and other measures in order to obtain the timeliness goal for plutonium, and the un-attended mode inspection approach based on the integrated containment/surveillance system coupled with radiation monitoring in order to reduce the inspection efforts. NMCC has been studying on the following measures for a large scale reprocessing plant safeguards (1) A radiation gate monitor and integrated surveillance system (2) A near real time Shipper and Receiver Difference monitoring (3) A near real time material accountancy system operated for the bulk handling area (4) A volume measurement technique in a large scale input accountancy vessel (5) An in-process inventory estimation technique applied to the process equipment such as the pulse column and evaporator (6) Solution transfer monitoring approach applied to buffer tanks in the chemical process (7) A timely analysis technique such as a hybrid K edge densitometer operated in the on-site laboratory (J.P.N.).

  9. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  10. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  11. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed; Elsawy, Hesham; Gharbieh, Mohammad; Alouini, Mohamed-Slim; Adinoyi, Abdulkareem; Alshaalan, Furaih

    2017-01-01

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end

  12. Proposal for geological site selection for L/ILW and HLW repositories. Statement of requirements, procedure and results. Technical report 08-03

    International Nuclear Information System (INIS)

    2008-10-01

    , reliability of geological findings and engineering suitability; 3) The large-scale geological-tectonic situation is assessed and large-scale areas that remain under consideration are defined. From the viewpoint of long-term stability and explorability of spatial conditions, all large-scale geological-tectonic areas in Switzerland come into consideration for the L/ILW repository. For the HLW repository, the Alps, the Folded Jura, the western Tabular Jura and a small part of the Molasse Basin (western sub-Jurassic zone) are excluded; 4) The preferred host rock formations are chosen within the large-scale areas still under consideration. Proposed for the L/ILW repository are the Opalinus Clay with its confining units, the clay stone sequence 'Brauner Dogger' with its confining units, the Effingen Beds and the marl formations of the Helveticum. For the HLW repository, the Opalinus Clay with its confining units is proposed as the preferred host formation; 5) The configurations of the preferred host rocks within the large-scale areas under consideration are evaluated in the fifth step. Taking into account the presence of regional geological features (regional fault zones, over-deepened valleys resulting from glacial erosion, zones with indications of small-scale tectonic dissection, other zones to be avoided for reasons of neotectonics), preferred areas are identified within which the preferred host rocks can be found at a suitable depth and with sufficient thickness and lateral extent. The preferred areas are used as the basis for delimiting the geological siting regions. Some siting regions contain several preferred areas and sometimes more than one host rock type. In three of the geological siting regions, the possibility exists in principle of siting the L/ILW and HLW repositories together as a so-called 'combined repository'. For the L/ILW repository, the geological siting regions Southern Schaffhausen, Zuercher Weinland and Boezberg (all with Opalinus Clay as host rock) are

  13. Simulation of Anisotropic Rock Damage for Geologic Fracturing

    Science.gov (United States)

    Busetti, S.; Xu, H.; Arson, C. F.

    2014-12-01

    A continuum damage model for differential stress-induced anisotropic crack formation and stiffness degradation is used to study geologic fracturing in rocks. The finite element-based model solves for deformation in the quasi-linear elastic domain and determines the six component damage tensor at each deformation increment. The model permits an isotropic or anisotropic intact or pre-damaged reference state, and the elasticity tensor evolves depending on the stress path. The damage variable, similar to Oda's fabric tensor, grows when the surface energy dissipated by three-dimensional opened cracks exceeds a threshold defined at the appropriate scale of the representative elementary volume (REV). At the laboratory or wellbore scale (1000m) scales the damaged REV reflects early natural fracturing (background or tectonic fracturing) or shear strain localization (fault process zone, fault-tip damage, etc.). The numerical model was recently benchmarked against triaxial stress-strain data from laboratory rock mechanics tests. However, the utility of the model to predict geologic fabric such as natural fracturing in hydrocarbon reservoirs was not fully explored. To test the ability of the model to predict geological fracturing, finite element simulations (Abaqus) of common geologic scenarios with known fracture patterns (borehole pressurization, folding, faulting) are simulated and the modeled damage tensor is compared against physical fracture observations. Simulated damage anisotropy is similar to that derived using fractured rock-mass upscaling techniques for pre-determined fracture patterns. This suggests that if model parameters are constrained with local data (e.g., lab, wellbore, or reservoir domain), forward modeling could be used to predict mechanical fabric at the relevant REV scale. This reference fabric also can be used as the starting material property to pre-condition subsequent deformation or fluid flow. Continuing efforts are to expand the present damage

  14. In situ vitrification: Preliminary results from the first large-scale radioactive test

    International Nuclear Information System (INIS)

    Buelt, J.L.; Westsik, J.H.

    1988-02-01

    The first large-scale radioactive test (LSRT) of In Situ Vitrification (ISV) has been completed. In Situ Vitrification is a process whereby joule heating immobilizes contaminated soil in place by converting it to a durable glass and crystalline waste form. The LSRT was conducted at an actual transuranic contaminated soil site on the Department of Energy's Hanford Site. The test had two objectives: (1) determine large-scale processing performance and (2) produce a waste form that can be fully evaluated as a potential technique for the final disposal of transuranic-contaminated soil sites at Hanford. This accomplishment has provided technical data to evaluate the ISV process for its potential in the final disposition of transuranic-contaminated soil sites at Hanford. Because of the test's successful completion, within a year technical data on the vitrified soil will be available to determine how well the process incorporates transuranics into the waste form and how well the form resists leaching of transuranics. Preliminary results available include retention of transuranics and other elements within the waste form during processing and the efficiency of the off-gas treatment system in removing contaminants from the gaseous effluents. 13 refs., 10 figs., 5 tabs

  15. Using value stream mapping technique through the lean production transformation process: An implementation in a large-scaled tractor company

    Directory of Open Access Journals (Sweden)

    Mehmet Rıza Adalı

    2017-04-01

    Full Text Available Today’s world, manufacturing industries have to continue their development and continuity in more competitive environment via decreasing their costs. As a first step in the lean production process transformation is to analyze the value added activities and non-value adding activities. This study aims at applying the concepts of Value Stream Mapping (VSM in a large-scaled tractor company in Sakarya. Waste and process time are identified by mapping the current state in the production line of platform. The future state was suggested with improvements for elimination of waste and reduction of lead time, which went from 13,08 to 4,35 days. Analysis are made using current and future states to support the suggested improvements and cycle time of the production line of platform is improved 8%. Results showed that VSM is a good alternative in the decision-making for change in production process.

  16. Onshore and offshore geologic map of the Coal Oil Point area, southern California

    Science.gov (United States)

    Dartnell, Pete; Conrad, James E.; Stanley, Richard G.; Guy R. Cochrane, Guy R.

    2011-01-01

    Geologic maps that span the shoreline and include both onshore and offshore areas are potentially valuable tools that can lead to a more in depth understanding of coastal environments. Such maps can contribute to the understanding of shoreline change, geologic hazards, both offshore and along-shore sediment and pollutant transport. They are also useful in assessing geologic and biologic resources. Several intermediate-scale (1:100,000) geologic maps that include both onshore and offshore areas (herein called onshore-offshore geologic maps) have been produced of areas along the California coast (see Saucedo and others, 2003; Kennedy and others, 2007; Kennedy and Tan, 2008), but few large-scale (1:24,000) maps have been produced that can address local coastal issues. A cooperative project between Federal and State agencies and universities has produced an onshore-offshore geologic map at 1:24,000 scale of the Coal Oil Point area and part of the Santa Barbara Channel, southern California (fig. 1). As part of the project, the U.S. Geological Survey (USGS) and the California Geological Survey (CGS) hosted a workshop (May 2nd and 3rd, 2007) for producers and users of coastal map products (see list of participants) to develop a consensus on the content and format of onshore-offshore geologic maps (and accompanying GIS files) so that they have relevance for coastal-zone management. The USGS and CGS are working to develop coastal maps that combine geospatial information from offshore and onshore and serve as an important tool for addressing a broad range of coastal-zone management issues. The workshop was divided into sessions for presentations and discussion of bathymetry and topography, geology, and habitat products and needs of end users. During the workshop, participants reviewed existing maps and discussed their merits and shortcomings. This report addresses a number of items discussed in the workshop and details the onshore and offshore geologic map of the Coal Oil

  17. BigSUR: large-scale structured urban reconstruction

    KAUST Repository

    Kelly, Tom

    2017-11-22

    The creation of high-quality semantically parsed 3D models for dense metropolitan areas is a fundamental urban modeling problem. Although recent advances in acquisition techniques and processing algorithms have resulted in large-scale imagery or 3D polygonal reconstructions, such data-sources are typically noisy, and incomplete, with no semantic structure. In this paper, we present an automatic data fusion technique that produces high-quality structured models of city blocks. From coarse polygonal meshes, street-level imagery, and GIS footprints, we formulate a binary integer program that globally balances sources of error to produce semantically parsed mass models with associated facade elements. We demonstrate our system on four city regions of varying complexity; our examples typically contain densely built urban blocks spanning hundreds of buildings. In our largest example, we produce a structured model of 37 city blocks spanning a total of 1,011 buildings at a scale and quality previously impossible to achieve automatically.

  18. BigSUR: large-scale structured urban reconstruction

    KAUST Repository

    Kelly, Tom; Femiani, John; Wonka, Peter; Mitra, Niloy J.

    2017-01-01

    The creation of high-quality semantically parsed 3D models for dense metropolitan areas is a fundamental urban modeling problem. Although recent advances in acquisition techniques and processing algorithms have resulted in large-scale imagery or 3D polygonal reconstructions, such data-sources are typically noisy, and incomplete, with no semantic structure. In this paper, we present an automatic data fusion technique that produces high-quality structured models of city blocks. From coarse polygonal meshes, street-level imagery, and GIS footprints, we formulate a binary integer program that globally balances sources of error to produce semantically parsed mass models with associated facade elements. We demonstrate our system on four city regions of varying complexity; our examples typically contain densely built urban blocks spanning hundreds of buildings. In our largest example, we produce a structured model of 37 city blocks spanning a total of 1,011 buildings at a scale and quality previously impossible to achieve automatically.

  19. Geology and resource assessment of Costa Rica at 1:500,000 scale; a digital representation of maps of the U.S. Geological Survey's 1987 folio I-1865

    Science.gov (United States)

    Schruben, Paul G.

    1997-01-01

    This CD-ROM contains digital versions of the geology and resource assessment maps of Costa Rica originally published in USGS Folio I-1865 (U.S. Geological Survey, the Direccion General de Geologia, Minas e Hidrocarburos, and the Universidad de Costa Rica, 1987) at a scale of 1:500,000. The following layers are available on the CD-ROM: geology and faults; favorable domains for selected deposit types; Bouguer gravity data; isostatic gravity contours; mineral deposits, prospects, and occurrences; and rock geochemistry sample points. For DOS users, the CD-ROM contains MAPPER, a user-friendly map display program. Some of the maps are also provided in the following additional formats on the CD-ROM: (1) ArcView 1 and 3, (2) ARC/INFO 6.1.2 Export, (3) Digital Line Graph (DLG) Optional, and (4) Drawing Exchange File (DXF.)

  20. TensorFlow: A system for large-scale machine learning

    OpenAIRE

    Abadi, Martín; Barham, Paul; Chen, Jianmin; Chen, Zhifeng; Davis, Andy; Dean, Jeffrey; Devin, Matthieu; Ghemawat, Sanjay; Irving, Geoffrey; Isard, Michael; Kudlur, Manjunath; Levenberg, Josh; Monga, Rajat; Moore, Sherry; Murray, Derek G.

    2016-01-01

    TensorFlow is a machine learning system that operates at large scale and in heterogeneous environments. TensorFlow uses dataflow graphs to represent computation, shared state, and the operations that mutate that state. It maps the nodes of a dataflow graph across many machines in a cluster, and within a machine across multiple computational devices, including multicore CPUs, general-purpose GPUs, and custom designed ASICs known as Tensor Processing Units (TPUs). This architecture gives flexib...

  1. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...

  2. Prediction of geological and mechanical processes while disposing of high-level waste (HLW) into the earth crust

    International Nuclear Information System (INIS)

    Kedrovsky, O.L.; Morozov, V.N.

    1992-01-01

    Prediction of geological and mechanical processes while disposing of high-level waste of atomic industry into the earth crust is the fundamental base for ecological risk assessment (possible consequences) while developing repository designs. The subject of this paper is the analytical estimate of possibilities of rock fracturing mechanisms to predict isolation properties loss by massif beginning from crystal lattice of minerals up to large fracture disturbances under conditions of long-term influence of pressure, temperature, and radiation. To solve the problem possibilities of kinetic

  3. Influences of geomorphology and geology on alpine treeline in the American West - More important than climatic influences?

    Science.gov (United States)

    Butler, D.R.; Malanson, G.P.; Walsh, S.J.; Fagre, D.B.

    2007-01-01

    The spatial distribution and pattern of alpine treeline in the American West reflect the overarching influences of geological history, lithology and structure, and geomorphic processes and landforms, and geologic and geomorphic factors—both forms and processes—can control the spatiotemporal response of the ecotone to climate change. These influences occur at spatial scales ranging from the continental scale to fine scale processes and landforms at the slope scale. Past geomorphic influences, particularly Pleistocene glaciation, have also left their impact on treeline, and treelines across the west are still adjusting to post-Pleistocene conditions within Pleistocene-created landforms. Current fine scale processes include solifluction and changes on relict solifluction and digging by animals. These processes should be examined in detail in future studies to facilitate a better understanding of where individual tree seedlings become established as a primary response of the ecotone to climate change.

  4. The large-scale environment from cosmological simulations - I. The baryonic cosmic web

    Science.gov (United States)

    Cui, Weiguang; Knebe, Alexander; Yepes, Gustavo; Yang, Xiaohu; Borgani, Stefano; Kang, Xi; Power, Chris; Staveley-Smith, Lister

    2018-01-01

    Using a series of cosmological simulations that includes one dark-matter-only (DM-only) run, one gas cooling-star formation-supernova feedback (CSF) run and one that additionally includes feedback from active galactic nuclei (AGNs), we classify the large-scale structures with both a velocity-shear-tensor code (VWEB) and a tidal-tensor code (PWEB). We find that the baryonic processes have almost no impact on large-scale structures - at least not when classified using aforementioned techniques. More importantly, our results confirm that the gas component alone can be used to infer the filamentary structure of the universe practically un-biased, which could be applied to cosmology constraints. In addition, the gas filaments are classified with its velocity (VWEB) and density (PWEB) fields, which can theoretically connect to the radio observations, such as H I surveys. This will help us to bias-freely link the radio observations with dark matter distributions at large scale.

  5. High resolution reservoir geological modelling using outcrop information

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Changmin; Lin Kexiang; Liu Huaibo [Jianghan Petroleum Institute, Hubei (China)] [and others

    1997-08-01

    This is China`s first case study of high resolution reservoir geological modelling using outcrop information. The key of the modelling process is to build a prototype model and using the model as a geological knowledge bank. Outcrop information used in geological modelling including seven aspects: (1) Determining the reservoir framework pattern by sedimentary depositional system and facies analysis; (2) Horizontal correlation based on the lower and higher stand duration of the paleo-lake level; (3) Determining the model`s direction based on the paleocurrent statistics; (4) Estimating the sandbody communication by photomosaic and profiles; (6) Estimating reservoir properties distribution within sandbody by lithofacies analysis; and (7) Building the reservoir model in sandbody scale by architectural element analysis and 3-D sampling. A high resolution reservoir geological model of Youshashan oil field has been built by using this method.

  6. Sentinel-1 data massive processing for large scale DInSAR analyses within Cloud Computing environments through the P-SBAS approach

    Science.gov (United States)

    Lanari, Riccardo; Bonano, Manuela; Buonanno, Sabatino; Casu, Francesco; De Luca, Claudio; Fusco, Adele; Manunta, Michele; Manzo, Mariarosaria; Pepe, Antonio; Zinno, Ivana

    2017-04-01

    -core programming techniques. Currently, Cloud Computing environments make available large collections of computing resources and storage that can be effectively exploited through the presented S1 P-SBAS processing chain to carry out interferometric analyses at a very large scale, in reduced time. This allows us to deal also with the problems connected to the use of S1 P-SBAS chain in operational contexts, related to hazard monitoring and risk prevention and mitigation, where handling large amounts of data represents a challenging task. As a significant experimental result we performed a large spatial scale SBAS analysis relevant to the Central and Southern Italy by exploiting the Amazon Web Services Cloud Computing platform. In particular, we processed in parallel 300 S1 acquisitions covering the Italian peninsula from Lazio to Sicily through the presented S1 P-SBAS processing chain, generating 710 interferograms, thus finally obtaining the displacement time series of the whole processed area. This work has been partially supported by the CNR-DPC agreement, the H2020 EPOS-IP project (GA 676564) and the ESA GEP project.

  7. Recent Advances in Understanding Large Scale Vapour Explosions

    International Nuclear Information System (INIS)

    Board, S.J.; Hall, R.W.

    1976-01-01

    description of efficient large scale explosions it will be necessary to consider three stages: a) the setting up of a quasi-stable initial configuration; b) the triggering of this configuration; c) the propagation of the explosion. In this paper we consider each stage in turn, reviewing the relevant experimental information and theory to see to what extent the requirements for energetic explosions, and the physical processes that can satisfy these requirements, are understood. We pay particular attention to an attractively simple criterion for explosiveness, suggested by Fauske, that the contact temperature should exceed the temperature for spontaneous nucleation of the coolant, because on this criterion, sodium and UO 2 in particular are not explosive

  8. A model-based framework for incremental scale-up of wastewater treatment processes

    DEFF Research Database (Denmark)

    Mauricio Iglesias, Miguel; Sin, Gürkan

    Scale-up is traditionally done following specific ratios or rules of thumb which do not lead to optimal results. We present a generic framework to assist in scale-up of wastewater treatment processes based on multiscale modelling, multiobjective optimisation and a validation of the model at the new...... large scale. The framework is illustrated by the scale-up of a complete autotropic nitrogen removal process. The model based multiobjective scaleup offers a promising improvement compared to the rule of thumbs based emprical scale up rules...

  9. Large-Scale Structure and Hyperuniformity of Amorphous Ices

    Science.gov (United States)

    Martelli, Fausto; Torquato, Salvatore; Giovambattista, Nicolas; Car, Roberto

    2017-09-01

    We investigate the large-scale structure of amorphous ices and transitions between their different forms by quantifying their large-scale density fluctuations. Specifically, we simulate the isothermal compression of low-density amorphous ice (LDA) and hexagonal ice to produce high-density amorphous ice (HDA). Both HDA and LDA are nearly hyperuniform; i.e., they are characterized by an anomalous suppression of large-scale density fluctuations. By contrast, in correspondence with the nonequilibrium phase transitions to HDA, the presence of structural heterogeneities strongly suppresses the hyperuniformity and the system becomes hyposurficial (devoid of "surface-area fluctuations"). Our investigation challenges the largely accepted "frozen-liquid" picture, which views glasses as structurally arrested liquids. Beyond implications for water, our findings enrich our understanding of pressure-induced structural transformations in glasses.

  10. Important processes affecting the release and migration of radionuclides from a deep geological repository

    International Nuclear Information System (INIS)

    Barátová, Dana; Nečas, Vladimír

    2017-01-01

    The processes that affect significantly the transport of contaminants through the near field and far field of a deep geological repository of spent nuclear fuel were studied. The processes can be generally divided into (i) processes related to the release of radionuclides from the spent nuclear fuel; (ii) processes related to the radionuclide transport mechanisms (such as advection and diffusion); and (iii) processes affecting the rate of radionuclide migration through the multi-barrier repository system. A near-field and geosphere model of an unspecified geological repository sited in a crystalline rock is also described. Focus of the treatment is on the effects of the different processes on the activity flow of the major safety-relevant radionuclides. The activity flow was simulated for one spent fuel cask by using the GoldSim simulation tool. (orig.)

  11. In-situ vitrification: a large-scale prototype for immobilizing radioactively contaminated waste

    International Nuclear Information System (INIS)

    Carter, J.G.; Buelt, J.L.

    1986-03-01

    Pacific Northwest Laboratory is developing the technology of in situ vitrification, a thermal treatment process for immobilizing radioactively contaminated soil. A permanent remedial action, the process incorporates radionuclides into a glass and crystalline form. The transportable procss consists of an electrical power system to vitrify the soil, a hood to contain gaseous effluents, an off-gas treatment system and cooling system, and a process control station. Large-scale testing of the in situ vitrification process is currently underway

  12. Beyond single syllables: large-scale modeling of reading aloud with the Connectionist Dual Process (CDP++) model.

    Science.gov (United States)

    Perry, Conrad; Ziegler, Johannes C; Zorzi, Marco

    2010-09-01

    Most words in English have more than one syllable, yet the most influential computational models of reading aloud are restricted to processing monosyllabic words. Here, we present CDP++, a new version of the Connectionist Dual Process model (Perry, Ziegler, & Zorzi, 2007). CDP++ is able to simulate the reading aloud of mono- and disyllabic words and nonwords, and learns to assign stress in exactly the same way as it learns to associate graphemes with phonemes. CDP++ is able to simulate the monosyllabic benchmark effects its predecessor could, and therefore shows full backwards compatibility. CDP++ also accounts for a number of novel effects specific to disyllabic words, including the effects of stress regularity and syllable number. In terms of database performance, CDP++ accounts for over 49% of the reaction time variance on items selected from the English Lexicon Project, a very large database of several thousand of words. With its lexicon of over 32,000 words, CDP++ is therefore a notable example of the successful scaling-up of a connectionist model to a size that more realistically approximates the human lexical system. Copyright © 2010 Elsevier Inc. All rights reserved.

  13. Remote collaboration system based on large scale simulation

    International Nuclear Information System (INIS)

    Kishimoto, Yasuaki; Sugahara, Akihiro; Li, J.Q.

    2008-01-01

    Large scale simulation using super-computer, which generally requires long CPU time and produces large amount of data, has been extensively studied as a third pillar in various advanced science fields in parallel to theory and experiment. Such a simulation is expected to lead new scientific discoveries through elucidation of various complex phenomena, which are hardly identified only by conventional theoretical and experimental approaches. In order to assist such large simulation studies for which many collaborators working at geographically different places participate and contribute, we have developed a unique remote collaboration system, referred to as SIMON (simulation monitoring system), which is based on client-server system control introducing an idea of up-date processing, contrary to that of widely used post-processing. As a key ingredient, we have developed a trigger method, which transmits various requests for the up-date processing from the simulation (client) running on a super-computer to a workstation (server). Namely, the simulation running on a super-computer actively controls the timing of up-date processing. The server that has received the requests from the ongoing simulation such as data transfer, data analyses, and visualizations, etc. starts operations according to the requests during the simulation. The server makes the latest results available to web browsers, so that the collaborators can monitor the results at any place and time in the world. By applying the system to a specific simulation project of laser-matter interaction, we have confirmed that the system works well and plays an important role as a collaboration platform on which many collaborators work with one another

  14. Numerical simulation of small-scale mixing processes in the upper ocean and atmospheric boundary layer

    International Nuclear Information System (INIS)

    Druzhinin, O; Troitskaya, Yu; Zilitinkevich, S

    2016-01-01

    The processes of turbulent mixing and momentum and heat exchange occur in the upper ocean at depths up to several dozens of meters and in the atmospheric boundary layer within interval of millimeters to dozens of meters and can not be resolved by known large- scale climate models. Thus small-scale processes need to be parameterized with respect to large scale fields. This parameterization involves the so-called bulk coefficients which relate turbulent fluxes with large-scale fields gradients. The bulk coefficients are dependent on the properties of the small-scale mixing processes which are affected by the upper-ocean stratification and characteristics of surface and internal waves. These dependencies are not well understood at present and need to be clarified. We employ Direct Numerical Simulation (DNS) as a research tool which resolves all relevant flow scales and does not require closure assumptions typical of Large-Eddy and Reynolds Averaged Navier-Stokes simulations (LES and RANS). Thus DNS provides a solid ground for correct parameterization of small-scale mixing processes and also can be used for improving LES and RANS closure models. In particular, we discuss the problems of the interaction between small-scale turbulence and internal gravity waves propagating in the pycnocline in the upper ocean as well as the impact of surface waves on the properties of atmospheric boundary layer over wavy water surface. (paper)

  15. U.S. Department of Energy's site screening, site selection, and initial characterization for storage of CO2 in deep geological formations

    Science.gov (United States)

    Rodosta, T.D.; Litynski, J.T.; Plasynski, S.I.; Hickman, S.; Frailey, S.; Myer, L.

    2011-01-01

    The U.S. Department of Energy (DOE) is the lead Federal agency for the development and deployment of carbon sequestration technologies. As part of its mission to facilitate technology transfer and develop guidelines from lessons learned, DOE is developing a series of best practice manuals (BPMs) for carbon capture and storage (CCS). The "Site Screening, Site Selection, and Initial Characterization for Storage of CO2 in Deep Geological Formations" BPM is a compilation of best practices and includes flowchart diagrams illustrating the general decision making process for Site Screening, Site Selection, and Initial Characterization. The BPM integrates the knowledge gained from various programmatic efforts, with particular emphasis on the Characterization Phase through pilot-scale CO2 injection testing of the Validation Phase of the Regional Carbon Sequestration Partnership (RCSP) Initiative. Key geologic and surface elements that suitable candidate storage sites should possess are identified, along with example Site Screening, Site Selection, and Initial Characterization protocols for large-scale geologic storage projects located across diverse geologic and regional settings. This manual has been written as a working document, establishing a framework and methodology for proper site selection for CO2 geologic storage. This will be useful for future CO2 emitters, transporters, and storage providers. It will also be of use in informing local, regional, state, and national governmental agencies of best practices in proper sequestration site selection. Furthermore, it will educate the inquisitive general public on options and processes for geologic CO2 storage. In addition to providing best practices, the manual presents a geologic storage resource and capacity classification system. The system provides a "standard" to communicate storage and capacity estimates, uncertainty and project development risk, data guidelines and analyses for adequate site characterization, and

  16. USGS Imagery Topo Large-scale Base Map Service from The National Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The USGS Imagery Topo Large service from The National Map (TNM) is a dynamic topographic base map service that combines the best available data (Boundaries,...

  17. Lithological history and ductile deformation: the lessons for long-term stability of large-scales structures in the olkiluoto

    International Nuclear Information System (INIS)

    Wikstrom, L.; Aaltonen, I.; Mattila, J.

    2009-01-01

    The Olkiluoto site has been chosen as a repository site for the high-level nuclear waste in 2001. Investigations in the site have been ongoing since 1987. The basic idea in the crystalline nuclear waste site still is that the solid repository block surrounded by deformation zones can host a safe repository. It is impossible to say that neither the major ductile nor large-scale brittle deformation zones are stable, but it is possible to say that the tectonic processes have been active in a stable way for billions of years by reactivating the old features time after time and there are no signs of new large features formed in the vicinity of the site during the present time including post-glacial period. Understanding the geological history, especially the ductile deformation and over thrusting, begins from the understanding of the lithological features, mainly rock types, in the island. Vice versa, the occurrence and location of the lithological features are interpreted according to ductile deformation. In addition, you cannot study only present brittle deformation but you need to understand also older ductile and lithological features to be able to understand why these brittle features are where they are and to be able to predict them. (authors)

  18. Large-scale seismic test for soil-structure interaction research in Hualien, Taiwan

    International Nuclear Information System (INIS)

    Ueshima, T.; Kokusho, T.; Okamoto, T.

    1995-01-01

    It is important to evaluate dynamic soil-structure interaction more accurately in the aseismic design of important facilities such as nuclear power plants. A large-scale model structure with about 1/4th of commercial nuclear power plants was constructed on the gravelly layers in seismically active Hualien, Taiwan. This international joint project is called 'the Hualien LSST Project', where 'LSST' is short for Large-Scale Seismic Test. In this paper, research tasks and responsibilities, the process of the construction work and research tasks along the time-line, main results obtained up to now, and so on in this Project are described. (J.P.N.)

  19. Towards understanding how surface life can affect interior geological processes: a non-equilibrium thermodynamics approach

    Directory of Open Access Journals (Sweden)

    J. G. Dyke

    2011-06-01

    Full Text Available Life has significantly altered the Earth's atmosphere, oceans and crust. To what extent has it also affected interior geological processes? To address this question, three models of geological processes are formulated: mantle convection, continental crust uplift and erosion and oceanic crust recycling. These processes are characterised as non-equilibrium thermodynamic systems. Their states of disequilibrium are maintained by the power generated from the dissipation of energy from the interior of the Earth. Altering the thickness of continental crust via weathering and erosion affects the upper mantle temperature which leads to changes in rates of oceanic crust recycling and consequently rates of outgassing of carbon dioxide into the atmosphere. Estimates for the power generated by various elements in the Earth system are shown. This includes, inter alia, surface life generation of 264 TW of power, much greater than those of geological processes such as mantle convection at 12 TW. This high power results from life's ability to harvest energy directly from the sun. Life need only utilise a small fraction of the generated free chemical energy for geochemical transformations at the surface, such as affecting rates of weathering and erosion of continental rocks, in order to affect interior, geological processes. Consequently when assessing the effects of life on Earth, and potentially any planet with a significant biosphere, dynamical models may be required that better capture the coupled nature of biologically-mediated surface and interior processes.

  20. Geologic Data Package for 2001 Immobilized Low-Activity Waste Performance Assessment

    International Nuclear Information System (INIS)

    SP Reidel; DG Horton

    1999-01-01

    This database is a compilation of existing geologic data from both the existing and new immobilized low-activity waste disposal sites for use in the 2001 Performance Assessment. Data were compiled from both surface and subsurface geologic sources. Large-scale surface geologic maps, previously published, cover the entire 200-East Area and the disposal sites. Subsurface information consists of drilling and geophysical logs from nearby boreholes and stored sediment samples. Numerous published geological reports are available that describe the subsurface geology of the area. Site-specific subsurface data are summarized in tables and profiles in this document. Uncertainty in data is mainly restricted to borehole information. Variations in sampling and drilling techniques present some correlation uncertainties across the sites. A greater degree of uncertainty exists on the new site because of restricted borehole coverage. There is some uncertainty to the location and orientation of elastic dikes across the sites

  1. Inducing a health-promoting change process within an organization the Effectiveness of a Large-Scale Intervention on Social Capital, Openness, and Autonomous Motivation Toward Health

    NARCIS (Netherlands)

    Scheppingen, A.R. van; Vroome, E.M.M. de; Have, K.C.J.M. ten; Bos, E.H.; Zwetsloot, G.I.J.M.; Mechelen, W. van

    2014-01-01

    Objective: To examine the effectiveness of an organizational large-scale intervention applied to induce a health-promoting organizational change process. Design and Methods: A quasi-experimental, "as-treated" design was used. Regression analyses on data of employees of a Dutch dairy company (n =324)

  2. Internationalization Measures in Large Scale Research Projects

    Science.gov (United States)

    Soeding, Emanuel; Smith, Nancy

    2017-04-01

    Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.

  3. Development of performance assessment methodology for nuclear waste isolation in geologic media

    Science.gov (United States)

    Bonano, E. J.; Chu, M. S. Y.; Cranwell, R. M.; Davis, P. A.

    The burial of nuclear wastes in deep geologic formations as a means for their disposal is an issue of significant technical and social impact. The analysis of the processes involved can be performed only with reliable mathematical models and computer codes as opposed to conducting experiments because the time scales associated are on the order of tens of thousands of years. These analyses are concerned primarily with the migration of radioactive contaminants from the repository to the environment accessible to humans. Modeling of this phenomenon depends on a large number of other phenomena taking place in the geologic porous and/or fractured medium. These are ground-water flow, physicochemical interactions of the contaminants with the rock, heat transfer, and mass transport. Once the radionuclides have reached the accessible environment, the pathways to humans and health effects are estimated. A performance assessment methodology for a potential high-level waste repository emplaced in a basalt formation has been developed for the U.S. Nuclear Regulatory Commission.

  4. Diffusion Experiments with Opalinus and Callovo-Oxfordian Clays: Laboratory, Large-Scale Experiments and Microscale Analysis by RBS

    Energy Technology Data Exchange (ETDEWEB)

    Garcia-Gutierrez, M.; Alonso, U.; Missana, T.; Cormenzana, J.L.; Mingarro, M.; Morejon, J.; Gil, P.

    2009-09-25

    Consolidated clays are potential host rocks for deep geological repositories for high-level radioactive waste. Diffusion is the main transport process for radionuclides (RN) in these clays. Radionuclide (RN) diffusion coefficients are the most important parameters for Performance Assessment (PA) calculations of clay barriers. Different diffusion methodologies were applied at a laboratory scale to analyse the diffusion behaviour of a wide range of RN. Main aims were to understand the transport properties of different RNs in two different clays and to contribute with feasible methodologies to improve in-situ diffusion experiments, using samples of larger scale. Classical laboratory essays and a novel experimental set-up for large-scale diffusion experiments were performed, together to a novel application of the nuclear ion beam technique Rutherford Backscattering Spectrometry (RBS), for diffusion analyses at the micrometer scale. The main experimental and theoretical characteristics of the different methodologies, and their advantages and limitations are here discussed. Experiments were performed with the Opalinus and the Callovo-Oxfordian clays. Both clays are studied as potential host rock for a repository. Effective diffusion coefficients ranged between 1.10{sup -}10 to 1.10{sup -}12 m{sup 2}/s for neutral, low sorbing cations (as Na and Sr) and anions. Apparent diffusion coefficients for strongly sorbing elements, as Cs and Co, are in the order of 1.10-13 m{sup 2}/s; europium present the lowest diffusion coefficient (5.10{sup -}15 m{sup 2}/s). The results obtained by the different approaches gave a comprehensive database of diffusion coefficients for RN with different transport behaviour within both clays. (Author) 42 refs.

  5. Double inflation: A possible resolution of the large-scale structure problem

    International Nuclear Information System (INIS)

    Turner, M.S.; Villumsen, J.V.; Vittorio, N.; Silk, J.; Juszkiewicz, R.

    1986-11-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Ω = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of ∼100 Mpc, while the small-scale structure over ≤ 10 Mpc resembles that in a low density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations. 38 refs., 6 figs

  6. Large-scale fracture mechancis testing -- requirements and possibilities

    International Nuclear Information System (INIS)

    Brumovsky, M.

    1993-01-01

    Application of fracture mechanics to very important and/or complicated structures, like reactor pressure vessels, brings also some questions about the reliability and precision of such calculations. These problems become more pronounced in cases of elastic-plastic conditions of loading and/or in parts with non-homogeneous materials (base metal and austenitic cladding, property gradient changes through material thickness) or with non-homogeneous stress fields (nozzles, bolt threads, residual stresses etc.). For such special cases some verification by large-scale testing is necessary and valuable. This paper discusses problems connected with planning of such experiments with respect to their limitations, requirements to a good transfer of received results to an actual vessel. At the same time, an analysis of possibilities of small-scale model experiments is also shown, mostly in connection with application of results between standard, small-scale and large-scale experiments. Experience from 30 years of large-scale testing in SKODA is used as an example to support this analysis. 1 fig

  7. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  8. Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing.

    Science.gov (United States)

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance; Messina, Thomas; Fan, Hongtao; Jaeger, Edward; Stephens, Susan

    2013-06-27

    Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of the box. Rainbow is available

  9. High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering

    Science.gov (United States)

    Maly, K.

    1998-01-01

    Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated

  10. Geology Forsmark. Site descriptive modelling Forsmark - stage 2.2

    Energy Technology Data Exchange (ETDEWEB)

    Stephens, Michael B. [Geological Survey of Sweden, Uppsala (Sweden); Fox, Aaron; La Pointe, Paul [Golder Associates Inc (United States); Simeonov, Assen [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Isaksson, Hans [GeoVista AB, Luleaa (Sweden); Hermanson, Jan; Oehman, Johan [Golder Associates AB, Stockholm (Sweden)

    2007-10-15

    The geological work during stage 2.2 has involved the development of deterministic models for rock domains (RFM) and deformation zones (ZFM), the identification and deterministic modelling of fracture domains (FFM) inside the candidate volume, i.e. the parts of rock domains that are not affected by deformation zones, and the development of statistical models for fractures and minor deformation zones (geological discrete fracture network modelling or geological DFN modelling). The geological DFN model addresses brittle structures at a scale of less than 1 km, which is the lower cut-off in the deterministic modelling of deformation zones. In order to take account of variability in data resolution, deterministic models for rock domains and deformation zones are presented in both regional and local model volumes, while the geological DFN model is valid within specific fracture domains inside the north-western part of the candidate volume, including the target volume. The geological modelling work has evaluated and made use of: A revised bedrock geological map at the ground surface. Geological and geophysical data from 21 cored boreholes and 33 percussion boreholes. Detailed mapping of fractures and rock units along nine excavations or large surface outcrops. Data bearing on the characterisation (including kinematics) of deformation zones. Complementary geochronological and other rock and fracture analytical data. Lineaments identified on the basis of airborne and high-resolution ground magnetic data. A reprocessing of both surface and borehole reflection seismic data. Seismic refraction data. The outputs of the deterministic modelling work are geometric models in RVS format and detailed property tables for rock domains and deformation zones, and a description of fracture domains. The outputs of the geological DFN modelling process are recommended parameters or statistical distributions that describe fracture set orientations, radius sizes, volumetric intensities

  11. Geology Forsmark. Site descriptive modelling Forsmark - stage 2.2

    International Nuclear Information System (INIS)

    Stephens, Michael B.; Fox, Aaron; La Pointe, Paul; Simeonov, Assen; Isaksson, Hans; Hermanson, Jan; Oehman, Johan

    2007-10-01

    The geological work during stage 2.2 has involved the development of deterministic models for rock domains (RFM) and deformation zones (ZFM), the identification and deterministic modelling of fracture domains (FFM) inside the candidate volume, i.e. the parts of rock domains that are not affected by deformation zones, and the development of statistical models for fractures and minor deformation zones (geological discrete fracture network modelling or geological DFN modelling). The geological DFN model addresses brittle structures at a scale of less than 1 km, which is the lower cut-off in the deterministic modelling of deformation zones. In order to take account of variability in data resolution, deterministic models for rock domains and deformation zones are presented in both regional and local model volumes, while the geological DFN model is valid within specific fracture domains inside the north-western part of the candidate volume, including the target volume. The geological modelling work has evaluated and made use of: A revised bedrock geological map at the ground surface. Geological and geophysical data from 21 cored boreholes and 33 percussion boreholes. Detailed mapping of fractures and rock units along nine excavations or large surface outcrops. Data bearing on the characterisation (including kinematics) of deformation zones. Complementary geochronological and other rock and fracture analytical data. Lineaments identified on the basis of airborne and high-resolution ground magnetic data. A reprocessing of both surface and borehole reflection seismic data. Seismic refraction data. The outputs of the deterministic modelling work are geometric models in RVS format and detailed property tables for rock domains and deformation zones, and a description of fracture domains. The outputs of the geological DFN modelling process are recommended parameters or statistical distributions that describe fracture set orientations, radius sizes, volumetric intensities

  12. Overview of geology, hydrology, geomorphology, and sediment budget of the Deschutes River Basin, Oregon.

    Science.gov (United States)

    Jim E. O' Connor; Gordon E. Grant; Tana L. Haluska

    2003-01-01

    Within the Deschutes River basin of central Oregon, the geology, hydrology, and physiography influence geomorphic and ecologic processes at a variety of temporal and spatial scales. Hydrologic and physiographic characteristics of the basin are related to underlying geologic materials. In the southwestern part of the basin, Quaternary volcanism and tectonism has created...

  13. Large-scale visualization system for grid environment

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of Japan Atomic Energy Agency (CCSE/JAEA) has been conducting R and Ds of distributed computing (grid computing) environments: Seamless Thinking Aid (STA), Information Technology Based Laboratory (ITBL) and Atomic Energy Grid InfraStructure (AEGIS). In these R and Ds, we have developed the visualization technology suitable for the distributed computing environment. As one of the visualization tools, we have developed the Parallel Support Toolkit (PST) which can execute the visualization process parallely on a computer. Now, we improve PST to be executable simultaneously on multiple heterogeneous computers using Seamless Thinking Aid Message Passing Interface (STAMPI). STAMPI, we have developed in these R and Ds, is the MPI library executable on a heterogeneous computing environment. The improvement realizes the visualization of extremely large-scale data and enables more efficient visualization processes in a distributed computing environment. (author)

  14. Reactive transport at the pore-scale: Geological Labs on Chip studies (GLoCs) for CO2 storage in saline aquifers

    Science.gov (United States)

    Azaroual, M. M.; Lassin, A., Sr.; André, L., Sr.; Devau, N., Sr.; Leroy, P., Sr.

    2017-12-01

    The near well bore of CO2 injection in saline aquifer is the main sensitive part of the targeted carbone storage reservoirs. The recent development of microfluidics tools mimicking porous media of geological reservoirs allowed studying physical, physico-chemical and thermodynamic mechanisms. We used the GLoCs "Geological Labs on Chip" to study dynamic and reactive transport processes at the pore scale induced by the CO2 geological storage. The present work is a first attempt to reproduce, by reactive transport modeling, an experiment of calcium carbonate precipitation during the co-injection of two aqueous solutions in a GLoC device. For that purpose, a new kinetics model, based on the transition-state-theory and on surface complexation modeling, was developed to describe the co-precipitation of amorphous calcium carbonate (ACC) and calcite. ACC precipitates and creates surface complexation sites from which calcite can nucleate and create new surface complexation sites. When the kinetics of calcite precipitation are fast enough, the consumption of matter leads to the dissolution of ACC. The modeling results were first compared to batch experiments (from the literature) and then applied with success to dynamic experiment observations carried out on a GLoC device (from the literature). On the other hand, we evaluated the solubility of CO2 in capillary waters that increases between 5 to 10 folds for reservoir conditions (200 bar and 100°C) compared to the bulk water. The GLoCs tools started to address an excellent and much finer degree of processes control (reactive transport processes, mixing effects, minerals precipitation and dissolution kinetics, etc.) thanks to in situ analysis and characterization techniques, allowing access in real time to relevant properties. Current investigations focus on key parameters influencing the flowing dynamics and trapping mechanisms (relative permeability, capillary conditions, kinetics of dissolution and precipitation of minerals).

  15. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    The present paper presents overtopping measurements from small scale model test performed at the Haudraulic & Coastal Engineering Laboratory, Aalborg University, Denmark and large scale model tests performed at the Largde Wave Channel,Hannover, Germany. Comparison between results obtained from...... small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects...

  16. Manufacturing Process Simulation of Large-Scale Cryotanks

    Science.gov (United States)

    Babai, Majid; Phillips, Steven; Griffin, Brian

    2003-01-01

    NASA's Space Launch Initiative (SLI) is an effort to research and develop the technologies needed to build a second-generation reusable launch vehicle. It is required that this new launch vehicle be 100 times safer and 10 times cheaper to operate than current launch vehicles. Part of the SLI includes the development of reusable composite and metallic cryotanks. The size of these reusable tanks is far greater than anything ever developed and exceeds the design limits of current manufacturing tools. Several design and manufacturing approaches have been formulated, but many factors must be weighed during the selection process. Among these factors are tooling reachability, cycle times, feasibility, and facility impacts. The manufacturing process simulation capabilities available at NASA.s Marshall Space Flight Center have played a key role in down selecting between the various manufacturing approaches. By creating 3-D manufacturing process simulations, the varying approaches can be analyzed in a virtual world before any hardware or infrastructure is built. This analysis can detect and eliminate costly flaws in the various manufacturing approaches. The simulations check for collisions between devices, verify that design limits on joints are not exceeded, and provide cycle times which aide in the development of an optimized process flow. In addition, new ideas and concerns are often raised after seeing the visual representation of a manufacturing process flow. The output of the manufacturing process simulations allows for cost and safety comparisons to be performed between the various manufacturing approaches. This output helps determine which manufacturing process options reach the safety and cost goals of the SLI. As part of the SLI, The Boeing Company was awarded a basic period contract to research and propose options for both a metallic and a composite cryotank. Boeing then entered into a task agreement with the Marshall Space Flight Center to provide manufacturing

  17. Geological Feasibility of Underground Oil Storage in Jintan Salt Mine of China

    Directory of Open Access Journals (Sweden)

    Xilin Shi

    2017-01-01

    Full Text Available A number of large underground oil storage spaces will be constructed in deep salt mines in China in the coming years. According to the general geological survey, the first salt cavern oil storage base of China is planned to be built in Jintan salt mine. In this research, the geological feasibility of the salt mine for oil storage is identified in detail as follows. (1 The characteristics of regional structure, strata sediment, and impermeable layer distribution of Jintan salt mine were evaluated and analyzed. (2 The tightness of cap rock was evaluated in reviews of macroscopic geology and microscopic measuring. (3 According to the geological characteristics of Jintan salt mine, the specific targeted formation for building underground oil storage was chosen, and the sealing of nonsalt interlayers was evaluated. (4 Based on the sonar measuring results of the salt caverns, the characteristics of solution mining salt caverns were analyzed. In addition, the preferred way of underground oil storage construction was determined. (5 Finally, the results of closed well observation in solution mining salt caverns were assessed. The research results indicated that Jintan salt mine has the basic geological conditions for building large-scale underground oil storage.

  18. Side effects of problem-solving strategies in large-scale nutrition science: towards a diversification of health.

    Science.gov (United States)

    Penders, Bart; Vos, Rein; Horstman, Klasien

    2009-11-01

    Solving complex problems in large-scale research programmes requires cooperation and division of labour. Simultaneously, large-scale problem solving also gives rise to unintended side effects. Based upon 5 years of researching two large-scale nutrigenomic research programmes, we argue that problems are fragmented in order to be solved. These sub-problems are given priority for practical reasons and in the process of solving them, various changes are introduced in each sub-problem. Combined with additional diversity as a result of interdisciplinarity, this makes reassembling the original and overall goal of the research programme less likely. In the case of nutrigenomics and health, this produces a diversification of health. As a result, the public health goal of contemporary nutrition science is not reached in the large-scale research programmes we studied. Large-scale research programmes are very successful in producing scientific publications and new knowledge; however, in reaching their political goals they often are less successful.

  19. Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping

    Science.gov (United States)

    Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.

    2017-12-01

    Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.

  20. Process for selecting a site for Canada's deep geological repository for used nuclear fuel

    International Nuclear Information System (INIS)

    Facella, J.; Belfadhel, M.B.

    2011-01-01

    The Nuclear Waste Management Organization (NWMO) is responsible for implementing Adaptive Phased Management, the approach selected by the Government of Canada for long-term management of used nuclear fuel waste generated by Canadian nuclear reactors. The ultimate objective of Adaptive Phased Management is the centralized containment and isolation of Canada's used nuclear fuel in a Deep Geological Repository in a suitable crystalline or sedimentary rock formation at a depth of about 500m. The repository will consist of a series of access and service shafts and a series of tunnels leading to placement rooms where used fuel will be placed and sealed in competent rock using a multi-barrier system which includes long lived specially designed containers, sealing materials such as bentonite and the rock itself. The used fuel will be monitored throughout all phases of implementation and will also remain retrievable for an extended period of time. In May 2010, the NWMO published the site selection process that serves as the road map to decision-making on the location for the deep geological repository. NWMO initiated the process with a first stage that invites communities to learn more about the project and the site selection process. NWMO is actively building awareness of the project and, on request of communities, is delivering briefings, supporting community capacity building and undertaking high-level screenings of site suitability. The paper provides a brief description of: Adaptive Phased Management including the deep geological repository which is its ultimate goal, and the design of the site selection process, and importantly the approach to assessing the suitability of sites from both a social and technical perspective. The paper will outline how NWMO sought to develop a socially-acceptable site selection process as a firm foundation for future decisions on siting. Through a two-year collaborative process, NWMO sought to understand the expectations of

  1. Bio-inspired wooden actuators for large scale applications.

    Science.gov (United States)

    Rüggeberg, Markus; Burgert, Ingo

    2015-01-01

    Implementing programmable actuation into materials and structures is a major topic in the field of smart materials. In particular the bilayer principle has been employed to develop actuators that respond to various kinds of stimuli. A multitude of small scale applications down to micrometer size have been developed, but up-scaling remains challenging due to either limitations in mechanical stiffness of the material or in the manufacturing processes. Here, we demonstrate the actuation of wooden bilayers in response to changes in relative humidity, making use of the high material stiffness and a good machinability to reach large scale actuation and application. Amplitude and response time of the actuation were measured and can be predicted and controlled by adapting the geometry and the constitution of the bilayers. Field tests in full weathering conditions revealed long-term stability of the actuation. The potential of the concept is shown by a first demonstrator. With the sensor and actuator intrinsically incorporated in the wooden bilayers, the daily change in relative humidity is exploited for an autonomous and solar powered movement of a tracker for solar modules.

  2. Performance Prediction for Large-Scale Nuclear Waste Repositories: Final Report

    International Nuclear Information System (INIS)

    Glassley, W E; Nitao, J J; Grant, W; Boulos, T N; Gokoffski, M O; Johnson, J W; Kercher, J R; Levatin, J A; Steefel, C I

    2001-01-01

    The goal of this project was development of a software package capable of utilizing terascale computational platforms for solving subsurface flow and transport problems important for disposal of high level nuclear waste materials, as well as for DOE-complex clean-up and stewardship efforts. We sought to develop a tool that would diminish reliance on abstracted models, and realistically represent the coupling between subsurface fluid flow, thermal effects and chemical reactions that both modify the physical framework of the rock materials and which change the rock mineralogy and chemistry of the migrating fluid. Providing such a capability would enhance realism in models and increase confidence in long-term predictions of performance. Achieving this goal also allows more cost-effective design and execution of monitoring programs needed to evaluate model results. This goal was successfully accomplished through the development of a new simulation tool (NUFT-C). This capability allows high resolution modeling of complex coupled thermal-hydrological-geochemical processes in the saturated and unsaturated zones of the Earth's crust. The code allows consideration of virtually an unlimited number of chemical species and minerals in a multi-phase, non-isothermal environment. Because the code is constructed to utilize the computational power of the tera-scale IBM ASCI computers, simulations that encompass large rock volumes and complex chemical systems can now be done without sacrificing spatial or temporal resolution. The code is capable of doing one-, two-, and three-dimensional simulations, allowing unprecedented evaluation of the evolution of rock properties and mineralogical and chemical change as a function of time. The code has been validated by comparing results of simulations to laboratory-scale experiments, other benchmark codes, field scale experiments, and observations in natural systems. The results of these exercises demonstrate that the physics and chemistry

  3. Streaming Parallel GPU Acceleration of Large-Scale filter-based Spiking Neural Networks

    NARCIS (Netherlands)

    L.P. Slazynski (Leszek); S.M. Bohte (Sander)

    2012-01-01

    htmlabstractThe arrival of graphics processing (GPU) cards suitable for massively parallel computing promises a↵ordable large-scale neural network simulation previously only available at supercomputing facil- ities. While the raw numbers suggest that GPUs may outperform CPUs by at least an order of

  4. Geology Laxemar. Site descriptive modelling SDM-Site Laxemar

    Energy Technology Data Exchange (ETDEWEB)

    Wahlgren, Carl-Henric (Geological Survey of Sweden, Uppsala (Sweden)); Curtis, Philip; Hermanson, Jan; Forssberg, Ola; Oehman, Johan (Golder Associates AB (Sweden)); Fox, Aaron; La Pointe, Paul (Golder Associates Inc (United States)); Drake, Henrik (Dept. of Earth Sciences, Univ. of Goeteborg, Goeteborg (Sweden)); Triumf, Carl-Axel; Mattsson, Haakan; Thunehed, Hans (GeoVista AB, Luleaa (Sweden)); Juhlin, Christopher (Dept. of Earth Sciences, Uppsala Univ., Uppsala (Sweden))

    2008-11-15

    The geological work during the SDM Site Laxemar modelling stage has involved the continued development of deterministic models for rock domains (RSM) and deformation zones (ZSM), the identification and deterministic modelling of fracture domains (FSM), and the development of statistical models for fractures and minor deformation zones (geological discrete fracture network (DFN) modelling). The geological DFN model addresses fractures/structures with a size of less than 1 km, which is the lower cut-off of structures included in the deterministic modelling of deformation zones. In order to take account of variability in data resolution, deterministic models for rock domains and deformation zones are presented in both regional and local scale model volumes, while the geological DFN model is valid only within specific fracture domains inside the Laxemar local model volume. The geological and geophysical data that constitute the basis for the SDM-Site Laxemar modelling work comprise all data that have been acquired from Laxemar, i.e. all data that were available at the data freeze for SDM-Site Laxemar at August 31, 2007. Selected quality controlled data from the complementary cored borehole KLX27A have also been utilised in the modelling work. Data from the following investigations were acquired during the complete site investigation between the data freezes for Laxemar 1.2 and SDM-Site Laxemar as defined above: A revised bedrock geological map at the ground surface. Geological and geophysical data from 40 new cored boreholes and 14 percussion boreholes. Sampling and subsequent modal and geochemical analytical work of bedrock samples taken in connection with excavations in southern Laxemar. Detailed mapping of fractures and rock units along 10 trench excavations and 2 large surface exposures (drill sites for KLX09 and KLX11A/KLX20A). Special studies involving more detailed characterisation of deformation zones identified in the geological single-hole interpretation

  5. Thermal activation of dislocations in large scale obstacle bypass

    Science.gov (United States)

    Sobie, Cameron; Capolungo, Laurent; McDowell, David L.; Martinez, Enrique

    2017-08-01

    Dislocation dynamics simulations have been used extensively to predict hardening caused by dislocation-obstacle interactions, including irradiation defect hardening in the athermal case. Incorporating the role of thermal energy on these interactions is possible with a framework provided by harmonic transition state theory (HTST) enabling direct access to thermally activated reaction rates using the Arrhenius equation, including rates of dislocation-obstacle bypass processes. Moving beyond unit dislocation-defect reactions to a representative environment containing a large number of defects requires coarse-graining the activation energy barriers of a population of obstacles into an effective energy barrier that accurately represents the large scale collective process. The work presented here investigates the relationship between unit dislocation-defect bypass processes and the distribution of activation energy barriers calculated for ensemble bypass processes. A significant difference between these cases is observed, which is attributed to the inherent cooperative nature of dislocation bypass processes. In addition to the dislocation-defect interaction, the morphology of the dislocation segments pinned to the defects play an important role on the activation energies for bypass. A phenomenological model for activation energy stress dependence is shown to describe well the effect of a distribution of activation energies, and a probabilistic activation energy model incorporating the stress distribution in a material is presented.

  6. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  7. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  8. The large scale in-situ PRACLAY heater and seal tests in URL HADES, Mol, Belgium

    Energy Technology Data Exchange (ETDEWEB)

    Xiangling Li; Guangjing Chen; Verstricht, Jan; Van Marcke, Philippe; Troullinos, Ioannis [ESV EURIDICE, Mol (Belgium)

    2013-07-01

    In Belgium, the URL HADES was constructed in the Boom Clay formation at the Mol site to investigate the feasibility of geological disposal in a clay formation. Since 1995, the URL R and D programme has focused on large scale demonstration tests like the PRACLAY Heater and Seal tests. The main objective of the Heater Test is to demonstrate that the thermal load generated by the heat-emitting waste will not jeopardise the safety functions of the host rock. The primary objective of the Seal Test is to provide suitable hydraulic boundary conditions for the Heater Test. The Seal Test also provides an opportunity to investigate the in-situ behaviour of a bentonite-based EBS. The PRACLAY gallery was constructed in 2007 and the hydraulic seal was installed in 2010. The bentonite is hydrated both naturally and artificially. The swelling, total pressure and pore pressure of the bentonite are continuously measured and analysed by numerical simulations to get a better understanding of this hydration processes. The timing of switching on the heater depends on the progress of the bentonite hydration, as a sufficient seal swelling is needed to fulfill its role. A set of conditions to be met for the heater switch-on and its schedule will be given. (authors)

  9. The method of arbitrarily large moments to calculate single scale processes in quantum field theory

    Energy Technology Data Exchange (ETDEWEB)

    Bluemlein, Johannes [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Schneider, Carsten [Johannes Kepler Univ., Linz (Austria). Research Inst. for Symbolic Computation (RISC)

    2017-01-15

    We device a new method to calculate a large number of Mellin moments of single scale quantities using the systems of differential and/or difference equations obtained by integration-by-parts identities between the corresponding Feynman integrals of loop corrections to physical quantities. These scalar quantities have a much simpler mathematical structure than the complete quantity. A sufficiently large set of moments may even allow the analytic reconstruction of the whole quantity considered, holding in case of first order factorizing systems. In any case, one may derive highly precise numerical representations in general using this method, which is otherwise completely analytic.

  10. HiQuant: Rapid Postquantification Analysis of Large-Scale MS-Generated Proteomics Data.

    Science.gov (United States)

    Bryan, Kenneth; Jarboui, Mohamed-Ali; Raso, Cinzia; Bernal-Llinares, Manuel; McCann, Brendan; Rauch, Jens; Boldt, Karsten; Lynn, David J

    2016-06-03

    Recent advances in mass-spectrometry-based proteomics are now facilitating ambitious large-scale investigations of the spatial and temporal dynamics of the proteome; however, the increasing size and complexity of these data sets is overwhelming current downstream computational methods, specifically those that support the postquantification analysis pipeline. Here we present HiQuant, a novel application that enables the design and execution of a postquantification workflow, including common data-processing steps, such as assay normalization and grouping, and experimental replicate quality control and statistical analysis. HiQuant also enables the interpretation of results generated from large-scale data sets by supporting interactive heatmap analysis and also the direct export to Cytoscape and Gephi, two leading network analysis platforms. HiQuant may be run via a user-friendly graphical interface and also supports complete one-touch automation via a command-line mode. We evaluate HiQuant's performance by analyzing a large-scale, complex interactome mapping data set and demonstrate a 200-fold improvement in the execution time over current methods. We also demonstrate HiQuant's general utility by analyzing proteome-wide quantification data generated from both a large-scale public tyrosine kinase siRNA knock-down study and an in-house investigation into the temporal dynamics of the KSR1 and KSR2 interactomes. Download HiQuant, sample data sets, and supporting documentation at http://hiquant.primesdb.eu .

  11. Safeguarding aspects of large-scale commercial reprocessing plants

    International Nuclear Information System (INIS)

    1979-03-01

    The paper points out that several solutions to the problems of safeguarding large-scale plants have been put forward: (1) Increased measurement accuracy. This does not remove the problem of timely detection. (2) Continuous in-process measurement. As yet unproven and likely to be costly. (3) More extensive use of containment and surveillance. The latter appears to be feasible but requires the incorporation of safeguards into plant design and sufficient redundancy to protect the operators interests. The advantages of altering the emphasis of safeguards philosophy from quantitative goals to the analysis of diversion strategies should be considered

  12. Updating Geospatial Data from Large Scale Data Sources

    Science.gov (United States)

    Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.

    2011-08-01

    In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical

  13. Volcanic Processes and Geology of Augustine Volcano, Alaska

    Science.gov (United States)

    Waitt, Richard B.; Beget, James E.

    2009-01-01

    Augustine Island (volcano) in lower Cook Inlet, Alaska, has erupted repeatedly in late-Holocene and historical times. Eruptions typically beget high-energy volcanic processes. Most notable are bouldery debris avalanches containing immense angular clasts shed from summit domes. Coarse deposits of these avalanches form much of Augustine's lower flanks. A new geologic map at 1:25,000 scale depicts these deposits, these processes. We correlate deposits by tephra layers calibrated by many radiocarbon dates. Augustine Volcano began erupting on the flank of a small island of Jurassic clastic-sedimentary rock before the late Wisconsin glaciation (late Pleistocene). The oldest known effusions ranged from olivine basalt explosively propelled by steam, to highly explosive magmatic eruptions of dacite or rhyodacite shed as pumice flows. Late Wisconsin piedmont glaciers issuing from the mountainous western mainland surrounded the island while dacitic eruptive debris swept down the south volcano flank. Evidence is scant for eruptions between the late Wisconsin and about 2,200 yr B.P. On a few south-flank inliers, thick stratigraphically low pumiceous pyroclastic-flow and fall deposits probably represent this period from which we have no radiocarbon dates on Augustine Island. Eruptions between about 5,350 and 2,200 yr B.P. we know with certainty by distal tephras. On Shuyak Island 100 km southeast of Augustine, two distal fall ashes of Augustinian chemical provenance (microprobe analysis of glass) date respectively between about 5,330 and 5,020 yr B.P. and between about 3,620 and 3,360 yr B.P. An Augustine ash along Kamishak Creek 70 km southwest of Augustine dates between about 3,850 and 3,660 yr B.P. A probably Augustinian ash lying within peat near Homer dates to about 2,275 yr B.P. From before 2,200 yr B.P. to the present, Augustine eruptive products abundantly mantle the island. During this period, numerous coarse debris avalanches swept beyond Augustine's coast, most

  14. Geological setting control of flood dynamics in lowland rivers (Poland).

    Science.gov (United States)

    Wierzbicki, Grzegorz; Ostrowski, Piotr; Falkowski, Tomasz; Mazgajski, Michał

    2018-04-27

    We aim to answer a question: how does the geological setting affect flood dynamics in lowland alluvial rivers? The study area covers three river reaches: not trained, relatively large on the European scale, flowing in broad valleys cut in the landscape of old glacial plains. We focus on the locations where levees [both: a) natural or b) artificial] were breached during flood. In these locations we identify (1) the erosional traces of flood (crevasse channels) on the floodplain displayed on DEM derived from ALS LIDAR. In the main river channel, we perform drillings in order to measure the depth of the suballuvial surface and to locate (2) the protrusions of bedrock resistant to erosion. We juxtapose on one map: (1) the floodplain geomorphology with (2) the geological data from the river channel. The results from each of the three study reaches are presented on maps prepared in the same manner in order to enable a comparison of the regularities of fluvial processes written in (1) the landscape and driven by (2) the geological setting. These processes act in different river reaches: (a) not embanked and dominated by ice jam floods, (b) embanked and dominated by rainfall and ice jam floods. We also analyse hydrological data to present hydrodynamic descriptions of the flood. Our principal results indicate similarity of (1) distinctive erosional patterns and (2) specific geological features in all three study reaches. We draw the conclusion: protrusions of suballuvial bedrock control the flood dynamics in alluvial rivers. It happens in both types of rivers. In areas where the floodplain remains natural, the river inundates freely during every flood. In other areas the floodplain has been reclaimed by humans who constructed an artificial levee system, which protects the flood-prone area from inundation, until levee breach occurs. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. A parallel FE-FV scheme to solve fluid flow in complex geologic media

    NARCIS (Netherlands)

    Coumou, Dim; Matthäi, Stephan; Geiger, Sebastian; Driesner, Thomas

    2008-01-01

    Field data-based simulations of geologic systems require much computational time because of their mathematical complexity and the often desired large scales in space and time. To conduct accurate simulations in an acceptable time period, methods to reduce runtime are required. A parallelization

  16. Homogenization of Large-Scale Movement Models in Ecology

    Science.gov (United States)

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  17. Safeguarding of large scale reprocessing and MOX plants

    International Nuclear Information System (INIS)

    Howsley, R.; Burrows, B.; Longevialle, H. de; Kuroi, H.; Izumi, A.

    1997-01-01

    In May 97, the IAEA Board of Governors approved the final measures of the ''93+2'' safeguards strengthening programme, thus improving the international non-proliferation regime by enhancing the effectiveness and efficiency of safeguards verification. These enhancements are not however, a revolution in current practices, but rather an important step in the continuous evolution of the safeguards system. The principles embodied in 93+2, for broader access to information and increased physical access already apply, in a pragmatic way, to large scale reprocessing and MOX fabrication plants. In these plants, qualitative measures and process monitoring play an important role in addition to accountancy and material balance evaluations in attaining the safeguard's goals. This paper will reflect on the safeguards approaches adopted for these large bulk handling facilities and draw analogies, conclusions and lessons for the forthcoming implementation of the 93+2 Programme. (author)

  18. Large-scale demonstration of waste solidification in saltstone

    International Nuclear Information System (INIS)

    McIntyre, P.F.; Oblath, S.B.; Wilhite, E.L.

    1988-05-01

    The saltstone lysimeters are a large scale demonstration of a disposal concept for decontaminated salt solution resulting from in-tank processing of defense waste. The lysimeter experiment has provided data on the leaching behavior of large saltstone monoliths under realistic field conditions. The results also will be used to compare the effect of capping the wasteform on contaminant release. Biweekly monitoring of sump leachate from three lysimeters has continued on a routine basis for approximately 3 years. An uncapped lysimeter has shown the highest levels of nitrate and 99 Tc release. Gravel and clay capped lysimeters have shown levels equivalent to or slightly higher than background rainwater levels. Mathematical model predictions have been compared to lysimeter results. The models will be applied to predict the impact of saltstone disposal on groundwater quality. 9 refs., 5 figs., 3 tabs

  19. Dynamical links between small- and large-scale mantle heterogeneity: Seismological evidence

    Science.gov (United States)

    Frost, Daniel A.; Garnero, Edward J.; Rost, Sebastian

    2018-01-01

    We identify PKP • PKP scattered waves (also known as P‧ •P‧) from earthquakes recorded at small-aperture seismic arrays at distances less than 65°. P‧ •P‧ energy travels as a PKP wave through the core, up into the mantle, then scatters back down through the core to the receiver as a second PKP. P‧ •P‧ waves are unique in that they allow scattering heterogeneities throughout the mantle to be imaged. We use array-processing methods to amplify low amplitude, coherent scattered energy signals and resolve their incoming direction. We deterministically map scattering heterogeneity locations from the core-mantle boundary to the surface. We use an extensive dataset with sensitivity to a large volume of the mantle and a location method allowing us to resolve and map more heterogeneities than have previously been possible, representing a significant increase in our understanding of small-scale structure within the mantle. Our results demonstrate that the distribution of scattering heterogeneities varies both radially and laterally. Scattering is most abundant in the uppermost and lowermost mantle, and a minimum in the mid-mantle, resembling the radial distribution of tomographically derived whole-mantle velocity heterogeneity. We investigate the spatial correlation of scattering heterogeneities with large-scale tomographic velocities, lateral velocity gradients, the locations of deep-seated hotspots and subducted slabs. In the lowermost 1500 km of the mantle, small-scale heterogeneities correlate with regions of low seismic velocity, high lateral seismic gradient, and proximity to hotspots. In the upper 1000 km of the mantle there is no significant correlation between scattering heterogeneity location and subducted slabs. Between 600 and 900 km depth, scattering heterogeneities are more common in the regions most remote from slabs, and close to hotspots. Scattering heterogeneities show an affinity for regions close to slabs within the upper 200 km of the

  20. FEBEX project: full-scale engineered barriers experiment for a deep geological repository for high level radioactive waste in crystalline host rock. Final report

    International Nuclear Information System (INIS)

    Alberdi, J.; Barcala, J. M.; Campos, R.; Cuevas, A. M.; Fernandez, E.

    2000-01-01

    FEBEX has the multiple objective of demonstrating the feasibility of manufacturing, handling and constructing the engineered barriers and of developing codes for the thermo-hydro-mechanical and thermo-hydro-geochemical performance assessment of a deep geological repository for high level radioactive wastes. These objectives require integrated theoretical and experimental development work. The experimental work consists of three parts: an in situ test, a mock-up test and a series of laboratory tests. The experiments is based on the Spanish reference concept for crystalline rock, in which the waste capsules are placed horizontally in drifts surround by high density compacted bentonite blocks. In the two large-scale tests, the thermal effects of the wastes were simulated by means of heaters; hydration was natural in the in situ test and controlled in the mock-up test. The large-scale tests, with their monitoring systems, have been in operation for more than two years. the demonstration has been achieved in the in situ test and there are great expectation that numerical models sufficiently validated for the near-field performance assessment will be achieved. (Author)

  1. Assessment of present and future large-scale semiconductor detector systems

    International Nuclear Information System (INIS)

    Spieler, H.G.; Haller, E.E.

    1984-11-01

    The performance of large-scale semiconductor detector systems is assessed with respect to their theoretical potential and to the practical limitations imposed by processing techniques, readout electronics and radiation damage. In addition to devices which detect reaction products directly, the analysis includes photodetectors for scintillator arrays. Beyond present technology we also examine currently evolving structures and techniques which show potential for producing practical devices in the foreseeable future

  2. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  3. Applications of Data Assimilation to Analysis of the Ocean on Large Scales

    Science.gov (United States)

    Miller, Robert N.; Busalacchi, Antonio J.; Hackert, Eric C.

    1997-01-01

    It is commonplace to begin talks on this topic by noting that oceanographic data are too scarce and sparse to provide complete initial and boundary conditions for large-scale ocean models. Even considering the availability of remotely-sensed data such as radar altimetry from the TOPEX and ERS-1 satellites, a glance at a map of available subsurface data should convince most observers that this is still the case. Data are still too sparse for comprehensive treatment of interannual to interdecadal climate change through the use of models, since the new data sets have not been around for very long. In view of the dearth of data, we must note that the overall picture is changing rapidly. Recently, there have been a number of large scale ocean analysis and prediction efforts, some of which now run on an operational or at least quasi-operational basis, most notably the model based analyses of the tropical oceans. These programs are modeled on numerical weather prediction. Aside from the success of the global tide models, assimilation of data in the tropics, in support of prediction and analysis of seasonal to interannual climate change, is probably the area of large scale ocean modeling and data assimilation in which the most progress has been made. Climate change is a problem which is particularly suited to advanced data assimilation methods. Linear models are useful, and the linear theory can be exploited. For the most part, the data are sufficiently sparse that implementation of advanced methods is worthwhile. As an example of a large scale data assimilation experiment with a recent extensive data set, we present results of a tropical ocean experiment in which the Kalman filter was used to assimilate three years of altimetric data from Geosat into a coarsely resolved linearized long wave shallow water model. Since nonlinear processes dominate the local dynamic signal outside the tropics, subsurface dynamical quantities cannot be reliably inferred from surface height

  4. VisualRank: applying PageRank to large-scale image search.

    Science.gov (United States)

    Jing, Yushi; Baluja, Shumeet

    2008-11-01

    Because of the relative ease in understanding and processing text, commercial image-search systems often rely on techniques that are largely indistinguishable from text-search. Recently, academic studies have demonstrated the effectiveness of employing image-based features to provide alternative or additional signals. However, it remains uncertain whether such techniques will generalize to a large number of popular web queries, and whether the potential improvement to search quality warrants the additional computational cost. In this work, we cast the image-ranking problem into the task of identifying "authority" nodes on an inferred visual similarity graph and propose VisualRank to analyze the visual link structures among images. The images found to be "authorities" are chosen as those that answer the image-queries well. To understand the performance of such an approach in a real system, we conducted a series of large-scale experiments based on the task of retrieving images for 2000 of the most popular products queries. Our experimental results show significant improvement, in terms of user satisfaction and relevancy, in comparison to the most recent Google Image Search results. Maintaining modest computational cost is vital to ensuring that this procedure can be used in practice; we describe the techniques required to make this system practical for large scale deployment in commercial search engines.

  5. Large-scale weakly supervised object localization via latent category learning.

    Science.gov (United States)

    Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve

    2015-04-01

    Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.

  6. Parallel Quasi Newton Algorithms for Large Scale Non Linear Unconstrained Optimization

    International Nuclear Information System (INIS)

    Rahman, M. A.; Basarudin, T.

    1997-01-01

    This paper discusses about Quasi Newton (QN) method to solve non-linear unconstrained minimization problems. One of many important of QN method is choice of matrix Hk. to be positive definite and satisfies to QN method. Our interest here is the parallel QN methods which will suite for the solution of large-scale optimization problems. The QN methods became less attractive in large-scale problems because of the storage and computational requirements. How ever, it is often the case that the Hessian is space matrix. In this paper we include the mechanism of how to reduce the Hessian update and hold the Hessian properties.One major reason of our research is that the QN method may be good in solving certain type of minimization problems, but it is efficiency degenerate when is it applied to solve other category of problems. For this reason, we use an algorithm containing several direction strategies which are processed in parallel. We shall attempt to parallelized algorithm by exploring different search directions which are generated by various QN update during the minimization process. The different line search strategies will be employed simultaneously in the process of locating the minimum along each direction.The code of algorithm will be written in Occam language 2 which is run on the transputer machine

  7. Large scale metal-free synthesis of graphene on sapphire and transfer-free device fabrication.

    Science.gov (United States)

    Song, Hyun Jae; Son, Minhyeok; Park, Chibeom; Lim, Hyunseob; Levendorf, Mark P; Tsen, Adam W; Park, Jiwoong; Choi, Hee Cheul

    2012-05-21

    Metal catalyst-free growth of large scale single layer graphene film on a sapphire substrate by a chemical vapor deposition (CVD) process at 950 °C is demonstrated. A top-gated graphene field effect transistor (FET) device is successfully fabricated without any transfer process. The detailed growth process is investigated by the atomic force microscopy (AFM) studies.

  8. Abnormal binding and disruption in large scale networks involved in human partial seizures

    Directory of Open Access Journals (Sweden)

    Bartolomei Fabrice

    2013-12-01

    Full Text Available There is a marked increase in the amount of electrophysiological and neuroimaging works dealing with the study of large scale brain connectivity in the epileptic brain. Our view of the epileptogenic process in the brain has largely evolved over the last twenty years from the historical concept of “epileptic focus” to a more complex description of “Epileptogenic networks” involved in the genesis and “propagation” of epileptic activities. In particular, a large number of studies have been dedicated to the analysis of intracerebral EEG signals to characterize the dynamic of interactions between brain areas during temporal lobe seizures. These studies have reported that large scale functional connectivity is dramatically altered during seizures, particularly during temporal lobe seizure genesis and development. Dramatic changes in neural synchrony provoked by epileptic rhythms are also responsible for the production of ictal symptoms or changes in patient’s behaviour such as automatisms, emotional changes or consciousness alteration. Beside these studies dedicated to seizures, large-scale network connectivity during the interictal state has also been investigated not only to define biomarkers of epileptogenicity but also to better understand the cognitive impairments observed between seizures.

  9. Report on geologic remote sensing of the Columbia Plateau

    International Nuclear Information System (INIS)

    Sandness, G.A.; Kimball, C.S.; Schmierer, K.E.; Lindberg, J.W.

    1982-05-01

    The purpose of this remote sensing study is to identify faults or other geologic features which may have a significant bearing on the structural and tectonic character of the Hanford Site and the surrounding region. Landsat imagery, Skylab photographs, and U-2 photographs were analyzed to identify and map geologic photolineaments in the Columbia Plateau. The Landsat and Skylab imagery provided a regional perspective and allowed the identification of large-scale linear features. The U-2 photography provided much greater spatial resolution as well as a stereoscopic viewing capability. This allowed identification of smaller structural or geologic features and the identification of many cultural and nongeologic lineaments detected in the Landsat and Skylab imagery. The area studied totals, approximately 85,000 square miles, and encompasses virtually all exposures of Columbia River Basalt in the states of Washington, Oregon, and Idaho. It also includes an area bordering the Columbia River Basalt outcrop. This border area was studied in order to identify significant structures that may extend into the plateau. Included are a description of the procedures used for image analysis, 20 lineament maps at a scale of 1:250,000, geological summaries for the areas covered by the lineament maps, and discussions of many of the lineaments shown on the maps. Comparisons of the lineament maps with available geologic maps showed that the number of detected lineaments was much greater than the number of known faults and other linear features. Approximately 70% of the faults shown on the geologic maps were detected and are characterized as lineaments. Lineament trends in the northwest-southeast and northeast-southwest directions were found to predominate throughout the study area

  10. The Cosmology Large Angular Scale Surveyor (CLASS)

    Science.gov (United States)

    Cleary, Joseph

    2018-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is an array of four telescopes designed to measure the polarization of the Cosmic Microwave Background. CLASS aims to detect the B-mode polarization from primordial gravitational waves predicted by cosmic inflation theory, as well as the imprint left by reionization upon the CMB E-mode polarization. This will be achieved through a combination of observing strategy and state-of-the-art instrumentation. CLASS is observing 70% of the sky to characterize the CMB at large angular scales, which will measure the entire CMB power spectrum from the reionization peak to the recombination peak. The four telescopes operate at frequencies of 38, 93, 145, and 217 GHz, in order to estimate Galactic synchrotron and dust foregrounds while avoiding atmospheric absorption. CLASS employs rapid polarization modulation to overcome atmospheric and instrumental noise. Polarization sensitive cryogenic detectors with low noise levels provide CLASS the sensitivity required to constrain the tensor-to-scalar ratio down to levels of r ~ 0.01 while also measuring the optical depth the reionization to sample-variance levels. These improved constraints on the optical depth to reionization are required to pin down the mass of neutrinos from complementary cosmological data. CLASS has completed a year of observations at 38 GHz and is in the process of deploying the rest of the telescope array. This poster provides an overview and update on the CLASS science, hardware and survey operations.

  11. An Novel Architecture of Large-scale Communication in IOT

    Science.gov (United States)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  12. Geologic Map of MTM 35337, 40337, and 45337 Quadrangles, Deuteronilus Mensae Region of Mars

    Science.gov (United States)

    Chuang, Frank C.; Crown, David A.

    2009-01-01

    present time. Several scenarios for its formation, including single and multiple large impact events, have been proposed and debated in the literature. Endogenic processes whereby crust is thinned by internal mantle convection and tectonic processes have also been proposed. Planetary accretion models and isotopic data from Martian meteorites suggest that the crust formed very early in Martian history. Using populations of quasi-circular depressions extracted from the topography of Mars, other studies suggest that the age difference between the highlands and lowlands could be ~100 m.y.. Furthermore, understanding the origin and age of the dichotomy boundary has been made more complicated due to significant erosion and deposition that have modified the boundary and its adjacent regions. The resulting diversity of terrains and features is likely a combined result of ancient and recent events. Detailed geologic analyses of dichotomy boundary zones are important for understanding the spatial and temporal variations in highland evolution. This information, and comparisons to other highland regions, can help elucidate the scale of potential environmental changes. Previous geomorphic and geologic mapping investigations of the Deuteronilus Mensae region have been completed at local to global scales. The regional geology was first mapped by Lucchitta (1978) at 1:5,000,000 scale using Mariner 9 data. This study concluded that high crater flux early in Martian history formed overlapping craters and basins that were later filled by voluminous lava flows that buried the impacted surface, creating the highlands. After this period of heavy bombardment, fluvial erosion of the highlands formed the canyons and valleys, followed by dissection that created the small mesas and buttes, and later, formation of the steep escarpment marking the present-day northern highland margin. After valley dissection, mass wasting and eolian processes caused lateral retreat of mesas and buttes

  13. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  14. Numerical Modeling of Large-Scale Rocky Coastline Evolution

    Science.gov (United States)

    Limber, P.; Murray, A. B.; Littlewood, R.; Valvo, L.

    2008-12-01

    Seventy-five percent of the world's ocean coastline is rocky. On large scales (i.e. greater than a kilometer), many intertwined processes drive rocky coastline evolution, including coastal erosion and sediment transport, tectonics, antecedent topography, and variations in sea cliff lithology. In areas such as California, an additional aspect of rocky coastline evolution involves submarine canyons that cut across the continental shelf and extend into the nearshore zone. These types of canyons intercept alongshore sediment transport and flush sand to abyssal depths during periodic turbidity currents, thereby delineating coastal sediment transport pathways and affecting shoreline evolution over large spatial and time scales. How tectonic, sediment transport, and canyon processes interact with inherited topographic and lithologic settings to shape rocky coastlines remains an unanswered, and largely unexplored, question. We will present numerical model results of rocky coastline evolution that starts with an immature fractal coastline. The initial shape is modified by headland erosion, wave-driven alongshore sediment transport, and submarine canyon placement. Our previous model results have shown that, as expected, an initial sediment-free irregularly shaped rocky coastline with homogeneous lithology will undergo smoothing in response to wave attack; headlands erode and mobile sediment is swept into bays, forming isolated pocket beaches. As this diffusive process continues, pocket beaches coalesce, and a continuous sediment transport pathway results. However, when a randomly placed submarine canyon is introduced to the system as a sediment sink, the end results are wholly different: sediment cover is reduced, which in turn increases weathering and erosion rates and causes the entire shoreline to move landward more rapidly. The canyon's alongshore position also affects coastline morphology. When placed offshore of a headland, the submarine canyon captures local sediment

  15. Rocky intertidal macrobenthic communities across a large-scale estuarine gradient

    Directory of Open Access Journals (Sweden)

    Luis Giménez

    2010-03-01

    Full Text Available We evaluated relationships between (1 salinity and species richness and (2 frontal zones and community structure for the rocky intertidal macrobenthic community of the Uruguayan coast. A large-scale sampling design (extent ~500 km covering 9 rocky shores across 3 intertidal levels was performed between September and November 2002. The linear relationship between salinity and species richness (minimum at the freshwater extreme and the lack of correlation between variation in salinity and richness rejected two previous empirical models, explaining variations in species richness along the salinity gradient. Other factors (e.g. turbidity may explain this discrepancy. The estuarine front defined two communities—freshwater and estuarine-marine—differing in species composition and richness. The freshwater community was characterised by low richness and few individuals confined to crevices or tide pools, and must be structured by physical processes (e.g. desiccation; the estuarine-marine community, with individuals occupying almost all available substrata, must be structured by both physical and biological processes. A marine front, separating estuarine and marine habitats, had a weak effect on community structure although estuarine and marine assemblages differed according to species characterising different functional groups. We conclude that the position of the estuarine frontal zones is important for explaining large-scale patterns of community structure in the study area.

  16. Constructing a Geology Ontology Using a Relational Database

    Science.gov (United States)

    Hou, W.; Yang, L.; Yin, S.; Ye, J.; Clarke, K.

    2013-12-01

    In geology community, the creation of a common geology ontology has become a useful means to solve problems of data integration, knowledge transformation and the interoperation of multi-source, heterogeneous and multiple scale geological data. Currently, human-computer interaction methods and relational database-based methods are the primary ontology construction methods. Some human-computer interaction methods such as the Geo-rule based method, the ontology life cycle method and the module design method have been proposed for applied geological ontologies. Essentially, the relational database-based method is a reverse engineering of abstracted semantic information from an existing database. The key is to construct rules for the transformation of database entities into the ontology. Relative to the human-computer interaction method, relational database-based methods can use existing resources and the stated semantic relationships among geological entities. However, two problems challenge the development and application. One is the transformation of multiple inheritances and nested relationships and their representation in an ontology. The other is that most of these methods do not measure the semantic retention of the transformation process. In this study, we focused on constructing a rule set to convert the semantics in a geological database into a geological ontology. According to the relational schema of a geological database, a conversion approach is presented to convert a geological spatial database to an OWL-based geological ontology, which is based on identifying semantics such as entities, relationships, inheritance relationships, nested relationships and cluster relationships. The semantic integrity of the transformation was verified using an inverse mapping process. In a geological ontology, an inheritance and union operations between superclass and subclass were used to present the nested relationship in a geochronology and the multiple inheritances

  17. Large scale particle simulations in a virtual memory computer

    International Nuclear Information System (INIS)

    Gray, P.C.; Million, R.; Wagner, J.S.; Tajima, T.

    1983-01-01

    Virtual memory computers are capable of executing large-scale particle simulations even when the memory requirements exceeds the computer core size. The required address space is automatically mapped onto slow disc memory the the operating system. When the simulation size is very large, frequent random accesses to slow memory occur during the charge accumulation and particle pushing processes. Assesses to slow memory significantly reduce the excecution rate of the simulation. We demonstrate in this paper that with the proper choice of sorting algorithm, a nominal amount of sorting to keep physically adjacent particles near particles with neighboring array indices can reduce random access to slow memory, increase the efficiency of the I/O system, and hence, reduce the required computing time. (orig.)

  18. Large-scale particle simulations in a virtual-memory computer

    International Nuclear Information System (INIS)

    Gray, P.C.; Wagner, J.S.; Tajima, T.; Million, R.

    1982-08-01

    Virtual memory computers are capable of executing large-scale particle simulations even when the memory requirements exceed the computer core size. The required address space is automatically mapped onto slow disc memory by the operating system. When the simulation size is very large, frequent random accesses to slow memory occur during the charge accumulation and particle pushing processes. Accesses to slow memory significantly reduce the execution rate of the simulation. We demonstrate in this paper that with the proper choice of sorting algorithm, a nominal amount of sorting to keep physically adjacent particles near particles with neighboring array indices can reduce random access to slow memory, increase the efficiency of the I/O system, and hence, reduce the required computing time

  19. New technologies for large-scale micropatterning of functional nanocomposite polymers

    Science.gov (United States)

    Khosla, A.; Gray, B. L.

    2012-04-01

    We present a review of different micropatterning technologies for flexible elastomeric functional nanocomposites with a particular emphasis on mold material and processes for production of large size substrates. The functional polymers include electrically conducting and magnetic materials developed at the Micro-instrumentation Laboratory at Simon Fraser University, Canada. We present a chart that compares many of these different conductive and magnetic functional nanocomposites and their measured characteristics. Furthermore, we have previously reported hybrid processes for nanocomposite polymers micromolded against SU-8 photoepoxy masters. However, SU-8 is typically limited to substrate sizes that are compatible with microelectronics processing as a microelectronics uv-patterning step is typically involved, and de-molding problems are observed. Recently, we have developed new processes that address the problems faced with SU-8 molds. These new technologies for micropatterning nanocomposites involve new substrate materials. A low cost Poly(methyl methacrylate) (PMMA) microfabrication technology has been developed, which involves fabrication of micromold via either CO2 laser ablation or deep UV. We have previously reported this large-scale patterning technique using laser ablation. Finally, we compare the two processes for PMMA producing micromolds for nanocomposites.

  20. Geologic map of the Nepenthes Planum Region, Mars

    Science.gov (United States)

    Skinner, James A.; Tanaka, Kenneth L.

    2018-03-26

    This map product contains a map sheet at 1:1,506,000 scale that shows the geology of the Nepenthes Planum region of Mars, which is located between the cratered highlands that dominate the southern hemisphere and the less-cratered sedimentary plains that dominate the northern hemisphere.  The map region contains cone- and mound-shaped landforms as well as lobate materials that are morphologically similar to terrestrial igneous or mud vents and flows. This map is part of an informal series of small-scale (large-area) maps aimed at refining current understanding of the geologic units and structures that make up the highland-to-lowland transition zone. The map base consists of a controlled Thermal Emission Imaging System (THEMIS) daytime infrared image mosaic (100 meters per pixel resolution) supplemented by a Mars Orbiter Laser Altimeter (MOLA) digital elevation model (463 meters per pixel resolution). The map includes a Description of Map Units and a Correlation of Map Units that describes and correlates units identified across the entire map region. The geologic map was assembled using ArcGIS software by Environmental Systems Research Institute (http://www.esri.com). The ArcGIS project, geodatabase, base map, and all map components are included online as supplemental data.