WorldWideScience

Sample records for large-scale heterogeneous integration

  1. On transport in formations of large heterogeneity scales

    International Nuclear Information System (INIS)

    Dagan, Gedeon

    1990-01-01

    It has been suggested that in transport through heterogeneous aquifers, the effective dispersivity increases with the travel distance, since plumes encounter heterogeneity of increasing scales. This conclusion is underlain, however, by the assumption of ergodicity. If the plume is viewed as made up of different particles, this means that these particles move independently from a statistical point of view. To satisfy ergodicity the solute body has to be of a much larger extent than heterogeneity scales. Thus, if the latter are increasing for ever and the solute body is finite, ergodicity cannot be obeyed. To demonstrate this thesis we relate to the two-dimensional heterogeneity associated with transmissivity variations in the horizontal plane. First, the effective dispersion coefficient is defined as half the rate of change of the expected value of the solute body second spatial moment relative to its centroid. Subsequently the asymptotic large time limit of dispersivity is evaluated in terms of the log transmissivity integral scale and of the dimensions of the initial solute body in the direction of mean flow and normal to it. It is shown that for a thin plume aligned with the mean flow the effective dispersivity is zero and the effect of heterogeneity is a slight and finite expansion determined solely by the solute body size. In the case of a solute body transverse to the mean flow the effective dispersivity is different from zero, but has a maximal value which is again dependent on the solute body size and not on the heterogeneity scale. It is concluded that from a theoretical standpoint and for the definition of dispersivity adopted here for non-ergodic conditions, the claim of ever-increasing dispersivity with travel distance is not valid for the scale of heterogeneity analyzed here. (Author) (21 refs., 6 figs.)

  2. Optimization of large-scale heterogeneous system-of-systems models.

    Energy Technology Data Exchange (ETDEWEB)

    Parekh, Ojas; Watson, Jean-Paul; Phillips, Cynthia Ann; Siirola, John; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Lee, Herbert K. H. (University of California, Santa Cruz, Santa Cruz, CA); Hart, William Eugene; Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Woodruff, David L. (University of California, Davis, Davis, CA)

    2012-01-01

    Decision makers increasingly rely on large-scale computational models to simulate and analyze complex man-made systems. For example, computational models of national infrastructures are being used to inform government policy, assess economic and national security risks, evaluate infrastructure interdependencies, and plan for the growth and evolution of infrastructure capabilities. A major challenge for decision makers is the analysis of national-scale models that are composed of interacting systems: effective integration of system models is difficult, there are many parameters to analyze in these systems, and fundamental modeling uncertainties complicate analysis. This project is developing optimization methods to effectively represent and analyze large-scale heterogeneous system of systems (HSoS) models, which have emerged as a promising approach for describing such complex man-made systems. These optimization methods enable decision makers to predict future system behavior, manage system risk, assess tradeoffs between system criteria, and identify critical modeling uncertainties.

  3. Dynamical links between small- and large-scale mantle heterogeneity: Seismological evidence

    Science.gov (United States)

    Frost, Daniel A.; Garnero, Edward J.; Rost, Sebastian

    2018-01-01

    We identify PKP • PKP scattered waves (also known as P‧ •P‧) from earthquakes recorded at small-aperture seismic arrays at distances less than 65°. P‧ •P‧ energy travels as a PKP wave through the core, up into the mantle, then scatters back down through the core to the receiver as a second PKP. P‧ •P‧ waves are unique in that they allow scattering heterogeneities throughout the mantle to be imaged. We use array-processing methods to amplify low amplitude, coherent scattered energy signals and resolve their incoming direction. We deterministically map scattering heterogeneity locations from the core-mantle boundary to the surface. We use an extensive dataset with sensitivity to a large volume of the mantle and a location method allowing us to resolve and map more heterogeneities than have previously been possible, representing a significant increase in our understanding of small-scale structure within the mantle. Our results demonstrate that the distribution of scattering heterogeneities varies both radially and laterally. Scattering is most abundant in the uppermost and lowermost mantle, and a minimum in the mid-mantle, resembling the radial distribution of tomographically derived whole-mantle velocity heterogeneity. We investigate the spatial correlation of scattering heterogeneities with large-scale tomographic velocities, lateral velocity gradients, the locations of deep-seated hotspots and subducted slabs. In the lowermost 1500 km of the mantle, small-scale heterogeneities correlate with regions of low seismic velocity, high lateral seismic gradient, and proximity to hotspots. In the upper 1000 km of the mantle there is no significant correlation between scattering heterogeneity location and subducted slabs. Between 600 and 900 km depth, scattering heterogeneities are more common in the regions most remote from slabs, and close to hotspots. Scattering heterogeneities show an affinity for regions close to slabs within the upper 200 km of the

  4. Large-scale compositional heterogeneity in the Earth's mantle

    Science.gov (United States)

    Ballmer, M.

    2017-12-01

    Seismic imaging of subducted Farallon and Tethys lithosphere in the lower mantle has been taken as evidence for whole-mantle convection, and efficient mantle mixing. However, cosmochemical constraints point to a lower-mantle composition that has a lower Mg/Si compared to upper-mantle pyrolite. Moreover, geochemical signatures of magmatic rocks indicate the long-term persistence of primordial reservoirs somewhere in the mantle. In this presentation, I establish geodynamic mechanisms for sustaining large-scale (primordial) heterogeneity in the Earth's mantle using numerical models. Mantle flow is controlled by rock density and viscosity. Variations in intrinsic rock density, such as due to heterogeneity in basalt or iron content, can induce layering or partial layering in the mantle. Layering can be sustained in the presence of persistent whole mantle convection due to active "unmixing" of heterogeneity in low-viscosity domains, e.g. in the transition zone or near the core-mantle boundary [1]. On the other hand, lateral variations in intrinsic rock viscosity, such as due to heterogeneity in Mg/Si, can strongly affect the mixing timescales of the mantle. In the extreme case, intrinsically strong rocks may remain unmixed through the age of the Earth, and persist as large-scale domains in the mid-mantle due to focusing of deformation along weak conveyor belts [2]. That large-scale lateral heterogeneity and/or layering can persist in the presence of whole-mantle convection can explain the stagnation of some slabs, as well as the deflection of some plumes, in the mid-mantle. These findings indeed motivate new seismic studies for rigorous testing of model predictions. [1] Ballmer, M. D., N. C. Schmerr, T. Nakagawa, and J. Ritsema (2015), Science Advances, doi:10.1126/sciadv.1500815. [2] Ballmer, M. D., C. Houser, J. W. Hernlund, R. Wentzcovitch, and K. Hirose (2017), Nature Geoscience, doi:10.1038/ngeo2898.

  5. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  6. Very large scale heterogeneous integration (VLSHI) and wafer-level vacuum packaging for infrared bolometer focal plane arrays

    Science.gov (United States)

    Forsberg, Fredrik; Roxhed, Niclas; Fischer, Andreas C.; Samel, Björn; Ericsson, Per; Hoivik, Nils; Lapadatu, Adriana; Bring, Martin; Kittilsland, Gjermund; Stemme, Göran; Niklaus, Frank

    2013-09-01

    Imaging in the long wavelength infrared (LWIR) range from 8 to 14 μm is an extremely useful tool for non-contact measurement and imaging of temperature in many industrial, automotive and security applications. However, the cost of the infrared (IR) imaging components has to be significantly reduced to make IR imaging a viable technology for many cost-sensitive applications. This paper demonstrates new and improved fabrication and packaging technologies for next-generation IR imaging detectors based on uncooled IR bolometer focal plane arrays. The proposed technologies include very large scale heterogeneous integration for combining high-performance, SiGe quantum-well bolometers with electronic integrated read-out circuits and CMOS compatible wafer-level vacuum packing. The fabrication and characterization of bolometers with a pitch of 25 μm × 25 μm that are arranged on read-out-wafers in arrays with 320 × 240 pixels are presented. The bolometers contain a multi-layer quantum well SiGe thermistor with a temperature coefficient of resistance of -3.0%/K. The proposed CMOS compatible wafer-level vacuum packaging technology uses Cu-Sn solid-liquid interdiffusion (SLID) bonding. The presented technologies are suitable for implementation in cost-efficient fabless business models with the potential to bring about the cost reduction needed to enable low-cost IR imaging products for industrial, security and automotive applications.

  7. A Ranking Approach on Large-Scale Graph With Multidimensional Heterogeneous Information.

    Science.gov (United States)

    Wei, Wei; Gao, Bin; Liu, Tie-Yan; Wang, Taifeng; Li, Guohui; Li, Hang

    2016-04-01

    Graph-based ranking has been extensively studied and frequently applied in many applications, such as webpage ranking. It aims at mining potentially valuable information from the raw graph-structured data. Recently, with the proliferation of rich heterogeneous information (e.g., node/edge features and prior knowledge) available in many real-world graphs, how to effectively and efficiently leverage all information to improve the ranking performance becomes a new challenging problem. Previous methods only utilize part of such information and attempt to rank graph nodes according to link-based methods, of which the ranking performances are severely affected by several well-known issues, e.g., over-fitting or high computational complexity, especially when the scale of graph is very large. In this paper, we address the large-scale graph-based ranking problem and focus on how to effectively exploit rich heterogeneous information of the graph to improve the ranking performance. Specifically, we propose an innovative and effective semi-supervised PageRank (SSP) approach to parameterize the derived information within a unified semi-supervised learning framework (SSLF-GR), then simultaneously optimize the parameters and the ranking scores of graph nodes. Experiments on the real-world large-scale graphs demonstrate that our method significantly outperforms the algorithms that consider such graph information only partially.

  8. Large-scale model of flow in heterogeneous and hierarchical porous media

    Science.gov (United States)

    Chabanon, Morgan; Valdés-Parada, Francisco J.; Ochoa-Tapia, J. Alberto; Goyeau, Benoît

    2017-11-01

    Heterogeneous porous structures are very often encountered in natural environments, bioremediation processes among many others. Reliable models for momentum transport are crucial whenever mass transport or convective heat occurs in these systems. In this work, we derive a large-scale average model for incompressible single-phase flow in heterogeneous and hierarchical soil porous media composed of two distinct porous regions embedding a solid impermeable structure. The model, based on the local mechanical equilibrium assumption between the porous regions, results in a unique momentum transport equation where the global effective permeability naturally depends on the permeabilities at the intermediate mesoscopic scales and therefore includes the complex hierarchical structure of the soil. The associated closure problem is numerically solved for various configurations and properties of the heterogeneous medium. The results clearly show that the effective permeability increases with the volume fraction of the most permeable porous region. It is also shown that the effective permeability is sensitive to the dimensionality spatial arrangement of the porous regions and in particular depends on the contact between the impermeable solid and the two porous regions.

  9. Distributed constraint satisfaction for coordinating and integrating a large-scale, heterogenous enterprise

    CERN Document Server

    Eisenberg, C

    2003-01-01

    Market forces are continuously driving public and private organisations towards higher productivity, shorter process and production times, and fewer labour hours. To cope with these changes, organisations are adopting new organisational models of coordination and cooperation that increase their flexibility, consistency, efficiency, productivity and profit margins. In this thesis an organisational model of coordination and cooperation is examined using a real life example; the technical integration of a distributed large-scale project of an international physics collaboration. The distributed resource constraint project scheduling problem is modelled and solved with the methods of distributed constraint satisfaction. A distributed local search method, the distributed breakout algorithm (DisBO), is used as the basis for the coordination scheme. The efficiency of the local search method is improved by extending it with an incremental problem solving scheme with variable ordering. The scheme is implemented as cen...

  10. TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems

    OpenAIRE

    Abadi, Martín; Agarwal, Ashish; Barham, Paul; Brevdo, Eugene; Chen, Zhifeng; Citro, Craig; Corrado, Greg S.; Davis, Andy; Dean, Jeffrey; Devin, Matthieu; Ghemawat, Sanjay; Goodfellow, Ian; Harp, Andrew; Irving, Geoffrey; Isard, Michael

    2016-01-01

    TensorFlow is an interface for expressing machine learning algorithms, and an implementation for executing such algorithms. A computation expressed using TensorFlow can be executed with little or no change on a wide variety of heterogeneous systems, ranging from mobile devices such as phones and tablets up to large-scale distributed systems of hundreds of machines and thousands of computational devices such as GPU cards. The system is flexible and can be used to express a wide variety of algo...

  11. Problems of large-scale vertically-integrated aquaculture

    Energy Technology Data Exchange (ETDEWEB)

    Webber, H H; Riordan, P F

    1976-01-01

    The problems of vertically-integrated aquaculture are outlined; they are concerned with: species limitations (in the market, biological and technological); site selection, feed, manpower needs, and legal, institutional and financial requirements. The gaps in understanding of, and the constraints limiting, large-scale aquaculture are listed. Future action is recommended with respect to: types and diversity of species to be cultivated, marketing, biotechnology (seed supply, disease control, water quality and concerted effort), siting, feed, manpower, legal and institutional aids (granting of water rights, grants, tax breaks, duty-free imports, etc.), and adequate financing. The last of hard data based on experience suggests that large-scale vertically-integrated aquaculture is a high risk enterprise, and with the high capital investment required, banks and funding institutions are wary of supporting it. Investment in pilot projects is suggested to demonstrate that large-scale aquaculture can be a fully functional and successful business. Construction and operation of such pilot farms is judged to be in the interests of both the public and private sector.

  12. Analysis for Large Scale Integration of Electric Vehicles into Power Grids

    DEFF Research Database (Denmark)

    Hu, Weihao; Chen, Zhe; Wang, Xiaoru

    2011-01-01

    Electric Vehicles (EVs) provide a significant opportunity for reducing the consumption of fossil energies and the emission of carbon dioxide. With more and more electric vehicles integrated in the power systems, it becomes important to study the effects of EV integration on the power systems......, especially the low and middle voltage level networks. In the paper, the basic structure and characteristics of the electric vehicles are introduced. The possible impacts of large scale integration of electric vehicles on the power systems especially the advantage to the integration of the renewable energies...... are discussed. Finally, the research projects related to the large scale integration of electric vehicles into the power systems are introduced, it will provide reference for large scale integration of Electric Vehicles into power grids....

  13. The use of semantic similarity measures for optimally integrating heterogeneous Gene Ontology data from large scale annotation pipelines

    Directory of Open Access Journals (Sweden)

    Gaston K Mazandu

    2014-08-01

    Full Text Available With the advancement of new high throughput sequencing technologies, there has been an increase in the number of genome sequencing projects worldwide, which has yielded complete genome sequences of human, animals and plants. Subsequently, several labs have focused on genome annotation, consisting of assigning functions to gene products, mostly using Gene Ontology (GO terms. As a consequence, there is an increased heterogeneity in annotations across genomes due to different approaches used by different pipelines to infer these annotations and also due to the nature of the GO structure itself. This makes a curator's task difficult, even if they adhere to the established guidelines for assessing these protein annotations. Here we develop a genome-scale approach for integrating GO annotations from different pipelines using semantic similarity measures. We used this approach to identify inconsistencies and similarities in functional annotations between orthologs of human and Drosophila melanogaster, to assess the quality of GO annotations derived from InterPro2GO mappings compared to manually annotated GO annotations for the Drosophila melanogaster proteome from a FlyBase dataset and human, and to filter GO annotation data for these proteomes. Results obtained indicate that an efficient integration of GO annotations eliminates redundancy up to 27.08 and 22.32% in the Drosophila melanogaster and human GO annotation datasets, respectively. Furthermore, we identified lack of and missing annotations for some orthologs, and annotation mismatches between InterPro2GO and manual pipelines in these two proteomes, thus requiring further curation. This simplifies and facilitates tasks of curators in assessing protein annotations, reduces redundancy and eliminates inconsistencies in large annotation datasets for ease of comparative functional genomics.

  14. Report of the Workshop on Petascale Systems Integration for LargeScale Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, William T.C.; Walter, Howard; New, Gary; Engle, Tom; Pennington, Rob; Comes, Brad; Bland, Buddy; Tomlison, Bob; Kasdorf, Jim; Skinner, David; Regimbal, Kevin

    2007-10-01

    There are significant issues regarding Large Scale System integration that are not being addressed in other forums such as current research portfolios or vendor user groups. Unfortunately, the issues in the area of large-scale system integration often fall into a netherworld; not research, not facilities, not procurement, not operations, not user services. Taken together, these issues along with the impact of sub-optimal integration technology means the time required to deploy, integrate and stabilize large scale system may consume up to 20 percent of the useful life of such systems. Improving the state of the art for large scale systems integration has potential to increase the scientific productivity of these systems. Sites have significant expertise, but there are no easy ways to leverage this expertise among them . Many issues inhibit the sharing of information, including available time and effort, as well as issues with sharing proprietary information. Vendors also benefit in the long run from the solutions to issues detected during site testing and integration. There is a great deal of enthusiasm for making large scale system integration a full-fledged partner along with the other major thrusts supported by funding agencies in the definition, design, and use of a petascale systems. Integration technology and issues should have a full 'seat at the table' as petascale and exascale initiatives and programs are planned. The workshop attendees identified a wide range of issues and suggested paths forward. Pursuing these with funding opportunities and innovation offers the opportunity to dramatically improve the state of large scale system integration.

  15. Improving predictions of large scale soil carbon dynamics: Integration of fine-scale hydrological and biogeochemical processes, scaling, and benchmarking

    Science.gov (United States)

    Riley, W. J.; Dwivedi, D.; Ghimire, B.; Hoffman, F. M.; Pau, G. S. H.; Randerson, J. T.; Shen, C.; Tang, J.; Zhu, Q.

    2015-12-01

    Numerical model representations of decadal- to centennial-scale soil-carbon dynamics are a dominant cause of uncertainty in climate change predictions. Recent attempts by some Earth System Model (ESM) teams to integrate previously unrepresented soil processes (e.g., explicit microbial processes, abiotic interactions with mineral surfaces, vertical transport), poor performance of many ESM land models against large-scale and experimental manipulation observations, and complexities associated with spatial heterogeneity highlight the nascent nature of our community's ability to accurately predict future soil carbon dynamics. I will present recent work from our group to develop a modeling framework to integrate pore-, column-, watershed-, and global-scale soil process representations into an ESM (ACME), and apply the International Land Model Benchmarking (ILAMB) package for evaluation. At the column scale and across a wide range of sites, observed depth-resolved carbon stocks and their 14C derived turnover times can be explained by a model with explicit representation of two microbial populations, a simple representation of mineralogy, and vertical transport. Integrating soil and plant dynamics requires a 'process-scaling' approach, since all aspects of the multi-nutrient system cannot be explicitly resolved at ESM scales. I will show that one approach, the Equilibrium Chemistry Approximation, improves predictions of forest nitrogen and phosphorus experimental manipulations and leads to very different global soil carbon predictions. Translating model representations from the site- to ESM-scale requires a spatial scaling approach that either explicitly resolves the relevant processes, or more practically, accounts for fine-resolution dynamics at coarser scales. To that end, I will present recent watershed-scale modeling work that applies reduced order model methods to accurately scale fine-resolution soil carbon dynamics to coarse-resolution simulations. Finally, we

  16. OffshoreDC DC grids for integration of large scale wind power

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Endegnanew, Atsede Gualu; Stamatiou, Georgios

    The present report summarizes the main findings of the Nordic Energy Research project “DC grids for large scale integration of offshore wind power – OffshoreDC”. The project is been funded by Nordic Energy Research through the TFI programme and was active between 2011 and 2016. The overall...... objective of the project was to drive the development of the VSC based HVDC technology for future large scale offshore grids, supporting a standardised and commercial development of the technology, and improving the opportunities for the technology to support power system integration of large scale offshore...

  17. Coordinated SLNR based Precoding in Large-Scale Heterogeneous Networks

    KAUST Repository

    Boukhedimi, Ikram

    2017-03-06

    This work focuses on the downlink of large-scale two-tier heterogeneous networks composed of a macro-cell overlaid by micro-cell networks. Our interest is on the design of coordinated beamforming techniques that allow to mitigate the inter-cell interference. Particularly, we consider the case in which the coordinating base stations (BSs) have imperfect knowledge of the channel state information. Under this setting, we propose a regularized SLNR based precoding design in which the regularization factor is used to allow better resilience with respect to the channel estimation errors. Based on tools from random matrix theory, we provide an analytical analysis of the SINR and SLNR performances. These results are then exploited to propose a proper setting of the regularization factor. Simulation results are finally provided in order to validate our findings and to confirm the performance of the proposed precoding scheme.

  18. Coordinated SLNR based Precoding in Large-Scale Heterogeneous Networks

    KAUST Repository

    Boukhedimi, Ikram; Kammoun, Abla; Alouini, Mohamed-Slim

    2017-01-01

    This work focuses on the downlink of large-scale two-tier heterogeneous networks composed of a macro-cell overlaid by micro-cell networks. Our interest is on the design of coordinated beamforming techniques that allow to mitigate the inter-cell interference. Particularly, we consider the case in which the coordinating base stations (BSs) have imperfect knowledge of the channel state information. Under this setting, we propose a regularized SLNR based precoding design in which the regularization factor is used to allow better resilience with respect to the channel estimation errors. Based on tools from random matrix theory, we provide an analytical analysis of the SINR and SLNR performances. These results are then exploited to propose a proper setting of the regularization factor. Simulation results are finally provided in order to validate our findings and to confirm the performance of the proposed precoding scheme.

  19. A generic library for large scale solution of PDEs on modern heterogeneous architectures

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig; Engsig-Karup, Allan Peter

    2012-01-01

    Adapting to new programming models for modern multi- and many-core architectures requires code-rewriting and changing algorithms and data structures, in order to achieve good efficiency and scalability. We present a generic library for solving large scale partial differential equations (PDEs......), capable of utilizing heterogeneous CPU/GPU environments. The library can be used for fast proto-typing of PDE solvers, based on finite difference approximations of spatial derivatives in one, two, or three dimensions. In order to efficiently solve large scale problems, we keep memory consumption...... and memory access low, using a low-storage implementation of flexible-order finite difference operators. We will illustrate the use of library components by assembling such matrix-free operators to be used with one of the supported iterative solvers, such as GMRES, CG, Multigrid or Defect Correction...

  20. Dynamic Reactive Power Compensation of Large Scale Wind Integrated Power System

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul

    2015-01-01

    wind turbines especially wind farms with additional grid support functionalities like dynamic support (e,g dynamic reactive power support etc.) and ii) refurbishment of existing conventional central power plants to synchronous condensers could be one of the efficient, reliable and cost effective option......Due to progressive displacement of conventional power plants by wind turbines, dynamic security of large scale wind integrated power systems gets significantly compromised. In this paper we first highlight the importance of dynamic reactive power support/voltage security in large scale wind...... integrated power systems with least presence of conventional power plants. Then we propose a mixed integer dynamic optimization based method for optimal dynamic reactive power allocation in large scale wind integrated power systems. One of the important aspects of the proposed methodology is that unlike...

  1. The Plant Phenology Ontology: A New Informatics Resource for Large-Scale Integration of Plant Phenology Data.

    Science.gov (United States)

    Stucky, Brian J; Guralnick, Rob; Deck, John; Denny, Ellen G; Bolmgren, Kjell; Walls, Ramona

    2018-01-01

    Plant phenology - the timing of plant life-cycle events, such as flowering or leafing out - plays a fundamental role in the functioning of terrestrial ecosystems, including human agricultural systems. Because plant phenology is often linked with climatic variables, there is widespread interest in developing a deeper understanding of global plant phenology patterns and trends. Although phenology data from around the world are currently available, truly global analyses of plant phenology have so far been difficult because the organizations producing large-scale phenology data are using non-standardized terminologies and metrics during data collection and data processing. To address this problem, we have developed the Plant Phenology Ontology (PPO). The PPO provides the standardized vocabulary and semantic framework that is needed for large-scale integration of heterogeneous plant phenology data. Here, we describe the PPO, and we also report preliminary results of using the PPO and a new data processing pipeline to build a large dataset of phenology information from North America and Europe.

  2. Effects of reservoir heterogeneity on scaling of effective mass transfer coefficient for solute transport

    Science.gov (United States)

    Leung, Juliana Y.; Srinivasan, Sanjay

    2016-09-01

    Modeling transport process at large scale requires proper scale-up of subsurface heterogeneity and an understanding of its interaction with the underlying transport mechanisms. A technique based on volume averaging is applied to quantitatively assess the scaling characteristics of effective mass transfer coefficient in heterogeneous reservoir models. The effective mass transfer coefficient represents the combined contribution from diffusion and dispersion to the transport of non-reactive solute particles within a fluid phase. Although treatment of transport problems with the volume averaging technique has been published in the past, application to geological systems exhibiting realistic spatial variability remains a challenge. Previously, the authors developed a new procedure where results from a fine-scale numerical flow simulation reflecting the full physics of the transport process albeit over a sub-volume of the reservoir are integrated with the volume averaging technique to provide effective description of transport properties. The procedure is extended such that spatial averaging is performed at the local-heterogeneity scale. In this paper, the transport of a passive (non-reactive) solute is simulated on multiple reservoir models exhibiting different patterns of heterogeneities, and the scaling behavior of effective mass transfer coefficient (Keff) is examined and compared. One such set of models exhibit power-law (fractal) characteristics, and the variability of dispersion and Keff with scale is in good agreement with analytical expressions described in the literature. This work offers an insight into the impacts of heterogeneity on the scaling of effective transport parameters. A key finding is that spatial heterogeneity models with similar univariate and bivariate statistics may exhibit different scaling characteristics because of the influence of higher order statistics. More mixing is observed in the channelized models with higher-order continuity. It

  3. Base Station Placement Algorithm for Large-Scale LTE Heterogeneous Networks.

    Science.gov (United States)

    Lee, Seungseob; Lee, SuKyoung; Kim, Kyungsoo; Kim, Yoon Hyuk

    2015-01-01

    Data traffic demands in cellular networks today are increasing at an exponential rate, giving rise to the development of heterogeneous networks (HetNets), in which small cells complement traditional macro cells by extending coverage to indoor areas. However, the deployment of small cells as parts of HetNets creates a key challenge for operators' careful network planning. In particular, massive and unplanned deployment of base stations can cause high interference, resulting in highly degrading network performance. Although different mathematical modeling and optimization methods have been used to approach various problems related to this issue, most traditional network planning models are ill-equipped to deal with HetNet-specific characteristics due to their focus on classical cellular network designs. Furthermore, increased wireless data demands have driven mobile operators to roll out large-scale networks of small long term evolution (LTE) cells. Therefore, in this paper, we aim to derive an optimum network planning algorithm for large-scale LTE HetNets. Recently, attempts have been made to apply evolutionary algorithms (EAs) to the field of radio network planning, since they are characterized as global optimization methods. Yet, EA performance often deteriorates rapidly with the growth of search space dimensionality. To overcome this limitation when designing optimum network deployments for large-scale LTE HetNets, we attempt to decompose the problem and tackle its subcomponents individually. Particularly noting that some HetNet cells have strong correlations due to inter-cell interference, we propose a correlation grouping approach in which cells are grouped together according to their mutual interference. Both the simulation and analytical results indicate that the proposed solution outperforms the random-grouping based EA as well as an EA that detects interacting variables by monitoring the changes in the objective function algorithm in terms of system

  4. Upscaling of Large-Scale Transport in Spatially Heterogeneous Porous Media Using Wavelet Transformation

    Science.gov (United States)

    Moslehi, M.; de Barros, F.; Ebrahimi, F.; Sahimi, M.

    2015-12-01

    Modeling flow and solute transport in large-scale heterogeneous porous media involves substantial computational burdens. A common approach to alleviate this complexity is to utilize upscaling methods. These processes generate upscaled models with less complexity while attempting to preserve the hydrogeological properties comparable to the original fine-scale model. We use Wavelet Transformations (WT) of the spatial distribution of aquifer's property to upscale the hydrogeological models and consequently transport processes. In particular, we apply the technique to a porous formation with broadly distributed and correlated transmissivity to verify the performance of the WT. First, transmissivity fields are coarsened using WT in such a way that the high transmissivity zones, in which more important information is embedded, mostly remain the same, while the low transmissivity zones are averaged out since they contain less information about the hydrogeological formation. Next, flow and non-reactive transport are simulated in both fine-scale and upscaled models to predict both the concentration breakthrough curves at a control location and the large-scale spreading of the plume around its centroid. The results reveal that the WT of the fields generates non-uniform grids with an average of 2.1% of the number of grid blocks in the original fine-scale models, which eventually leads to a significant reduction in the computational costs. We show that the upscaled model obtained through the WT reconstructs the concentration breakthrough curves and the spreading of the plume at different times accurately. Furthermore, the impacts of the Hurst coefficient, size of the flow domain and the orders of magnitude difference in transmissivity values on the results have been investigated. It is observed that as the heterogeneity and the size of the domain increase, better agreement between the results of fine-scale and upscaled models can be achieved. Having this framework at hand aids

  5. Multidimensional quantum entanglement with large-scale integrated optics

    DEFF Research Database (Denmark)

    Wang, Jianwei; Paesani, Stefano; Ding, Yunhong

    2018-01-01

    -dimensional entanglement. A programmable bipartite entangled system is realized with dimension up to 15 × 15 on a large-scale silicon-photonics quantum circuit. The device integrates more than 550 photonic components on a single chip, including 16 identical photon-pair sources. We verify the high precision, generality......The ability to control multidimensional quantum systems is key for the investigation of fundamental science and for the development of advanced quantum technologies. We demonstrate a multidimensional integrated quantum photonic platform able to generate, control and analyze high...

  6. Scaling the heterogeneously heated convective boundary layer

    Science.gov (United States)

    Van Heerwaarden, C.; Mellado, J.; De Lozar, A.

    2013-12-01

    We have studied the heterogeneously heated convective boundary layer (CBL) by means of large-eddy simulations (LES) and direct numerical simulations (DNS). What makes our study different from previous studies on this subject are our very long simulations in which the system travels through multiple states and that from there we have derived scaling laws. In our setup, a stratified atmosphere is heated from below by square patches with a high surface buoyancy flux, surrounded by regions with no or little flux. By letting a boundary layer grow in time we let the system evolve from the so-called meso-scale to the micro-scale regime. In the former the heterogeneity is large and strong circulations can develop, while in the latter the heterogeneity is small and does no longer influence the boundary layer structure. Within each simulation we can now observe the formation of a peak in kinetic energy, which represents the 'optimal' heterogeneity size in the meso-scale, and the subsequent decay of the peak and the development towards the transition to the micro-scale. We have created a non-dimensional parameter space that describes all properties of this system. By studying the previously described evolution for different combinations of parameters, we have derived three important conclusions. First, there exists a horizontal length scale of the heterogeneity (L) that is a function of the boundary layer height (h) and the Richardson (Ri) number of the inversion at the top of the boundary layer. This relationship has the form L = h Ri^(3/8). Second, this horizontal length scale L allows for expressing the time evolution, and thus the state of the system, as a ratio of this length scale and the distance between two patches Xp. This ratio thus describes to which extent the circulation fills up the space that exists between two patch centers. The timings of the transition from the meso- to the micro-scale collapse under this scaling for all simulations sharing the same flux

  7. Large-scale Modeling of Nitrous Oxide Production: Issues of Representing Spatial Heterogeneity

    Science.gov (United States)

    Morris, C. K.; Knighton, J.

    2017-12-01

    Nitrous oxide is produced from the biological processes of nitrification and denitrification in terrestrial environments and contributes to the greenhouse effect that warms Earth's climate. Large scale modeling can be used to determine how global rate of nitrous oxide production and consumption will shift under future climates. However, accurate modeling of nitrification and denitrification is made difficult by highly parameterized, nonlinear equations. Here we show that the representation of spatial heterogeneity in inputs, specifically soil moisture, causes inaccuracies in estimating the average nitrous oxide production in soils. We demonstrate that when soil moisture is averaged from a spatially heterogeneous surface, net nitrous oxide production is under predicted. We apply this general result in a test of a widely-used global land surface model, the Community Land Model v4.5. The challenges presented by nonlinear controls on nitrous oxide are highlighted here to provide a wider context to the problem of extraordinary denitrification losses in CLM. We hope that these findings will inform future researchers on the possibilities for model improvement of the global nitrogen cycle.

  8. Studies of Sub-Synchronous Oscillations in Large-Scale Wind Farm Integrated System

    Science.gov (United States)

    Yue, Liu; Hang, Mend

    2018-01-01

    With the rapid development and construction of large-scale wind farms and grid-connected operation, the series compensation wind power AC transmission is gradually becoming the main way of power usage and improvement of wind power availability and grid stability, but the integration of wind farm will change the SSO (Sub-Synchronous oscillation) damping characteristics of synchronous generator system. Regarding the above SSO problem caused by integration of large-scale wind farms, this paper focusing on doubly fed induction generator (DFIG) based wind farms, aim to summarize the SSO mechanism in large-scale wind power integrated system with series compensation, which can be classified as three types: sub-synchronous control interaction (SSCI), sub-synchronous torsional interaction (SSTI), sub-synchronous resonance (SSR). Then, SSO modelling and analysis methods are categorized and compared by its applicable areas. Furthermore, this paper summarizes the suppression measures of actual SSO projects based on different control objectives. Finally, the research prospect on this field is explored.

  9. HETEROGENEOUS INTEGRATION TECHNOLOGY

    Science.gov (United States)

    2017-08-24

    AFRL-RY-WP-TR-2017-0168 HETEROGENEOUS INTEGRATION TECHNOLOGY Dr. Burhan Bayraktaroglu Devices for Sensing Branch Aerospace Components & Subsystems...Final September 1, 2016 – May 1, 2017 4. TITLE AND SUBTITLE HETEROGENEOUS INTEGRATION TECHNOLOGY 5a. CONTRACT NUMBER In-house 5b. GRANT NUMBER N/A...provide a structure for this review. The history and the current status of integration technologies in each category are examined and product examples are

  10. Expected Future Conditions for Secure Power Operation with Large Scale of RES Integration

    International Nuclear Information System (INIS)

    Majstrovic, G.; Majstrovic, M.; Sutlovic, E.

    2015-01-01

    EU energy strategy is strongly focused on the large scale integration of renewable energy sources. The most dominant part here is taken by variable sources - wind power plants. Grid integration of intermittent sources along with keeping the system stable and secure is one of the biggest challenges for the TSOs. This part is often neglected by the energy policy makers, so this paper deals with expected future conditions for secure power system operation with large scale wind integration. It gives an overview of expected wind integration development in EU, as well as expected P/f regulation and control needs. The paper is concluded with several recommendations. (author).

  11. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    Science.gov (United States)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  12. Heterogeneous integration of lithium niobate and silicon nitride waveguides for wafer-scale photonic integrated circuits on silicon.

    Science.gov (United States)

    Chang, Lin; Pfeiffer, Martin H P; Volet, Nicolas; Zervas, Michael; Peters, Jon D; Manganelli, Costanza L; Stanton, Eric J; Li, Yifei; Kippenberg, Tobias J; Bowers, John E

    2017-02-15

    An ideal photonic integrated circuit for nonlinear photonic applications requires high optical nonlinearities and low loss. This work demonstrates a heterogeneous platform by bonding lithium niobate (LN) thin films onto a silicon nitride (Si3N4) waveguide layer on silicon. It not only provides large second- and third-order nonlinear coefficients, but also shows low propagation loss in both the Si3N4 and the LN-Si3N4 waveguides. The tapers enable low-loss-mode transitions between these two waveguides. This platform is essential for various on-chip applications, e.g., modulators, frequency conversions, and quantum communications.

  13. Investigation on the integral output power model of a large-scale wind farm

    Institute of Scientific and Technical Information of China (English)

    BAO Nengsheng; MA Xiuqian; NI Weidou

    2007-01-01

    The integral output power model of a large-scale wind farm is needed when estimating the wind farm's output over a period of time in the future.The actual wind speed power model and calculation method of a wind farm made up of many wind turbine units are discussed.After analyzing the incoming wind flow characteristics and their energy distributions,and after considering the multi-effects among the wind turbine units and certain assumptions,the incoming wind flow model of multi-units is built.The calculation algorithms and steps of the integral output power model of a large-scale wind farm are provided.Finally,an actual power output of the wind farm is calculated and analyzed by using the practical measurement wind speed data.The characteristics of a large-scale wind farm are also discussed.

  14. Evaluation of Kirkwood-Buff integrals via finite size scaling: a large scale molecular dynamics study

    Science.gov (United States)

    Dednam, W.; Botha, A. E.

    2015-01-01

    Solvation of bio-molecules in water is severely affected by the presence of co-solvent within the hydration shell of the solute structure. Furthermore, since solute molecules can range from small molecules, such as methane, to very large protein structures, it is imperative to understand the detailed structure-function relationship on the microscopic level. For example, it is useful know the conformational transitions that occur in protein structures. Although such an understanding can be obtained through large-scale molecular dynamic simulations, it is often the case that such simulations would require excessively large simulation times. In this context, Kirkwood-Buff theory, which connects the microscopic pair-wise molecular distributions to global thermodynamic properties, together with the recently developed technique, called finite size scaling, may provide a better method to reduce system sizes, and hence also the computational times. In this paper, we present molecular dynamics trial simulations of biologically relevant low-concentration solvents, solvated by aqueous co-solvent solutions. In particular we compare two different methods of calculating the relevant Kirkwood-Buff integrals. The first (traditional) method computes running integrals over the radial distribution functions, which must be obtained from large system-size NVT or NpT simulations. The second, newer method, employs finite size scaling to obtain the Kirkwood-Buff integrals directly by counting the particle number fluctuations in small, open sub-volumes embedded within a larger reservoir that can be well approximated by a much smaller simulation cell. In agreement with previous studies, which made a similar comparison for aqueous co-solvent solutions, without the additional solvent, we conclude that the finite size scaling method is also applicable to the present case, since it can produce computationally more efficient results which are equivalent to the more costly radial distribution

  15. Evaluation of Kirkwood-Buff integrals via finite size scaling: a large scale molecular dynamics study

    International Nuclear Information System (INIS)

    Dednam, W; Botha, A E

    2015-01-01

    Solvation of bio-molecules in water is severely affected by the presence of co-solvent within the hydration shell of the solute structure. Furthermore, since solute molecules can range from small molecules, such as methane, to very large protein structures, it is imperative to understand the detailed structure-function relationship on the microscopic level. For example, it is useful know the conformational transitions that occur in protein structures. Although such an understanding can be obtained through large-scale molecular dynamic simulations, it is often the case that such simulations would require excessively large simulation times. In this context, Kirkwood-Buff theory, which connects the microscopic pair-wise molecular distributions to global thermodynamic properties, together with the recently developed technique, called finite size scaling, may provide a better method to reduce system sizes, and hence also the computational times. In this paper, we present molecular dynamics trial simulations of biologically relevant low-concentration solvents, solvated by aqueous co-solvent solutions. In particular we compare two different methods of calculating the relevant Kirkwood-Buff integrals. The first (traditional) method computes running integrals over the radial distribution functions, which must be obtained from large system-size NVT or NpT simulations. The second, newer method, employs finite size scaling to obtain the Kirkwood-Buff integrals directly by counting the particle number fluctuations in small, open sub-volumes embedded within a larger reservoir that can be well approximated by a much smaller simulation cell. In agreement with previous studies, which made a similar comparison for aqueous co-solvent solutions, without the additional solvent, we conclude that the finite size scaling method is also applicable to the present case, since it can produce computationally more efficient results which are equivalent to the more costly radial distribution

  16. Some effects of integrated production planning in large-scale kitchens

    DEFF Research Database (Denmark)

    Engelund, Eva Høy; Friis, Alan; Jacobsen, Peter

    2005-01-01

    Integrated production planning in large-scale kitchens proves advantageous for increasing the overall quality of the food produced and the flexibility in terms of a diverse food supply. The aim is to increase the flexibility and the variability in the production as well as the focus on freshness ...

  17. Integral criteria for large-scale multiple fingerprint solutions

    Science.gov (United States)

    Ushmaev, Oleg S.; Novikov, Sergey O.

    2004-08-01

    We propose the definition and analysis of the optimal integral similarity score criterion for large scale multmodal civil ID systems. Firstly, the general properties of score distributions for genuine and impostor matches for different systems and input devices are investigated. The empirical statistics was taken from the real biometric tests. Then we carry out the analysis of simultaneous score distributions for a number of combined biometric tests and primary for ultiple fingerprint solutions. The explicit and approximate relations for optimal integral score, which provides the least value of the FRR while the FAR is predefined, have been obtained. The results of real multiple fingerprint test show good correspondence with the theoretical results in the wide range of the False Acceptance and the False Rejection Rates.

  18. Dynamic Arrest in Charged Colloidal Systems Exhibiting Large-Scale Structural Heterogeneities

    International Nuclear Information System (INIS)

    Haro-Perez, C.; Callejas-Fernandez, J.; Hidalgo-Alvarez, R.; Rojas-Ochoa, L. F.; Castaneda-Priego, R.; Quesada-Perez, M.; Trappe, V.

    2009-01-01

    Suspensions of charged liposomes are found to exhibit typical features of strongly repulsive fluid systems at short length scales, while exhibiting structural heterogeneities at larger length scales that are characteristic of attractive systems. We model the static structure factor of these systems using effective pair interaction potentials composed of a long-range attraction and a shorter range repulsion. Our modeling of the static structure yields conditions for dynamically arrested states at larger volume fractions, which we find to agree with the experimentally observed dynamics

  19. Two-scale large deviations for chemical reaction kinetics through second quantization path integral

    International Nuclear Information System (INIS)

    Li, Tiejun; Lin, Feng

    2016-01-01

    Motivated by the study of rare events for a typical genetic switching model in systems biology, in this paper we aim to establish the general two-scale large deviations for chemical reaction systems. We build a formal approach to explicitly obtain the large deviation rate functionals for the considered two-scale processes based upon the second quantization path integral technique. We get three important types of large deviation results when the underlying two timescales are in three different regimes. This is realized by singular perturbation analysis to the rate functionals obtained by the path integral. We find that the three regimes possess the same deterministic mean-field limit but completely different chemical Langevin approximations. The obtained results are natural extensions of the classical large volume limit for chemical reactions. We also discuss its implication on the single-molecule Michaelis–Menten kinetics. Our framework and results can be applied to understand general multi-scale systems including diffusion processes. (paper)

  20. Fuel pin integrity assessment under large scale transients

    International Nuclear Information System (INIS)

    Dutta, B.K.

    2006-01-01

    The integrity of fuel rods under normal, abnormal and accident conditions is an important consideration during fuel design of advanced nuclear reactors. The fuel matrix and the sheath form the first barrier to prevent the release of radioactive materials into the primary coolant. An understanding of the fuel and clad behaviour under different reactor conditions, particularly under the beyond-design-basis accident scenario leading to large scale transients, is always desirable to assess the inherent safety margins in fuel pin design and to plan for the mitigation the consequences of accidents, if any. The severe accident conditions are typically characterized by the energy deposition rates far exceeding the heat removal capability of the reactor coolant system. This may lead to the clad failure due to fission gas pressure at high temperature, large- scale pellet-clad interaction and clad melting. The fuel rod performance is affected by many interdependent complex phenomena involving extremely complex material behaviour. The versatile experimental database available in this area has led to the development of powerful analytical tools to characterize fuel under extreme scenarios

  1. Do large-scale assessments measure students' ability to integrate scientific knowledge?

    Science.gov (United States)

    Lee, Hee-Sun

    2010-03-01

    Large-scale assessments are used as means to diagnose the current status of student achievement in science and compare students across schools, states, and countries. For efficiency, multiple-choice items and dichotomously-scored open-ended items are pervasively used in large-scale assessments such as Trends in International Math and Science Study (TIMSS). This study investigated how well these items measure secondary school students' ability to integrate scientific knowledge. This study collected responses of 8400 students to 116 multiple-choice and 84 open-ended items and applied an Item Response Theory analysis based on the Rasch Partial Credit Model. Results indicate that most multiple-choice items and dichotomously-scored open-ended items can be used to determine whether students have normative ideas about science topics, but cannot measure whether students integrate multiple pieces of relevant science ideas. Only when the scoring rubric is redesigned to capture subtle nuances of student open-ended responses, open-ended items become a valid and reliable tool to assess students' knowledge integration ability.

  2. XML-based approaches for the integration of heterogeneous bio-molecular data.

    Science.gov (United States)

    Mesiti, Marco; Jiménez-Ruiz, Ernesto; Sanz, Ismael; Berlanga-Llavori, Rafael; Perlasca, Paolo; Valentini, Giorgio; Manset, David

    2009-10-15

    The today's public database infrastructure spans a very large collection of heterogeneous biological data, opening new opportunities for molecular biology, bio-medical and bioinformatics research, but raising also new problems for their integration and computational processing. In this paper we survey the most interesting and novel approaches for the representation, integration and management of different kinds of biological data by exploiting XML and the related recommendations and approaches. Moreover, we present new and interesting cutting edge approaches for the appropriate management of heterogeneous biological data represented through XML. XML has succeeded in the integration of heterogeneous biomolecular information, and has established itself as the syntactic glue for biological data sources. Nevertheless, a large variety of XML-based data formats have been proposed, thus resulting in a difficult effective integration of bioinformatics data schemes. The adoption of a few semantic-rich standard formats is urgent to achieve a seamless integration of the current biological resources.

  3. Mining the Mind Research Network: A Novel Framework for Exploring Large Scale, Heterogeneous Translational Neuroscience Research Data Sources

    Science.gov (United States)

    Bockholt, Henry J.; Scully, Mark; Courtney, William; Rachakonda, Srinivas; Scott, Adam; Caprihan, Arvind; Fries, Jill; Kalyanam, Ravi; Segall, Judith M.; de la Garza, Raul; Lane, Susan; Calhoun, Vince D.

    2009-01-01

    A neuroinformatics (NI) system is critical to brain imaging research in order to shorten the time between study conception and results. Such a NI system is required to scale well when large numbers of subjects are studied. Further, when multiple sites participate in research projects organizational issues become increasingly difficult. Optimized NI applications mitigate these problems. Additionally, NI software enables coordination across multiple studies, leveraging advantages potentially leading to exponential research discoveries. The web-based, Mind Research Network (MRN), database system has been designed and improved through our experience with 200 research studies and 250 researchers from seven different institutions. The MRN tools permit the collection, management, reporting and efficient use of large scale, heterogeneous data sources, e.g., multiple institutions, multiple principal investigators, multiple research programs and studies, and multimodal acquisitions. We have collected and analyzed data sets on thousands of research participants and have set up a framework to automatically analyze the data, thereby making efficient, practical data mining of this vast resource possible. This paper presents a comprehensive framework for capturing and analyzing heterogeneous neuroscience research data sources that has been fully optimized for end-users to perform novel data mining. PMID:20461147

  4. Mining the mind research network: a novel framework for exploring large scale, heterogeneous translational neuroscience research data sources.

    Directory of Open Access Journals (Sweden)

    Henry Jeremy Bockholt

    2010-04-01

    Full Text Available A neuroinformatics (NI system is critical to brain imaging research in order to shorten the time between study conception and results. Such a NI system is required to scale well when large numbers of subjects are studied. Further, when multiple sites participate in research projects organizational issues become increasingly difficult. Optimized NI applications mitigate these problems. Additionally, NI software enables coordination across multiple studies, leveraging advantages potentially leading to exponential research discoveries. The web-based, Mind Research Network (MRN, database system has been designed and improved through our experience with 200 research studies and 250 researchers from 7 different institutions. The MRN tools permit the collection, management, reporting and efficient use of large scale, heterogeneous data sources, e.g., multiple institutions, multiple principal investigators, multiple research programs and studies, and multimodal acquisitions. We have collected and analyzed data sets on thousands of research participants and have set up a framework to automatically analyze the data, thereby making efficient, practical data mining of this vast resource possible. This paper presents a comprehensive framework for capturing and analyzing heterogeneous neuroscience research data sources that has been fully optimized for end-users to perform novel data mining.

  5. IoT European Large-Scale Pilots – Integration, Experimentation and Testing

    OpenAIRE

    Guillén, Sergio Gustavo; Sala, Pilar; Fico, Giuseppe; Arredondo, Maria Teresa; Cano, Alicia; Posada, Jorge; Gutierrez, Germán; Palau, Carlos; Votis, Konstantinos; Verdouw, Cor N.; Wolfert, Sjaak; Beers, George; Sundmaeker, Harald; Chatzikostas, Grigoris; Ziegler, Sébastien

    2017-01-01

    The IoT European Large-Scale Pilots Programme includes the innovation consortia that are collaborating to foster the deployment of IoT solutions in Europe through the integration of advanced IoT technologies across the value chain, demonstration of multiple IoT applications at scale and in a usage context, and as close as possible to operational conditions. The programme projects are targeted, goal-driven initiatives that propose IoT approaches to specific real-life industrial/societal challe...

  6. Multidimensional quantum entanglement with large-scale integrated optics.

    Science.gov (United States)

    Wang, Jianwei; Paesani, Stefano; Ding, Yunhong; Santagati, Raffaele; Skrzypczyk, Paul; Salavrakos, Alexia; Tura, Jordi; Augusiak, Remigiusz; Mančinska, Laura; Bacco, Davide; Bonneau, Damien; Silverstone, Joshua W; Gong, Qihuang; Acín, Antonio; Rottwitt, Karsten; Oxenløwe, Leif K; O'Brien, Jeremy L; Laing, Anthony; Thompson, Mark G

    2018-04-20

    The ability to control multidimensional quantum systems is central to the development of advanced quantum technologies. We demonstrate a multidimensional integrated quantum photonic platform able to generate, control, and analyze high-dimensional entanglement. A programmable bipartite entangled system is realized with dimensions up to 15 × 15 on a large-scale silicon photonics quantum circuit. The device integrates more than 550 photonic components on a single chip, including 16 identical photon-pair sources. We verify the high precision, generality, and controllability of our multidimensional technology, and further exploit these abilities to demonstrate previously unexplored quantum applications, such as quantum randomness expansion and self-testing on multidimensional states. Our work provides an experimental platform for the development of multidimensional quantum technologies. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  7. Electric vehicles and large-scale integration of wind power

    DEFF Research Database (Denmark)

    Liu, Wen; Hu, Weihao; Lund, Henrik

    2013-01-01

    with this imbalance and to reduce its high dependence on oil production. For this reason, it is interesting to analyse the extent to which transport electrification can further the renewable energy integration. This paper quantifies this issue in Inner Mongolia, where the share of wind power in the electricity supply...... was 6.5% in 2009 and which has the plan to develop large-scale wind power. The results show that electric vehicles (EVs) have the ability to balance the electricity demand and supply and to further the wind power integration. In the best case, the energy system with EV can increase wind power...... integration by 8%. The application of EVs benefits from saving both energy system cost and fuel cost. However, the negative consequences of decreasing energy system efficiency and increasing the CO2 emission should be noted when applying the hydrogen fuel cell vehicle (HFCV). The results also indicate...

  8. Large scale integration of photovoltaics in cities

    International Nuclear Information System (INIS)

    Strzalka, Aneta; Alam, Nazmul; Duminil, Eric; Coors, Volker; Eicker, Ursula

    2012-01-01

    Highlights: ► We implement the photovoltaics on a large scale. ► We use three-dimensional modelling for accurate photovoltaic simulations. ► We consider the shadowing effect in the photovoltaic simulation. ► We validate the simulated results using detailed hourly measured data. - Abstract: For a large scale implementation of photovoltaics (PV) in the urban environment, building integration is a major issue. This includes installations on roof or facade surfaces with orientations that are not ideal for maximum energy production. To evaluate the performance of PV systems in urban settings and compare it with the building user’s electricity consumption, three-dimensional geometry modelling was combined with photovoltaic system simulations. As an example, the modern residential district of Scharnhauser Park (SHP) near Stuttgart/Germany was used to calculate the potential of photovoltaic energy and to evaluate the local own consumption of the energy produced. For most buildings of the district only annual electrical consumption data was available and only selected buildings have electronic metering equipment. The available roof area for one of these multi-family case study buildings was used for a detailed hourly simulation of the PV power production, which was then compared to the hourly measured electricity consumption. The results were extrapolated to all buildings of the analyzed area by normalizing them to the annual consumption data. The PV systems can produce 35% of the quarter’s total electricity consumption and half of this generated electricity is directly used within the buildings.

  9. Vertical and lateral heterogeneous integration

    Science.gov (United States)

    Geske, Jon; Okuno, Yae L.; Bowers, John E.; Jayaraman, Vijay

    2001-09-01

    A technique for achieving large-scale monolithic integration of lattice-mismatched materials in the vertical direction and the lateral integration of dissimilar lattice-matched structures has been developed. The technique uses a single nonplanar direct-wafer-bond step to transform vertically integrated epitaxial structures into lateral epitaxial variation across the surface of a wafer. Nonplanar wafer bonding is demonstrated by integrating four different unstrained multi-quantum-well active regions lattice matched to InP on a GaAs wafer surface. Microscopy is used to verify the quality of the bonded interface, and photoluminescence is used to verify that the bonding process does not degrade the optical quality of the laterally integrated wells. The authors propose this technique as a means to achieve greater levels of wafer-scale integration in optical, electrical, and micromechanical devices.

  10. Challenges and options for large scale integration of wind power

    International Nuclear Information System (INIS)

    Tande, John Olav Giaever

    2006-01-01

    Challenges and options for large scale integration of wind power are examined. Immediate challenges are related to weak grids. Assessment of system stability requires numerical simulation. Models are being developed - validation is essential. Coordination of wind and hydro generation is a key for allowing more wind power capacity in areas with limited transmission corridors. For the case study grid depending on technology and control the allowed wind farm size is increased from 50 to 200 MW. The real life example from 8 January 2005 demonstrates that existing marked based mechanisms can handle large amounts of wind power. In wind integration studies it is essential to take account of the controllability of modern wind farms, the power system flexibility and the smoothing effect of geographically dispersed wind farms. Modern wind farms contribute to system adequacy - combining wind and hydro constitutes a win-win system (ml)

  11. Accounting for small scale heterogeneity in ecohydrologic watershed models

    Science.gov (United States)

    Burke, W.; Tague, C.

    2017-12-01

    Spatially distributed ecohydrologic models are inherently constrained by the spatial resolution of their smallest units, below which land and processes are assumed to be homogenous. At coarse scales, heterogeneity is often accounted for by computing store and fluxes of interest over a distribution of land cover types (or other sources of heterogeneity) within spatially explicit modeling units. However this approach ignores spatial organization and the lateral transfer of water and materials downslope. The challenge is to account both for the role of flow network topology and fine-scale heterogeneity. We present a new approach that defines two levels of spatial aggregation and that integrates spatially explicit network approach with a flexible representation of finer-scale aspatial heterogeneity. Critically, this solution does not simply increase the resolution of the smallest spatial unit, and so by comparison, results in improved computational efficiency. The approach is demonstrated by adapting Regional Hydro-Ecologic Simulation System (RHESSys), an ecohydrologic model widely used to simulate climate, land use, and land management impacts. We illustrate the utility of our approach by showing how the model can be used to better characterize forest thinning impacts on ecohydrology. Forest thinning is typically done at the scale of individual trees, and yet management responses of interest include impacts on watershed scale hydrology and on downslope riparian vegetation. Our approach allow us to characterize the variability in tree size/carbon reduction and water transfers between neighboring trees while still capturing hillslope to watershed scale effects, Our illustrative example demonstrates that accounting for these fine scale effects can substantially alter model estimates, in some cases shifting the impacts of thinning on downslope water availability from increases to decreases. We conclude by describing other use cases that may benefit from this approach

  12. Heterogeneously Integrated Microwave Signal Generators with Narrow Linewidth Lasers

    Science.gov (United States)

    2017-03-20

    have shown that heterogeneous integration not only allows for a reduced cost due to economy of scale, but also allows for same or even better...advantage of introducing SOAs for microwave generator is the control and boosting of optical power before the detector providing higher RF powers. A

  13. A large-scale circuit mechanism for hierarchical dynamical processing in the primate cortex

    OpenAIRE

    Chaudhuri, Rishidev; Knoblauch, Kenneth; Gariel, Marie-Alice; Kennedy, Henry; Wang, Xiao-Jing

    2015-01-01

    We developed a large-scale dynamical model of the macaque neocortex, which is based on recently acquired directed- and weighted-connectivity data from tract-tracing experiments, and which incorporates heterogeneity across areas. A hierarchy of timescales naturally emerges from this system: sensory areas show brief, transient responses to input (appropriate for sensory processing), whereas association areas integrate inputs over time and exhibit persistent activity (suitable for decision-makin...

  14. Probing Mantle Heterogeneity Across Spatial Scales

    Science.gov (United States)

    Hariharan, A.; Moulik, P.; Lekic, V.

    2017-12-01

    Inferences of mantle heterogeneity in terms of temperature, composition, grain size, melt and crystal structure may vary across local, regional and global scales. Probing these scale-dependent effects require quantitative comparisons and reconciliation of tomographic models that vary in their regional scope, parameterization, regularization and observational constraints. While a range of techniques like radial correlation functions and spherical harmonic analyses have revealed global features like the dominance of long-wavelength variations in mantle heterogeneity, they have limited applicability for specific regions of interest like subduction zones and continental cratons. Moreover, issues like discrepant 1-D reference Earth models and related baseline corrections have impeded the reconciliation of heterogeneity between various regional and global models. We implement a new wavelet-based approach that allows for structure to be filtered simultaneously in both the spectral and spatial domain, allowing us to characterize heterogeneity on a range of scales and in different geographical regions. Our algorithm extends a recent method that expanded lateral variations into the wavelet domain constructed on a cubed sphere. The isolation of reference velocities in the wavelet scaling function facilitates comparisons between models constructed with arbitrary 1-D reference Earth models. The wavelet transformation allows us to quantify the scale-dependent consistency between tomographic models in a region of interest and investigate the fits to data afforded by heterogeneity at various dominant wavelengths. We find substantial and spatially varying differences in the spectrum of heterogeneity between two representative global Vp models constructed using different data and methodologies. Applying the orthonormality of the wavelet expansion, we isolate detailed variations in velocity from models and evaluate additional fits to data afforded by adding such complexities to long

  15. Hypersingular integral equations, waveguiding effects in Cantorian Universe and genesis of large scale structures

    International Nuclear Information System (INIS)

    Iovane, G.; Giordano, P.

    2005-01-01

    In this work we introduce the hypersingular integral equations and analyze a realistic model of gravitational waveguides on a cantorian space-time. A waveguiding effect is considered with respect to the large scale structure of the Universe, where the structure formation appears as if it were a classically self-similar random process at all astrophysical scales. The result is that it seems we live in an El Naschie's o (∞) Cantorian space-time, where gravitational lensing and waveguiding effects can explain the appearing Universe. In particular, we consider filamentary and planar large scale structures as possible refraction channels for electromagnetic radiation coming from cosmological structures. From this vision the Universe appears like a large self-similar adaptive mirrors set, thanks to three numerical simulations. Consequently, an infinite Universe is just an optical illusion that is produced by mirroring effects connected with the large scale structure of a finite and not a large Universe

  16. Climate forcing and infectious disease transmission in urban landscapes: integrating demographic and socioeconomic heterogeneity.

    Science.gov (United States)

    Santos-Vega, Mauricio; Martinez, Pamela P; Pascual, Mercedes

    2016-10-01

    Urbanization and climate change are the two major environmental challenges of the 21st century. The dramatic expansion of cities around the world creates new conditions for the spread, surveillance, and control of infectious diseases. In particular, urban growth generates pronounced spatial heterogeneity within cities, which can modulate the effect of climate factors at local spatial scales in large urban environments. Importantly, the interaction between environmental forcing and socioeconomic heterogeneity at local scales remains an open area in infectious disease dynamics, especially for urban landscapes of the developing world. A quantitative and conceptual framework on urban health with a focus on infectious diseases would benefit from integrating aspects of climate forcing, population density, and level of wealth. In this paper, we review what is known about these drivers acting independently and jointly on urban infectious diseases; we then outline elements that are missing and would contribute to building such a framework. © 2016 New York Academy of Sciences.

  17. Quantifying the Impacts of Large Scale Integration of Renewables in Indian Power Sector

    Science.gov (United States)

    Kumar, P.; Mishra, T.; Banerjee, R.

    2017-12-01

    India's power sector is responsible for nearly 37 percent of India's greenhouse gas emissions. For a fast emerging economy like India whose population and energy consumption are poised to rise rapidly in the coming decades, renewable energy can play a vital role in decarbonizing power sector. In this context, India has targeted 33-35 percent emission intensity reduction (with respect to 2005 levels) along with large scale renewable energy targets (100GW solar, 60GW wind, and 10GW biomass energy by 2022) in INDCs submitted at Paris agreement. But large scale integration of renewable energy is a complex process which faces a number of problems like capital intensiveness, matching intermittent loads with least storage capacity and reliability. In this context, this study attempts to assess the technical feasibility of integrating renewables into Indian electricity mix by 2022 and analyze its implications on power sector operations. This study uses TIMES, a bottom up energy optimization model with unit commitment and dispatch features. We model coal and gas fired units discretely with region-wise representation of wind and solar resources. The dispatch features are used for operational analysis of power plant units under ramp rate and minimum generation constraints. The study analyzes India's electricity sector transition for the year 2022 with three scenarios. The base case scenario (no RE addition) along with INDC scenario (with 100GW solar, 60GW wind, 10GW biomass) and low RE scenario (50GW solar, 30GW wind) have been created to analyze the implications of large scale integration of variable renewable energy. The results provide us insights on trade-offs involved in achieving mitigation targets and investment decisions involved. The study also examines operational reliability and flexibility requirements of the system for integrating renewables.

  18. An innovative large scale integration of silicon nanowire-based field effect transistors

    Science.gov (United States)

    Legallais, M.; Nguyen, T. T. T.; Mouis, M.; Salem, B.; Robin, E.; Chenevier, P.; Ternon, C.

    2018-05-01

    Since the early 2000s, silicon nanowire field effect transistors are emerging as ultrasensitive biosensors while offering label-free, portable and rapid detection. Nevertheless, their large scale production remains an ongoing challenge due to time consuming, complex and costly technology. In order to bypass these issues, we report here on the first integration of silicon nanowire networks, called nanonet, into long channel field effect transistors using standard microelectronic process. A special attention is paid to the silicidation of the contacts which involved a large number of SiNWs. The electrical characteristics of these FETs constituted by randomly oriented silicon nanowires are also studied. Compatible integration on the back-end of CMOS readout and promising electrical performances open new opportunities for sensing applications.

  19. Evolution of scaling emergence in large-scale spatial epidemic spreading.

    Science.gov (United States)

    Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan

    2011-01-01

    Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease.

  20. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  1. Impact of spatially correlated pore-scale heterogeneity on drying porous media

    Science.gov (United States)

    Borgman, Oshri; Fantinel, Paolo; Lühder, Wieland; Goehring, Lucas; Holtzman, Ran

    2017-07-01

    We study the effect of spatially-correlated heterogeneity on isothermal drying of porous media. We combine a minimal pore-scale model with microfluidic experiments with the same pore geometry. Our simulated drying behavior compares favorably with experiments, considering the large sensitivity of the emergent behavior to the uncertainty associated with even small manufacturing errors. We show that increasing the correlation length in particle sizes promotes preferential drying of clusters of large pores, prolonging liquid connectivity and surface wetness and thus higher drying rates for longer periods. Our findings improve our quantitative understanding of how pore-scale heterogeneity impacts drying, which plays a role in a wide range of processes ranging from fuel cells to curing of paints and cements to global budgets of energy, water and solutes in soils.

  2. Alpine Ecohydrology Across Scales: Propagating Fine-scale Heterogeneity to the Catchment and Beyond

    Science.gov (United States)

    Mastrotheodoros, T.; Pappas, C.; Molnar, P.; Burlando, P.; Hadjidoukas, P.; Fatichi, S.

    2017-12-01

    In mountainous ecosystems, complex topography and landscape heterogeneity govern ecohydrological states and fluxes. Here, we investigate topographic controls on water, energy and carbon fluxes across different climatic regimes and vegetation types representative of the European Alps. We use an ecohydrological model to perform fine-scale numerical experiments on a synthetic domain that comprises a symmetric mountain with eight catchments draining along the cardinal and intercardinal directions. Distributed meteorological model input variables are generated using observations from Switzerland. The model computes the incoming solar radiation based on the local topography. We implement a multivariate statistical framework to disentangle the impact of landscape heterogeneity (i.e., elevation, aspect, flow contributing area, vegetation type) on the simulated water, carbon, and energy dynamics. This allows us to identify the sensitivities of several ecohydrological variables (including leaf area index, evapotranspiration, snow-cover and net primary productivity) to topographic and meteorological inputs at different spatial and temporal scales. We also use an alpine catchment as a real case study to investigate how the natural variability of soil and land cover affects the idealized relationships that arise from the synthetic domain. In accordance with previous studies, our analysis shows a complex pattern of vegetation response to radiation. We find also different patterns of ecosystem sensitivity to topography-driven heterogeneity depending on the hydrological regime (i.e., wet vs. dry conditions). Our results suggest that topography-driven variability in ecohydrological variables (e.g. transpiration) at the fine spatial scale can exceed 50%, but it is substantially reduced ( 5%) when integrated at the catchment scale.

  3. Integration and segregation of large-scale brain networks during short-term task automatization.

    Science.gov (United States)

    Mohr, Holger; Wolfensteller, Uta; Betzel, Richard F; Mišić, Bratislav; Sporns, Olaf; Richiardi, Jonas; Ruge, Hannes

    2016-11-03

    The human brain is organized into large-scale functional networks that can flexibly reconfigure their connectivity patterns, supporting both rapid adaptive control and long-term learning processes. However, it has remained unclear how short-term network dynamics support the rapid transformation of instructions into fluent behaviour. Comparing fMRI data of a learning sample (N=70) with a control sample (N=67), we find that increasingly efficient task processing during short-term practice is associated with a reorganization of large-scale network interactions. Practice-related efficiency gains are facilitated by enhanced coupling between the cingulo-opercular network and the dorsal attention network. Simultaneously, short-term task automatization is accompanied by decreasing activation of the fronto-parietal network, indicating a release of high-level cognitive control, and a segregation of the default mode network from task-related networks. These findings suggest that short-term task automatization is enabled by the brain's ability to rapidly reconfigure its large-scale network organization involving complementary integration and segregation processes.

  4. A network integration approach for drug-target interaction prediction and computational drug repositioning from heterogeneous information.

    Science.gov (United States)

    Luo, Yunan; Zhao, Xinbin; Zhou, Jingtian; Yang, Jinglin; Zhang, Yanqing; Kuang, Wenhua; Peng, Jian; Chen, Ligong; Zeng, Jianyang

    2017-09-18

    The emergence of large-scale genomic, chemical and pharmacological data provides new opportunities for drug discovery and repositioning. In this work, we develop a computational pipeline, called DTINet, to predict novel drug-target interactions from a constructed heterogeneous network, which integrates diverse drug-related information. DTINet focuses on learning a low-dimensional vector representation of features, which accurately explains the topological properties of individual nodes in the heterogeneous network, and then makes prediction based on these representations via a vector space projection scheme. DTINet achieves substantial performance improvement over other state-of-the-art methods for drug-target interaction prediction. Moreover, we experimentally validate the novel interactions between three drugs and the cyclooxygenase proteins predicted by DTINet, and demonstrate the new potential applications of these identified cyclooxygenase inhibitors in preventing inflammatory diseases. These results indicate that DTINet can provide a practically useful tool for integrating heterogeneous information to predict new drug-target interactions and repurpose existing drugs.Network-based data integration for drug-target prediction is a promising avenue for drug repositioning, but performance is wanting. Here, the authors introduce DTINet, whose performance is enhanced in the face of noisy, incomplete and high-dimensional biological data by learning low-dimensional vector representations.

  5. Symplectic integrators for large scale molecular dynamics simulations: A comparison of several explicit methods

    International Nuclear Information System (INIS)

    Gray, S.K.; Noid, D.W.; Sumpter, B.G.

    1994-01-01

    We test the suitability of a variety of explicit symplectic integrators for molecular dynamics calculations on Hamiltonian systems. These integrators are extremely simple algorithms with low memory requirements, and appear to be well suited for large scale simulations. We first apply all the methods to a simple test case using the ideas of Berendsen and van Gunsteren. We then use the integrators to generate long time trajectories of a 1000 unit polyethylene chain. Calculations are also performed with two popular but nonsymplectic integrators. The most efficient integrators of the set investigated are deduced. We also discuss certain variations on the basic symplectic integration technique

  6. Large-scale building integrated photovoltaics field trial. First technical report - installation phase

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    This report summarises the results of the first eighteen months of the Large-Scale Building Integrated Photovoltaic Field Trial focussing on technical aspects. The project aims included increasing awareness and application of the technology, raising the UK capabilities in application of the technology, and assessing the potential for building integrated photovoltaics (BIPV). Details are given of technology choices; project organisation, cost, and status; and the evaluation criteria. Installations of BIPV described include University buildings, commercial centres, and a sports stadium, wildlife park, church hall, and district council building. Lessons learnt are discussed, and a further report covering monitoring aspects is planned.

  7. Large scale mapping of groundwater resources using a highly integrated set of tools

    DEFF Research Database (Denmark)

    Søndergaard, Verner; Auken, Esben; Christiansen, Anders Vest

    large areas with information from an optimum number of new investigation boreholes, existing boreholes, logs and water samples to get an integrated and detailed description of the groundwater resources and their vulnerability.Development of more time efficient and airborne geophysical data acquisition...... platforms (e.g. SkyTEM) have made large-scale mapping attractive and affordable in the planning and administration of groundwater resources. The handling and optimized use of huge amounts of geophysical data covering large areas has also required a comprehensive database, where data can easily be stored...

  8. Satellite-based remote sensing of running water habitats at large riverscape scales: Tools to analyze habitat heterogeneity for river ecosystem management

    Science.gov (United States)

    Hugue, F.; Lapointe, M.; Eaton, B. C.; Lepoutre, A.

    2016-01-01

    We illustrate an approach to quantify patterns in hydraulic habitat composition and local heterogeneity applicable at low cost over very large river extents, with selectable reach window scales. Ongoing developments in remote sensing and geographical information science massively improve efficiencies in analyzing earth surface features. With the development of new satellite sensors and drone platforms and with the lowered cost of high resolution multispectral imagery, fluvial geomorphology is experiencing a revolution in mapping streams at high resolution. Exploiting the power of aerial or satellite imagery is particularly useful in a riverscape research framework (Fausch et al., 2002), where high resolution sampling of fluvial features and very large coverage extents are needed. This study presents a satellite remote sensing method that requires very limited field calibration data to estimate over various scales ranging from 1 m to many tens or river kilometers (i) spatial composition metrics for key hydraulic mesohabitat types and (ii) reach-scale wetted habitat heterogeneity indices such as the hydromorphological index of diversity (HMID). When the purpose is hydraulic habitat characterization applied over long river networks, the proposed method (although less accurate) is much less computationally expensive and less data demanding than two dimensional computational fluid dynamics (CFD). Here, we illustrate the tools based on a Worldview 2 satellite image of the Kiamika River, near Mont Laurier, Quebec, Canada, specifically over a 17-km river reach below the Kiamika dam. In the first step, a high resolution water depth (D) map is produced from a spectral band ratio (calculated from the multispectral image), calibrated with limited field measurements. Next, based only on known river discharge and estimated cross section depths at time of image capture, empirical-based pseudo-2D hydraulic rules are used to rapidly generate a two-dimensional map of flow velocity

  9. NASA's Information Power Grid: Large Scale Distributed Computing and Data Management

    Science.gov (United States)

    Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)

    2001-01-01

    Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.

  10. The prospect of modern thermomechanics in structural integrity calculations of large-scale pressure vessels

    Science.gov (United States)

    Fekete, Tamás

    2018-05-01

    Structural integrity calculations play a crucial role in designing large-scale pressure vessels. Used in the electric power generation industry, these kinds of vessels undergo extensive safety analyses and certification procedures before deemed feasible for future long-term operation. The calculations are nowadays directed and supported by international standards and guides based on state-of-the-art results of applied research and technical development. However, their ability to predict a vessel's behavior under accidental circumstances after long-term operation is largely limited by the strong dependence of the analysis methodology on empirical models that are correlated to the behavior of structural materials and their changes during material aging. Recently a new scientific engineering paradigm, structural integrity has been developing that is essentially a synergistic collaboration between a number of scientific and engineering disciplines, modeling, experiments and numerics. Although the application of the structural integrity paradigm highly contributed to improving the accuracy of safety evaluations of large-scale pressure vessels, the predictive power of the analysis methodology has not yet improved significantly. This is due to the fact that already existing structural integrity calculation methodologies are based on the widespread and commonly accepted 'traditional' engineering thermal stress approach, which is essentially based on the weakly coupled model of thermomechanics and fracture mechanics. Recently, a research has been initiated in MTA EK with the aim to review and evaluate current methodologies and models applied in structural integrity calculations, including their scope of validity. The research intends to come to a better understanding of the physical problems that are inherently present in the pool of structural integrity problems of reactor pressure vessels, and to ultimately find a theoretical framework that could serve as a well

  11. Energy System Analysis of Large-Scale Integration of Wind Power

    International Nuclear Information System (INIS)

    Lund, Henrik

    2003-11-01

    The paper presents the results of two research projects conducted by Aalborg University and financed by the Danish Energy Research Programme. Both projects include the development of models and system analysis with focus on large-scale integration of wind power into different energy systems. Market reactions and ability to exploit exchange on the international market for electricity by locating exports in hours of high prices are included in the analyses. This paper focuses on results which are valid for energy systems in general. The paper presents the ability of different energy systems and regulation strategies to integrate wind power, The ability is expressed by three factors: One factor is the degree of electricity excess production caused by fluctuations in wind and CHP heat demands. The other factor is the ability to utilise wind power to reduce CO 2 emission in the system. And the third factor is the ability to benefit from exchange of electricity on the market. Energy systems and regulation strategies are analysed in the range of a wind power input from 0 to 100% of the electricity demand. Based on the Danish energy system, in which 50 per cent of the electricity demand is produced in CHP, a number of future energy systems with CO 2 reduction potentials are analysed, i.e. systems with more CHP, systems using electricity for transportation (battery or hydrogen vehicles) and systems with fuel-cell technologies. For the present and such potential future energy systems different regulation strategies have been analysed, i.e. the inclusion of small CHP plants into the regulation task of electricity balancing and grid stability and investments in electric heating, heat pumps and heat storage capacity. Also the potential of energy management has been analysed. The results of the analyses make it possible to compare short-term and long-term potentials of different strategies of large-scale integration of wind power

  12. Modelling solute dispersion in periodic heterogeneous porous media: Model benchmarking against intermediate scale experiments

    Science.gov (United States)

    Majdalani, Samer; Guinot, Vincent; Delenne, Carole; Gebran, Hicham

    2018-06-01

    This paper is devoted to theoretical and experimental investigations of solute dispersion in heterogeneous porous media. Dispersion in heterogenous porous media has been reported to be scale-dependent, a likely indication that the proposed dispersion models are incompletely formulated. A high quality experimental data set of breakthrough curves in periodic model heterogeneous porous media is presented. In contrast with most previously published experiments, the present experiments involve numerous replicates. This allows the statistical variability of experimental data to be accounted for. Several models are benchmarked against the data set: the Fickian-based advection-dispersion, mobile-immobile, multirate, multiple region advection dispersion models, and a newly proposed transport model based on pure advection. A salient property of the latter model is that its solutions exhibit a ballistic behaviour for small times, while tending to the Fickian behaviour for large time scales. Model performance is assessed using a novel objective function accounting for the statistical variability of the experimental data set, while putting equal emphasis on both small and large time scale behaviours. Besides being as accurate as the other models, the new purely advective model has the advantages that (i) it does not exhibit the undesirable effects associated with the usual Fickian operator (namely the infinite solute front propagation speed), and (ii) it allows dispersive transport to be simulated on every heterogeneity scale using scale-independent parameters.

  13. Basin-scale heterogeneity in Antarctic precipitation and its impact on surface mass variability

    Directory of Open Access Journals (Sweden)

    J. Fyke

    2017-11-01

    Full Text Available Annually averaged precipitation in the form of snow, the dominant term of the Antarctic Ice Sheet surface mass balance, displays large spatial and temporal variability. Here we present an analysis of spatial patterns of regional Antarctic precipitation variability and their impact on integrated Antarctic surface mass balance variability simulated as part of a preindustrial 1800-year global, fully coupled Community Earth System Model simulation. Correlation and composite analyses based on this output allow for a robust exploration of Antarctic precipitation variability. We identify statistically significant relationships between precipitation patterns across Antarctica that are corroborated by climate reanalyses, regional modeling and ice core records. These patterns are driven by variability in large-scale atmospheric moisture transport, which itself is characterized by decadal- to centennial-scale oscillations around the long-term mean. We suggest that this heterogeneity in Antarctic precipitation variability has a dampening effect on overall Antarctic surface mass balance variability, with implications for regulation of Antarctic-sourced sea level variability, detection of an emergent anthropogenic signal in Antarctic mass trends and identification of Antarctic mass loss accelerations.

  14. The MIRAGE project: large scale radionuclide transport investigations and integral migration experiments

    International Nuclear Information System (INIS)

    Come, B.; Bidoglio, G.; Chapman, N.

    1986-01-01

    Predictions of radionuclide migration through the geosphere must be supported by large-scale, long-term investigations. Several research areas of the MIRAGE Project are devoted to acquiring reliable data for developing and validating models. Apart from man-made migration experiments in boreholes and/or underground galleries, attention is paid to natural geological migration systems which have been active for very long time spans. The potential role of microbial activity, either resident or introduced into the host media, is also considered. In order to clarify basic mechanisms, smaller scale ''integral'' migration experiments under fully controlled laboratory conditions are also carried out using real waste forms and representative geological media. (author)

  15. Large-Scale Structure and Hyperuniformity of Amorphous Ices

    Science.gov (United States)

    Martelli, Fausto; Torquato, Salvatore; Giovambattista, Nicolas; Car, Roberto

    2017-09-01

    We investigate the large-scale structure of amorphous ices and transitions between their different forms by quantifying their large-scale density fluctuations. Specifically, we simulate the isothermal compression of low-density amorphous ice (LDA) and hexagonal ice to produce high-density amorphous ice (HDA). Both HDA and LDA are nearly hyperuniform; i.e., they are characterized by an anomalous suppression of large-scale density fluctuations. By contrast, in correspondence with the nonequilibrium phase transitions to HDA, the presence of structural heterogeneities strongly suppresses the hyperuniformity and the system becomes hyposurficial (devoid of "surface-area fluctuations"). Our investigation challenges the largely accepted "frozen-liquid" picture, which views glasses as structurally arrested liquids. Beyond implications for water, our findings enrich our understanding of pressure-induced structural transformations in glasses.

  16. Test methods of total dose effects in very large scale integrated circuits

    International Nuclear Information System (INIS)

    He Chaohui; Geng Bin; He Baoping; Yao Yujuan; Li Yonghong; Peng Honglun; Lin Dongsheng; Zhou Hui; Chen Yusheng

    2004-01-01

    A kind of test method of total dose effects (TDE) is presented for very large scale integrated circuits (VLSI). The consumption current of devices is measured while function parameters of devices (or circuits) are measured. Then the relation between data errors and consumption current can be analyzed and mechanism of TDE in VLSI can be proposed. Experimental results of 60 Co γ TDEs are given for SRAMs, EEPROMs, FLASH ROMs and a kind of CPU

  17. Heterogeneous Monolithic Integration of Single-Crystal Organic Materials.

    Science.gov (United States)

    Park, Kyung Sun; Baek, Jangmi; Park, Yoonkyung; Lee, Lynn; Hyon, Jinho; Koo Lee, Yong-Eun; Shrestha, Nabeen K; Kang, Youngjong; Sung, Myung Mo

    2017-02-01

    Manufacturing high-performance organic electronic circuits requires the effective heterogeneous integration of different nanoscale organic materials with uniform morphology and high crystallinity in a desired arrangement. In particular, the development of high-performance organic electronic and optoelectronic devices relies on high-quality single crystals that show optimal intrinsic charge-transport properties and electrical performance. Moreover, the heterogeneous integration of organic materials on a single substrate in a monolithic way is highly demanded for the production of fundamental organic electronic components as well as complex integrated circuits. Many of the various methods that have been designed to pattern multiple heterogeneous organic materials on a substrate and the heterogeneous integration of organic single crystals with their crystal growth are described here. Critical issues that have been encountered in the development of high-performance organic integrated electronics are also addressed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Deciphering the clinical effect of drugs through large-scale data integration

    DEFF Research Database (Denmark)

    Kjærulff, Sonny Kim

    . This work demonstrates the power of a strategy that uses clinical data mining in association with chemical biology in order to reduce the search space and aid identification of novel drug actions. The second article described in chapter 3 outlines a high confidence side-effect-drug interaction dataset. We...... demonstrates the importance of using high-confidence drug-side-effect data in deciphering the effect of small molecules in humans. In summary, this thesis presents computational systems chemical biology approaches that can help identify clinical effects of small molecules through large-scale data integration...

  19. The effects of spatial heterogeneity and subsurface lateral transfer on evapotranspiration estimates in large scale Earth system models

    Science.gov (United States)

    Rouholahnejad, E.; Fan, Y.; Kirchner, J. W.; Miralles, D. G.

    2017-12-01

    Most Earth system models (ESM) average over considerable sub-grid heterogeneity in land surface properties, and overlook subsurface lateral flow. This could potentially bias evapotranspiration (ET) estimates and has implications for future temperature predictions, since overestimations in ET imply greater latent heat fluxes and potential underestimation of dry and warm conditions in the context of climate change. Here we quantify the bias in evaporation estimates that may arise from the fact that ESMs average over considerable heterogeneity in surface properties, and also neglect lateral transfer of water across the heterogeneous landscapes at global scale. We use a Budyko framework to express ET as a function of P and PET to derive simple sub-grid closure relations that quantify how spatial heterogeneity and lateral transfer could affect average ET as seen from the atmosphere. We show that averaging over sub-grid heterogeneity in P and PET, as typical Earth system models do, leads to overestimation of average ET. Our analysis at global scale shows that the effects of sub-grid heterogeneity will be most pronounced in steep mountainous areas where the topographic gradient is high and where P is inversely correlated with PET across the landscape. In addition, we use the Total Water Storage (TWS) anomaly estimates from the Gravity Recovery and Climate Experiment (GRACE) remote sensing product and assimilate it into the Global Land Evaporation Amsterdam Model (GLEAM) to correct for existing free drainage lower boundary condition in GLEAM and quantify whether, and how much, accounting for changes in terrestrial storage can improve the simulation of soil moisture and regional ET fluxes at global scale.

  20. An integration strategy for large enterprises

    Directory of Open Access Journals (Sweden)

    Risimić Dejan

    2007-01-01

    Full Text Available Integration is the process of enabling a communication between disparate software components. Integration has been the burning issue for large enterprises in the last twenty years, due to the fact that 70% of the development and deployment budget is spent on integrating complex and heterogeneous back-end and front-end IT systems. The need to integrate existing applications is to support newer, faster, more accurate business processes and to provide meaningful, consistent management information. Historically, integration started with the introduction of point-to-point approaches evolving into simpler hub-and spoke topologies. These topologies were combined with custom remote procedure calls, distributed object technologies and message-oriented middleware (MOM, continued with enterprise application integration (EAI and used an application server as a primary vehicle for integration. The current phase of the evolution is service-oriented architecture (SOA combined with an enterprise service bus (ESB. Technical aspects of the comparison between the aforementioned technologies are analyzed and presented. The result of the study is the recommended integration strategy for large enterprises.

  1. Ecoregion-Based Conservation Planning in the Mediterranean: Dealing with Large-Scale Heterogeneity

    Science.gov (United States)

    Giakoumi, Sylvaine; Sini, Maria; Gerovasileiou, Vasilis; Mazor, Tessa; Beher, Jutta; Possingham, Hugh P.; Abdulla, Ameer; Çinar, Melih Ertan; Dendrinos, Panagiotis; Gucu, Ali Cemal; Karamanlidis, Alexandros A.; Rodic, Petra; Panayotidis, Panayotis; Taskin, Ergun; Jaklin, Andrej; Voultsiadou, Eleni; Webster, Chloë; Zenetos, Argyro; Katsanevakis, Stelios

    2013-01-01

    socioeconomically heterogeneous basin, and (c) it adopts ecoregions as the most appropriate level for large-scale planning. PMID:24155901

  2. Field scale heterogeneity of redox conditions in till-upscaling to a catchment nitrate model

    DEFF Research Database (Denmark)

    Hansen, J.R.; Erntsen, V.; Refsgaard, J.C.

    2008-01-01

    Point scale studies in different settings of glacial geology show a large local variation of redox conditions. There is a need to develop an upscaling methodology for catchment scale models. This paper describes a study of field-scale heterogeneity of redox-interfaces in a till aquitard within an...

  3. Transfer Printed Nanomembranes for Heterogeneously Integrated Membrane Photonics

    Directory of Open Access Journals (Sweden)

    Hongjun Yang

    2015-11-01

    Full Text Available Heterogeneous crystalline semiconductor nanomembrane (NM integration is investigated for single-layer and double-layer Silicon (Si NM photonics, III-V/Si NM lasers, and graphene/Si NM total absorption devices. Both homogeneous and heterogeneous integration are realized by the versatile transfer printing technique. The performance of these integrated membrane devices shows, not only intact optical and electrical characteristics as their bulk counterparts, but also the unique light and matter interactions, such as Fano resonance, slow light, and critical coupling in photonic crystal cavities. Such a heterogeneous integration approach offers tremendous practical application potentials on unconventional, Si CMOS compatible, and high performance optoelectronic systems.

  4. Word Sense Disambiguation Based on Large Scale Polish CLARIN Heterogeneous Lexical Resources

    Directory of Open Access Journals (Sweden)

    Paweł Kędzia

    2015-12-01

    Full Text Available Word Sense Disambiguation Based on Large Scale Polish CLARIN Heterogeneous Lexical Resources Lexical resources can be applied in many different Natural Language Engineering tasks, but the most fundamental task is the recognition of word senses used in text contexts. The problem is difficult, not yet fully solved and different lexical resources provided varied support for it. Polish CLARIN lexical semantic resources are based on the plWordNet — a very large wordnet for Polish — as a central structure which is a basis for linking together several resources of different types. In this paper, several Word Sense Disambiguation (henceforth WSD methods developed for Polish that utilise plWordNet are discussed. Textual sense descriptions in the traditional lexicon can be compared with text contexts using Lesk’s algorithm in order to find best matching senses. In the case of a wordnet, lexico-semantic relations provide the main description of word senses. Thus, first, we adapted and applied to Polish a WSD method based on the Page Rank. According to it, text words are mapped on their senses in the plWordNet graph and Page Rank algorithm is run to find senses with the highest scores. The method presents results lower but comparable to those reported for English. The error analysis showed that the main problems are: fine grained sense distinctions in plWordNet and limited number of connections between words of different parts of speech. In the second approach plWordNet expanded with the mapping onto the SUMO ontology concepts was used. Two scenarios for WSD were investigated: two step disambiguation and disambiguation based on combined networks of plWordNet and SUMO. In the former scenario, words are first assigned SUMO concepts and next plWordNet senses are disambiguated. In latter, plWordNet and SUMO are combined in one large network used next for the disambiguation of senses. The additional knowledge sources used in WSD improved the performance

  5. Architectures of adaptive integration in large collaborative projects

    Directory of Open Access Journals (Sweden)

    Lois Wright Morton

    2015-12-01

    Full Text Available Collaborations to address complex societal problems associated with managing human-natural systems often require large teams comprised of scientists from multiple disciplines. For many such problems, large-scale, transdisciplinary projects whose members include scientists, stakeholders, and other professionals are necessary. The success of very large, transdisciplinary projects can be facilitated by attending to the diversity of types of collaboration that inevitably occur within them. As projects progress and evolve, the resulting dynamic collaborative heterogeneity within them constitutes architectures of adaptive integration (AAI. Management that acknowledges this dynamic and fosters and promotes awareness of it within a project can better facilitate the creativity and innovation required to address problems from a systems perspective. In successful large projects, AAI (1 functionally meets objectives and goals, (2 uses disciplinary expertise and concurrently bridges many disciplines, (3 has mechanisms to enable connection, (4 delineates boundaries to keep focus but retain flexibility, (5 continuously monitors and adapts, and (6 encourages project-wide awareness. These principles are illustrated using as case studies three large climate change and agriculture projects funded by the U.S. Department of Agriculture-National Institute of Food and Agriculture.

  6. WE-E-17A-06: Assessing the Scale of Tumor Heterogeneity by Complete Hierarchical Segmentation On MRI

    International Nuclear Information System (INIS)

    Gensheimer, M; Trister, A; Ermoian, R; Hawkins, D

    2014-01-01

    Purpose: In many cancers, intratumoral heterogeneity exists in vascular and genetic structure. We developed an algorithm which uses clinical imaging to interrogate different scales of heterogeneity. We hypothesize that heterogeneity of perfusion at large distance scales may correlate with propensity for disease recurrence. We applied the algorithm to initial diagnosis MRI of rhabdomyosarcoma patients to predict recurrence. Methods: The Spatial Heterogeneity Analysis by Recursive Partitioning (SHARP) algorithm recursively segments the tumor image. The tumor is repeatedly subdivided, with each dividing line chosen to maximize signal intensity difference between the two subregions. This process continues to the voxel level, producing segments at multiple scales. Heterogeneity is measured by comparing signal intensity histograms between each segmented region and the adjacent region. We measured the scales of contrast enhancement heterogeneity of the primary tumor in 18 rhabdomyosarcoma patients. Using Cox proportional hazards regression, we explored the influence of heterogeneity parameters on relapse-free survival (RFS). To compare with existing methods, fractal and Haralick texture features were also calculated. Results: The complete segmentation produced by SHARP allows extraction of diverse features, including the amount of heterogeneity at various distance scales, the area of the tumor with the most heterogeneity at each scale, and for a given point in the tumor, the heterogeneity at different scales. 10/18 rhabdomyosarcoma patients suffered disease recurrence. On contrast-enhanced MRI, larger scale of maximum signal intensity heterogeneity, relative to tumor diameter, predicted for shorter RFS (p=0.05). Fractal dimension, fractal fit, and three Haralick features did not predict RFS (p=0.09-0.90). Conclusion: SHARP produces an automatic segmentation of tumor regions and reports the amount of heterogeneity at various distance scales. In rhabdomyosarcoma, RFS was

  7. Seismic modeling of multidimensional heterogeneity scales of Mallik gas hydrate reservoirs, Northwest Territories of Canada

    Science.gov (United States)

    Huang, Jun-Wei; Bellefleur, Gilles; Milkereit, Bernd

    2009-07-01

    In hydrate-bearing sediments, the velocity and attenuation of compressional and shear waves depend primarily on the spatial distribution of hydrates in the pore space of the subsurface lithologies. Recent characterizations of gas hydrate accumulations based on seismic velocity and attenuation generally assume homogeneous sedimentary layers and neglect effects from large- and small-scale heterogeneities of hydrate-bearing sediments. We present an algorithm, based on stochastic medium theory, to construct heterogeneous multivariable models that mimic heterogeneities of hydrate-bearing sediments at the level of detail provided by borehole logging data. Using this algorithm, we model some key petrophysical properties of gas hydrates within heterogeneous sediments near the Mallik well site, Northwest Territories, Canada. The modeled density, and P and S wave velocities used in combination with a modified Biot-Gassmann theory provide a first-order estimate of the in situ volume of gas hydrate near the Mallik 5L-38 borehole. Our results suggest a range of 528 to 768 × 106 m3/km2 of natural gas trapped within hydrates, nearly an order of magnitude lower than earlier estimates which did not include effects of small-scale heterogeneities. Further, the petrophysical models are combined with a 3-D finite difference modeling algorithm to study seismic attenuation due to scattering and leaky mode propagation. Simulations of a near-offset vertical seismic profile and cross-borehole numerical surveys demonstrate that attenuation of seismic energy may not be directly related to the intrinsic attenuation of hydrate-bearing sediments but, instead, may be largely attributed to scattering from small-scale heterogeneities and highly attenuate leaky mode propagation of seismic waves through larger-scale heterogeneities in sediments.

  8. Enhancing yeast transcription analysis through integration of heterogeneous data

    DEFF Research Database (Denmark)

    Grotkjær, Thomas; Nielsen, Jens

    2004-01-01

    of Saccharomyces cerevisiae whole genome transcription data. A special focus is on the quantitative aspects of normalisation and mathematical modelling approaches, since they are expected to play an increasing role in future DNA microarray analysis studies. Data analysis is exemplified with cluster analysis......DNA microarray technology enables the simultaneous measurement of the transcript level of thousands of genes. Primary analysis can be done with basic statistical tools and cluster analysis, but effective and in depth analysis of the vast amount of transcription data requires integration with data...... from several heterogeneous data Sources, such as upstream promoter sequences, genome-scale metabolic models, annotation databases and other experimental data. In this review, we discuss how experimental design, normalisation, heterogeneous data and mathematical modelling can enhance analysis...

  9. An efficient implementation of 3D high-resolution imaging for large-scale seismic data with GPU/CPU heterogeneous parallel computing

    Science.gov (United States)

    Xu, Jincheng; Liu, Wei; Wang, Jin; Liu, Linong; Zhang, Jianfeng

    2018-02-01

    De-absorption pre-stack time migration (QPSTM) compensates for the absorption and dispersion of seismic waves by introducing an effective Q parameter, thereby making it an effective tool for 3D, high-resolution imaging of seismic data. Although the optimal aperture obtained via stationary-phase migration reduces the computational cost of 3D QPSTM and yields 3D stationary-phase QPSTM, the associated computational efficiency is still the main problem in the processing of 3D, high-resolution images for real large-scale seismic data. In the current paper, we proposed a division method for large-scale, 3D seismic data to optimize the performance of stationary-phase QPSTM on clusters of graphics processing units (GPU). Then, we designed an imaging point parallel strategy to achieve an optimal parallel computing performance. Afterward, we adopted an asynchronous double buffering scheme for multi-stream to perform the GPU/CPU parallel computing. Moreover, several key optimization strategies of computation and storage based on the compute unified device architecture (CUDA) were adopted to accelerate the 3D stationary-phase QPSTM algorithm. Compared with the initial GPU code, the implementation of the key optimization steps, including thread optimization, shared memory optimization, register optimization and special function units (SFU), greatly improved the efficiency. A numerical example employing real large-scale, 3D seismic data showed that our scheme is nearly 80 times faster than the CPU-QPSTM algorithm. Our GPU/CPU heterogeneous parallel computing framework significant reduces the computational cost and facilitates 3D high-resolution imaging for large-scale seismic data.

  10. Built-In Data-Flow Integration Testing in Large-Scale Component-Based Systems

    Science.gov (United States)

    Piel, Éric; Gonzalez-Sanchez, Alberto; Gross, Hans-Gerhard

    Modern large-scale component-based applications and service ecosystems are built following a number of different component models and architectural styles, such as the data-flow architectural style. In this style, each building block receives data from a previous one in the flow and sends output data to other components. This organisation expresses information flows adequately, and also favours decoupling between the components, leading to easier maintenance and quicker evolution of the system. Integration testing is a major means to ensure the quality of large systems. Their size and complexity, together with the fact that they are developed and maintained by several stake holders, make Built-In Testing (BIT) an attractive approach to manage their integration testing. However, so far no technique has been proposed that combines BIT and data-flow integration testing. We have introduced the notion of a virtual component in order to realize such a combination. It permits to define the behaviour of several components assembled to process a flow of data, using BIT. Test-cases are defined in a way that they are simple to write and flexible to adapt. We present two implementations of our proposed virtual component integration testing technique, and we extend our previous proposal to detect and handle errors in the definition by the user. The evaluation of the virtual component testing approach suggests that more issues can be detected in systems with data-flows than through other integration testing approaches.

  11. Up scaling two-phase flow in heterogeneous porous media; Mise a l'echelle des ecoulements diphasiques dans les milieux poreux heterogenes

    Energy Technology Data Exchange (ETDEWEB)

    Artus, V.

    2003-11-01

    For two-phase flow in heterogeneous media, the emergence of different flow regimes at large-scale is driven by local interactions between the viscous coupling and the heterogeneity. In particular, when the viscosity ratio is favorable, viscous effects induce a transverse flow that stabilizes the front while flooding. However, most of recent stochastic models neglect the influence of the viscous coupling. We developed a stochastic model for the dynamics of the front, taking the viscous coupling into account. For stable cases, this model relates the statistical properties of the front to the statistical properties of the permeability field. For stable flow in stratified media, we show that the front is stationary by parts in the reservoir. These parts can be identified as large-scale hydrodynamic layers and separately coarsened in the large-scale simulation model. For flows with favorable viscosity ratios in isotropic reservoirs, we show that a stationary front occurs, in a statistical sense. For unfavorable viscosity ratios, the flow is driven by the development of viscous fingering. These different regimes lead to different large-scale saturation profiles that can be matched with a macro-dispersion equation, if the effective convective flux is modified to take into account stabilizing or destabilizing viscous effects. (author)

  12. Approaches to large scale unsaturated flow in heterogeneous, stratified, and fractured geologic media

    International Nuclear Information System (INIS)

    Ababou, R.

    1991-08-01

    This report develops a broad review and assessment of quantitative modeling approaches and data requirements for large-scale subsurface flow in radioactive waste geologic repository. The data review includes discussions of controlled field experiments, existing contamination sites, and site-specific hydrogeologic conditions at Yucca Mountain. Local-scale constitutive models for the unsaturated hydrodynamic properties of geologic media are analyzed, with particular emphasis on the effect of structural characteristics of the medium. The report further reviews and analyzes large-scale hydrogeologic spatial variability from aquifer data, unsaturated soil data, and fracture network data gathered from the literature. Finally, various modeling strategies toward large-scale flow simulations are assessed, including direct high-resolution simulation, and coarse-scale simulation based on auxiliary hydrodynamic models such as single equivalent continuum and dual-porosity continuum. The roles of anisotropy, fracturing, and broad-band spatial variability are emphasized. 252 refs

  13. Earthquake scaling laws for rupture geometry and slip heterogeneity

    Science.gov (United States)

    Thingbaijam, Kiran K. S.; Mai, P. Martin; Goda, Katsuichiro

    2016-04-01

    We analyze an extensive compilation of finite-fault rupture models to investigate earthquake scaling of source geometry and slip heterogeneity to derive new relationships for seismic and tsunami hazard assessment. Our dataset comprises 158 earthquakes with a total of 316 rupture models selected from the SRCMOD database (http://equake-rc.info/srcmod). We find that fault-length does not saturate with earthquake magnitude, while fault-width reveals inhibited growth due to the finite seismogenic thickness. For strike-slip earthquakes, fault-length grows more rapidly with increasing magnitude compared to events of other faulting types. Interestingly, our derived relationship falls between the L-model and W-model end-members. In contrast, both reverse and normal dip-slip events are more consistent with self-similar scaling of fault-length. However, fault-width scaling relationships for large strike-slip and normal dip-slip events, occurring on steeply dipping faults (δ~90° for strike-slip faults, and δ~60° for normal faults), deviate from self-similarity. Although reverse dip-slip events in general show self-similar scaling, the restricted growth of down-dip fault extent (with upper limit of ~200 km) can be seen for mega-thrust subduction events (M~9.0). Despite this fact, for a given earthquake magnitude, subduction reverse dip-slip events occupy relatively larger rupture area, compared to shallow crustal events. In addition, we characterize slip heterogeneity in terms of its probability distribution and spatial correlation structure to develop a complete stochastic random-field characterization of earthquake slip. We find that truncated exponential law best describes the probability distribution of slip, with observable scale parameters determined by the average and maximum slip. Applying Box-Cox transformation to slip distributions (to create quasi-normal distributed data) supports cube-root transformation, which also implies distinctive non-Gaussian slip

  14. Integrated fringe projection 3D scanning system for large-scale metrology based on laser tracker

    Science.gov (United States)

    Du, Hui; Chen, Xiaobo; Zhou, Dan; Guo, Gen; Xi, Juntong

    2017-10-01

    Large scale components exist widely in advance manufacturing industry,3D profilometry plays a pivotal role for the quality control. This paper proposes a flexible, robust large-scale 3D scanning system by integrating a robot with a binocular structured light scanner and a laser tracker. The measurement principle and system construction of the integrated system are introduced. And a mathematical model is established for the global data fusion. Subsequently, a flexible and robust method and mechanism is introduced for the establishment of the end coordination system. Based on this method, a virtual robot noumenon is constructed for hand-eye calibration. And then the transformation matrix between end coordination system and world coordination system is solved. Validation experiment is implemented for verifying the proposed algorithms. Firstly, hand-eye transformation matrix is solved. Then a car body rear is measured for 16 times for the global data fusion algorithm verification. And the 3D shape of the rear is reconstructed successfully.

  15. Hierarchical hybrid control of manipulators: Artificial intelligence in large scale integrated circuits

    Science.gov (United States)

    Greene, P. H.

    1972-01-01

    Both in practical engineering and in control of muscular systems, low level subsystems automatically provide crude approximations to the proper response. Through low level tuning of these approximations, the proper response variant can emerge from standardized high level commands. Such systems are expressly suited to emerging large scale integrated circuit technology. A computer, using symbolic descriptions of subsystem responses, can select and shape responses of low level digital or analog microcircuits. A mathematical theory that reveals significant informational units in this style of control and software for realizing such information structures are formulated.

  16. Optimal Wind Energy Integration in Large-Scale Electric Grids

    Science.gov (United States)

    Albaijat, Mohammad H.

    The major concern in electric grid operation is operating under the most economical and reliable fashion to ensure affordability and continuity of electricity supply. This dissertation investigates the effects of such challenges, which affect electric grid reliability and economic operations. These challenges are: 1. Congestion of transmission lines, 2. Transmission lines expansion, 3. Large-scale wind energy integration, and 4. Phaser Measurement Units (PMUs) optimal placement for highest electric grid observability. Performing congestion analysis aids in evaluating the required increase of transmission line capacity in electric grids. However, it is necessary to evaluate expansion of transmission line capacity on methods to ensure optimal electric grid operation. Therefore, the expansion of transmission line capacity must enable grid operators to provide low-cost electricity while maintaining reliable operation of the electric grid. Because congestion affects the reliability of delivering power and increases its cost, the congestion analysis in electric grid networks is an important subject. Consequently, next-generation electric grids require novel methodologies for studying and managing congestion in electric grids. We suggest a novel method of long-term congestion management in large-scale electric grids. Owing to the complication and size of transmission line systems and the competitive nature of current grid operation, it is important for electric grid operators to determine how many transmission lines capacity to add. Traditional questions requiring answers are "Where" to add, "How much of transmission line capacity" to add, and "Which voltage level". Because of electric grid deregulation, transmission lines expansion is more complicated as it is now open to investors, whose main interest is to generate revenue, to build new transmission lines. Adding a new transmission capacity will help the system to relieve the transmission system congestion, create

  17. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  18. Neural networks supporting audiovisual integration for speech: A large-scale lesion study.

    Science.gov (United States)

    Hickok, Gregory; Rogalsky, Corianne; Matchin, William; Basilakos, Alexandra; Cai, Julia; Pillay, Sara; Ferrill, Michelle; Mickelsen, Soren; Anderson, Steven W; Love, Tracy; Binder, Jeffrey; Fridriksson, Julius

    2018-06-01

    Auditory and visual speech information are often strongly integrated resulting in perceptual enhancements for audiovisual (AV) speech over audio alone and sometimes yielding compelling illusory fusion percepts when AV cues are mismatched, the McGurk-MacDonald effect. Previous research has identified three candidate regions thought to be critical for AV speech integration: the posterior superior temporal sulcus (STS), early auditory cortex, and the posterior inferior frontal gyrus. We assess the causal involvement of these regions (and others) in the first large-scale (N = 100) lesion-based study of AV speech integration. Two primary findings emerged. First, behavioral performance and lesion maps for AV enhancement and illusory fusion measures indicate that classic metrics of AV speech integration are not necessarily measuring the same process. Second, lesions involving superior temporal auditory, lateral occipital visual, and multisensory zones in the STS are the most disruptive to AV speech integration. Further, when AV speech integration fails, the nature of the failure-auditory vs visual capture-can be predicted from the location of the lesions. These findings show that AV speech processing is supported by unimodal auditory and visual cortices as well as multimodal regions such as the STS at their boundary. Motor related frontal regions do not appear to play a role in AV speech integration. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. An integrated model for assessing both crop productivity and agricultural water resources at a large scale

    Science.gov (United States)

    Okada, M.; Sakurai, G.; Iizumi, T.; Yokozawa, M.

    2012-12-01

    Agricultural production utilizes regional resources (e.g. river water and ground water) as well as local resources (e.g. temperature, rainfall, solar energy). Future climate changes and increasing demand due to population increases and economic developments would intensively affect the availability of water resources for agricultural production. While many studies assessed the impacts of climate change on agriculture, there are few studies that dynamically account for changes in water resources and crop production. This study proposes an integrated model for assessing both crop productivity and agricultural water resources at a large scale. Also, the irrigation management to subseasonal variability in weather and crop response varies for each region and each crop. To deal with such variations, we used the Markov Chain Monte Carlo technique to quantify regional-specific parameters associated with crop growth and irrigation water estimations. We coupled a large-scale crop model (Sakurai et al. 2012), with a global water resources model, H08 (Hanasaki et al. 2008). The integrated model was consisting of five sub-models for the following processes: land surface, crop growth, river routing, reservoir operation, and anthropogenic water withdrawal. The land surface sub-model was based on a watershed hydrology model, SWAT (Neitsch et al. 2009). Surface and subsurface runoffs simulated by the land surface sub-model were input to the river routing sub-model of the H08 model. A part of regional water resources available for agriculture, simulated by the H08 model, was input as irrigation water to the land surface sub-model. The timing and amount of irrigation water was simulated at a daily step. The integrated model reproduced the observed streamflow in an individual watershed. Additionally, the model accurately reproduced the trends and interannual variations of crop yields. To demonstrate the usefulness of the integrated model, we compared two types of impact assessment of

  20. Large scale cross hole testing

    International Nuclear Information System (INIS)

    Ball, J.K.; Black, J.H.; Doe, T.

    1991-05-01

    As part of the Site Characterisation and Validation programme the results of the large scale cross hole testing have been used to document hydraulic connections across the SCV block, to test conceptual models of fracture zones and obtain hydrogeological properties of the major hydrogeological features. The SCV block is highly heterogeneous. This heterogeneity is not smoothed out even over scales of hundreds of meters. Results of the interpretation validate the hypothesis of the major fracture zones, A, B and H; not much evidence of minor fracture zones is found. The uncertainty in the flow path, through the fractured rock, causes sever problems in interpretation. Derived values of hydraulic conductivity were found to be in a narrow range of two to three orders of magnitude. Test design did not allow fracture zones to be tested individually. This could be improved by testing the high hydraulic conductivity regions specifically. The Piezomac and single hole equipment worked well. Few, if any, of the tests ran long enough to approach equilibrium. Many observation boreholes showed no response. This could either be because there is no hydraulic connection, or there is a connection but a response is not seen within the time scale of the pumping test. The fractional dimension analysis yielded credible results, and the sinusoidal testing procedure provided an effective means of identifying the dominant hydraulic connections. (10 refs.) (au)

  1. Integration of heterogeneous features for remote sensing scene classification

    Science.gov (United States)

    Wang, Xin; Xiong, Xingnan; Ning, Chen; Shi, Aiye; Lv, Guofang

    2018-01-01

    Scene classification is one of the most important issues in remote sensing (RS) image processing. We find that features from different channels (shape, spectral, texture, etc.), levels (low-level and middle-level), or perspectives (local and global) could provide various properties for RS images, and then propose a heterogeneous feature framework to extract and integrate heterogeneous features with different types for RS scene classification. The proposed method is composed of three modules (1) heterogeneous features extraction, where three heterogeneous feature types, called DS-SURF-LLC, mean-Std-LLC, and MS-CLBP, are calculated, (2) heterogeneous features fusion, where the multiple kernel learning (MKL) is utilized to integrate the heterogeneous features, and (3) an MKL support vector machine classifier for RS scene classification. The proposed method is extensively evaluated on three challenging benchmark datasets (a 6-class dataset, a 12-class dataset, and a 21-class dataset), and the experimental results show that the proposed method leads to good classification performance. It produces good informative features to describe the RS image scenes. Moreover, the integration of heterogeneous features outperforms some state-of-the-art features on RS scene classification tasks.

  2. Integrating heterogeneous healthcare call centers.

    Science.gov (United States)

    Peschel, K M; Reed, W C; Salter, K

    1998-01-01

    In a relatively short period, OHS has absorbed multiple call centers supporting different LOBs from various acquisitions, functioning with diverse standards, processes, and technologies. However, customer and employee satisfaction is predicated on OHS's ability to thoroughly integrate these heterogeneous call centers. The integration was initiated and has successfully progressed through a balanced program of focused leadership and a defined strategy which includes site consolidation, sound performance management philosophies, and enabling technology. Benefits have already been achieved with even more substantive ones to occur as the integration continues to evolve.

  3. Promoting landscape heterogeneity to improve the biodiversity benefits of certified palm oil production: Evidence from Peninsular Malaysia

    Directory of Open Access Journals (Sweden)

    Badrul Azhar

    2015-01-01

    Full Text Available The Roundtable on Sustainable Palm Oil (RSPO is responsible for the certification of palm oil producers that comply with sustainability standards. However, it is not known whether RSPO-certified plantations are effective in maintaining biodiversity. Focusing on Peninsular Malaysia, we show that both RSPO-certified plantations and uncertified large-scale plantations are characterized by very low levels of landscape heterogeneity. By contrast, heterogeneity measures were many times higher in palm oil producing smallholdings, despite their lack of RSPO certification. The low heterogeneity of large-scale oil palm plantations, including those certified by the RSPO, is likely to severely limit their value for biodiversity conservation. Uncertified smallholdings, in contrast, are much more heterogeneous and therefore hold substantially greater promise for the integration of palm oil production and biodiversity conservation than large-scale plantations. With oil palm agriculture further expanding, certification schemes should mandate producers to improve biodiversity conservation through landscape management that promotes greater landscape heterogeneity.

  4. High-Throughput Multiple Dies-to-Wafer Bonding Technology and III/V-on-Si Hybrid Lasers for Heterogeneous Integration of Optoelectronic Integrated Circuits

    Directory of Open Access Journals (Sweden)

    Xianshu eLuo

    2015-04-01

    Full Text Available Integrated optical light source on silicon is one of the key building blocks for optical interconnect technology. Great research efforts have been devoting worldwide to explore various approaches to integrate optical light source onto the silicon substrate. The achievements so far include the successful demonstration of III/V-on-Si hybrid lasers through III/V-gain material to silicon wafer bonding technology. However, for potential large-scale integration, leveraging on mature silicon complementary metal oxide semiconductor (CMOS fabrication technology and infrastructure, more effective bonding scheme with high bonding yield is in great demand considering manufacturing needs. In this paper, we propose and demonstrate a high-throughput multiple dies-to-wafer (D2W bonding technology which is then applied for the demonstration of hybrid silicon lasers. By temporarily bonding III/V dies to a handle silicon wafer for simultaneous batch processing, it is expected to bond unlimited III/V dies to silicon device wafer with high yield. As proof-of-concept, more than 100 III/V dies bonding to 200 mm silicon wafer is demonstrated. The high performance of the bonding interface is examined with various characterization techniques. Repeatable demonstrations of 16-III/V-die bonding to pre-patterned 200 mm silicon wafers have been performed for various hybrid silicon lasers, in which device library including Fabry-Perot (FP laser, lateral-coupled distributed feedback (LC-DFB laser with side wall grating, and mode-locked laser (MLL. From these results, the presented multiple D2W bonding technology can be a key enabler towards the large-scale heterogeneous integration of optoelectronic integrated circuits (H-OEIC.

  5. Multi-format all-optical processing based on a large-scale, hybridly integrated photonic circuit.

    Science.gov (United States)

    Bougioukos, M; Kouloumentas, Ch; Spyropoulou, M; Giannoulis, G; Kalavrouziotis, D; Maziotis, A; Bakopoulos, P; Harmon, R; Rogers, D; Harrison, J; Poustie, A; Maxwell, G; Avramopoulos, H

    2011-06-06

    We investigate through numerical studies and experiments the performance of a large scale, silica-on-silicon photonic integrated circuit for multi-format regeneration and wavelength-conversion. The circuit encompasses a monolithically integrated array of four SOAs inside two parallel Mach-Zehnder structures, four delay interferometers and a large number of silica waveguides and couplers. Exploiting phase-incoherent techniques, the circuit is capable of processing OOK signals at variable bit rates, DPSK signals at 22 or 44 Gb/s and DQPSK signals at 44 Gbaud. Simulation studies reveal the wavelength-conversion potential of the circuit with enhanced regenerative capabilities for OOK and DPSK modulation formats and acceptable quality degradation for DQPSK format. Regeneration of 22 Gb/s OOK signals with amplified spontaneous emission (ASE) noise and DPSK data signals degraded with amplitude, phase and ASE noise is experimentally validated demonstrating a power penalty improvement up to 1.5 dB.

  6. Constraints on small-scale heterogeneity in the lowermost mantle from observations of near podal PcP precursors

    Science.gov (United States)

    Zhang, Baolong; Ni, Sidao; Sun, Daoyuan; Shen, Zhichao; Jackson, Jennifer M.; Wu, Wenbo

    2018-05-01

    Volumetric heterogeneities on large (∼>1000 km) and intermediate scales (∼>100 km) in the lowermost mantle have been established with seismological approaches. However, there are controversies regarding the level of heterogeneity in the lowermost mantle at small scales (a few kilometers to tens of kilometers), with lower bound estimates ranging from 0.1% to a few percent. We take advantage of the small amplitude PcP waves at near podal distances (0-12°) to constrain the level of small-scale heterogeneity within 250 km above the CMB. First, we compute short period synthetic seismograms with a finite difference code for a series of volumetric heterogeneity models in the lowermost mantle, and find that PcP is not identifiable if the small-scale heterogeneity in the lowermost mantle is above 2.5%. We then use a functional form appropriate for coda decay to suppress P coda contamination. By comparing the corrected envelope of PcP and its precursors with synthetic seismograms, we find that perturbations of small-scale (∼8 km) heterogeneity in the lowermost mantle is ∼0.2-0.5% beneath regions of the China-Myanmar border area, Okhotsk Sea and South America. Whereas strong perturbations (∼1.0%) are found beneath Central America. In the regions studied, we find that this particular type of small-scale heterogeneity in the lowermost mantle is weak, yet there are some regions requiring heterogeneity up to 1.0%. Where scattering is stronger, such as under Central America, more chemically complex mineral assemblages may be present at the core-mantle boundary.

  7. Large scale integration of intermittent renewable energy sources in the Greek power sector

    International Nuclear Information System (INIS)

    Voumvoulakis, Emmanouil; Asimakopoulou, Georgia; Danchev, Svetoslav; Maniatis, George; Tsakanikas, Aggelos

    2012-01-01

    As a member of the European Union, Greece has committed to achieve ambitious targets for the penetration of renewable energy sources (RES) in gross electricity consumption by 2020. Large scale integration of RES requires a suitable mixture of compatible generation units, in order to deal with the intermittency of wind velocity and solar irradiation. The scope of this paper is to examine the impact of large scale integration of intermittent energy sources, required to meet the 2020 RES target, on the generation expansion plan, the fuel mix and the spinning reserve requirements of the Greek electricity system. We perform hourly simulation of the intermittent RES generation to estimate residual load curves on a monthly basis, which are then inputted in a WASP-IV model of the Greek power system. We find that the decarbonisation effort, with the rapid entry of RES and the abolishment of the grandfathering of CO 2 allowances, will radically transform the Greek electricity sector over the next 10 years, which has wide-reaching policy implications. - Highlights: ► Greece needs 8.8 to 9.3 GW additional RES installations by 2020. ► RES capacity credit varies between 12.2% and 15.3%, depending on interconnections. ► Without institutional changes, the reserve requirements will be more than double. ► New CCGT installed capacity will probably exceed the cost-efficient level. ► Competitive pressures should be introduced in segments other than day-ahead market.

  8. Heterogeneous Compression of Large Collections of Evolutionary Trees.

    Science.gov (United States)

    Matthews, Suzanne J

    2015-01-01

    Compressing heterogeneous collections of trees is an open problem in computational phylogenetics. In a heterogeneous tree collection, each tree can contain a unique set of taxa. An ideal compression method would allow for the efficient archival of large tree collections and enable scientists to identify common evolutionary relationships over disparate analyses. In this paper, we extend TreeZip to compress heterogeneous collections of trees. TreeZip is the most efficient algorithm for compressing homogeneous tree collections. To the best of our knowledge, no other domain-based compression algorithm exists for large heterogeneous tree collections or enable their rapid analysis. Our experimental results indicate that TreeZip averages 89.03 percent (72.69 percent) space savings on unweighted (weighted) collections of trees when the level of heterogeneity in a collection is moderate. The organization of the TRZ file allows for efficient computations over heterogeneous data. For example, consensus trees can be computed in mere seconds. Lastly, combining the TreeZip compressed (TRZ) file with general-purpose compression yields average space savings of 97.34 percent (81.43 percent) on unweighted (weighted) collections of trees. Our results lead us to believe that TreeZip will prove invaluable in the efficient archival of tree collections, and enables scientists to develop novel methods for relating heterogeneous collections of trees.

  9. Integrating CLIPS applications into heterogeneous distributed systems

    Science.gov (United States)

    Adler, Richard M.

    1991-01-01

    SOCIAL is an advanced, object-oriented development tool for integrating intelligent and conventional applications across heterogeneous hardware and software platforms. SOCIAL defines a family of 'wrapper' objects called agents, which incorporate predefined capabilities for distributed communication and control. Developers embed applications within agents and establish interactions between distributed agents via non-intrusive message-based interfaces. This paper describes a predefined SOCIAL agent that is specialized for integrating C Language Integrated Production System (CLIPS)-based applications. The agent's high-level Application Programming Interface supports bidirectional flow of data, knowledge, and commands to other agents, enabling CLIPS applications to initiate interactions autonomously, and respond to requests and results from heterogeneous remote systems. The design and operation of CLIPS agents are illustrated with two distributed applications that integrate CLIPS-based expert systems with other intelligent systems for isolating and mapping problems in the Space Shuttle Launch Processing System at the NASA Kennedy Space Center.

  10. Properties Important To Mixing For WTP Large Scale Integrated Testing

    International Nuclear Information System (INIS)

    Koopman, D.; Martino, C.; Poirier, M.

    2012-01-01

    Large Scale Integrated Testing (LSIT) is being planned by Bechtel National, Inc. to address uncertainties in the full scale mixing performance of the Hanford Waste Treatment and Immobilization Plant (WTP). Testing will use simulated waste rather than actual Hanford waste. Therefore, the use of suitable simulants is critical to achieving the goals of the test program. External review boards have raised questions regarding the overall representativeness of simulants used in previous mixing tests. Accordingly, WTP requested the Savannah River National Laboratory (SRNL) to assist with development of simulants for use in LSIT. Among the first tasks assigned to SRNL was to develop a list of waste properties that matter to pulse-jet mixer (PJM) mixing of WTP tanks. This report satisfies Commitment 5.2.3.1 of the Department of Energy Implementation Plan for Defense Nuclear Facilities Safety Board Recommendation 2010-2: physical properties important to mixing and scaling. In support of waste simulant development, the following two objectives are the focus of this report: (1) Assess physical and chemical properties important to the testing and development of mixing scaling relationships; (2) Identify the governing properties and associated ranges for LSIT to achieve the Newtonian and non-Newtonian test objectives. This includes the properties to support testing of sampling and heel management systems. The test objectives for LSIT relate to transfer and pump out of solid particles, prototypic integrated operations, sparger operation, PJM controllability, vessel level/density measurement accuracy, sampling, heel management, PJM restart, design and safety margin, Computational Fluid Dynamics (CFD) Verification and Validation (V and V) and comparison, performance testing and scaling, and high temperature operation. The slurry properties that are most important to Performance Testing and Scaling depend on the test objective and rheological classification of the slurry (i

  11. PROPERTIES IMPORTANT TO MIXING FOR WTP LARGE SCALE INTEGRATED TESTING

    Energy Technology Data Exchange (ETDEWEB)

    Koopman, D.; Martino, C.; Poirier, M.

    2012-04-26

    Large Scale Integrated Testing (LSIT) is being planned by Bechtel National, Inc. to address uncertainties in the full scale mixing performance of the Hanford Waste Treatment and Immobilization Plant (WTP). Testing will use simulated waste rather than actual Hanford waste. Therefore, the use of suitable simulants is critical to achieving the goals of the test program. External review boards have raised questions regarding the overall representativeness of simulants used in previous mixing tests. Accordingly, WTP requested the Savannah River National Laboratory (SRNL) to assist with development of simulants for use in LSIT. Among the first tasks assigned to SRNL was to develop a list of waste properties that matter to pulse-jet mixer (PJM) mixing of WTP tanks. This report satisfies Commitment 5.2.3.1 of the Department of Energy Implementation Plan for Defense Nuclear Facilities Safety Board Recommendation 2010-2: physical properties important to mixing and scaling. In support of waste simulant development, the following two objectives are the focus of this report: (1) Assess physical and chemical properties important to the testing and development of mixing scaling relationships; (2) Identify the governing properties and associated ranges for LSIT to achieve the Newtonian and non-Newtonian test objectives. This includes the properties to support testing of sampling and heel management systems. The test objectives for LSIT relate to transfer and pump out of solid particles, prototypic integrated operations, sparger operation, PJM controllability, vessel level/density measurement accuracy, sampling, heel management, PJM restart, design and safety margin, Computational Fluid Dynamics (CFD) Verification and Validation (V and V) and comparison, performance testing and scaling, and high temperature operation. The slurry properties that are most important to Performance Testing and Scaling depend on the test objective and rheological classification of the slurry (i

  12. Emerging heterogeneous integrated photonic platforms on silicon

    Directory of Open Access Journals (Sweden)

    Fathpour Sasan

    2015-05-01

    Full Text Available Silicon photonics has been established as a mature and promising technology for optoelectronic integrated circuits, mostly based on the silicon-on-insulator (SOI waveguide platform. However, not all optical functionalities can be satisfactorily achieved merely based on silicon, in general, and on the SOI platform, in particular. Long-known shortcomings of silicon-based integrated photonics are optical absorption (in the telecommunication wavelengths and feasibility of electrically-injected lasers (at least at room temperature. More recently, high two-photon and free-carrier absorptions required at high optical intensities for third-order optical nonlinear effects, inherent lack of second-order optical nonlinearity, low extinction ratio of modulators based on the free-carrier plasma effect, and the loss of the buried oxide layer of the SOI waveguides at mid-infrared wavelengths have been recognized as other shortcomings. Accordingly, several novel waveguide platforms have been developing to address these shortcomings of the SOI platform. Most of these emerging platforms are based on heterogeneous integration of other material systems on silicon substrates, and in some cases silicon is integrated on other substrates. Germanium and its binary alloys with silicon, III–V compound semiconductors, silicon nitride, tantalum pentoxide and other high-index dielectric or glass materials, as well as lithium niobate are some of the materials heterogeneously integrated on silicon substrates. The materials are typically integrated by a variety of epitaxial growth, bonding, ion implantation and slicing, etch back, spin-on-glass or other techniques. These wide range of efforts are reviewed here holistically to stress that there is no pure silicon or even group IV photonics per se. Rather, the future of the field of integrated photonics appears to be one of heterogenization, where a variety of different materials and waveguide platforms will be used for

  13. Large scale validation of the M5L lung CAD on heterogeneous CT datasets

    Energy Technology Data Exchange (ETDEWEB)

    Lopez Torres, E., E-mail: Ernesto.Lopez.Torres@cern.ch, E-mail: cerello@to.infn.it [CEADEN, Havana 11300, Cuba and INFN, Sezione di Torino, Torino 10125 (Italy); Fiorina, E.; Pennazio, F.; Peroni, C. [Department of Physics, University of Torino, Torino 10125, Italy and INFN, Sezione di Torino, Torino 10125 (Italy); Saletta, M.; Cerello, P., E-mail: Ernesto.Lopez.Torres@cern.ch, E-mail: cerello@to.infn.it [INFN, Sezione di Torino, Torino 10125 (Italy); Camarlinghi, N.; Fantacci, M. E. [Department of Physics, University of Pisa, Pisa 56127, Italy and INFN, Sezione di Pisa, Pisa 56127 (Italy)

    2015-04-15

    Purpose: M5L, a fully automated computer-aided detection (CAD) system for the detection and segmentation of lung nodules in thoracic computed tomography (CT), is presented and validated on several image datasets. Methods: M5L is the combination of two independent subsystems, based on the Channeler Ant Model as a segmentation tool [lung channeler ant model (lungCAM)] and on the voxel-based neural approach. The lungCAM was upgraded with a scan equalization module and a new procedure to recover the nodules connected to other lung structures; its classification module, which makes use of a feed-forward neural network, is based of a small number of features (13), so as to minimize the risk of lacking generalization, which could be possible given the large difference between the size of the training and testing datasets, which contain 94 and 1019 CTs, respectively. The lungCAM (standalone) and M5L (combined) performance was extensively tested on 1043 CT scans from three independent datasets, including a detailed analysis of the full Lung Image Database Consortium/Image Database Resource Initiative database, which is not yet found in literature. Results: The lungCAM and M5L performance is consistent across the databases, with a sensitivity of about 70% and 80%, respectively, at eight false positive findings per scan, despite the variable annotation criteria and acquisition and reconstruction conditions. A reduced sensitivity is found for subtle nodules and ground glass opacities (GGO) structures. A comparison with other CAD systems is also presented. Conclusions: The M5L performance on a large and heterogeneous dataset is stable and satisfactory, although the development of a dedicated module for GGOs detection could further improve it, as well as an iterative optimization of the training procedure. The main aim of the present study was accomplished: M5L results do not deteriorate when increasing the dataset size, making it a candidate for supporting radiologists on large

  14. Assessing the scale of tumor heterogeneity by complete hierarchical segmentation of MRI.

    Science.gov (United States)

    Gensheimer, Michael F; Hawkins, Douglas S; Ermoian, Ralph P; Trister, Andrew D

    2015-02-07

    In many cancers, intratumoral heterogeneity has been found in histology, genetic variation and vascular structure. We developed an algorithm to interrogate different scales of heterogeneity using clinical imaging. We hypothesize that heterogeneity of perfusion at coarse scale may correlate with treatment resistance and propensity for disease recurrence. The algorithm recursively segments the tumor image into increasingly smaller regions. Each dividing line is chosen so as to maximize signal intensity difference between the two regions. This process continues until the tumor has been divided into single voxels, resulting in segments at multiple scales. For each scale, heterogeneity is measured by comparing each segmented region to the adjacent region and calculating the difference in signal intensity histograms. Using digital phantom images, we showed that the algorithm is robust to image artifacts and various tumor shapes. We then measured the primary tumor scales of contrast enhancement heterogeneity in MRI of 18 rhabdomyosarcoma patients. Using Cox proportional hazards regression, we explored the influence of heterogeneity parameters on relapse-free survival. Coarser scale of maximum signal intensity heterogeneity was prognostic of shorter survival (p = 0.05). By contrast, two fractal parameters and three Haralick texture features were not prognostic. In summary, our algorithm produces a biologically motivated segmentation of tumor regions and reports the amount of heterogeneity at various distance scales. If validated on a larger dataset, this prognostic imaging biomarker could be useful to identify patients at higher risk for recurrence and candidates for alternative treatment.

  15. Balancing modern Power System with large scale of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Altin, Müfit; Hansen, Anca Daniela

    2014-01-01

    Power system operators must ensure robust, secure and reliable power system operation even with a large scale integration of wind power. Electricity generated from the intermittent wind in large propor-tion may impact on the control of power system balance and thus deviations in the power system...... frequency in small or islanded power systems or tie line power flows in interconnected power systems. Therefore, the large scale integration of wind power into the power system strongly concerns the secure and stable grid operation. To ensure the stable power system operation, the evolving power system has...... to be analysed with improved analytical tools and techniques. This paper proposes techniques for the active power balance control in future power systems with the large scale wind power integration, where power balancing model provides the hour-ahead dispatch plan with reduced planning horizon and the real time...

  16. Spatial Heterogeneity of the Forest Canopy Scales with the Heterogeneity of an Understory Shrub Based on Fractal Analysis

    Directory of Open Access Journals (Sweden)

    Catherine K. Denny

    2017-04-01

    Full Text Available Spatial heterogeneity of vegetation is an important landscape characteristic, but is difficult to assess due to scale-dependence. Here we examine how spatial patterns in the forest canopy affect those of understory plants, using the shrub Canada buffaloberry (Shepherdia canadensis (L. Nutt. as a focal species. Evergreen and deciduous forest canopy and buffaloberry shrub presence were measured with line-intercept sampling along ten 2-km transects in the Rocky Mountain foothills of west-central Alberta, Canada. Relationships between overstory canopy and understory buffaloberry presence were assessed for scales ranging from 2 m to 502 m. Fractal dimensions of both canopy and buffaloberry were estimated and then related using box-counting methods to evaluate spatial heterogeneity based on patch distribution and abundance. Effects of canopy presence on buffaloberry were scale-dependent, with shrub presence negatively related to evergreen canopy cover and positively related to deciduous cover. The effect of evergreen canopy was significant at a local scale between 2 m and 42 m, while that of deciduous canopy was significant at a meso-scale between 150 m and 358 m. Fractal analysis indicated that buffaloberry heterogeneity positively scaled with evergreen canopy heterogeneity, but was unrelated to that of deciduous canopy. This study demonstrates that evergreen canopy cover is a determinant of buffaloberry heterogeneity, highlighting the importance of spatial scale and canopy composition in understanding canopy-understory relationships.

  17. Dynamic model of frequency control in Danish power system with large scale integration of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Hansen, Anca Daniela; Sørensen, Poul Ejnar

    2013-01-01

    This work evaluates the impact of large scale integration of wind power in future power systems when 50% of load demand can be met from wind power. The focus is on active power balance control, where the main source of power imbalance is an inaccurate wind speed forecast. In this study, a Danish...... power system model with large scale of wind power is developed and a case study for an inaccurate wind power forecast is investigated. The goal of this work is to develop an adequate power system model that depicts relevant dynamic features of the power plants and compensates for load generation...... imbalances, caused by inaccurate wind speed forecast, by an appropriate control of the active power production from power plants....

  18. Experimental Quantification of Pore-Scale Flow Phenomena in 2D Heterogeneous Porous Micromodels: Multiphase Flow Towards Coupled Solid-Liquid Interactions

    Science.gov (United States)

    Li, Y.; Kazemifar, F.; Blois, G.; Christensen, K. T.

    2017-12-01

    Geological sequestration of CO2 within saline aquifers is a viable technology for reducing CO2 emissions. Central to this goal is accurately predicting both the fidelity of candidate sites pre-injection of CO2 and its post-injection migration. Moreover, local fluid pressure buildup may cause activation of small pre-existing unidentified faults, leading to micro-seismic events, which could prove disastrous for societal acceptance of CCS, and possibly compromise seal integrity. Recent evidence shows that large-scale events are coupled with pore-scale phenomena, which necessitates the representation of pore-scale stress, strain, and multiphase flow processes in large-scale modeling. To this end, the pore-scale flow of water and liquid/supercritical CO2 is investigated under reservoir-relevant conditions, over a range of wettability conditions in 2D heterogeneous micromodels that reflect the complexity of a real sandstone. High-speed fluorescent microscopy, complemented by a fast differential pressure transmitter, allows for simultaneous measurement of the flow field within and the instantaneous pressure drop across the micromodels. A flexible micromodel is also designed and fabricated, to be used in conjunction with the micro-PIV technique, enabling the quantification of coupled solid-liquid interactions.

  19. Data warehousing technologies for large-scale and right-time data

    DEFF Research Database (Denmark)

    Xiufeng, Liu

    heterogeneous sources into a central data warehouse (DW) by Extract-Transform-Load (ETL) at regular time intervals, e.g., monthly, weekly, or daily. But now, it becomes challenging for large-scale data, and hard to meet the near real-time/right-time business decisions. This thesis considers some...

  20. Passive technologies for future large-scale photonic integrated circuits on silicon: polarization handling, light non-reciprocity and loss reduction

    Directory of Open Access Journals (Sweden)

    Daoxin Dai

    2012-03-01

    Full Text Available Silicon-based large-scale photonic integrated circuits are becoming important, due to the need for higher complexity and lower cost for optical transmitters, receivers and optical buffers. In this paper, passive technologies for large-scale photonic integrated circuits are described, including polarization handling, light non-reciprocity and loss reduction. The design rule for polarization beam splitters based on asymmetrical directional couplers is summarized and several novel designs for ultra-short polarization beam splitters are reviewed. A novel concept for realizing a polarization splitter–rotator is presented with a very simple fabrication process. Realization of silicon-based light non-reciprocity devices (e.g., optical isolator, which is very important for transmitters to avoid sensitivity to reflections, is also demonstrated with the help of magneto-optical material by the bonding technology. Low-loss waveguides are another important technology for large-scale photonic integrated circuits. Ultra-low loss optical waveguides are achieved by designing a Si3N4 core with a very high aspect ratio. The loss is reduced further to <0.1 dB m−1 with an improved fabrication process incorporating a high-quality thermal oxide upper cladding by means of wafer bonding. With the developed ultra-low loss Si3N4 optical waveguides, some devices are also demonstrated, including ultra-high-Q ring resonators, low-loss arrayed-waveguide grating (demultiplexers, and high-extinction-ratio polarizers.

  1. Heterogeneity and scale of sustainable development in cities.

    Science.gov (United States)

    Brelsford, Christa; Lobo, José; Hand, Joe; Bettencourt, Luís M A

    2017-08-22

    Rapid worldwide urbanization is at once the main cause and, potentially, the main solution to global sustainable development challenges. The growth of cities is typically associated with increases in socioeconomic productivity, but it also creates strong inequalities. Despite a growing body of evidence characterizing these heterogeneities in developed urban areas, not much is known systematically about their most extreme forms in developing cities and their consequences for sustainability. Here, we characterize the general patterns of income and access to services in a large number of developing cities, with an emphasis on an extensive, high-resolution analysis of the urban areas of Brazil and South Africa. We use detailed census data to construct sustainable development indices in hundreds of thousands of neighborhoods and show that their statistics are scale-dependent and point to the critical role of large cities in creating higher average incomes and greater access to services within their national context. We then quantify the general statistical trajectory toward universal basic service provision at different scales to show that it is characterized by varying levels of inequality, with initial increases in access being typically accompanied by growing disparities over characteristic spatial scales. These results demonstrate how extensions of these methods to other goals and data can be used over time and space to produce a simple but general quantitative assessment of progress toward internationally agreed sustainable development goals.

  2. Heterogeneity and scale of sustainable development in cities

    Science.gov (United States)

    Brelsford, Christa; Lobo, José; Hand, Joe

    2017-01-01

    Rapid worldwide urbanization is at once the main cause and, potentially, the main solution to global sustainable development challenges. The growth of cities is typically associated with increases in socioeconomic productivity, but it also creates strong inequalities. Despite a growing body of evidence characterizing these heterogeneities in developed urban areas, not much is known systematically about their most extreme forms in developing cities and their consequences for sustainability. Here, we characterize the general patterns of income and access to services in a large number of developing cities, with an emphasis on an extensive, high-resolution analysis of the urban areas of Brazil and South Africa. We use detailed census data to construct sustainable development indices in hundreds of thousands of neighborhoods and show that their statistics are scale-dependent and point to the critical role of large cities in creating higher average incomes and greater access to services within their national context. We then quantify the general statistical trajectory toward universal basic service provision at different scales to show that it is characterized by varying levels of inequality, with initial increases in access being typically accompanied by growing disparities over characteristic spatial scales. These results demonstrate how extensions of these methods to other goals and data can be used over time and space to produce a simple but general quantitative assessment of progress toward internationally agreed sustainable development goals. PMID:28461489

  3. Impacts of large-scale offshore wind farm integration on power systems through VSC-HVDC

    DEFF Research Database (Denmark)

    Liu, Hongzhi; Chen, Zhe

    2013-01-01

    The potential of offshore wind energy has been commonly recognized and explored globally. Many countries have implemented and planned offshore wind farms to meet their increasing electricity demands and public environmental appeals, especially in Europe. With relatively less space limitation......, an offshore wind farm could have a capacity rating to hundreds of MWs or even GWs that is large enough to compete with conventional power plants. Thus the impacts of a large offshore wind farm on power system operation and security should be thoroughly studied and understood. This paper investigates...... the impacts of integrating a large-scale offshore wind farm into the transmission system of a power grid through VSC-HVDC connection. The concerns are focused on steady-state voltage stability, dynamic voltage stability and transient angle stability. Simulation results based on an exemplary power system...

  4. Impurity engineering of Czochralski silicon used for ultra large-scaled-integrated circuits

    Science.gov (United States)

    Yang, Deren; Chen, Jiahe; Ma, Xiangyang; Que, Duanlin

    2009-01-01

    Impurities in Czochralski silicon (Cz-Si) used for ultra large-scaled-integrated (ULSI) circuits have been believed to deteriorate the performance of devices. In this paper, a review of the recent processes from our investigation on internal gettering in Cz-Si wafers which were doped with nitrogen, germanium and/or high content of carbon is presented. It has been suggested that those impurities enhance oxygen precipitation, and create both denser bulk microdefects and enough denuded zone with the desirable width, which is benefit of the internal gettering of metal contamination. Based on the experimental facts, a potential mechanism of impurity doping on the internal gettering structure is interpreted and, a new concept of 'impurity engineering' for Cz-Si used for ULSI is proposed.

  5. Large Sample Neutron Activation Analysis of Heterogeneous Samples

    International Nuclear Information System (INIS)

    Stamatelatos, I.E.; Vasilopoulou, T.; Tzika, F.

    2018-01-01

    A Large Sample Neutron Activation Analysis (LSNAA) technique was developed for non-destructive analysis of heterogeneous bulk samples. The technique incorporated collimated scanning and combining experimental measurements and Monte Carlo simulations for the identification of inhomogeneities in large volume samples and the correction of their effect on the interpretation of gamma-spectrometry data. Corrections were applied for the effect of neutron self-shielding, gamma-ray attenuation, geometrical factor and heterogeneous activity distribution within the sample. A benchmark experiment was performed to investigate the effect of heterogeneity on the accuracy of LSNAA. Moreover, a ceramic vase was analyzed as a whole demonstrating the feasibility of the technique. The LSNAA results were compared against results obtained by INAA and a satisfactory agreement between the two methods was observed. This study showed that LSNAA is a technique capable to perform accurate non-destructive, multi-elemental compositional analysis of heterogeneous objects. It also revealed the great potential of the technique for the analysis of precious objects and artefacts that need to be preserved intact and cannot be damaged for sampling purposes. (author)

  6. Integrating adaptive behaviour in large-scale flood risk assessments: an Agent-Based Modelling approach

    Science.gov (United States)

    Haer, Toon; Aerts, Jeroen

    2015-04-01

    Between 1998 and 2009, Europe suffered over 213 major damaging floods, causing 1126 deaths, displacing around half a million people. In this period, floods caused at least 52 billion euro in insured economic losses making floods the most costly natural hazard faced in Europe. In many low-lying areas, the main strategy to cope with floods is to reduce the risk of the hazard through flood defence structures, like dikes and levees. However, it is suggested that part of the responsibility for flood protection needs to shift to households and businesses in areas at risk, and that governments and insurers can effectively stimulate the implementation of individual protective measures. However, adaptive behaviour towards flood risk reduction and the interaction between the government, insurers, and individuals has hardly been studied in large-scale flood risk assessments. In this study, an European Agent-Based Model is developed including agent representatives for the administrative stakeholders of European Member states, insurers and reinsurers markets, and individuals following complex behaviour models. The Agent-Based Modelling approach allows for an in-depth analysis of the interaction between heterogeneous autonomous agents and the resulting (non-)adaptive behaviour. Existing flood damage models are part of the European Agent-Based Model to allow for a dynamic response of both the agents and the environment to changing flood risk and protective efforts. By following an Agent-Based Modelling approach this study is a first contribution to overcome the limitations of traditional large-scale flood risk models in which the influence of individual adaptive behaviour towards flood risk reduction is often lacking.

  7. Scale Reliability Evaluation with Heterogeneous Populations

    Science.gov (United States)

    Raykov, Tenko; Marcoulides, George A.

    2015-01-01

    A latent variable modeling approach for scale reliability evaluation in heterogeneous populations is discussed. The method can be used for point and interval estimation of reliability of multicomponent measuring instruments in populations representing mixtures of an unknown number of latent classes or subpopulations. The procedure is helpful also…

  8. A refined regional modeling approach for the Corn Belt - Experiences and recommendations for large-scale integrated modeling

    Science.gov (United States)

    Panagopoulos, Yiannis; Gassman, Philip W.; Jha, Manoj K.; Kling, Catherine L.; Campbell, Todd; Srinivasan, Raghavan; White, Michael; Arnold, Jeffrey G.

    2015-05-01

    Nonpoint source pollution from agriculture is the main source of nitrogen and phosphorus in the stream systems of the Corn Belt region in the Midwestern US. This region is comprised of two large river basins, the intensely row-cropped Upper Mississippi River Basin (UMRB) and Ohio-Tennessee River Basin (OTRB), which are considered the key contributing areas for the Northern Gulf of Mexico hypoxic zone according to the US Environmental Protection Agency. Thus, in this area it is of utmost importance to ensure that intensive agriculture for food, feed and biofuel production can coexist with a healthy water environment. To address these objectives within a river basin management context, an integrated modeling system has been constructed with the hydrologic Soil and Water Assessment Tool (SWAT) model, capable of estimating river basin responses to alternative cropping and/or management strategies. To improve modeling performance compared to previous studies and provide a spatially detailed basis for scenario development, this SWAT Corn Belt application incorporates a greatly refined subwatershed structure based on 12-digit hydrologic units or 'subwatersheds' as defined by the US Geological Service. The model setup, calibration and validation are time-demanding and challenging tasks for these large systems, given the scale intensive data requirements, and the need to ensure the reliability of flow and pollutant load predictions at multiple locations. Thus, the objectives of this study are both to comprehensively describe this large-scale modeling approach, providing estimates of pollution and crop production in the region as well as to present strengths and weaknesses of integrated modeling at such a large scale along with how it can be improved on the basis of the current modeling structure and results. The predictions were based on a semi-automatic hydrologic calibration approach for large-scale and spatially detailed modeling studies, with the use of the Sequential

  9. TradeWind. Integrating wind. Developing Europe's power market for the large-scale integration of wind power. Final report

    Energy Technology Data Exchange (ETDEWEB)

    2009-02-15

    Based on a single European grid and power market system, the TradeWind project explores to what extent large-scale wind power integration challenges could be addressed by reinforcing interconnections between Member States in Europe. Additionally, the project looks at the conditions required for a sound power market design that ensures a cost-effective integration of wind power at EU level. In this way, the study addresses two issues of key importance for the future integration of renewable energy, namely the weak interconnectivity levels between control zones and the inflexibility and fragmented nature of the European power market. Work on critical transmission paths and interconnectors is slow for a variety of reasons including planning and administrative barriers, lack of public acceptance, insufficient economic incentives for TSOs, and the lack of a joint European approach by the key stakeholders. (au)

  10. Assessing the scale of tumor heterogeneity by complete hierarchical segmentation of MRI

    International Nuclear Information System (INIS)

    Gensheimer, Michael F; Ermoian, Ralph P; Hawkins, Douglas S; Trister, Andrew D

    2015-01-01

    In many cancers, intratumoral heterogeneity has been found in histology, genetic variation and vascular structure. We developed an algorithm to interrogate different scales of heterogeneity using clinical imaging. We hypothesize that heterogeneity of perfusion at coarse scale may correlate with treatment resistance and propensity for disease recurrence. The algorithm recursively segments the tumor image into increasingly smaller regions. Each dividing line is chosen so as to maximize signal intensity difference between the two regions. This process continues until the tumor has been divided into single voxels, resulting in segments at multiple scales. For each scale, heterogeneity is measured by comparing each segmented region to the adjacent region and calculating the difference in signal intensity histograms. Using digital phantom images, we showed that the algorithm is robust to image artifacts and various tumor shapes. We then measured the primary tumor scales of contrast enhancement heterogeneity in MRI of 18 rhabdomyosarcoma patients. Using Cox proportional hazards regression, we explored the influence of heterogeneity parameters on relapse-free survival. Coarser scale of maximum signal intensity heterogeneity was prognostic of shorter survival (p = 0.05). By contrast, two fractal parameters and three Haralick texture features were not prognostic. In summary, our algorithm produces a biologically motivated segmentation of tumor regions and reports the amount of heterogeneity at various distance scales. If validated on a larger dataset, this prognostic imaging biomarker could be useful to identify patients at higher risk for recurrence and candidates for alternative treatment. (paper)

  11. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  12. Integrating mean and variance heterogeneities to identify differentially expressed genes.

    Science.gov (United States)

    Ouyang, Weiwei; An, Qiang; Zhao, Jinying; Qin, Huaizhen

    2016-12-06

    In functional genomics studies, tests on mean heterogeneity have been widely employed to identify differentially expressed genes with distinct mean expression levels under different experimental conditions. Variance heterogeneity (aka, the difference between condition-specific variances) of gene expression levels is simply neglected or calibrated for as an impediment. The mean heterogeneity in the expression level of a gene reflects one aspect of its distribution alteration; and variance heterogeneity induced by condition change may reflect another aspect. Change in condition may alter both mean and some higher-order characteristics of the distributions of expression levels of susceptible genes. In this report, we put forth a conception of mean-variance differentially expressed (MVDE) genes, whose expression means and variances are sensitive to the change in experimental condition. We mathematically proved the null independence of existent mean heterogeneity tests and variance heterogeneity tests. Based on the independence, we proposed an integrative mean-variance test (IMVT) to combine gene-wise mean heterogeneity and variance heterogeneity induced by condition change. The IMVT outperformed its competitors under comprehensive simulations of normality and Laplace settings. For moderate samples, the IMVT well controlled type I error rates, and so did existent mean heterogeneity test (i.e., the Welch t test (WT), the moderated Welch t test (MWT)) and the procedure of separate tests on mean and variance heterogeneities (SMVT), but the likelihood ratio test (LRT) severely inflated type I error rates. In presence of variance heterogeneity, the IMVT appeared noticeably more powerful than all the valid mean heterogeneity tests. Application to the gene profiles of peripheral circulating B raised solid evidence of informative variance heterogeneity. After adjusting for background data structure, the IMVT replicated previous discoveries and identified novel experiment

  13. Ontology based heterogeneous materials database integration and semantic query

    Science.gov (United States)

    Zhao, Shuai; Qian, Quan

    2017-10-01

    Materials digital data, high throughput experiments and high throughput computations are regarded as three key pillars of materials genome initiatives. With the fast growth of materials data, the integration and sharing of data is very urgent, that has gradually become a hot topic of materials informatics. Due to the lack of semantic description, it is difficult to integrate data deeply in semantic level when adopting the conventional heterogeneous database integration approaches such as federal database or data warehouse. In this paper, a semantic integration method is proposed to create the semantic ontology by extracting the database schema semi-automatically. Other heterogeneous databases are integrated to the ontology by means of relational algebra and the rooted graph. Based on integrated ontology, semantic query can be done using SPARQL. During the experiments, two world famous First Principle Computational databases, OQMD and Materials Project are used as the integration targets, which show the availability and effectiveness of our method.

  14. Response of Moist Convection to Multi-scale Surface Flux Heterogeneity

    Science.gov (United States)

    Kang, S. L.; Ryu, J. H.

    2015-12-01

    We investigate response of moist convection to multi-scale feature of the spatial variation of surface sensible heat fluxes (SHF) in the afternoon evolution of the convective boundary layer (CBL), utilizing a mesoscale-domain large eddy simulation (LES) model. The multi-scale surface heterogeneity feature is analytically created as a function of the spectral slope in the wavelength range from a few tens of km to a few hundreds of m in the spectrum of surface SHF on a log-log scale. The response of moist convection to the κ-3 - slope (where κ is wavenumber) surface SHF field is compared with that to the κ-2 - slope surface, which has a relatively weak mesoscale feature, and the homogeneous κ0 - slope surface. Given the surface energy balance with a spatially uniform available energy, the prescribed SHF has a 180° phase lag with the latent heat flux (LHF) in a horizontal domain of (several tens of km)2. Thus, warmer (cooler) surface is relatively dry (moist). For all the cases, the same observation-based sounding is prescribed for the initial condition. For all the κ-3 - slope surface heterogeneity cases, early non-precipitating shallow clouds further develop into precipitating deep thunderstorms. But for all the κ-2 - slope cases, only shallow clouds develop. We compare the vertical profiles of domain-averaged fluxes and variances, and the contribution of the mesoscale and turbulence contributions to the fluxes and variances, between the κ-3 versus κ-2 slope cases. Also the cross-scale processes are investigated.

  15. Microfluidic very large-scale integration for biochips: Technology, testing and fault-tolerant design

    DEFF Research Database (Denmark)

    Araci, Ismail Emre; Pop, Paul; Chakrabarty, Krishnendu

    2015-01-01

    of this paper is on continuous-flow biochips, where the basic building block is a microvalve. By combining these microvalves, more complex units such as mixers, switches, multiplexers can be built, hence the name of the technology, “microfluidic Very Large-Scale Integration” (mVLSI). A roadblock......Microfluidic biochips are replacing the conventional biochemical analyzers by integrating all the necessary functions for biochemical analysis using microfluidics. Biochips are used in many application areas, such as, in vitro diagnostics, drug discovery, biotech and ecology. The focus...... presents the state-of-the-art in the mVLSI platforms and emerging research challenges in the area of continuous-flow microfluidics, focusing on testing techniques and fault-tolerant design....

  16. Large-scale visualization system for grid environment

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of Japan Atomic Energy Agency (CCSE/JAEA) has been conducting R and Ds of distributed computing (grid computing) environments: Seamless Thinking Aid (STA), Information Technology Based Laboratory (ITBL) and Atomic Energy Grid InfraStructure (AEGIS). In these R and Ds, we have developed the visualization technology suitable for the distributed computing environment. As one of the visualization tools, we have developed the Parallel Support Toolkit (PST) which can execute the visualization process parallely on a computer. Now, we improve PST to be executable simultaneously on multiple heterogeneous computers using Seamless Thinking Aid Message Passing Interface (STAMPI). STAMPI, we have developed in these R and Ds, is the MPI library executable on a heterogeneous computing environment. The improvement realizes the visualization of extremely large-scale data and enables more efficient visualization processes in a distributed computing environment. (author)

  17. Nanoscale heterogeneity at the aqueous electrolyte-electrode interface

    Science.gov (United States)

    Limmer, David T.; Willard, Adam P.

    2015-01-01

    Using molecular dynamics simulations, we reveal emergent properties of hydrated electrode interfaces that while molecular in origin are integral to the behavior of the system across long times scales and large length scales. Specifically, we describe the impact of a disordered and slowly evolving adsorbed layer of water on the molecular structure and dynamics of the electrolyte solution adjacent to it. Generically, we find that densities and mobilities of both water and dissolved ions are spatially heterogeneous in the plane parallel to the electrode over nanosecond timescales. These and other recent results are analyzed in the context of available experimental literature from surface science and electrochemistry. We speculate on the implications of this emerging microscopic picture on the catalytic proficiency of hydrated electrodes, offering a new direction for study in heterogeneous catalysis at the nanoscale.

  18. Heterogeneous grain-scale response in ferroic polycrystals under electric field

    DEFF Research Database (Denmark)

    Daniels, John E.; Majkut, Marta; Cao, Qingua

    2016-01-01

    -ray diffraction (3D-XRD) is used to resolve the non-180° ferroelectric domain switching strain components of 191 grains from the bulk of a polycrystalline electro-ceramic that has undergone an electric-field-induced phase transformation. It is found that while the orientation of a given grain relative...... to the field direction has a significant influence on the phase and resultant domain texture, there are large deviations from the average behaviour at the grain scale. It is suggested that these deviations arise from local strain and electric field neighbourhoods being highly heterogeneous within the bulk...

  19. Simplified nonplanar wafer bonding for heterogeneous device integration

    Science.gov (United States)

    Geske, Jon; Bowers, John E.; Riley, Anton

    2004-07-01

    We demonstrate a simplified nonplanar wafer bonding technique for heterogeneous device integration. The improved technique can be used to laterally integrate dissimilar semiconductor device structures on a lattice-mismatched substrate. Using the technique, two different InP-based vertical-cavity surface-emitting laser active regions have been integrated onto GaAs without compromising the quality of the photoluminescence. Experimental and numerical simulation results are presented.

  20. Integration Of Data From Heterogeneous Sources Using Etl Technology.

    Directory of Open Access Journals (Sweden)

    Marek Macura

    2014-01-01

    Full Text Available Data integration is a crucial issue in environments of heterogeneous data sources. At present mentioned heterogeneity is becoming widespread. Whenever, based on various data sources, we want to gain useful information and knowledge we must solve data integration problem in order to apply appropriate analytical methods on comprehensive and uniform data. Such activity is known as knowledge discovery from data process. Therefore approaches to data integration problem are very interesting and bring us closer to the "age of information". The paper presents an architecture, which implements knowledge discovery from data process. The solution combines ETL technology and wrapper layer known from mediated systems. It also provides semantic integration through connections mechanism between data elements. The solution allows for integration of any data sources and implementation of analytical methods in one environment. The proposed environment is verified by applying it to data sources on the foundry industry.

  1. Shallow to Deep Convection Transition over a Heterogeneous Land Surface Using the Land Model Coupled Large-Eddy Simulation

    Science.gov (United States)

    Lee, J.; Zhang, Y.; Klein, S. A.

    2017-12-01

    The triggering of the land breeze, and hence the development of deep convection over heterogeneous land should be understood as a consequence of the complex processes involving various factors from land surface and atmosphere simultaneously. That is a sub-grid scale process that many large-scale models have difficulty incorporating it into the parameterization scheme partly due to lack of our understanding. Thus, it is imperative that we approach the problem using a high-resolution modeling framework. In this study, we use SAM-SLM (Lee and Khairoutdinov, 2015), a large-eddy simulation model coupled to a land model, to explore the cloud effect such as cold pool, the cloud shading and the soil moisture memory on the land breeze structure and the further development of cloud and precipitation over a heterogeneous land surface. The atmospheric large scale forcing and the initial sounding are taken from the new composite case study of the fair-weather, non-precipitating shallow cumuli at ARM SGP (Zhang et al., 2017). We model the land surface as a chess board pattern with alternating leaf area index (LAI). The patch contrast of the LAI is adjusted to encompass the weak to strong heterogeneity amplitude. The surface sensible- and latent heat fluxes are computed according to the given LAI representing the differential surface heating over a heterogeneous land surface. Separate from the surface forcing imposed from the originally modeled surface, the cases that transition into the moist convection can induce another layer of the surface heterogeneity from the 1) radiation shading by clouds, 2) adjusted soil moisture pattern by the rain, 3) spreading cold pool. First, we assess and quantifies the individual cloud effect on the land breeze and the moist convection under the weak wind to simplify the feedback processes. And then, the same set of experiments is repeated under sheared background wind with low level jet, a typical summer time wind pattern at ARM SGP site, to

  2. Thesaurus-based search in large heterogeneous collections

    NARCIS (Netherlands)

    J. Wielemaker (Jan); M. Hildebrand (Michiel); J.R. van Ossenbruggen (Jacco); G. Schreiber (Guus); A. Sheth; not CWI et al

    2008-01-01

    htmlabstractIn cultural heritage, large virtual collections are coming into existence. Such collections contain heterogeneous sets of metadata and vocabulary concepts, originating from multiple sources. In the context of the E-Culture demonstrator we have shown earlier that such virtual

  3. Micro-scale heterogeneity of spiders (Arachnida: Araneae) in the ...

    African Journals Online (AJOL)

    Coarse-scale studies that focus on species distributions and richness neglect heterogeneity that may be present at finer scales. Studies of arthropod assemblage structure at fine (1 × 1 km) scales are rare, but important, because these are the spatial levels at which real world applications are viable. Here we investigate ...

  4. A Large-Scale Design Integration Approach Developed in Conjunction with the Ares Launch Vehicle Program

    Science.gov (United States)

    Redmon, John W.; Shirley, Michael C.; Kinard, Paul S.

    2012-01-01

    This paper presents a method for performing large-scale design integration, taking a classical 2D drawing envelope and interface approach and applying it to modern three dimensional computer aided design (3D CAD) systems. Today, the paradigm often used when performing design integration with 3D models involves a digital mockup of an overall vehicle, in the form of a massive, fully detailed, CAD assembly; therefore, adding unnecessary burden and overhead to design and product data management processes. While fully detailed data may yield a broad depth of design detail, pertinent integration features are often obscured under the excessive amounts of information, making them difficult to discern. In contrast, the envelope and interface method results in a reduction in both the amount and complexity of information necessary for design integration while yielding significant savings in time and effort when applied to today's complex design integration projects. This approach, combining classical and modern methods, proved advantageous during the complex design integration activities of the Ares I vehicle. Downstream processes, benefiting from this approach by reducing development and design cycle time, include: Creation of analysis models for the Aerodynamic discipline; Vehicle to ground interface development; Documentation development for the vehicle assembly.

  5. Concurrent heterogeneous neural model simulation on real-time neuromimetic hardware.

    Science.gov (United States)

    Rast, Alexander; Galluppi, Francesco; Davies, Sergio; Plana, Luis; Patterson, Cameron; Sharp, Thomas; Lester, David; Furber, Steve

    2011-11-01

    Dedicated hardware is becoming increasingly essential to simulate emerging very-large-scale neural models. Equally, however, it needs to be able to support multiple models of the neural dynamics, possibly operating simultaneously within the same system. This may be necessary either to simulate large models with heterogeneous neural types, or to simplify simulation and analysis of detailed, complex models in a large simulation by isolating the new model to a small subpopulation of a larger overall network. The SpiNNaker neuromimetic chip is a dedicated neural processor able to support such heterogeneous simulations. Implementing these models on-chip uses an integrated library-based tool chain incorporating the emerging PyNN interface that allows a modeller to input a high-level description and use an automated process to generate an on-chip simulation. Simulations using both LIF and Izhikevich models demonstrate the ability of the SpiNNaker system to generate and simulate heterogeneous networks on-chip, while illustrating, through the network-scale effects of wavefront synchronisation and burst gating, methods that can provide effective behavioural abstractions for large-scale hardware modelling. SpiNNaker's asynchronous virtual architecture permits greater scope for model exploration, with scalable levels of functional and temporal abstraction, than conventional (or neuromorphic) computing platforms. The complete system illustrates a potential path to understanding the neural model of computation, by building (and breaking) neural models at various scales, connecting the blocks, then comparing them against the biology: computational cognitive neuroscience. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. TensorFlow: A system for large-scale machine learning

    OpenAIRE

    Abadi, Martín; Barham, Paul; Chen, Jianmin; Chen, Zhifeng; Davis, Andy; Dean, Jeffrey; Devin, Matthieu; Ghemawat, Sanjay; Irving, Geoffrey; Isard, Michael; Kudlur, Manjunath; Levenberg, Josh; Monga, Rajat; Moore, Sherry; Murray, Derek G.

    2016-01-01

    TensorFlow is a machine learning system that operates at large scale and in heterogeneous environments. TensorFlow uses dataflow graphs to represent computation, shared state, and the operations that mutate that state. It maps the nodes of a dataflow graph across many machines in a cluster, and within a machine across multiple computational devices, including multicore CPUs, general-purpose GPUs, and custom designed ASICs known as Tensor Processing Units (TPUs). This architecture gives flexib...

  7. The Hamburg large scale geostrophic ocean general circulation model. Cycle 1

    International Nuclear Information System (INIS)

    Maier-Reimer, E.; Mikolajewicz, U.

    1992-02-01

    The rationale for the Large Scale Geostrophic ocean circulation model (LSG-OGCM) is based on the observations that for a large scale ocean circulation model designed for climate studies, the relevant characteristic spatial scales are large compared with the internal Rossby radius throughout most of the ocean, while the characteristic time scales are large compared with the periods of gravity modes and barotropic Rossby wave modes. In the present version of the model, the fast modes have been filtered out by a conventional technique of integrating the full primitive equations, including all terms except the nonlinear advection of momentum, by an implicit time integration method. The free surface is also treated prognostically, without invoking a rigid lid approximation. The numerical scheme is unconditionally stable and has the additional advantage that it can be applied uniformly to the entire globe, including the equatorial and coastal current regions. (orig.)

  8. Thesaurus-based search in large heterogeneous collections

    NARCIS (Netherlands)

    Wielemaker, J.; Hildebrand, M.; van Ossenbruggen, J.; Schreiber, G.

    2008-01-01

    In cultural heritage, large virtual collections are coming into existence. Such collections contain heterogeneous sets of metadata and vocabulary concepts, originating from multiple sources. In the context of the E-Culture demonstrator we have shown earlier that such virtual collections can be

  9. Integration, Provenance, and Temporal Queries for Large-Scale Knowledge Bases

    OpenAIRE

    Gao, Shi

    2016-01-01

    Knowledge bases that summarize web information in RDF triples deliver many benefits, including support for natural language question answering and powerful structured queries that extract encyclopedic knowledge via SPARQL. Large scale knowledge bases grow rapidly in terms of scale and significance, and undergo frequent changes in both schema and content. Two critical problems have thus emerged: (i) how to support temporal queries that explore the history of knowledge bases or flash-back to th...

  10. Seismic Modeling Of Reservoir Heterogeneity Scales: An Application To Gas Hydrate Reservoirs

    Science.gov (United States)

    Huang, J.; Bellefleur, G.; Milkereit, B.

    2008-12-01

    Natural gas hydrates, a type of inclusion compound or clathrate, are composed of gas molecules trapped within a cage of water molecules. The occurrence of gas hydrates in permafrost regions has been confirmed by core samples recovered from the Mallik gas hydrate research wells located within Mackenzie Delta in Northwest Territories of Canada. Strong vertical variations of compressional and shear sonic velocities and weak surface seismic expressions of gas hydrates indicate that lithological heterogeneities control the distribution of hydrates. Seismic scattering studies predict that typical scales and strong physical contrasts due to gas hydrate concentration will generate strong forward scattering, leaving only weak energy captured by surface receivers. In order to understand the distribution of hydrates and the seismic scattering effects, an algorithm was developed to construct heterogeneous petrophysical reservoir models. The algorithm was based on well logs showing power law features and Gaussian or Non-Gaussian probability density distribution, and was designed to honor the whole statistical features of well logs such as the characteristic scales and the correlation among rock parameters. Multi-dimensional and multi-variable heterogeneous models representing the same statistical properties were constructed and applied to the heterogeneity analysis of gas hydrate reservoirs. The petrophysical models provide the platform to estimate rock physics properties as well as to study the impact of seismic scattering, wave mode conversion, and their integration on wave behavior in heterogeneous reservoirs. Using the Biot-Gassmann theory, the statistical parameters obtained from Mallik 5L-38, and the correlation length estimated from acoustic impedance inversion, gas hydrate volume fraction in Mallik area was estimated to be 1.8%, approximately 2x108 m3 natural gas stored in a hydrate bearing interval within 0.25 km2 lateral extension and between 889 m and 1115 m depth

  11. Temporal integration and 1/f power scaling in a circuit model of cerebellar interneurons.

    Science.gov (United States)

    Maex, Reinoud; Gutkin, Boris

    2017-07-01

    Inhibitory interneurons interconnected via electrical and chemical (GABA A receptor) synapses form extensive circuits in several brain regions. They are thought to be involved in timing and synchronization through fast feedforward control of principal neurons. Theoretical studies have shown, however, that whereas self-inhibition does indeed reduce response duration, lateral inhibition, in contrast, may generate slow response components through a process of gradual disinhibition. Here we simulated a circuit of interneurons (stellate and basket cells) of the molecular layer of the cerebellar cortex and observed circuit time constants that could rise, depending on parameter values, to >1 s. The integration time scaled both with the strength of inhibition, vanishing completely when inhibition was blocked, and with the average connection distance, which determined the balance between lateral and self-inhibition. Electrical synapses could further enhance the integration time by limiting heterogeneity among the interneurons and by introducing a slow capacitive current. The model can explain several observations, such as the slow time course of OFF-beam inhibition, the phase lag of interneurons during vestibular rotation, or the phase lead of Purkinje cells. Interestingly, the interneuron spike trains displayed power that scaled approximately as 1/ f at low frequencies. In conclusion, stellate and basket cells in cerebellar cortex, and interneuron circuits in general, may not only provide fast inhibition to principal cells but also act as temporal integrators that build a very short-term memory. NEW & NOTEWORTHY The most common function attributed to inhibitory interneurons is feedforward control of principal neurons. In many brain regions, however, the interneurons are densely interconnected via both chemical and electrical synapses but the function of this coupling is largely unknown. Based on large-scale simulations of an interneuron circuit of cerebellar cortex, we

  12. Microstructure and nonlinear signatures of yielding in a heterogeneous colloidal gel under large amplitude oscillatory shear

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Juntae; Helgeson, Matthew E., E-mail: helgeson@engineering.ucsb.edu [Department of Chemical Engineering, University of California Santa Barbara, Santa Barbara, California 93106 (United States); Merger, Dimitri; Wilhelm, Manfred [Institute for Chemical Technology and Polymer Chemistry, Karlsruhe Institute of Technology, 76131 Karlsruhe (Germany)

    2014-09-01

    We investigate yielding in a colloidal gel that forms a heterogeneous structure, consisting of a two-phase bicontinuous network of colloid-rich domains of fractal clusters and colloid-poor domains. Combining large amplitude oscillatory shear measurements with simultaneous small and ultra-small angle neutron scattering (rheo-SANS/USANS), we characterize both the nonlinear mechanical processes and strain amplitude-dependent microstructure underlying yielding. We observe a broad, three-stage yielding process that evolves over an order of magnitude in strain amplitude between the onset of nonlinearity and flow. Analyzing the intracycle response as a sequence of physical processes reveals a transition from elastic straining to elastoplastic thinning (which dominates in region I) and eventually yielding (which evolves through region II) and flow (which saturates in region III), and allows quantification of instantaneous nonlinear parameters associated with yielding. These measures exhibit significant strain rate amplitude dependence above a characteristic frequency, which we argue is governed by poroelastic effects. Correlating these results with time-averaged rheo-USANS measurements reveals that the material passes through a cascade of structural breakdown from large to progressively smaller length scales. In region I, compression of the fractal domains leads to the formation of large voids. In regions II and III, cluster-cluster correlations become increasingly homogeneous, suggesting breakage and eventually depercolation of intercluster bonds at the yield point. All significant structural changes occur on the micron-scale, suggesting that large-scale rearrangements of hundreds or thousands of particles, rather than the homogeneous rearrangement of particle-particle bonds, dominate the initial yielding of heterogeneous colloidal gels.

  13. Operation Modeling of Power Systems Integrated with Large-Scale New Energy Power Sources

    Directory of Open Access Journals (Sweden)

    Hui Li

    2016-10-01

    Full Text Available In the most current methods of probabilistic power system production simulation, the output characteristics of new energy power generation (NEPG has not been comprehensively considered. In this paper, the power output characteristics of wind power generation and photovoltaic power generation are firstly analyzed based on statistical methods according to their historical operating data. Then the characteristic indexes and the filtering principle of the NEPG historical output scenarios are introduced with the confidence level, and the calculation model of NEPG’s credible capacity is proposed. Based on this, taking the minimum production costs or the best energy-saving and emission-reduction effect as the optimization objective, the power system operation model with large-scale integration of new energy power generation (NEPG is established considering the power balance, the electricity balance and the peak balance. Besides, the constraints of the operating characteristics of different power generation types, the maintenance schedule, the load reservation, the emergency reservation, the water abandonment and the transmitting capacity between different areas are also considered. With the proposed power system operation model, the operation simulations are carried out based on the actual Northwest power grid of China, which resolves the new energy power accommodations considering different system operating conditions. The simulation results well verify the validity of the proposed power system operation model in the accommodation analysis for the power system which is penetrated with large scale NEPG.

  14. Large-scale offshore wind energy. Cost analysis and integration in the Dutch electricity market

    International Nuclear Information System (INIS)

    De Noord, M.

    1999-02-01

    The results of analysis of the construction and integration costs of large-scale offshore wind energy (OWE) farms in 2010 are presented. The integration of these farms (1 and 5 GW) in the Dutch electricity distribution system have been regarded against the background of a liberalised electricity market. A first step is taken for the determination of costs involved in solving integration problems. Three different types of foundations are examined: the mono-pile, the jacket and a new type of foundation: the concrete caisson pile: all single-turbine-single-support structures. For real offshore applications (>10 km offshore, at sea-depths >20 m), the concrete caisson pile is regarded as the most suitable. The price/power ratios of wind turbines are analysed. It is assumed that in 2010 turbines in the power range of 3-5 MW are available. The main calculations have been conducted for a 3 MW turbine. The main choice in electrical infrastructure is for AC or DC. Calculations show that at distances of 30 km offshore and more, the use of HVDC will result in higher initial costs but lower operating costs. The share of operating and maintenance (O ampersand M) costs in the kWh cost price is approximately 3.3%. To be able to compare the two farms, a base case is derived with a construction time of 10 years for both. The energy yield is calculated for a wind regime offshore of 9.0 m/s annual mean wind speed. Per 3 MW turbine this results in an annual energy production of approximately 12 GWh. The total farm efficiency amounts to 82%, resulting in a total farm capacity factor of 38%. With a required internal rate of return of 15%, the kWh cost price amounts to 0.24 DFl and 0.21 DFl for the 1 GW and 5 GW farms respectively in the base case. The required internal rate of return has a large effect on the kWh cost price, followed by costs of subsystems. O ampersand M costs have little effect on the cost price. Parameter studies show that a small cost reduction of 5% is possible when

  15. A Large Scale Code Resolution Service Network in the Internet of Things

    Science.gov (United States)

    Yu, Haining; Zhang, Hongli; Fang, Binxing; Yu, Xiangzhan

    2012-01-01

    In the Internet of Things a code resolution service provides a discovery mechanism for a requester to obtain the information resources associated with a particular product code immediately. In large scale application scenarios a code resolution service faces some serious issues involving heterogeneity, big data and data ownership. A code resolution service network is required to address these issues. Firstly, a list of requirements for the network architecture and code resolution services is proposed. Secondly, in order to eliminate code resolution conflicts and code resolution overloads, a code structure is presented to create a uniform namespace for code resolution records. Thirdly, we propose a loosely coupled distributed network consisting of heterogeneous, independent; collaborating code resolution services and a SkipNet based code resolution service named SkipNet-OCRS, which not only inherits DHT's advantages, but also supports administrative control and autonomy. For the external behaviors of SkipNet-OCRS, a novel external behavior mode named QRRA mode is proposed to enhance security and reduce requester complexity. For the internal behaviors of SkipNet-OCRS, an improved query algorithm is proposed to increase query efficiency. It is analyzed that integrating SkipNet-OCRS into our resolution service network can meet our proposed requirements. Finally, simulation experiments verify the excellent performance of SkipNet-OCRS. PMID:23202207

  16. A large scale code resolution service network in the Internet of Things.

    Science.gov (United States)

    Yu, Haining; Zhang, Hongli; Fang, Binxing; Yu, Xiangzhan

    2012-11-07

    In the Internet of Things a code resolution service provides a discovery mechanism for a requester to obtain the information resources associated with a particular product code immediately. In large scale application scenarios a code resolution service faces some serious issues involving heterogeneity, big data and data ownership. A code resolution service network is required to address these issues. Firstly, a list of requirements for the network architecture and code resolution services is proposed. Secondly, in order to eliminate code resolution conflicts and code resolution overloads, a code structure is presented to create a uniform namespace for code resolution records. Thirdly, we propose a loosely coupled distributed network consisting of heterogeneous, independent; collaborating code resolution services and a SkipNet based code resolution service named SkipNet-OCRS, which not only inherits DHT’s advantages, but also supports administrative control and autonomy. For the external behaviors of SkipNet-OCRS, a novel external behavior mode named QRRA mode is proposed to enhance security and reduce requester complexity. For the internal behaviors of SkipNet-OCRS, an improved query algorithm is proposed to increase query efficiency. It is analyzed that integrating SkipNet-OCRS into our resolution service network can meet our proposed requirements. Finally, simulation experiments verify the excellent performance of SkipNet-OCRS.

  17. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    Science.gov (United States)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating large-scale/local-scale correlation, enmpirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the large-scale/local-scale links were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach which integrated discrete wavelet multiresolution analysis for reconstructing local hydrometeorological processes (predictand : precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector) on a monthly time-step. This approach

  18. Vortex forcing model for turbulent flow over spanwise-heterogeneous topogrpahies: scaling arguments and similarity solution

    Science.gov (United States)

    Anderson, William; Yang, Jianzhi

    2017-11-01

    Spanwise surface heterogeneity beneath high-Reynolds number, fully-rough wall turbulence is known to induce mean secondary flows in the form of counter-rotating streamwise vortices. The secondary flows are a manifestation of Prandtl's secondary flow of the second kind - driven and sustained by spatial heterogeneity of components of the turbulent (Reynolds averaged) stress tensor. The spacing between adjacent surface heterogeneities serves as a control on the spatial extent of the counter-rotating cells, while their intensity is controlled by the spanwise gradient in imposed drag (where larger gradients associated with more dramatic transitions in roughness induce stronger cells). In this work, we have performed an order of magnitude analysis of the mean (Reynolds averaged) streamwise vorticity transport equation, revealing the scaling dependence of circulation upon spanwise spacing. The scaling arguments are supported by simulation data. Then, we demonstrate that mean streamwise velocity can be predicted a priori via a similarity solution to the mean streamwise vorticity transport equation. A vortex forcing term was used to represent the affects of spanwise topographic heterogeneity within the flow. Efficacy of the vortex forcing term was established with large-eddy simulation cases, wherein vortex forcing model parameters were altered to capture different values of spanwise spacing.

  19. Quantifying seismic anisotropy induced by small-scale chemical heterogeneities

    Science.gov (United States)

    Alder, C.; Bodin, T.; Ricard, Y.; Capdeville, Y.; Debayle, E.; Montagner, J. P.

    2017-12-01

    Observations of seismic anisotropy are usually used as a proxy for lattice-preferred orientation (LPO) of anisotropic minerals in the Earth's mantle. In this way, seismic anisotropy observed in tomographic models provides important constraints on the geometry of mantle deformation associated with thermal convection and plate tectonics. However, in addition to LPO, small-scale heterogeneities that cannot be resolved by long-period seismic waves may also produce anisotropy. The observed (i.e. apparent) anisotropy is then a combination of an intrinsic and an extrinsic component. Assuming the Earth's mantle exhibits petrological inhomogeneities at all scales, tomographic models built from long-period seismic waves may thus display extrinsic anisotropy. In this paper, we investigate the relation between the amplitude of seismic heterogeneities and the level of induced S-wave radial anisotropy as seen by long-period seismic waves. We generate some simple 1-D and 2-D isotropic models that exhibit a power spectrum of heterogeneities as what is expected for the Earth's mantle, that is, varying as 1/k, with k the wavenumber of these heterogeneities. The 1-D toy models correspond to simple layered media. In the 2-D case, our models depict marble-cake patterns in which an anomaly in shear wave velocity has been advected within convective cells. The long-wavelength equivalents of these models are computed using upscaling relations that link properties of a rapidly varying elastic medium to properties of the effective, that is, apparent, medium as seen by long-period waves. The resulting homogenized media exhibit extrinsic anisotropy and represent what would be observed in tomography. In the 1-D case, we analytically show that the level of anisotropy increases with the square of the amplitude of heterogeneities. This relation is numerically verified for both 1-D and 2-D media. In addition, we predict that 10 per cent of chemical heterogeneities in 2-D marble-cake models can

  20. Large Scale Functional Brain Networks Underlying Temporal Integration of Audio-Visual Speech Perception: An EEG Study.

    Science.gov (United States)

    Kumar, G Vinodh; Halder, Tamesh; Jaiswal, Amit K; Mukherjee, Abhishek; Roy, Dipanjan; Banerjee, Arpan

    2016-01-01

    Observable lip movements of the speaker influence perception of auditory speech. A classical example of this influence is reported by listeners who perceive an illusory (cross-modal) speech sound (McGurk-effect) when presented with incongruent audio-visual (AV) speech stimuli. Recent neuroimaging studies of AV speech perception accentuate the role of frontal, parietal, and the integrative brain sites in the vicinity of the superior temporal sulcus (STS) for multisensory speech perception. However, if and how does the network across the whole brain participates during multisensory perception processing remains an open question. We posit that a large-scale functional connectivity among the neural population situated in distributed brain sites may provide valuable insights involved in processing and fusing of AV speech. Varying the psychophysical parameters in tandem with electroencephalogram (EEG) recordings, we exploited the trial-by-trial perceptual variability of incongruent audio-visual (AV) speech stimuli to identify the characteristics of the large-scale cortical network that facilitates multisensory perception during synchronous and asynchronous AV speech. We evaluated the spectral landscape of EEG signals during multisensory speech perception at varying AV lags. Functional connectivity dynamics for all sensor pairs was computed using the time-frequency global coherence, the vector sum of pairwise coherence changes over time. During synchronous AV speech, we observed enhanced global gamma-band coherence and decreased alpha and beta-band coherence underlying cross-modal (illusory) perception compared to unisensory perception around a temporal window of 300-600 ms following onset of stimuli. During asynchronous speech stimuli, a global broadband coherence was observed during cross-modal perception at earlier times along with pre-stimulus decreases of lower frequency power, e.g., alpha rhythms for positive AV lags and theta rhythms for negative AV lags. Thus, our

  1. Heterogeneous slip and rupture models of the San Andreas fault zone based upon three-dimensional earthquake tomography

    Energy Technology Data Exchange (ETDEWEB)

    Foxall, William [Univ. of California, Berkeley, CA (United States)

    1992-11-01

    Crystal fault zones exhibit spatially heterogeneous slip behavior at all scales, slip being partitioned between stable frictional sliding, or fault creep, and unstable earthquake rupture. An understanding the mechanisms underlying slip segmentation is fundamental to research into fault dynamics and the physics of earthquake generation. This thesis investigates the influence that large-scale along-strike heterogeneity in fault zone lithology has on slip segmentation. Large-scale transitions from the stable block sliding of the Central 4D Creeping Section of the San Andreas, fault to the locked 1906 and 1857 earthquake segments takes place along the Loma Prieta and Parkfield sections of the fault, respectively, the transitions being accomplished in part by the generation of earthquakes in the magnitude range 6 (Parkfield) to 7 (Loma Prieta). Information on sub-surface lithology interpreted from the Loma Prieta and Parkfield three-dimensional crustal velocity models computed by Michelini (1991) is integrated with information on slip behavior provided by the distributions of earthquakes located using, the three-dimensional models and by surface creep data to study the relationships between large-scale lithological heterogeneity and slip segmentation along these two sections of the fault zone.

  2. Unraveling The Connectome: Visualizing and Abstracting Large-Scale Connectomics Data

    KAUST Repository

    Al-Awami, Ali K.

    2017-01-01

    -user system seamlessly integrates a diverse set of tools. Our system provides support for the management, provenance, accountability, and auditing of large-scale segmentations. Finally, we present a novel architecture to render very large volumes interactively

  3. Generic evolution of mixing in heterogeneous media

    Science.gov (United States)

    De Dreuzy, J.; Carrera, J.; Dentz, M.; Le Borgne, T.

    2011-12-01

    Mixing in heterogeneous media results from the competition bewteen flow fluctuations and local scale diffusion. Flow fluctuations quickly create concentration contrasts and thus heterogeneity of the concentration field, which is slowly homogenized by local scale diffusion. Mixing first deviates from Gaussian mixing, which represents the potential mixing induced by spreading before approaching it. This deviation fundamentally expresses the evolution of the interaction between spreading and local scale diffusion. We characterize it by the ratio γ of the non-Gaussian to the Gaussian mixing states. We define the Gaussian mixing state as the integrated squared concentration of the Gaussian plume that has the same longitudinal dispersion as the real plume. The non-Gaussian mixing state is the difference between the overall mixing state defined as the integrated squared concentration and the Gaussian mixing state. The main advantage of this definition is to use the full knowledge previously acquired on dispersion for characterizing mixing even when the solute concentration field is highly non Gaussian. Using high precision numerical simulations, we show that γ quickly increases, peaks and slowly decreases. γ can be derived from two scales characterizing spreading and local mixing, at least for large flux-weighted solute injection conditions into classically log-normal Gaussian correlated permeability fields. The spreading scale is directly related to the longitudinal dispersion. The local mixing scale is the largest scale over which solute concentrations can be considered locally uniform. More generally, beyond the characteristics of its maximum, γ turns out to have a highly generic scaling form. Its fast increase and slow decrease depend neither on the heterogeneity level, nor on the ratio of diffusion to advection, nor on the injection conditions. They might even not depend on the particularities of the flow fields as the same generic features also prevail for

  4. GIGGLE: a search engine for large-scale integrated genome analysis.

    Science.gov (United States)

    Layer, Ryan M; Pedersen, Brent S; DiSera, Tonya; Marth, Gabor T; Gertz, Jason; Quinlan, Aaron R

    2018-02-01

    GIGGLE is a genomics search engine that identifies and ranks the significance of genomic loci shared between query features and thousands of genome interval files. GIGGLE (https://github.com/ryanlayer/giggle) scales to billions of intervals and is over three orders of magnitude faster than existing methods. Its speed extends the accessibility and utility of resources such as ENCODE, Roadmap Epigenomics, and GTEx by facilitating data integration and hypothesis generation.

  5. An integration bridge for heterogeneous e-service environments

    OpenAIRE

    Baeta, Henrique Jorge Lourenço

    2012-01-01

    Dissertação para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores Home automation has evolved from a single integration of services (provided by devices, equipment, etc.) in the environment to a more broad integration of these core services with others(external to the environment) to create some added-value services for home users. This presents a key challenge of how to integrate disparate and heterogeneous e-service networks. To this, there exist already...

  6. Integrating transient heterogeneity of non-photochemical quenching in shade-grown heterobaric leaves of avocado (Persea americana L.): responses to CO2 concentration, stomatal occlusion, dehydration and relative humidity.

    Science.gov (United States)

    Takayama, Kotaro; King, Diana; Robinson, Sharon A; Osmond, Barry

    2013-11-01

    Long-lived shade leaves of avocado had extremely low rates of photosynthesis. Gas exchange measurements of photosynthesis were of limited use, so we resorted to Chl fluorescence imaging (CFI) and spot measurements to evaluate photosynthetic electron transport rates (ETRs) and non-photochemical quenching (NPQ). Imaging revealed a remarkable transient heterogeneity of NPQ during photosynthetic induction in these hypostomatous, heterobaric leaves, but was adequately integrated by spot measurements, despite long-lasting artifacts from repeated saturating flashes during assays. Major veins (mid-vein, first- and second-order veins) defined areas of more static large-scale heterogeneous NPQ, with more dynamic small-scale heterogeneity most strongly expressed in mesophyll cells between third- and fourth-order veins. Both responded to external CO2 concentration ([CO2]), occlusion of stomata with Vaseline™, leaf dehydration and relative humidity (RH). We interpreted these responses in terms of independent behavior of stomata in adjacent areoles that was largely expressed through CO2-limited photosynthesis. Heterogeneity was most pronounced and prolonged in the absence of net CO2 fixation in 100 p.p.m. [CO2] when respiratory and photorespiratory CO2 cycling constrained the inferred ETR to ~75% of values in 400 or 700 p.p.m. [CO2]. Likewise, sustained higher NPQ under Vaseline™, after dehydration or at low RH, also restricted ETR to ~75% of control values. Low NPQ in chloroplast-containing cells adjacent to major veins but remote from stomata suggested internal sources of high [CO2] in these tissues.

  7. Review of DC System Technologies for Large Scale Integration of Wind Energy Systems with Electricity Grids

    Directory of Open Access Journals (Sweden)

    Sheng Jie Shao

    2010-06-01

    Full Text Available The ever increasing development and availability of power electronic systems is the underpinning technology that enables large scale integration of wind generation plants with the electricity grid. As the size and power capacity of the wind turbine continues to increase, so is the need to place these significantly large structures at off-shore locations. DC grids and associated power transmission technologies provide opportunities for cost reduction and electricity grid impact minimization as the bulk power is concentrated at single point of entry. As a result, planning, optimization and impact can be studied and carefully controlled minimizing the risk of the investment as well as power system stability issues. This paper discusses the key technologies associated with DC grids for offshore wind farm applications.

  8. A Chain Perspective on Large-scale Number Systems

    NARCIS (Netherlands)

    Grijpink, J.H.A.M.

    2012-01-01

    As large-scale number systems gain significance in social and economic life (electronic communication, remote electronic authentication), the correct functioning and the integrity of public number systems take on crucial importance. They are needed to uniquely indicate people, objects or phenomena

  9. Semantic Representation and Scale-Up of Integrated Air Traffic Management Data

    Science.gov (United States)

    Keller, Richard M.; Ranjan, Shubha; Wei, Mie; Eshow, Michelle

    2016-01-01

    Each day, the global air transportation industry generates a vast amount of heterogeneous data from air carriers, air traffic control providers, and secondary aviation entities handling baggage, ticketing, catering, fuel delivery, and other services. Generally, these data are stored in isolated data systems, separated from each other by significant political, regulatory, economic, and technological divides. These realities aside, integrating aviation data into a single, queryable, big data store could enable insights leading to major efficiency, safety, and cost advantages. In this paper, we describe an implemented system for combining heterogeneous air traffic management data using semantic integration techniques. The system transforms data from its original disparate source formats into a unified semantic representation within an ontology-based triple store. Our initial prototype stores only a small sliver of air traffic data covering one day of operations at a major airport. The paper also describes our analysis of difficulties ahead as we prepare to scale up data storage to accommodate successively larger quantities of data -- eventually covering all US commercial domestic flights over an extended multi-year timeframe. We review several approaches to mitigating scale-up related query performance concerns.

  10. Reorganizing Complex Network to Improve Large-Scale Multiagent Teamwork

    Directory of Open Access Journals (Sweden)

    Yang Xu

    2014-01-01

    Full Text Available Large-scale multiagent teamwork has been popular in various domains. Similar to human society infrastructure, agents only coordinate with some of the others, with a peer-to-peer complex network structure. Their organization has been proven as a key factor to influence their performance. To expedite team performance, we have analyzed that there are three key factors. First, complex network effects may be able to promote team performance. Second, coordination interactions coming from their sources are always trying to be routed to capable agents. Although they could be transferred across the network via different paths, their sources and sinks depend on the intrinsic nature of the team which is irrelevant to the network connections. In addition, the agents involved in the same plan often form a subteam and communicate with each other more frequently. Therefore, if the interactions between agents can be statistically recorded, we are able to set up an integrated network adjustment algorithm by combining the three key factors. Based on our abstracted teamwork simulations and the coordination statistics, we implemented the adaptive reorganization algorithm. The experimental results briefly support our design that the reorganized network is more capable of coordinating heterogeneous agents.

  11. Integrating heterogeneous databases in clustered medic care environments using object-oriented technology

    Science.gov (United States)

    Thakore, Arun K.; Sauer, Frank

    1994-05-01

    The organization of modern medical care environments into disease-related clusters, such as a cancer center, a diabetes clinic, etc., has the side-effect of introducing multiple heterogeneous databases, often containing similar information, within the same organization. This heterogeneity fosters incompatibility and prevents the effective sharing of data amongst applications at different sites. Although integration of heterogeneous databases is now feasible, in the medical arena this is often an ad hoc process, not founded on proven database technology or formal methods. In this paper we illustrate the use of a high-level object- oriented semantic association method to model information found in different databases into an integrated conceptual global model that integrates the databases. We provide examples from the medical domain to illustrate an integration approach resulting in a consistent global view, without attacking the autonomy of the underlying databases.

  12. 8th international workshop on large-scale integration of wind power into power systems as well as on transmission networks for offshore wind farms. Proceedings

    International Nuclear Information System (INIS)

    Betancourt, Uta; Ackermann, Thomas

    2009-01-01

    Within the 8th International Workshop on Large-Scale Integration of Wind Power into Power Systems as well as on Transmission Networks for Offshore Wind Farms at 14th to 15th October, 2009 in Bremen (Federal Republic of Germany), lectures and posters were presented to the following sessions: (1) Keynote session and panel; (2) Grid integration studies and experience: Europe; (3) Connection of offshore wind farms; (4) Wind forecast; (5) High voltage direct current (HVDC); (6) German grid code issues; (7) Offshore grid connection; (8) Grid integration studies and experience: North America; (9) SUPWIND - Decision support tools for large scale integration of wind; (10) Windgrid - Wind on the grid: An integrated approach; (11) IEA Task 25; (12) Grid code issues; (13) Market Issues; (14) Offshore Grid; (15) Modelling; (16) Wind power and storage; (17) Power system balancing; (18) Wind turbine performance; (19) Modelling and offshore transformer.

  13. GIGGLE: a search engine for large-scale integrated genome analysis

    Science.gov (United States)

    Layer, Ryan M; Pedersen, Brent S; DiSera, Tonya; Marth, Gabor T; Gertz, Jason; Quinlan, Aaron R

    2018-01-01

    GIGGLE is a genomics search engine that identifies and ranks the significance of genomic loci shared between query features and thousands of genome interval files. GIGGLE (https://github.com/ryanlayer/giggle) scales to billions of intervals and is over three orders of magnitude faster than existing methods. Its speed extends the accessibility and utility of resources such as ENCODE, Roadmap Epigenomics, and GTEx by facilitating data integration and hypothesis generation. PMID:29309061

  14. Thin Film Magnetless Faraday Rotators for Compact Heterogeneous Integrated Optical Isolators (Postprint)

    Science.gov (United States)

    2017-06-15

    AFRL-RX-WP-JA-2017-0348 THIN-FILM MAGNETLESS FARADAY ROTATORS FOR COMPACT HETEROGENEOUS INTEGRATED OPTICAL ISOLATORS (POSTPRINT) Dolendra Karki...Interim 9 May 2016 – 1 December 2016 4. TITLE AND SUBTITLE THIN-FILM MAGNETLESS FARADAY ROTATORS FOR COMPACT HETEROGENEOUS INTEGRATED OPTICAL...transfer of ultra-compact thin-film magnetless Faraday rotators to silicon photonic substrates. Thin films of magnetization latching bismuth

  15. Large-scale prediction of drug-target relationships

    DEFF Research Database (Denmark)

    Kuhn, Michael; Campillos, Mónica; González, Paula

    2008-01-01

    The rapidly increasing amount of publicly available knowledge in biology and chemistry enables scientists to revisit many open problems by the systematic integration and analysis of heterogeneous novel data. The integration of relevant data does not only allow analyses at the network level, but a...

  16. Integrated reservoir characterization: Improvement in heterogeneities stochastic modelling by integration of additional external constraints

    Energy Technology Data Exchange (ETDEWEB)

    Doligez, B.; Eschard, R. [Institut Francais du Petrole, Rueil Malmaison (France); Geffroy, F. [Centre de Geostatistique, Fontainebleau (France)] [and others

    1997-08-01

    The classical approach to construct reservoir models is to start with a fine scale geological model which is informed with petrophysical properties. Then scaling-up techniques allow to obtain a reservoir model which is compatible with the fluid flow simulators. Geostatistical modelling techniques are widely used to build the geological models before scaling-up. These methods provide equiprobable images of the area under investigation, which honor the well data, and which variability is the same than the variability computed from the data. At an appraisal phase, when few data are available, or when the wells are insufficient to describe all the heterogeneities and the behavior of the field, additional constraints are needed to obtain a more realistic geological model. For example, seismic data or stratigraphic models can provide average reservoir information with an excellent areal coverage, but with a poor vertical resolution. New advances in modelisation techniques allow now to integrate this type of additional external information in order to constrain the simulations. In particular, 2D or 3D seismic derived information grids, or sand-shale ratios maps coming from stratigraphic models can be used as external drifts to compute the geological image of the reservoir at the fine scale. Examples are presented to illustrate the use of these new tools, their impact on the final reservoir model, and their sensitivity to some key parameters.

  17. Changing the scale of hydrogeophysical aquifer heterogeneity characterization

    Science.gov (United States)

    Paradis, Daniel; Tremblay, Laurie; Ruggeri, Paolo; Brunet, Patrick; Fabien-Ouellet, Gabriel; Gloaguen, Erwan; Holliger, Klaus; Irving, James; Molson, John; Lefebvre, Rene

    2015-04-01

    Contaminant remediation and management require the quantitative predictive capabilities of groundwater flow and mass transport numerical models. Such models have to encompass source zones and receptors, and thus typically cover several square kilometers. To predict the path and fate of contaminant plumes, these models have to represent the heterogeneous distribution of hydraulic conductivity (K). However, hydrogeophysics has generally been used to image relatively restricted areas of the subsurface (small fractions of km2), so there is a need for approaches defining heterogeneity at larger scales and providing data to constrain conceptual and numerical models of aquifer systems. This communication describes a workflow defining aquifer heterogeneity that was applied over a 12 km2 sub-watershed surrounding a decommissioned landfill emitting landfill leachate. The aquifer is a shallow, 10 to 20 m thick, highly heterogeneous and anisotropic assemblage of littoral sand and silt. Field work involved the acquisition of a broad range of data: geological, hydraulic, geophysical, and geochemical. The emphasis was put on high resolution and continuous hydrogeophysical data, the use of direct-push fully-screened wells and the acquisition of targeted high-resolution hydraulic data covering the range of observed aquifer materials. The main methods were: 1) surface geophysics (ground-penetrating radar and electrical resistivity); 2) direct-push operations with a geotechnical drilling rig (cone penetration tests with soil moisture resistivity CPT/SMR; full-screen well installation); and 3) borehole operations, including high-resolution hydraulic tests and geochemical sampling. New methods were developed to acquire high vertical resolution hydraulic data in direct-push wells, including both vertical and horizontal K (Kv and Kh). Various data integration approaches were used to represent aquifer properties in 1D, 2D and 3D. Using relevant vector machines (RVM), the mechanical and

  18. Large-scale complementary macroelectronics using hybrid integration of carbon nanotubes and IGZO thin-film transistors.

    Science.gov (United States)

    Chen, Haitian; Cao, Yu; Zhang, Jialu; Zhou, Chongwu

    2014-06-13

    Carbon nanotubes and metal oxide semiconductors have emerged as important materials for p-type and n-type thin-film transistors, respectively; however, realizing sophisticated macroelectronics operating in complementary mode has been challenging due to the difficulty in making n-type carbon nanotube transistors and p-type metal oxide transistors. Here we report a hybrid integration of p-type carbon nanotube and n-type indium-gallium-zinc-oxide thin-film transistors to achieve large-scale (>1,000 transistors for 501-stage ring oscillators) complementary macroelectronic circuits on both rigid and flexible substrates. This approach of hybrid integration allows us to combine the strength of p-type carbon nanotube and n-type indium-gallium-zinc-oxide thin-film transistors, and offers high device yield and low device variation. Based on this approach, we report the successful demonstration of various logic gates (inverter, NAND and NOR gates), ring oscillators (from 51 stages to 501 stages) and dynamic logic circuits (dynamic inverter, NAND and NOR gates).

  19. Scale-specific correlations between habitat heterogeneity and soil fauna diversity along a landscape structure gradient.

    Science.gov (United States)

    Vanbergen, Adam J; Watt, Allan D; Mitchell, Ruth; Truscott, Anne-Marie; Palmer, Stephen C F; Ivits, Eva; Eggleton, Paul; Jones, T Hefin; Sousa, José Paulo

    2007-09-01

    Habitat heterogeneity contributes to the maintenance of diversity, but the extent that landscape-scale rather than local-scale heterogeneity influences the diversity of soil invertebrates-species with small range sizes-is less clear. Using a Scottish habitat heterogeneity gradient we correlated Collembola and lumbricid worm species richness and abundance with different elements (forest cover, habitat richness and patchiness) and qualities (plant species richness, soil variables) of habitat heterogeneity, at landscape (1 km(2)) and local (up to 200 m(2)) scales. Soil fauna assemblages showed considerable turnover in species composition along this habitat heterogeneity gradient. Soil fauna species richness and turnover was greatest in landscapes that were a mosaic of habitats. Soil fauna diversity was hump-shaped along a gradient of forest cover, peaking where there was a mixture of forest and open habitats in the landscape. Landscape-scale habitat richness was positively correlated with lumbricid diversity, while Collembola and lumbricid abundances were negatively and positively related to landscape spatial patchiness. Furthermore, soil fauna diversity was positively correlated with plant diversity, which in turn peaked in the sites that were a mosaic of forest and open habitat patches. There was less evidence that local-scale habitat variables (habitat richness, tree cover, plant species richness, litter cover, soil pH, depth of organic horizon) affected soil fauna diversity: Collembola diversity was independent of all these measures, while lumbricid diversity positively and negatively correlated with vascular plant species richness and tree canopy density. Landscape-scale habitat heterogeneity affects soil diversity regardless of taxon, while the influence of habitat heterogeneity at local scales is dependent on taxon identity, and hence ecological traits, e.g. body size. Landscape-scale habitat heterogeneity by providing different niches and refuges, together

  20. Longitudinal heterogeneity of flow and heat fluxes in a large lowland river: A study of the San Joaquin River, CA, USA during a large-scale flow experiment

    Science.gov (United States)

    Bray, E. N.; Dunne, T.; Dozier, J.

    2011-12-01

    Systematic downstream variation of channel characteristics, scaled by flow affects the transport and distribution of heat throughout a large river. As water moves through a river channel, streamflow and velocity may fluctuate by orders of magnitude primarily due to channel geometry, slope and resistance to flow, and the time scales of those fluctuations range from days to decades (Constantz et al., 1994; Lundquist and Cayan, 2002; McKerchar and Henderson, 2003). It is well understood that the heat budget of a river is primarily governed by surface exchanges, with the most significant surface flux coming from net shortwave radiation. The absorption of radiation at a given point in a river is determined by the wavelength-dependent index of refraction, expressed by the angle of refraction and the optical depth as a function of physical depth and the absorption coefficient (Dozier, 1980). Few studies consider the influence of hydrologic alteration to the optical properties governing net radiative heat transfer in a large lowland river, yet it is the most significant component of the heat budget and definitive to a river's thermal regime. We seek a physically based model without calibration to incorporate scale-dependent physical processes governing heat and flow dynamics in large rivers, how they change across the longitudinal profile, and how they change under different flow regimes. Longitudinal flow and heat flux analyses require synoptic flow time series from multiple sites along rivers, and few hydrometric networks meet this requirement (Larned et al, 2011). We model the energy budget in a regulated 240-km mainstem reach of the San Joaquin River California, USA equipped with multiple gaging stations from Friant Dam to its confluence with the Merced River during a large-scale flow experiment. We use detailed hydroclimatic observations distributed across the longitudinal gradient creating a non-replicable field experiment of heat fluxes across a range of flow regime

  1. Large-scale diversity of slope fishes: pattern inconsistency between multiple diversity indices.

    Science.gov (United States)

    Gaertner, Jean-Claude; Maiorano, Porzia; Mérigot, Bastien; Colloca, Francesco; Politou, Chrissi-Yianna; Gil De Sola, Luis; Bertrand, Jacques A; Murenu, Matteo; Durbec, Jean-Pierre; Kallianiotis, Argyris; Mannini, Alessandro

    2013-01-01

    Large-scale studies focused on the diversity of continental slope ecosystems are still rare, usually restricted to a limited number of diversity indices and mainly based on the empirical comparison of heterogeneous local data sets. In contrast, we investigate large-scale fish diversity on the basis of multiple diversity indices and using 1454 standardized trawl hauls collected throughout the upper and middle slope of the whole northern Mediterranean Sea (36°3'- 45°7' N; 5°3'W - 28°E). We have analyzed (1) the empirical relationships between a set of 11 diversity indices in order to assess their degree of complementarity/redundancy and (2) the consistency of spatial patterns exhibited by each of the complementary groups of indices. Regarding species richness, our results contrasted both the traditional view based on the hump-shaped theory for bathymetric pattern and the commonly-admitted hypothesis of a large-scale decreasing trend correlated with a similar gradient of primary production in the Mediterranean Sea. More generally, we found that the components of slope fish diversity we analyzed did not always show a consistent pattern of distribution according either to depth or to spatial areas, suggesting that they are not driven by the same factors. These results, which stress the need to extend the number of indices traditionally considered in diversity monitoring networks, could provide a basis for rethinking not only the methodological approach used in monitoring systems, but also the definition of priority zones for protection. Finally, our results call into question the feasibility of properly investigating large-scale diversity patterns using a widespread approach in ecology, which is based on the compilation of pre-existing heterogeneous and disparate data sets, in particular when focusing on indices that are very sensitive to sampling design standardization, such as species richness.

  2. InP-DHBT-on-BiCMOS technology with fT/fmax of 400/350 GHz for heterogeneous integrated millimeter-wave sources

    DEFF Research Database (Denmark)

    Kraemer, Tomas; Ostermay, Ina; Jensen, Thomas

    2013-01-01

    -100 GHz. The 0.8 × 5 μm2 InP DHBTs show fT/fmax of 400/350 GHz with an output power of more than 26 mW at 96 GHz. These are record values for a heterogeneously integrated transistor on silicon. As a circuit example, a 164-GHz signal source is presented. It features a voltage-controlled oscillator in Bi......This paper presents a novel InP-SiGe BiCMOS technology using wafer-scale heterogeneous integration. The vertical stacking of the InP double heterojunction bipolar transistor (DHBT) circuitry directly on top of the BiCMOS wafer enables ultra-broadband interconnects with

  3. Scaling Effects of Cr(VI) Reduction Kinetics. The Role of Geochemical Heterogeneity

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Li [Pennsylvania State Univ., State College, PA (United States); Li, Li [Pennsylvania State Univ., State College, PA (United States)

    2015-10-22

    The natural subsurface is highly heterogeneous with minerals distributed in different spatial patterns. Fundamental understanding of how mineral spatial distribution patterns regulate sorption process is important for predicting the transport and fate of chemicals. Existing studies about the sorption was carried out in well-mixed batch reactors or uniformly packed columns, with few data available on the effects of spatial heterogeneities. As a result, there is a lack of data and understanding on how spatial heterogeneities control sorption processes. In this project, we aim to understand and develop modeling capabilities to predict the sorption of Cr(VI), an omnipresent contaminant in natural systems due to its natural occurrence and industrial utilization. We systematically examine the role of spatial patterns of illite, a common clay, in determining the extent of transport limitation and scaling effects associated with Cr(VI) sorption capacity and kinetics using column experiments and reactive transport modeling. Our results showed that the sorbed mass and rates can differ by an order of magnitude due to of the illite spatial heterogeneities and transport limitation. With constraints from data, we also developed the capabilities of modeling Cr(VI) in heterogeneous media. The developed model is then utilized to understand the general principles that govern the relationship between sorption and connectivity, a key measure of the spatial pattern characteristics. This correlation can be used to estimate Cr(VI) sorption characteristics in heterogeneous porous media. Insights gained here bridge gaps between laboratory and field application in hydrogeology and geochemical field, and advance predictive understanding of reactive transport processes in the natural heterogeneous subsurface. We believe that these findings will be of interest to a large number of environmental geochemists and engineers, hydrogeologists, and those interested in contaminant fate and transport

  4. Large scale healthcare data integration and analysis using the semantic web.

    Science.gov (United States)

    Timm, John; Renly, Sondra; Farkash, Ariel

    2011-01-01

    Healthcare data interoperability can only be achieved when the semantics of the content is well defined and consistently implemented across heterogeneous data sources. Achieving these objectives of interoperability requires the collaboration of experts from several domains. This paper describes tooling that integrates Semantic Web technologies with common tools to facilitate cross-domain collaborative development for the purposes of data interoperability. Our approach is divided into stages of data harmonization and representation, model transformation, and instance generation. We applied our approach on Hypergenes, an EU funded project, where we use our method to the Essential Hypertension disease model using a CDA template. Our domain expert partners include clinical providers, clinical domain researchers, healthcare information technology experts, and a variety of clinical data consumers. We show that bringing Semantic Web technologies into the healthcare interoperability toolkit increases opportunities for beneficial collaboration thus improving patient care and clinical research outcomes.

  5. Principle of Parsimony, Fake Science, and Scales

    Science.gov (United States)

    Yeh, T. C. J.; Wan, L.; Wang, X. S.

    2017-12-01

    Considering difficulties in predicting exact motions of water molecules, and the scale of our interests (bulk behaviors of many molecules), Fick's law (diffusion concept) has been created to predict solute diffusion process in space and time. G.I. Taylor (1921) demonstrated that random motion of the molecules reach the Fickian regime in less a second if our sampling scale is large enough to reach ergodic condition. Fick's law is widely accepted for describing molecular diffusion as such. This fits the definition of the parsimony principle at the scale of our concern. Similarly, advection-dispersion or convection-dispersion equation (ADE or CDE) has been found quite satisfactory for analysis of concentration breakthroughs of solute transport in uniformly packed soil columns. This is attributed to the solute is often released over the entire cross-section of the column, which has sampled many pore-scale heterogeneities and met the ergodicity assumption. Further, the uniformly packed column contains a large number of stationary pore-size heterogeneity. The solute thus reaches the Fickian regime after traveling a short distance along the column. Moreover, breakthrough curves are concentrations integrated over the column cross-section (the scale of our interest), and they meet the ergodicity assumption embedded in the ADE and CDE. To the contrary, scales of heterogeneity in most groundwater pollution problems evolve as contaminants travel. They are much larger than the scale of our observations and our interests so that the ergodic and the Fickian conditions are difficult. Upscaling the Fick's law for solution dispersion, and deriving universal rules of the dispersion to the field- or basin-scale pollution migrations are merely misuse of the parsimony principle and lead to a fake science ( i.e., the development of theories for predicting processes that can not be observed.) The appropriate principle of parsimony for these situations dictates mapping of large-scale

  6. Optimal Siting and Sizing of Energy Storage System for Power Systems with Large-scale Wind Power Integration

    DEFF Research Database (Denmark)

    Zhao, Haoran; Wu, Qiuwei; Huang, Shaojun

    2015-01-01

    This paper proposes algorithms for optimal sitingand sizing of Energy Storage System (ESS) for the operationplanning of power systems with large scale wind power integration.The ESS in this study aims to mitigate the wind powerfluctuations during the interval between two rolling Economic......Dispatches (EDs) in order to maintain generation-load balance.The charging and discharging of ESS is optimized consideringoperation cost of conventional generators, capital cost of ESSand transmission losses. The statistics from simulated systemoperations are then coupled to the planning process to determinethe...

  7. A Large Scale Code Resolution Service Network in the Internet of Things

    Directory of Open Access Journals (Sweden)

    Xiangzhan Yu

    2012-11-01

    Full Text Available In the Internet of Things a code resolution service provides a discovery mechanism for a requester to obtain the information resources associated with a particular product code immediately. In large scale application scenarios a code resolution service faces some serious issues involving heterogeneity, big data and data ownership. A code resolution service network is required to address these issues. Firstly, a list of requirements for the network architecture and code resolution services is proposed. Secondly, in order to eliminate code resolution conflicts and code resolution overloads, a code structure is presented to create a uniform namespace for code resolution records. Thirdly, we propose a loosely coupled distributed network consisting of heterogeneous, independent; collaborating code resolution services and a SkipNet based code resolution service named SkipNet-OCRS, which not only inherits DHT’s advantages, but also supports administrative control and autonomy. For the external behaviors of SkipNet-OCRS, a novel external behavior mode named QRRA mode is proposed to enhance security and reduce requester complexity. For the internal behaviors of SkipNet-OCRS, an improved query algorithm is proposed to increase query efficiency. It is analyzed that integrating SkipNet-OCRS into our resolution service network can meet our proposed requirements. Finally, simulation experiments verify the excellent performance of SkipNet-OCRS.

  8. Large epidemic thresholds emerge in heterogeneous networks of heterogeneous nodes

    Science.gov (United States)

    Yang, Hui; Tang, Ming; Gross, Thilo

    2015-08-01

    One of the famous results of network science states that networks with heterogeneous connectivity are more susceptible to epidemic spreading than their more homogeneous counterparts. In particular, in networks of identical nodes it has been shown that network heterogeneity, i.e. a broad degree distribution, can lower the epidemic threshold at which epidemics can invade the system. Network heterogeneity can thus allow diseases with lower transmission probabilities to persist and spread. However, it has been pointed out that networks in which the properties of nodes are intrinsically heterogeneous can be very resilient to disease spreading. Heterogeneity in structure can enhance or diminish the resilience of networks with heterogeneous nodes, depending on the correlations between the topological and intrinsic properties. Here, we consider a plausible scenario where people have intrinsic differences in susceptibility and adapt their social network structure to the presence of the disease. We show that the resilience of networks with heterogeneous connectivity can surpass those of networks with homogeneous connectivity. For epidemiology, this implies that network heterogeneity should not be studied in isolation, it is instead the heterogeneity of infection risk that determines the likelihood of outbreaks.

  9. Large epidemic thresholds emerge in heterogeneous networks of heterogeneous nodes.

    Science.gov (United States)

    Yang, Hui; Tang, Ming; Gross, Thilo

    2015-08-21

    One of the famous results of network science states that networks with heterogeneous connectivity are more susceptible to epidemic spreading than their more homogeneous counterparts. In particular, in networks of identical nodes it has been shown that network heterogeneity, i.e. a broad degree distribution, can lower the epidemic threshold at which epidemics can invade the system. Network heterogeneity can thus allow diseases with lower transmission probabilities to persist and spread. However, it has been pointed out that networks in which the properties of nodes are intrinsically heterogeneous can be very resilient to disease spreading. Heterogeneity in structure can enhance or diminish the resilience of networks with heterogeneous nodes, depending on the correlations between the topological and intrinsic properties. Here, we consider a plausible scenario where people have intrinsic differences in susceptibility and adapt their social network structure to the presence of the disease. We show that the resilience of networks with heterogeneous connectivity can surpass those of networks with homogeneous connectivity. For epidemiology, this implies that network heterogeneity should not be studied in isolation, it is instead the heterogeneity of infection risk that determines the likelihood of outbreaks.

  10. The impacts of pore-scale physical and chemical heterogeneities on the transport of radionuclide-carrying colloids

    Energy Technology Data Exchange (ETDEWEB)

    WU, Ning

    2018-04-24

    Independent of the methods of nuclear waste disposal, the degradation of packaging materials could lead to mobilization and transport of radionuclides into the geosphere. This process can be significantly accelerated due to the association of radionuclides with the backfill materials or mobile colloids in groundwater. The transport of these colloids is complicated by the inherent coupling of physical and chemical heterogeneities (e.g., pore space geometry, grain size, charge heterogeneity, and surface hydrophobicity) in natural porous media that can exist on the length scale of a few grains. In addition, natural colloids themselves are often heterogeneous in their surface properties (e.g., clay platelets possess opposite charges on the surface and along the rim). Both physical and chemical heterogeneities influence the transport and retention of radionuclides under various groundwater conditions. However, the precise mechanisms how these coupled heterogeneities influence colloidal transport are largely elusive. This knowledge gap is a major source of uncertainty in developing accurate models to represent the transport process and to predict distribution of radionuclides in the geosphere.

  11. Integrated modeling and up-scaling of landfill processes and heterogeneity using stochastic approach

    NARCIS (Netherlands)

    Bun, A.; Heimovaara, T.J.; Baviskar, S.M.; van Turnhout, A.G.; Konstantaki, L.A.

    2012-01-01

    Municipal solid waste landfills are a very complex and heterogeneous systems. The waste in a landfill body is a heterogeneous mixture of a wide range of materials containing high levels of organic matter, high amounts of salts and a wide range of different organic and inorganic substances, such as

  12. Energy modeling and analysis for optimal grid integration of large-scale variable renewables using hydrogen storage in Japan

    International Nuclear Information System (INIS)

    Komiyama, Ryoichi; Otsuki, Takashi; Fujii, Yasumasa

    2015-01-01

    Although the extensive introduction of VRs (variable renewables) will play an essential role to resolve energy and environmental issues in Japan after the Fukushima nuclear accident, its large-scale integration would pose a technical challenge in the grid management; as one of technical countermeasures, hydrogen storage receives much attention, as well as rechargeable battery, for controlling the intermittency of VR power output. For properly planning renewable energy policies, energy system modeling is important to quantify and qualitatively understand its potential benefits and impacts. This paper analyzes the optimal grid integration of large-scale VRs using hydrogen storage in Japan by developing a high time-resolution optimal power generation mix model. Simulation results suggest that the installation of hydrogen storage is promoted by both its cost reduction and CO 2 regulation policy. In addition, hydrogen storage turns out to be suitable for storing VR energy in a long period of time. Finally, through a sensitivity analysis of rechargeable battery cost, hydrogen storage is economically competitive with rechargeable battery; the cost of both technologies should be more elaborately recognized for formulating effective energy policies to integrate massive VRs into the country's power system in an economical manner. - Highlights: • Authors analyze hydrogen storage coupled with VRs (variable renewables). • Simulation analysis is done by developing an optimal power generation mix model. • Hydrogen storage installation is promoted by its cost decline and CO 2 regulation. • Hydrogen storage is suitable for storing VR energy in a long period of time. • Hydrogen storage is economically competitive with rechargeable battery

  13. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  14. Integration of crosswell seismic data for simulating porosity in a heterogeneous carbonate aquifer

    Science.gov (United States)

    Emery, Xavier; Parra, Jorge

    2013-11-01

    A challenge for the geostatistical simulation of subsurface properties in mining, petroleum and groundwater applications is the integration of well logs and seismic measurements, which can provide information on geological heterogeneities at a wide range of scales. This paper presents a case study conducted at the Port Mayaca aquifer, located in western Martin County, Florida, in which it is of interest to simulate porosity, based on porosity logs at two wells and high-resolution crosswell seismic measurements of P-wave impedance. To this end, porosity and impedance are transformed into cross-correlated Gaussian random fields, using local transformations. The model parameters (transformation functions, mean values and correlation structure of the transformed fields) are inferred and checked against the data. Multiple realizations of porosity can then be constructed conditionally to the impedance information in the interwell region, which allow identifying one low-porosity structure and two to three flow units that connect the two wells, mapping heterogeneities within these units and visually assessing fluid paths in the aquifer. In particular, the results suggest that the paths in the lower flow units, formed by a network of heterogeneous conduits, are not as smooth as in the upper flow unit.

  15. Upscaling of permeability heterogeneities in reservoir rocks; an integrated approach

    NARCIS (Netherlands)

    Mikes, D.

    2002-01-01

    This thesis presents a hierarchical and geologically constrained deterministic approach to incorporate small-scale heterogeneities into reservoir flow simulators. We use a hierarchical structure to encompass all scales from laminae to an entire depositional system. For the geological models under

  16. Accelerating large-scale phase-field simulations with GPU

    Directory of Open Access Journals (Sweden)

    Xiaoming Shi

    2017-10-01

    Full Text Available A new package for accelerating large-scale phase-field simulations was developed by using GPU based on the semi-implicit Fourier method. The package can solve a variety of equilibrium equations with different inhomogeneity including long-range elastic, magnetostatic, and electrostatic interactions. Through using specific algorithm in Compute Unified Device Architecture (CUDA, Fourier spectral iterative perturbation method was integrated in GPU package. The Allen-Cahn equation, Cahn-Hilliard equation, and phase-field model with long-range interaction were solved based on the algorithm running on GPU respectively to test the performance of the package. From the comparison of the calculation results between the solver executed in single CPU and the one on GPU, it was found that the speed on GPU is enormously elevated to 50 times faster. The present study therefore contributes to the acceleration of large-scale phase-field simulations and provides guidance for experiments to design large-scale functional devices.

  17. Vedic division methodology for high-speed very large scale integration applications

    Directory of Open Access Journals (Sweden)

    Prabir Saha

    2014-02-01

    Full Text Available Transistor level implementation of division methodology using ancient Vedic mathematics is reported in this Letter. The potentiality of the ‘Dhvajanka (on top of the flag’ formula was adopted from Vedic mathematics to implement such type of divider for practical very large scale integration applications. The division methodology was implemented through half of the divisor bit instead of the actual divisor, subtraction and little multiplication. Propagation delay and dynamic power consumption of divider circuitry were minimised significantly by stage reduction through Vedic division methodology. The functionality of the division algorithm was checked and performance parameters like propagation delay and dynamic power consumption were calculated through spice spectre with 90 nm complementary metal oxide semiconductor technology. The propagation delay of the resulted (32 ÷ 16 bit divider circuitry was only ∼300 ns and consumed ∼32.5 mW power for a layout area of 17.39 mm^2. Combination of Boolean arithmetic along with ancient Vedic mathematics, substantial amount of iterations were reduced resulted as ∼47, ∼38, 34% reduction in delay and ∼34, ∼21, ∼18% reduction in power were investigated compared with the mostly used (e.g. digit-recurrence, Newton–Raphson, Goldschmidt architectures.

  18. Large-scale integration of optimal combinations of PV, wind and wave power into the electricity supply

    DEFF Research Database (Denmark)

    Lund, Henrik

    2006-01-01

    This article presents the results of analyses of large-scale integration of wind power, photo voltaic (PV) and wave power into a Danish reference energy system. The possibility of integrating Renewable Energy Sources (RES) into the electricity supply is expressed in terms of the ability to avoid...... ancillary services are needed in order to secure the electricity supply system. The idea is to benefit from the different patterns in the fluctuations of different renewable sources. And the purpose is to identify optimal mixtures from a technical point of view. The optimal mixture seems to be when onshore...... wind power produces approximately 50% of the total electricity production from RES. Meanwhile, the mixture between PV and wave power seems to depend on the total amount of electricity production from RES. When the total RES input is below 20% of demand, PV should cover 40% and wave power only 10%. When...

  19. Accounting for Scale Heterogeneity in Healthcare-Related Discrete Choice Experiments when Comparing Stated Preferences: A Systematic Review.

    Science.gov (United States)

    Wright, Stuart J; Vass, Caroline M; Sim, Gene; Burton, Michael; Fiebig, Denzil G; Payne, Katherine

    2018-02-28

    Scale heterogeneity, or differences in the error variance of choices, may account for a significant amount of the observed variation in the results of discrete choice experiments (DCEs) when comparing preferences between different groups of respondents. The aim of this study was to identify if, and how, scale heterogeneity has been addressed in healthcare DCEs that compare the preferences of different groups. A systematic review identified all healthcare DCEs published between 1990 and February 2016. The full-text of each DCE was then screened to identify studies that compared preferences using data generated from multiple groups. Data were extracted and tabulated on year of publication, samples compared, tests for scale heterogeneity, and analytical methods to account for scale heterogeneity. Narrative analysis was used to describe if, and how, scale heterogeneity was accounted for when preferences were compared. A total of 626 healthcare DCEs were identified. Of these 199 (32%) aimed to compare the preferences of different groups specified at the design stage, while 79 (13%) compared the preferences of groups identified at the analysis stage. Of the 278 included papers, 49 (18%) discussed potential scale issues, 18 (7%) used a formal method of analysis to account for scale between groups, and 2 (1%) accounted for scale differences between preference groups at the analysis stage. Scale heterogeneity was present in 65% (n = 13) of studies that tested for it. Analytical methods to test for scale heterogeneity included coefficient plots (n = 5, 2%), heteroscedastic conditional logit models (n = 6, 2%), Swait and Louviere tests (n = 4, 1%), generalised multinomial logit models (n = 5, 2%), and scale-adjusted latent class analysis (n = 2, 1%). Scale heterogeneity is a prevalent issue in healthcare DCEs. Despite this, few published DCEs have discussed such issues, and fewer still have used formal methods to identify and account for the impact of scale

  20. Scaling net ecosystem production and net biome production over a heterogeneous region in the Western United States

    Science.gov (United States)

    D.P. Turner; W.D. Ritts; B.E. Law; W.B. Cohen; Z. Yan; T. Hudiburg; J.L. Campbell; M. Duane

    2007-01-01

    Bottom-up scaling of net ecosystem production (NEP) and net biome production (NBP) was used to generate a carbon budget for a large heterogeneous region (the state of Oregon, 2.5x105 km2 ) in the Western United States. Landsat resolution (30 m) remote sensing provided the basis for mapping land cover and disturbance history...

  1. Integrated biodosimetry in large scale radiological events. Opportunities for civil military co-operation

    International Nuclear Information System (INIS)

    Port, M.; Eder, S.F.; Lamkowski, A.; Majewski, M.; Abend, M.

    2016-01-01

    Radiological events like large scale radiological or nuclear accidents, terroristic attacks with radionuclide dispersal devices require rapid and precise medical classification (''triage'') and medical management of a large number of patients. Estimates on the absorbed dose and in particular predictions of the radiation induced health effects are mandatory for optimized allocation of limited medical resources and initiation of patient centred treatment. Among the German Armed Forces Medical Services the Bundeswehr Institute of Radiobiology offers a wide range of tools for the purpose of medical management to cope with different scenarios. The forward deployable mobile Medical Task Force has access to state of the art methodologies summarized into approaches such as physical dosimetry (including mobile gammaspectroscopy), clinical ''dosimetry'' (prodromi, H-Modul) and different means of biological dosimetry (e.g. dicentrics, high throughput gene expression techniques, gamma-H2AX). The integration of these different approaches enables trained physicians of the Medical Task Force to assess individual health injuries as well as prognostic evaluation, considering modern treatment options. To enhance the capacity of single institutions, networking has been recognized as an important emergency response strategy. The capabilities of physical, biological and clinical ''dosimetry'' approaches spanning from low up to high radiation exposures will be discussed. Furthermore civil military opportunities for combined efforts will be demonstrated.

  2. 1 million-Q optomechanical microdisk resonators for sensing with very large scale integration

    Science.gov (United States)

    Hermouet, M.; Sansa, M.; Banniard, L.; Fafin, A.; Gely, M.; Allain, P. E.; Santos, E. Gil; Favero, I.; Alava, T.; Jourdan, G.; Hentz, S.

    2018-02-01

    Cavity optomechanics have become a promising route towards the development of ultrasensitive sensors for a wide range of applications including mass, chemical and biological sensing. In this study, we demonstrate the potential of Very Large Scale Integration (VLSI) with state-of-the-art low-loss performance silicon optomechanical microdisks for sensing applications. We report microdisks exhibiting optical Whispering Gallery Modes (WGM) with 1 million quality factors, yielding high displacement sensitivity and strong coupling between optical WGMs and in-plane mechanical Radial Breathing Modes (RBM). Such high-Q microdisks with mechanical resonance frequencies in the 102 MHz range were fabricated on 200 mm wafers with Variable Shape Electron Beam lithography. Benefiting from ultrasensitive readout, their Brownian motion could be resolved with good Signal-to-Noise ratio at ambient pressure, as well as in liquid, despite high frequency operation and large fluidic damping: the mechanical quality factor reduced from few 103 in air to 10's in liquid, and the mechanical resonance frequency shifted down by a few percent. Proceeding one step further, we performed an all-optical operation of the resonators in air using a pump-probe scheme. Our results show our VLSI process is a viable approach for the next generation of sensors operating in vacuum, gas or liquid phase.

  3. Identifying gene-environment interactions in schizophrenia: contemporary challenges for integrated, large-scale investigations.

    Science.gov (United States)

    van Os, Jim; Rutten, Bart P; Myin-Germeys, Inez; Delespaul, Philippe; Viechtbauer, Wolfgang; van Zelst, Catherine; Bruggeman, Richard; Reininghaus, Ulrich; Morgan, Craig; Murray, Robin M; Di Forti, Marta; McGuire, Philip; Valmaggia, Lucia R; Kempton, Matthew J; Gayer-Anderson, Charlotte; Hubbard, Kathryn; Beards, Stephanie; Stilo, Simona A; Onyejiaka, Adanna; Bourque, Francois; Modinos, Gemma; Tognin, Stefania; Calem, Maria; O'Donovan, Michael C; Owen, Michael J; Holmans, Peter; Williams, Nigel; Craddock, Nicholas; Richards, Alexander; Humphreys, Isla; Meyer-Lindenberg, Andreas; Leweke, F Markus; Tost, Heike; Akdeniz, Ceren; Rohleder, Cathrin; Bumb, J Malte; Schwarz, Emanuel; Alptekin, Köksal; Üçok, Alp; Saka, Meram Can; Atbaşoğlu, E Cem; Gülöksüz, Sinan; Gumus-Akay, Guvem; Cihan, Burçin; Karadağ, Hasan; Soygür, Haldan; Cankurtaran, Eylem Şahin; Ulusoy, Semra; Akdede, Berna; Binbay, Tolga; Ayer, Ahmet; Noyan, Handan; Karadayı, Gülşah; Akturan, Elçin; Ulaş, Halis; Arango, Celso; Parellada, Mara; Bernardo, Miguel; Sanjuán, Julio; Bobes, Julio; Arrojo, Manuel; Santos, Jose Luis; Cuadrado, Pedro; Rodríguez Solano, José Juan; Carracedo, Angel; García Bernardo, Enrique; Roldán, Laura; López, Gonzalo; Cabrera, Bibiana; Cruz, Sabrina; Díaz Mesa, Eva Ma; Pouso, María; Jiménez, Estela; Sánchez, Teresa; Rapado, Marta; González, Emiliano; Martínez, Covadonga; Sánchez, Emilio; Olmeda, Ma Soledad; de Haan, Lieuwe; Velthorst, Eva; van der Gaag, Mark; Selten, Jean-Paul; van Dam, Daniella; van der Ven, Elsje; van der Meer, Floor; Messchaert, Elles; Kraan, Tamar; Burger, Nadine; Leboyer, Marion; Szoke, Andrei; Schürhoff, Franck; Llorca, Pierre-Michel; Jamain, Stéphane; Tortelli, Andrea; Frijda, Flora; Vilain, Jeanne; Galliot, Anne-Marie; Baudin, Grégoire; Ferchiou, Aziz; Richard, Jean-Romain; Bulzacka, Ewa; Charpeaud, Thomas; Tronche, Anne-Marie; De Hert, Marc; van Winkel, Ruud; Decoster, Jeroen; Derom, Catherine; Thiery, Evert; Stefanis, Nikos C; Sachs, Gabriele; Aschauer, Harald; Lasser, Iris; Winklbaur, Bernadette; Schlögelhofer, Monika; Riecher-Rössler, Anita; Borgwardt, Stefan; Walter, Anna; Harrisberger, Fabienne; Smieskova, Renata; Rapp, Charlotte; Ittig, Sarah; Soguel-dit-Piquard, Fabienne; Studerus, Erich; Klosterkötter, Joachim; Ruhrmann, Stephan; Paruch, Julia; Julkowski, Dominika; Hilboll, Desiree; Sham, Pak C; Cherny, Stacey S; Chen, Eric Y H; Campbell, Desmond D; Li, Miaoxin; Romeo-Casabona, Carlos María; Emaldi Cirión, Aitziber; Urruela Mora, Asier; Jones, Peter; Kirkbride, James; Cannon, Mary; Rujescu, Dan; Tarricone, Ilaria; Berardi, Domenico; Bonora, Elena; Seri, Marco; Marcacci, Thomas; Chiri, Luigi; Chierzi, Federico; Storbini, Viviana; Braca, Mauro; Minenna, Maria Gabriella; Donegani, Ivonne; Fioritti, Angelo; La Barbera, Daniele; La Cascia, Caterina Erika; Mulè, Alice; Sideli, Lucia; Sartorio, Rachele; Ferraro, Laura; Tripoli, Giada; Seminerio, Fabio; Marinaro, Anna Maria; McGorry, Patrick; Nelson, Barnaby; Amminger, G Paul; Pantelis, Christos; Menezes, Paulo R; Del-Ben, Cristina M; Gallo Tenan, Silvia H; Shuhama, Rosana; Ruggeri, Mirella; Tosato, Sarah; Lasalvia, Antonio; Bonetto, Chiara; Ira, Elisa; Nordentoft, Merete; Krebs, Marie-Odile; Barrantes-Vidal, Neus; Cristóbal, Paula; Kwapil, Thomas R; Brietzke, Elisa; Bressan, Rodrigo A; Gadelha, Ary; Maric, Nadja P; Andric, Sanja; Mihaljevic, Marina; Mirjanic, Tijana

    2014-07-01

    Recent years have seen considerable progress in epidemiological and molecular genetic research into environmental and genetic factors in schizophrenia, but methodological uncertainties remain with regard to validating environmental exposures, and the population risk conferred by individual molecular genetic variants is small. There are now also a limited number of studies that have investigated molecular genetic candidate gene-environment interactions (G × E), however, so far, thorough replication of findings is rare and G × E research still faces several conceptual and methodological challenges. In this article, we aim to review these recent developments and illustrate how integrated, large-scale investigations may overcome contemporary challenges in G × E research, drawing on the example of a large, international, multi-center study into the identification and translational application of G × E in schizophrenia. While such investigations are now well underway, new challenges emerge for G × E research from late-breaking evidence that genetic variation and environmental exposures are, to a significant degree, shared across a range of psychiatric disorders, with potential overlap in phenotype. © The Author 2014. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  4. Calculation of large scale relative permeabilities from stochastic properties of the permeability field and fluid properties

    Energy Technology Data Exchange (ETDEWEB)

    Lenormand, R.; Thiele, M.R. [Institut Francais du Petrole, Rueil Malmaison (France)

    1997-08-01

    The paper describes the method and presents preliminary results for the calculation of homogenized relative permeabilities using stochastic properties of the permeability field. In heterogeneous media, the spreading of an injected fluid is mainly sue to the permeability heterogeneity and viscosity fingering. At large scale, when the heterogeneous medium is replaced by a homogeneous one, we need to introduce a homogenized (or pseudo) relative permeability to obtain the same spreading. Generally, is derived by using fine-grid numerical simulations (Kyte and Berry). However, this operation is time consuming and cannot be performed for all the meshes of the reservoir. We propose an alternate method which uses the information given by the stochastic properties of the field without any numerical simulation. The method is based on recent developments on homogenized transport equations (the {open_quotes}MHD{close_quotes} equation, Lenormand SPE 30797). The MHD equation accounts for the three basic mechanisms of spreading of the injected fluid: (1) Dispersive spreading due to small scale randomness, characterized by a macrodispersion coefficient D. (2) Convective spreading due to large scale heterogeneities (layers) characterized by a heterogeneity factor H. (3) Viscous fingering characterized by an apparent viscosity ration M. In the paper, we first derive the parameters D and H as functions of variance and correlation length of the permeability field. The results are shown to be in good agreement with fine-grid simulations. The are then derived a function of D, H and M. The main result is that this approach lead to a time dependent . Finally, the calculated are compared to the values derived by history matching using fine-grid numerical simulations.

  5. Image-based computational quantification and visualization of genetic alterations and tumour heterogeneity.

    Science.gov (United States)

    Zhong, Qing; Rüschoff, Jan H; Guo, Tiannan; Gabrani, Maria; Schüffler, Peter J; Rechsteiner, Markus; Liu, Yansheng; Fuchs, Thomas J; Rupp, Niels J; Fankhauser, Christian; Buhmann, Joachim M; Perner, Sven; Poyet, Cédric; Blattner, Miriam; Soldini, Davide; Moch, Holger; Rubin, Mark A; Noske, Aurelia; Rüschoff, Josef; Haffner, Michael C; Jochum, Wolfram; Wild, Peter J

    2016-04-07

    Recent large-scale genome analyses of human tissue samples have uncovered a high degree of genetic alterations and tumour heterogeneity in most tumour entities, independent of morphological phenotypes and histopathological characteristics. Assessment of genetic copy-number variation (CNV) and tumour heterogeneity by fluorescence in situ hybridization (ISH) provides additional tissue morphology at single-cell resolution, but it is labour intensive with limited throughput and high inter-observer variability. We present an integrative method combining bright-field dual-colour chromogenic and silver ISH assays with an image-based computational workflow (ISHProfiler), for accurate detection of molecular signals, high-throughput evaluation of CNV, expressive visualization of multi-level heterogeneity (cellular, inter- and intra-tumour heterogeneity), and objective quantification of heterogeneous genetic deletions (PTEN) and amplifications (19q12, HER2) in diverse human tumours (prostate, endometrial, ovarian and gastric), using various tissue sizes and different scanners, with unprecedented throughput and reproducibility.

  6. Monolithic Ge-on-Si lasers for large-scale electronic-photonic integration

    Science.gov (United States)

    Liu, Jifeng; Kimerling, Lionel C.; Michel, Jurgen

    2012-09-01

    A silicon-based monolithic laser source has long been envisioned as a key enabling component for large-scale electronic-photonic integration in future generations of high-performance computation and communication systems. In this paper we present a comprehensive review on the development of monolithic Ge-on-Si lasers for this application. Starting with a historical review of light emission from the direct gap transition of Ge dating back to the 1960s, we focus on the rapid progress in band-engineered Ge-on-Si lasers in the past five years after a nearly 30-year gap in this research field. Ge has become an interesting candidate for active devices in Si photonics in the past decade due to its pseudo-direct gap behavior and compatibility with Si complementary metal oxide semiconductor (CMOS) processing. In 2007, we proposed combing tensile strain with n-type doping to compensate the energy difference between the direct and indirect band gap of Ge, thereby achieving net optical gain for CMOS-compatible diode lasers. Here we systematically present theoretical modeling, material growth methods, spontaneous emission, optical gain, and lasing under optical and electrical pumping from band-engineered Ge-on-Si, culminated by recently demonstrated electrically pumped Ge-on-Si lasers with >1 mW output in the communication wavelength window of 1500-1700 nm. The broad gain spectrum enables on-chip wavelength division multiplexing. A unique feature of band-engineered pseudo-direct gap Ge light emitters is that the emission intensity increases with temperature, exactly opposite to conventional direct gap semiconductor light-emitting devices. This extraordinary thermal anti-quenching behavior greatly facilitates monolithic integration on Si microchips where temperatures can reach up to 80 °C during operation. The same band-engineering approach can be extended to other pseudo-direct gap semiconductors, allowing us to achieve efficient light emission at wavelengths previously

  7. Monolithic Ge-on-Si lasers for large-scale electronic–photonic integration

    International Nuclear Information System (INIS)

    Liu, Jifeng; Kimerling, Lionel C; Michel, Jurgen

    2012-01-01

    A silicon-based monolithic laser source has long been envisioned as a key enabling component for large-scale electronic–photonic integration in future generations of high-performance computation and communication systems. In this paper we present a comprehensive review on the development of monolithic Ge-on-Si lasers for this application. Starting with a historical review of light emission from the direct gap transition of Ge dating back to the 1960s, we focus on the rapid progress in band-engineered Ge-on-Si lasers in the past five years after a nearly 30-year gap in this research field. Ge has become an interesting candidate for active devices in Si photonics in the past decade due to its pseudo-direct gap behavior and compatibility with Si complementary metal oxide semiconductor (CMOS) processing. In 2007, we proposed combing tensile strain with n-type doping to compensate the energy difference between the direct and indirect band gap of Ge, thereby achieving net optical gain for CMOS-compatible diode lasers. Here we systematically present theoretical modeling, material growth methods, spontaneous emission, optical gain, and lasing under optical and electrical pumping from band-engineered Ge-on-Si, culminated by recently demonstrated electrically pumped Ge-on-Si lasers with >1 mW output in the communication wavelength window of 1500–1700 nm. The broad gain spectrum enables on-chip wavelength division multiplexing. A unique feature of band-engineered pseudo-direct gap Ge light emitters is that the emission intensity increases with temperature, exactly opposite to conventional direct gap semiconductor light-emitting devices. This extraordinary thermal anti-quenching behavior greatly facilitates monolithic integration on Si microchips where temperatures can reach up to 80 °C during operation. The same band-engineering approach can be extended to other pseudo-direct gap semiconductors, allowing us to achieve efficient light emission at wavelengths previously

  8. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  9. Generation Expansion Planning Considering Integrating Large-scale Wind Generation

    DEFF Research Database (Denmark)

    Zhang, Chunyu; Ding, Yi; Østergaard, Jacob

    2013-01-01

    necessitated the inclusion of more innovative and sophisticated approaches in power system investment planning. A bi-level generation expansion planning approach considering large-scale wind generation was proposed in this paper. The first phase is investment decision, while the second phase is production...... optimization decision. A multi-objective PSO (MOPSO) algorithm was introduced to solve this optimization problem, which can accelerate the convergence and guarantee the diversity of Pareto-optimal front set as well. The feasibility and effectiveness of the proposed bi-level planning approach and the MOPSO...

  10. Multiphase flow towards coupled solid-liquid interactions in 2D heterogeneous porous micromodels: a fluorescent microscopy and micro-PIV measurement at pore scale

    Science.gov (United States)

    Li, Yaofa; Kazemifar, Farzan; Blois, Gianluca; Christensen, Kenneth; Kenneth Christensen, Notre Dame Team

    2017-11-01

    Multiphase flow in porous media is relevant to a range of applications in the energy and environmental sectors. Recently, the interest has been renewed by geological storage of CO2 within saline aquifers. Central to this goal is predicting the fidelity of candidate sites pre-injection of CO2 and its post-injection migration. Moreover, local pressure buildup may cause micro-seismic events, which could prove disastrous, and possibly compromise seal integrity. Evidence shows that the large-scale events are coupled with pore-scale phenomena, necessitating the understanding of pore-scale stress, strain, and flow processes and their representation in large-scale modeling. To this end, the pore-scale flow of water and supercritical CO2 is investigated under reservoir-relevant conditions over a range of wettability conditions in 2D heterogeneous micromodels that reflect the complexity of real sandstone. High-speed fluorescent microscopy, complemented by a fast differential pressure transmitter, allows for simultaneous measurement of the flow field within and the instantaneous pressure drop across the micromodels. A flexible micromodel is also designed, to be used in conjunction with the micro-PIV technique, enabling the quantification of coupled solid-liquid interactions. This work was supported as part of the GSCO2, an EFRC funded by the US DOE, Office of Science, and partially supported by WPI-I2CNER.

  11. FOREWORD: Heterogenous nucleation and microstructure formation—a scale- and system-bridging approach Heterogenous nucleation and microstructure formation—a scale- and system-bridging approach

    Science.gov (United States)

    Emmerich, H.

    2009-11-01

    Scope and aim of this volume. Nucleation and initial microstructure formation play an important role in almost all aspects of materials science [1-5]. The relevance of the prediction and control of nucleation and the subsequent microstructure formation is fully accepted across many areas of modern surface and materials science and technology. One reason is that a large range of material properties, from mechanical ones such as ductility and hardness to electrical and magnetic ones such as electric conductivity and magnetic hardness, depend largely on the specific crystalline structure that forms in nucleation and the subsequent initial microstructure growth. A very demonstrative example for the latter is the so called bamboo structure of an integrated circuit, for which resistance against electromigration [6] , a parallel alignment of grain boundaries vertical to the direction of electricity, is most favorable. Despite the large relevance of predicting and controlling nucleation and the subsequent microstructure formation, and despite significant progress in the experimental analysis of the later stages of crystal growth in line with new theoretical computer simulation concepts [7], details about the initial stages of solidification are still far from being satisfactorily understood. This is in particular true when the nucleation event occurs as heterogenous nucleation. The Priority Program SPP 1296 'Heterogenous Nucleation and Microstructure Formation—a Scale- and System-Bridging Approach' [8] sponsored by the German Research Foundation, DFG, intends to contribute to this open issue via a six year research program that enables approximately twenty research groups in Germany to work interdisciplinarily together following this goal. Moreover, it enables the participants to embed themselves in the international community which focuses on this issue via internationally open joint workshops, conferences and summer schools. An outline of such activities can be found

  12. Scaling impacts on environmental controls and spatial heterogeneity of soil organic carbon stocks

    Science.gov (United States)

    Mishra, U.; Riley, W. J.

    2015-07-01

    The spatial heterogeneity of land surfaces affects energy, moisture, and greenhouse gas exchanges with the atmosphere. However, representing the heterogeneity of terrestrial hydrological and biogeochemical processes in Earth system models (ESMs) remains a critical scientific challenge. We report the impact of spatial scaling on environmental controls, spatial structure, and statistical properties of soil organic carbon (SOC) stocks across the US state of Alaska. We used soil profile observations and environmental factors such as topography, climate, land cover types, and surficial geology to predict the SOC stocks at a 50 m spatial scale. These spatially heterogeneous estimates provide a data set with reasonable fidelity to the observations at a sufficiently high resolution to examine the environmental controls on the spatial structure of SOC stocks. We upscaled both the predicted SOC stocks and environmental variables from finer to coarser spatial scales (s = 100, 200, and 500 m and 1, 2, 5, and 10 km) and generated various statistical properties of SOC stock estimates. We found different environmental factors to be statistically significant predictors at different spatial scales. Only elevation, temperature, potential evapotranspiration, and scrub land cover types were significant predictors at all scales. The strengths of control (the median value of geographically weighted regression coefficients) of these four environmental variables on SOC stocks decreased with increasing scale and were accurately represented using mathematical functions (R2 = 0.83-0.97). The spatial structure of SOC stocks across Alaska changed with spatial scale. Although the variance (sill) and unstructured variability (nugget) of the calculated variograms of SOC stocks decreased exponentially with scale, the correlation length (range) remained relatively constant across scale. The variance of predicted SOC stocks decreased with spatial scale over the range of 50 m to ~ 500 m, and remained

  13. Large scale integration of flexible non-volatile, re-addressable memories using P(VDF-TrFE) and amorphous oxide transistors

    International Nuclear Information System (INIS)

    Gelinck, Gerwin H; Cobb, Brian; Van Breemen, Albert J J M; Myny, Kris

    2015-01-01

    Ferroelectric polymers and amorphous metal oxide semiconductors have emerged as important materials for re-programmable non-volatile memories and high-performance, flexible thin-film transistors, respectively. However, realizing sophisticated transistor memory arrays has proven to be a challenge, and demonstrating reliable writing to and reading from such a large scale memory has thus far not been demonstrated. Here, we report an integration of ferroelectric, P(VDF-TrFE), transistor memory arrays with thin-film circuitry that can address each individual memory element in that array. n-type indium gallium zinc oxide is used as the active channel material in both the memory and logic thin-film transistors. The maximum process temperature is 200 °C, allowing plastic films to be used as substrate material. The technology was scaled up to 150 mm wafer size, and offers good reproducibility, high device yield and low device variation. This forms the basis for successful demonstration of memory arrays, read and write circuitry, and the integration of these. (paper)

  14. Innovation-driven efficient development of the Longwangmiao Fm large-scale sulfur gas reservoir in Moxi block, Sichuan Basin

    Directory of Open Access Journals (Sweden)

    Xinhua Ma

    2016-03-01

    Full Text Available The Lower Cambrian Longwangmiao Fm gas reservoir in Moxi block of the Anyue Gas field, Sichuan Basin, is the largest single-sandbody integrated carbonate gas reservoir proved so far in China. Notwithstanding this reservoir's advantages like large-scale reserves and high single-well productivity, there are multiple complicated factors restricting its efficient development, such as a median content of hydrogen sulfide, low porosity and strong heterogeneity of fracture–cave formation, various modes of gas–water occurrences, and close relation between overpressure and stress sensitivity. Up till now, since only a few Cambrian large-scale carbonate gas reservoirs have ever been developed in the world, there still exists some blind spots especially about its exploration and production rules. Besides, as for large-scale sulfur gas reservoirs, the exploration and construction is costly, and production test in the early evaluation stage is severely limited, all of which will bring about great challenges in productivity construction and high potential risks. In this regard, combining with Chinese strategic demand of strengthening clean energy supply security, the PetroChina Southwest Oil & Gas Field Company has carried out researches and field tests for the purpose of providing high-production wells, optimizing development design, rapidly constructing high-quality productivity and upgrading HSE security in the Longwangmiao Fm gas reservoir in Moxi block. Through the innovations of technology and management mode within 3 years, this gas reservoir has been built into a modern large-scale gas field with high quality, high efficiency and high benefit, and its annual capacity is now up to over 100 × 108 m3, with a desirable production capacity and development indexes gained as originally anticipated. It has become a new model of large-scale gas reservoirs with efficient development, providing a reference for other types of gas reservoirs in China.

  15. Microenvironmental Heterogeneity Parallels Breast Cancer Progression: A Histology-Genomic Integration Analysis.

    Directory of Open Access Journals (Sweden)

    Rachael Natrajan

    2016-02-01

    Full Text Available The intra-tumor diversity of cancer cells is under intense investigation; however, little is known about the heterogeneity of the tumor microenvironment that is key to cancer progression and evolution. We aimed to assess the degree of microenvironmental heterogeneity in breast cancer and correlate this with genomic and clinical parameters.We developed a quantitative measure of microenvironmental heterogeneity along three spatial dimensions (3-D in solid tumors, termed the tumor ecosystem diversity index (EDI, using fully automated histology image analysis coupled with statistical measures commonly used in ecology. This measure was compared with disease-specific survival, key mutations, genome-wide copy number, and expression profiling data in a retrospective study of 510 breast cancer patients as a test set and 516 breast cancer patients as an independent validation set. In high-grade (grade 3 breast cancers, we uncovered a striking link between high microenvironmental heterogeneity measured by EDI and a poor prognosis that cannot be explained by tumor size, genomics, or any other data types. However, this association was not observed in low-grade (grade 1 and 2 breast cancers. The prognostic value of EDI was superior to known prognostic factors and was enhanced with the addition of TP53 mutation status (multivariate analysis test set, p = 9 × 10-4, hazard ratio = 1.47, 95% CI 1.17-1.84; validation set, p = 0.0011, hazard ratio = 1.78, 95% CI 1.26-2.52. Integration with genome-wide profiling data identified losses of specific genes on 4p14 and 5q13 that were enriched in grade 3 tumors with high microenvironmental diversity that also substratified patients into poor prognostic groups. Limitations of this study include the number of cell types included in the model, that EDI has prognostic value only in grade 3 tumors, and that our spatial heterogeneity measure was dependent on spatial scale and tumor size.To our knowledge, this is the first

  16. Application of parallel computing techniques to a large-scale reservoir simulation

    International Nuclear Information System (INIS)

    Zhang, Keni; Wu, Yu-Shu; Ding, Chris; Pruess, Karsten

    2001-01-01

    Even with the continual advances made in both computational algorithms and computer hardware used in reservoir modeling studies, large-scale simulation of fluid and heat flow in heterogeneous reservoirs remains a challenge. The problem commonly arises from intensive computational requirement for detailed modeling investigations of real-world reservoirs. This paper presents the application of a massive parallel-computing version of the TOUGH2 code developed for performing large-scale field simulations. As an application example, the parallelized TOUGH2 code is applied to develop a three-dimensional unsaturated-zone numerical model simulating flow of moisture, gas, and heat in the unsaturated zone of Yucca Mountain, Nevada, a potential repository for high-level radioactive waste. The modeling approach employs refined spatial discretization to represent the heterogeneous fractured tuffs of the system, using more than a million 3-D gridblocks. The problem of two-phase flow and heat transfer within the model domain leads to a total of 3,226,566 linear equations to be solved per Newton iteration. The simulation is conducted on a Cray T3E-900, a distributed-memory massively parallel computer. Simulation results indicate that the parallel computing technique, as implemented in the TOUGH2 code, is very efficient. The reliability and accuracy of the model results have been demonstrated by comparing them to those of small-scale (coarse-grid) models. These comparisons show that simulation results obtained with the refined grid provide more detailed predictions of the future flow conditions at the site, aiding in the assessment of proposed repository performance

  17. Large-scale derived flood frequency analysis based on continuous simulation

    Science.gov (United States)

    Dung Nguyen, Viet; Hundecha, Yeshewatesfa; Guse, Björn; Vorogushyn, Sergiy; Merz, Bruno

    2016-04-01

    There is an increasing need for spatially consistent flood risk assessments at the regional scale (several 100.000 km2), in particular in the insurance industry and for national risk reduction strategies. However, most large-scale flood risk assessments are composed of smaller-scale assessments and show spatial inconsistencies. To overcome this deficit, a large-scale flood model composed of a weather generator and catchments models was developed reflecting the spatially inherent heterogeneity. The weather generator is a multisite and multivariate stochastic model capable of generating synthetic meteorological fields (precipitation, temperature, etc.) at daily resolution for the regional scale. These fields respect the observed autocorrelation, spatial correlation and co-variance between the variables. They are used as input into catchment models. A long-term simulation of this combined system enables to derive very long discharge series at many catchment locations serving as a basic for spatially consistent flood risk estimates at the regional scale. This combined model was set up and validated for major river catchments in Germany. The weather generator was trained by 53-year observation data at 528 stations covering not only the complete Germany but also parts of France, Switzerland, Czech Republic and Australia with the aggregated spatial scale of 443,931 km2. 10.000 years of daily meteorological fields for the study area were generated. Likewise, rainfall-runoff simulations with SWIM were performed for the entire Elbe, Rhine, Weser, Donau and Ems catchments. The validation results illustrate a good performance of the combined system, as the simulated flood magnitudes and frequencies agree well with the observed flood data. Based on continuous simulation this model chain is then used to estimate flood quantiles for the whole Germany including upstream headwater catchments in neighbouring countries. This continuous large scale approach overcomes the several

  18. Silicon hybrid integration

    International Nuclear Information System (INIS)

    Li Xianyao; Yuan Taonu; Shao Shiqian; Shi Zujun; Wang Yi; Yu Yude; Yu Jinzhong

    2011-01-01

    Recently,much attention has concentrated on silicon based photonic integrated circuits (PICs), which provide a cost-effective solution for high speed, wide bandwidth optical interconnection and optical communication.To integrate III-V compounds and germanium semiconductors on silicon substrates,at present there are two kinds of manufacturing methods, i.e., heteroepitaxy and bonding. Low-temperature wafer bonding which can overcome the high growth temperature, lattice mismatch,and incompatibility of thermal expansion coefficients during heteroepitaxy, has offered the possibility for large-scale heterogeneous integration. In this paper, several commonly used bonding methods are reviewed, and the future trends of low temperature wafer bonding envisaged. (authors)

  19. Spatial Heterogeneity, Scale, Data Character and Sustainable Transport in the Big Data Era

    Science.gov (United States)

    Jiang, Bin

    2018-04-01

    In light of the emergence of big data, I have advocated and argued for a paradigm shift from Tobler's law to scaling law, from Euclidean geometry to fractal geometry, from Gaussian statistics to Paretian statistics, and - more importantly - from Descartes' mechanistic thinking to Alexander's organic thinking. Fractal geometry falls under the third definition of fractal - that is, a set or pattern is fractal if the scaling of far more small things than large ones recurs multiple times (Jiang and Yin 2014) - rather than under the second definition of fractal, which requires a power law between scales and details (Mandelbrot 1982). The new fractal geometry is more towards living geometry that "follows the rules, constraints, and contingent conditions that are, inevitably, encountered in the real world" (Alexander et al. 2012, p. 395), not only for understanding complexity, but also for creating complex or living structure (Alexander 2002-2005). This editorial attempts to clarify why the paradigm shift is essential and to elaborate on several concepts, including spatial heterogeneity (scaling law), scale (or the fourth meaning of scale), data character (in contrast to data quality), and sustainable transport in the big data era.

  20. 'Take the long way down': Integration of large-scale North Sea wind using HVDC transmission

    International Nuclear Information System (INIS)

    Weigt, Hannes; Jeske, Till; Leuthold, Florian; Hirschhausen, Christian von

    2010-01-01

    We analyze the impact of extensive wind development in Germany for the year 2015, focusing on grid extensions and price signals. We apply the electricity generation and network model ELMOD to compare zonal, nodal, and uniform pricing approaches. In addition to a reference case of network extensions recommended by the German Energy Agency (Dena), we develop a scenario to transmit wind energy to major load centers in Western and Southern Germany via high-voltage direct current (HVDC) connections. From an economic-engineering standpoint, our results indicate that these connections are the most economic way to manage the integration of large-scale offshore wind resources, and that nodal pricing is most likely to determine the locales for future investment to eliminate congestion. We conclude with a description of the model's potential limitations.

  1. Heterogeneous porous media permeability field characterization from fluid displacement data; Integration de donnees de deplacements de fluides dans la caracterisation de milieux poreux heterogenes

    Energy Technology Data Exchange (ETDEWEB)

    Kretz, V.

    2002-11-01

    The prediction of oil recovery or pollutant dispersion requires an accurate knowledge of the permeability field distribution. Available data are usually measurements in well bores, and, since a few years, 4D-seismic data (seismic mappings repeated in time). Such measurements allow to evaluate fluids displacements fronts evolution. The purpose of the thesis is to evaluate the possibility to determinate permeability fields from fluid displacement measurements in heterogeneous porous media. At the laboratory scale, experimental studies are made on a model and on numerical simulations. The system uses blocks of granular materials whose individual geometries and permeabilities are controlled. The fluids displacements are detected with an acoustical. The key parameters of the study are the size and spatial correlation of the permeability heterogeneity distribution, and the influence of viscosity and gravity contrasts between the injected ant displaced fluid. Then the inverse problem - evaluating the permeability field from concentration fronts evolution - is approached. At the reservoir scale, the work will mainly be focused on the integration of 4D-seismic data into inversion programs on a 3D synthetic case. A particular importance will be given to the calculation of gradients, in order to obtain a complementary information about the sensitivity of data. The information provided by 4D-seismic data consists in maps showing the vertical average of oil saturation or the presence of gas. The purpose is to integrate this qualitative information in the inversion process and to evaluate the impact on the reservoir characterization. Comparative studies - with or without 4D-seismic data - will be realized on a synthetic case. (author)

  2. Photorealistic large-scale urban city model reconstruction.

    Science.gov (United States)

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite).

  3. Micro-scale heterogeneity in water temperature | Dallas | Water SA

    African Journals Online (AJOL)

    Micro-scale heterogeneity in water temperature was examined in 6 upland sites in the Western Cape, South Africa. Hourly water temperature data converted to daily data showed that greatest differences were apparent in daily maximum temperatures between shallow- and deep-water biotopes during the warmest period of ...

  4. Web-GIS approach for integrated analysis of heterogeneous georeferenced data

    Science.gov (United States)

    Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander; Shulgina, Tamara

    2014-05-01

    Georeferenced datasets are currently actively used for modeling, interpretation and forecasting of climatic and ecosystem changes on different spatial and temporal scales [1]. Due to inherent heterogeneity of environmental datasets as well as their huge size (up to tens terabytes for a single dataset) a special software supporting studies in the climate and environmental change areas is required [2]. Dedicated information-computational system for integrated analysis of heterogeneous georeferenced climatological and meteorological data is presented. It is based on combination of Web and GIS technologies according to Open Geospatial Consortium (OGC) standards, and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library (http://www.geoext.org), ExtJS Framework (http://www.sencha.com/products/extjs) and OpenLayers software (http://openlayers.org). The main advantage of the system lies in it's capability to perform integrated analysis of time series of georeferenced data obtained from different sources (in-situ observations, model results, remote sensing data) and to combine the results in a single map [3, 4] as WMS and WFS layers in a web-GIS application. Also analysis results are available for downloading as binary files from the graphical user interface or can be directly accessed through web mapping (WMS) and web feature (WFS) services for a further processing by the user. Data processing is performed on geographically distributed computational cluster comprising data storage systems and corresponding computational nodes. Several geophysical datasets represented by NCEP/NCAR Reanalysis II, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, DWD Global Precipitation Climatology Centre's data, GMAO Modern Era-Retrospective analysis for Research and Applications, reanalysis of Monitoring

  5. Identifying and quantifying heterogeneity in high content analysis: application of heterogeneity indices to drug discovery.

    Directory of Open Access Journals (Sweden)

    Albert H Gough

    Full Text Available One of the greatest challenges in biomedical research, drug discovery and diagnostics is understanding how seemingly identical cells can respond differently to perturbagens including drugs for disease treatment. Although heterogeneity has become an accepted characteristic of a population of cells, in drug discovery it is not routinely evaluated or reported. The standard practice for cell-based, high content assays has been to assume a normal distribution and to report a well-to-well average value with a standard deviation. To address this important issue we sought to define a method that could be readily implemented to identify, quantify and characterize heterogeneity in cellular and small organism assays to guide decisions during drug discovery and experimental cell/tissue profiling. Our study revealed that heterogeneity can be effectively identified and quantified with three indices that indicate diversity, non-normality and percent outliers. The indices were evaluated using the induction and inhibition of STAT3 activation in five cell lines where the systems response including sample preparation and instrument performance were well characterized and controlled. These heterogeneity indices provide a standardized method that can easily be integrated into small and large scale screening or profiling projects to guide interpretation of the biology, as well as the development of therapeutics and diagnostics. Understanding the heterogeneity in the response to perturbagens will become a critical factor in designing strategies for the development of therapeutics including targeted polypharmacology.

  6. Impacts of physical and chemical aquifer heterogeneity on basin-scale solute transport: Vulnerability of deep groundwater to arsenic contamination in Bangladesh

    Science.gov (United States)

    Michael, Holly A.; Khan, Mahfuzur R.

    2016-12-01

    Aquifer heterogeneity presents a primary challenge in predicting the movement of solutes in groundwater systems. The problem is particularly difficult on very large scales, across which permeability, chemical properties, and pumping rates may vary by many orders of magnitude and data are often sparse. An example is the fluvio-deltaic aquifer system of Bangladesh, where naturally-occurring arsenic (As) exists over tens of thousands of square kilometers in shallow groundwater. Millions of people in As-affected regions rely on deep (≥150 m) groundwater as a safe source of drinking water. The sustainability of this resource has been evaluated with models using effective properties appropriate for a basin-scale contamination problem, but the extent to which preferential flow affects the timescale of downward migration of As-contaminated shallow groundwater is unknown. Here we embed detailed, heterogeneous representations of hydraulic conductivity (K), pumping rates, and sorptive properties (Kd) within a basin-scale numerical groundwater flow and solute transport model to evaluate their effects on vulnerability and deviations from simulations with homogeneous representations in two areas with different flow systems. Advective particle tracking shows that heterogeneity in K does not affect average travel times from shallow zones to 150 m depth, but the travel times of the fastest 10% of particles decreases by a factor of ∼2. Pumping distributions do not strongly affect travel times if irrigation remains shallow, but increases in the deep pumping rate substantially reduce travel times. Simulation of advective-dispersive transport with sorption shows that deep groundwater is protected from contamination over a sustainable timeframe (>1000 y) if the spatial distribution of Kd is uniform. However, if only low-K sediments sorb As, 30% of the aquifer is not protected. Results indicate that sustainable management strategies in the Bengal Basin should consider impacts of both

  7. Considerations regarding system engineering in large scale projects with heterogeneous contexts

    Science.gov (United States)

    Cremonini, A.; Caiazzo, M.; Hayden, D.; Labate, M. G.; Oulgin, R.; Santander-Vela, J.

    2016-08-01

    In this paper we would like to share some considerations and lessons learned based on our direct experience as system engineer at the SKA project, with emphasis in the personal experiences of the first author. This is a very wide and ambitious program, which involves several stakeholders with a level of heterogeneity in cultural backgrounds, technological heritages, multidisciplinary interplays, motivations and competences without precedents. The role of the leading author is to amalgamate efforts in order to deliver the "MID telescope" and in that role, he has often discovered that, Systems Engineering means far more than purely a disciplined sets of processes.

  8. Energy transfers in large-scale and small-scale dynamos

    Science.gov (United States)

    Samtaney, Ravi; Kumar, Rohit; Verma, Mahendra

    2015-11-01

    We present the energy transfers, mainly energy fluxes and shell-to-shell energy transfers in small-scale dynamo (SSD) and large-scale dynamo (LSD) using numerical simulations of MHD turbulence for Pm = 20 (SSD) and for Pm = 0.2 on 10243 grid. For SSD, we demonstrate that the magnetic energy growth is caused by nonlocal energy transfers from the large-scale or forcing-scale velocity field to small-scale magnetic field. The peak of these energy transfers move towards lower wavenumbers as dynamo evolves, which is the reason for the growth of the magnetic fields at the large scales. The energy transfers U2U (velocity to velocity) and B2B (magnetic to magnetic) are forward and local. For LSD, we show that the magnetic energy growth takes place via energy transfers from large-scale velocity field to large-scale magnetic field. We observe forward U2U and B2B energy flux, similar to SSD.

  9. Delta-Connected Cascaded H-Bridge Multilevel Converters for Large-Scale Photovoltaic Grid Integration

    DEFF Research Database (Denmark)

    Yu, Yifan; Konstantinou, Georgios; Townsend, Christopher D.

    2017-01-01

    The cascaded H-bridge (CHB) converter is becoming a promising candidate for use in next generation large-scale photovoltaic (PV) power plants. However, solar power generation in the three converter phase-legs can be significantly unbalanced, especially in a large geographically-dispersed plant....... The power imbalance between the three phases defines a limit for the injection of balanced three-phase currents to the grid. This paper quantifies the performance of, and experimentally confirms, the recently proposed delta-connected CHB converter for PV applications as an alternative configuration...... for large-scale PV power plants. The required voltage and current overrating for the converter is analytically developed and compared against the star-connected counterpart. It is shown that the delta-connected CHB converter extends the balancing capabilities of the star-connected CHB and can accommodate...

  10. Integrating weather and geotechnical monitoring data for assessing the stability of large scale surface mining operations

    Science.gov (United States)

    Steiakakis, Chrysanthos; Agioutantis, Zacharias; Apostolou, Evangelia; Papavgeri, Georgia; Tripolitsiotis, Achilles

    2016-01-01

    The geotechnical challenges for safe slope design in large scale surface mining operations are enormous. Sometimes one degree of slope inclination can significantly reduce the overburden to ore ratio and therefore dramatically improve the economics of the operation, while large scale slope failures may have a significant impact on human lives. Furthermore, adverse weather conditions, such as high precipitation rates, may unfavorably affect the already delicate balance between operations and safety. Geotechnical, weather and production parameters should be systematically monitored and evaluated in order to safely operate such pits. Appropriate data management, processing and storage are critical to ensure timely and informed decisions. This paper presents an integrated data management system which was developed over a number of years as well as the advantages through a specific application. The presented case study illustrates how the high production slopes of a mine that exceed depths of 100-120 m were successfully mined with an average displacement rate of 10- 20 mm/day, approaching an almost slow to moderate landslide velocity. Monitoring data of the past four years are included in the database and can be analyzed to produce valuable results. Time-series data correlations of movements, precipitation records, etc. are evaluated and presented in this case study. The results can be used to successfully manage mine operations and ensure the safety of the mine and the workforce.

  11. The integration of novel diagnostics techniques for multi-scale monitoring of large civil infrastructures

    Directory of Open Access Journals (Sweden)

    F. Soldovieri

    2008-11-01

    Full Text Available In the recent years, structural monitoring of large infrastructures (buildings, dams, bridges or more generally man-made structures has raised an increased attention due to the growing interest about safety and security issues and risk assessment through early detection. In this framework, aim of the paper is to introduce a new integrated approach which combines two sensing techniques acting on different spatial and temporal scales. The first one is a distributed optic fiber sensor based on the Brillouin scattering phenomenon, which allows a spatially and temporally continuous monitoring of the structure with a "low" spatial resolution (meter. The second technique is based on the use of Ground Penetrating Radar (GPR, which can provide detailed images of the inner status of the structure (with a spatial resolution less then tens centimetres, but does not allow a temporal continuous monitoring. The paper describes the features of these two techniques and provides experimental results concerning preliminary test cases.

  12. Algorithmic Foundation of Spectral Rarefaction for Measuring Satellite Imagery Heterogeneity at Multiple Spatial Scales

    Science.gov (United States)

    Rocchini, Duccio

    2009-01-01

    Measuring heterogeneity in satellite imagery is an important task to deal with. Most measures of spectral diversity have been based on Shannon Information theory. However, this approach does not inherently address different scales, ranging from local (hereafter referred to alpha diversity) to global scales (gamma diversity). The aim of this paper is to propose a method for measuring spectral heterogeneity at multiple scales based on rarefaction curves. An algorithmic solution of rarefaction applied to image pixel values (Digital Numbers, DNs) is provided and discussed. PMID:22389600

  13. Dynamic heterogeneity: a framework to promote ecological integration and hypothesis generation in urban systems

    Science.gov (United States)

    S. T. A. Pickett; M. L. Cadenasso; E. J. Rosi-Marshall; Ken Belt; P. M. Groffman; Morgan Grove; E. G. Irwin; S. S. Kaushal; S. L. LaDeau; C. H. Nilon; C. M. Swan; P. S. Warren

    2016-01-01

    Urban areas are understood to be extraordinarily spatially heterogeneous. Spatial heterogeneity, and its causes, consequences, and changes, are central to ecological science. The social sciences and urban design and planning professions also include spatial heterogeneity as a key concern. However, urban ecology, as a pursuit that integrates across these disciplines,...

  14. Scale dependent drivers of wild bee diversity in tropical heterogeneous agricultural landscapes.

    Science.gov (United States)

    Basu, Parthiba; Parui, Arpan Kumar; Chatterjee, Soumik; Dutta, Aditi; Chakraborty, Pushan; Roberts, Stuart; Smith, Barbara

    2016-10-01

    Factors associated with agricultural intensification, for example, loss of seminatural vegetation and pesticide use has been shown to adversely affect the bee community. These factors may impact the bee community differently at different landscape scales. The scale dependency is expected to be more pronounced in heterogeneous landscapes. However, the scale-dependent response of the bee community to drivers of its decline is relatively understudied, especially in the tropics where the agricultural landscape is often heterogeneous. This study looked at effects of agricultural intensification on bee diversity at patch and landscape scales in a tropical agricultural landscape. Wild bees were sampled using 12 permanent pan trap stations. Patch and landscape characteristics were measured within a 100 m (patch scale) and a 500 m (landscape scale) radius of pan trap stations. Information on pesticide input was obtained from farmer surveys. Data on vegetation cover, productivity, and percentage of agricultural and fallow land (FL) were collected using satellite imagery. Intensive areas in a bee-site network were less specialized in terms of resources to attract rare bee species while the less intensive areas, which supported more rare species, were more vulnerable to disturbance. A combination of patch quality and diversity as well as pesticide use regulates species diversity at the landscape scale (500 m), whereas pesticide quantity drove diversity at the patch scale (100 m). At the landscape scale, specialization of each site in terms of resources for bees increased with increasing patch diversity and FL while at the patch scale specialization declined with increased pesticide use. Bee functional groups responded differentially to landscape characteristics as well as pesticide use. Wood nesting bees were negatively affected by the number of pesticides used but other bee functional groups were not sensitive to pesticides. Synthesis and Applications : Different factors

  15. Integrated calibration of a 3D attitude sensor in large-scale metrology

    International Nuclear Information System (INIS)

    Gao, Yang; Lin, Jiarui; Yang, Linghui; Zhu, Jigui; Muelaner, Jody; Keogh, Patrick

    2017-01-01

    A novel calibration method is presented for a multi-sensor fusion system in large-scale metrology, which improves the calibration efficiency and reliability. The attitude sensor is composed of a pinhole prism, a converging lens, an area-array camera and a biaxial inclinometer. A mathematical model is established to determine its 3D attitude relative to a cooperative total station by using two vector observations from the imaging system and the inclinometer. There are two areas of unknown parameters in the measurement model that should be calibrated: the intrinsic parameters of the imaging model, and the transformation matrix between the camera and the inclinometer. An integrated calibration method using a three-axis rotary table and a total station is proposed. A single mounting position of the attitude sensor on the rotary table is sufficient to solve for all parameters of the measurement model. A correction technique for the reference laser beam of the total station is also presented to remove the need for accurate positioning of the sensor on the rotary table. Experimental verification has proved the practicality and accuracy of this calibration method. Results show that the mean deviations of attitude angles using the proposed method are less than 0.01°. (paper)

  16. Spatial heterogeneity and scale-dependent habitat selection for two sympatric raptors in mixed-grass prairie.

    Science.gov (United States)

    Atuo, Fidelis Akunke; O'Connell, Timothy John

    2017-08-01

    Sympatric predators are predicted to partition resources, especially under conditions of food limitation. Spatial heterogeneity that influences prey availability might play an important role in the scales at which potential competitors select habitat. We assessed potential mechanisms for coexistence by examining the role of heterogeneity in resource partitioning between sympatric raptors overwintering in the southern Great Plains. We conducted surveys for wintering Red-tailed hawk ( Buteo jamaicensis ) and Northern Harrier ( Circus cyanea ) at two state wildlife management areas in Oklahoma, USA. We used information from repeated distance sampling to project use locations in a GIS. We applied resource selection functions to model habitat selection at three scales and analyzed for niche partitioning using the outlying mean index. Habitat selection of the two predators was mediated by spatial heterogeneity. The two predators demonstrated significant fine-scale discrimination in habitat selection in homogeneous landscapes, but were more sympatric in heterogeneous landscapes. Red-tailed hawk used a variety of cover types in heterogeneous landscapes but specialized on riparian forest in homogeneous landscapes. Northern Harrier specialized on upland grasslands in homogeneous landscapes but selected more cover types in heterogeneous landscapes. Our study supports the growing body of evidence that landscapes can affect animal behaviors. In the system we studied, larger patches of primary land cover types were associated with greater allopatry in habitat selection between two potentially competing predators. Heterogeneity within the scale of raptor home ranges was associated with greater sympatry in use and less specialization in land cover types selected.

  17. A Proposal for Six Sigma Integration for Large-Scale Production of Penicillin G and Subsequent Conversion to 6-APA.

    Science.gov (United States)

    Nandi, Anirban; Pan, Sharadwata; Potumarthi, Ravichandra; Danquah, Michael K; Sarethy, Indira P

    2014-01-01

    Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D). In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA). It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA.

  18. A Proposal for Six Sigma Integration for Large-Scale Production of Penicillin G and Subsequent Conversion to 6-APA

    Directory of Open Access Journals (Sweden)

    Anirban Nandi

    2014-01-01

    Full Text Available Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D. In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA. It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA.

  19. How to deal with negative power price spikes?-Flexible voluntary curtailment agreements for large-scale integration of wind

    International Nuclear Information System (INIS)

    Brandstaett, Christine; Brunekreeft, Gert; Jahnke, Katy

    2011-01-01

    For the large-scale integration of electricity from renewable energy sources (RES-E), the German system seems to reach its limits. In 2009, the electricity wholesale market experienced serious negative prices at times of high wind and low demand. The feed-in system in Germany consists of a fixed feed-in price, a take-off obligation and a RES priority rule, and in practice only very restrictive use of RES-E curtailment. Exactly the latter is the problem. We argue that the overall performance of the system would improve seriously by lifting the restrictions on the use of voluntary curtailment agreements, while retaining the priority rule as such. Since generators of RES-E can only improve under this system reform, investment conditions improve, leading to higher installed RES-E capacity. This in turn implies that reduced wind output due to curtailment can actually be offset by higher wind output in all periods in which there is no problem. - Highlights: → We examine the large-scale integration of electricity from renewable sources (RES-E) into the German energy market. → Seriously negative prices at the wholesale market suggest that market design could be improved. → We argue that allowing flexible use of voluntary curtailment agreements (VCA), while keeping the priority feed-in rule, would increase the total system's efficiency. → Improved investment conditions due to flexible use of VCAs leading to higher installed RES-E capacity could offset the reduced wind output and would not impede climate policy goals.

  20. Large-scale hydrogen production using nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ryland, D.; Stolberg, L.; Kettner, A.; Gnanapragasam, N.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    For many years, Atomic Energy of Canada Limited (AECL) has been studying the feasibility of using nuclear reactors, such as the Supercritical Water-cooled Reactor, as an energy source for large scale hydrogen production processes such as High Temperature Steam Electrolysis and the Copper-Chlorine thermochemical cycle. Recent progress includes the augmentation of AECL's experimental capabilities by the construction of experimental systems to test high temperature steam electrolysis button cells at ambient pressure and temperatures up to 850{sup o}C and CuCl/HCl electrolysis cells at pressures up to 7 bar and temperatures up to 100{sup o}C. In parallel, detailed models of solid oxide electrolysis cells and the CuCl/HCl electrolysis cell are being refined and validated using experimental data. Process models are also under development to assess options for economic integration of these hydrogen production processes with nuclear reactors. Options for large-scale energy storage, including hydrogen storage, are also under study. (author)

  1. Atypical language laterality is associated with large-scale disruption of network integration in children with intractable focal epilepsy.

    Science.gov (United States)

    Ibrahim, George M; Morgan, Benjamin R; Doesburg, Sam M; Taylor, Margot J; Pang, Elizabeth W; Donner, Elizabeth; Go, Cristina Y; Rutka, James T; Snead, O Carter

    2015-04-01

    Epilepsy is associated with disruption of integration in distributed networks, together with altered localization for functions such as expressive language. The relation between atypical network connectivity and altered localization is unknown. In the current study we tested whether atypical expressive language laterality was associated with the alteration of large-scale network integration in children with medically-intractable localization-related epilepsy (LRE). Twenty-three right-handed children (age range 8-17) with medically-intractable LRE performed a verb generation task in fMRI. Language network activation was identified and the Laterality index (LI) was calculated within the pars triangularis and pars opercularis. Resting-state data from the same cohort were subjected to independent component analysis. Dual regression was used to identify associations between resting-state integration and LI values. Higher positive values of the LI, indicating typical language localization were associated with stronger functional integration of various networks including the default mode network (DMN). The normally symmetric resting-state networks showed a pattern of lateralized connectivity mirroring that of language function. The association between atypical language localization and network integration implies a widespread disruption of neural network development. These findings may inform the interpretation of localization studies by providing novel insights into reorganization of neural networks in epilepsy. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Large Scale Integration of Renewable Power Sources into the Vietnamese Power System

    Science.gov (United States)

    Kies, Alexander; Schyska, Bruno; Thanh Viet, Dinh; von Bremen, Lueder; Heinemann, Detlev; Schramm, Stefan

    2017-04-01

    The Vietnamese Power system is expected to expand considerably in upcoming decades. Power capacities installed are projected to grow from 39 GW in 2015 to 129.5 GW by 2030. Installed wind power capacities are expected to grow to 6 GW (0.8 GW 2015) and solar power capacities to 12 GW (0.85 GW 2015). This goes hand in hand with an increase of the renewable penetration in the power mix from 1.3% from wind and photovoltaics (PV) in 2015 to 5.4% by 2030. The overall potential for wind power in Vietnam is estimated to be around 24 GW. Moreover, the up-scaling of renewable energy sources was formulated as one of the priorized targets of the Vietnamese government in the National Power Development Plan VII. In this work, we investigate the transition of the Vietnamese power system towards high shares of renewables. For this purpose, we jointly optimise the expansion of renewable generation facilities for wind and PV, and the transmission grid within renewable build-up pathways until 2030 and beyond. To simulate the Vietnamese power system and its generation from renewable sources, we use highly spatially and temporally resolved historical weather and load data and the open source modelling toolbox Python for Power System Analysis (PyPSA). We show that the highest potential of renewable generation for wind and PV is observed in southern Vietnam and discuss the resulting need for transmission grid extensions in dependency of the optimal pathway. Furthermore, we show that the smoothing effect of wind power has several considerable beneficial effects and that the Vietnamese hydro power potential can be efficiently used to provide balancing opportunities. This work is part of the R&D Project "Analysis of the Large Scale Integration of Renewable Power into the Future Vietnamese Power System" (GIZ, 2016-2018).

  3. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  4. Data Portal for the Library of Integrated Network-based Cellular Signatures (LINCS) program: integrated access to diverse large-scale cellular perturbation response data

    Science.gov (United States)

    Koleti, Amar; Terryn, Raymond; Stathias, Vasileios; Chung, Caty; Cooper, Daniel J; Turner, John P; Vidović, Dušica; Forlin, Michele; Kelley, Tanya T; D’Urso, Alessandro; Allen, Bryce K; Torre, Denis; Jagodnik, Kathleen M; Wang, Lily; Jenkins, Sherry L; Mader, Christopher; Niu, Wen; Fazel, Mehdi; Mahi, Naim; Pilarczyk, Marcin; Clark, Nicholas; Shamsaei, Behrouz; Meller, Jarek; Vasiliauskas, Juozas; Reichard, John; Medvedovic, Mario; Ma’ayan, Avi; Pillai, Ajay

    2018-01-01

    Abstract The Library of Integrated Network-based Cellular Signatures (LINCS) program is a national consortium funded by the NIH to generate a diverse and extensive reference library of cell-based perturbation-response signatures, along with novel data analytics tools to improve our understanding of human diseases at the systems level. In contrast to other large-scale data generation efforts, LINCS Data and Signature Generation Centers (DSGCs) employ a wide range of assay technologies cataloging diverse cellular responses. Integration of, and unified access to LINCS data has therefore been particularly challenging. The Big Data to Knowledge (BD2K) LINCS Data Coordination and Integration Center (DCIC) has developed data standards specifications, data processing pipelines, and a suite of end-user software tools to integrate and annotate LINCS-generated data, to make LINCS signatures searchable and usable for different types of users. Here, we describe the LINCS Data Portal (LDP) (http://lincsportal.ccs.miami.edu/), a unified web interface to access datasets generated by the LINCS DSGCs, and its underlying database, LINCS Data Registry (LDR). LINCS data served on the LDP contains extensive metadata and curated annotations. We highlight the features of the LDP user interface that is designed to enable search, browsing, exploration, download and analysis of LINCS data and related curated content. PMID:29140462

  5. Bioprocess scale-up/down as integrative enabling technology: from fluid mechanics to systems biology and beyond.

    Science.gov (United States)

    Delvigne, Frank; Takors, Ralf; Mudde, Rob; van Gulik, Walter; Noorman, Henk

    2017-09-01

    Efficient optimization of microbial processes is a critical issue for achieving a number of sustainable development goals, considering the impact of microbial biotechnology in agrofood, environment, biopharmaceutical and chemical industries. Many of these applications require scale-up after proof of concept. However, the behaviour of microbial systems remains unpredictable (at least partially) when shifting from laboratory-scale to industrial conditions. The need for robust microbial systems is thus highly needed in this context, as well as a better understanding of the interactions between fluid mechanics and cell physiology. For that purpose, a full scale-up/down computational framework is already available. This framework links computational fluid dynamics (CFD), metabolic flux analysis and agent-based modelling (ABM) for a better understanding of the cell lifelines in a heterogeneous environment. Ultimately, this framework can be used for the design of scale-down simulators and/or metabolically engineered cells able to cope with environmental fluctuations typically found in large-scale bioreactors. However, this framework still needs some refinements, such as a better integration of gas-liquid flows in CFD, and taking into account intrinsic biological noise in ABM. © 2017 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.

  6. Physical heterogeneity control on effective mineral dissolution rates

    Science.gov (United States)

    Jung, Heewon; Navarre-Sitchler, Alexis

    2018-04-01

    Hydrologic heterogeneity may be an important factor contributing to the discrepancy in laboratory and field measured dissolution rates, but the governing factors influencing mineral dissolution rates among various representations of physical heterogeneity remain poorly understood. Here, we present multiple reactive transport simulations of anorthite dissolution in 2D latticed random permeability fields and link the information from local grid scale (1 cm or 4 m) dissolution rates to domain-scale (1m or 400 m) effective dissolution rates measured by the flux-weighted average of an ensemble of flow paths. We compare results of homogeneous models to heterogeneous models with different structure and layered permeability distributions within the model domain. Chemistry is simplified to a single dissolving primary mineral (anorthite) distributed homogeneously throughout the domain and a single secondary mineral (kaolinite) that is allowed to dissolve or precipitate. Results show that increasing size in correlation structure (i.e. long integral scales) and high variance in permeability distribution are two important factors inducing a reduction in effective mineral dissolution rates compared to homogeneous permeability domains. Larger correlation structures produce larger zones of low permeability where diffusion is an important transport mechanism. Due to the increased residence time under slow diffusive transport, the saturation state of a solute with respect to a reacting mineral approaches equilibrium and reduces the reaction rate. High variance in permeability distribution favorably develops large low permeability zones that intensifies the reduction in mixing and effective dissolution rate. However, the degree of reduction in effective dissolution rate observed in 1 m × 1 m domains is too small (equilibrium conditions reduce the effective dissolution rate by increasing the saturation state. However, in large domains where less- or non-reactive zones develop, higher

  7. Distributed and hierarchical control techniques for large-scale power plant systems

    International Nuclear Information System (INIS)

    Raju, G.V.S.; Kisner, R.A.

    1985-08-01

    In large-scale systems, integrated and coordinated control functions are required to maximize plant availability, to allow maneuverability through various power levels, and to meet externally imposed regulatory limitations. Nuclear power plants are large-scale systems. Prime subsystems are those that contribute directly to the behavior of the plant's ultimate output. The prime subsystems in a nuclear power plant include reactor, primary and intermediate heat transport, steam generator, turbine generator, and feedwater system. This paper describes and discusses the continuous-variable control system developed to supervise prime plant subsystems for optimal control and coordination

  8. Upscaling of Constitutive Relations In Unsaturated Heterogeneous Porous Media

    International Nuclear Information System (INIS)

    Liu, H. H.; Bodvarsson, G. S.

    2001-01-01

    When numerical model are used for modeling field scale flow and transport processes in the subsurface, the problem of ''upscaling'' arises. Typical scales, corresponding to spatial resolutions of subsurface heterogeneity in numerical models, are generally much larger than the measurement scale of the parameters and physical processes involved. The upscaling problems is, then, one of assigning parameters to gridblock scale based on parameter values measured on small scales. The focus of this study is to develop an approach to determine large-scale (upscaled) constitutive relations (relationships among relative permeability, capillary pressure and saturation) from small-scale measurements for porous media for a range of air entry values that are typical for the tuff matrix in the unsaturated zone of Yucca Mountain. For porous media with large air entry values, capillary forces play a key role in determining spatial water distribution at large-scales. Therefore, a relatively uniform capillary pressure approximately exists even for a large gridblock scale under steady state flow conditions. Based on these reasoning, we developed formulations that relate upscaled constitutive relations to ones measured at core-scale. Numerical experiments with stochastically generated heterogeneous porous media were used to evaluate the upscaling formulations

  9. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  10. Perspectives on Clinical Informatics: Integrating Large-Scale Clinical, Genomic, and Health Information for Clinical Care

    Directory of Open Access Journals (Sweden)

    In Young Choi

    2013-12-01

    Full Text Available The advances in electronic medical records (EMRs and bioinformatics (BI represent two significant trends in healthcare. The widespread adoption of EMR systems and the completion of the Human Genome Project developed the technologies for data acquisition, analysis, and visualization in two different domains. The massive amount of data from both clinical and biology domains is expected to provide personalized, preventive, and predictive healthcare services in the near future. The integrated use of EMR and BI data needs to consider four key informatics areas: data modeling, analytics, standardization, and privacy. Bioclinical data warehouses integrating heterogeneous patient-related clinical or omics data should be considered. The representative standardization effort by the Clinical Bioinformatics Ontology (CBO aims to provide uniquely identified concepts to include molecular pathology terminologies. Since individual genome data are easily used to predict current and future health status, different safeguards to ensure confidentiality should be considered. In this paper, we focused on the informatics aspects of integrating the EMR community and BI community by identifying opportunities, challenges, and approaches to provide the best possible care service for our patients and the population.

  11. Large scale and performance tests of the ATLAS online software

    International Nuclear Information System (INIS)

    Alexandrov; Kotov, V.; Mineev, M.; Roumiantsev, V.; Wolters, H.; Amorim, A.; Pedro, L.; Ribeiro, A.; Badescu, E.; Caprini, M.; Burckhart-Chromek, D.; Dobson, M.; Jones, R.; Kazarov, A.; Kolos, S.; Liko, D.; Lucio, L.; Mapelli, L.; Nassiakou, M.; Schweiger, D.; Soloviev, I.; Hart, R.; Ryabov, Y.; Moneta, L.

    2001-01-01

    One of the sub-systems of the Trigger/DAQ system of the future ATLAS experiment is the Online Software system. It encompasses the functionality needed to configure, control and monitor the DAQ. Its architecture is based on a component structure described in the ATLAS Trigger/DAQ technical proposal. Regular integration tests ensure its smooth operation in test beam setups during its evolutionary development towards the final ATLAS online system. Feedback is received and returned into the development process. Studies of the system behavior have been performed on a set of up to 111 PCs on a configuration which is getting closer to the final size. Large scale and performance test of the integrated system were performed on this setup with emphasis on investigating the aspects of the inter-dependence of the components and the performance of the communication software. Of particular interest were the run control state transitions in various configurations of the run control hierarchy. For the purpose of the tests, the software from other Trigger/DAQ sub-systems has been emulated. The author presents a brief overview of the online system structure, its components and the large scale integration tests and their results

  12. The necessity of and policy suggestions for implementing a limited number of large scale, fully integrated CCS demonstrations in China

    International Nuclear Information System (INIS)

    Li Zheng; Zhang Dongjie; Ma Linwei; West, Logan; Ni Weidou

    2011-01-01

    CCS is seen as an important and strategic technology option for China to reduce its CO 2 emission, and has received tremendous attention both around the world and in China. Scholars are divided on the role CCS should play, making the future of CCS in China highly uncertain. This paper presents the overall circumstances for CCS development in China, including the threats and opportunities for large scale deployment of CCS, the initial barriers and advantages that China currently possesses, as well as the current progress of CCS demonstration in China. The paper proposes the implementation of a limited number of larger scale, fully integrated CCS demonstration projects and explains the potential benefits that could be garnered. The problems with China's current CCS demonstration work are analyzed, and some targeted policies are proposed based on those observations. These policy suggestions can effectively solve these problems, help China gain the benefits with CCS demonstration soon, and make great contributions to China's big CO 2 reduction mission. - Highlights: → We analyze the overall circumstances for CCS development in China in detail. → China can garner multiple benefits by conducting several large, integrated CCS demos. → We present the current progress in CCS demonstration in China in detail. → Some problems exist with China's current CCS demonstration work. → Some focused policies are suggested to improve CCS demonstration in China.

  13. Large Scale System Safety Integration for Human Rated Space Vehicles

    Science.gov (United States)

    Massie, Michael J.

    2005-12-01

    Since the 1960s man has searched for ways to establish a human presence in space. Unfortunately, the development and operation of human spaceflight vehicles carry significant safety risks that are not always well understood. As a result, the countries with human space programs have felt the pain of loss of lives in the attempt to develop human space travel systems. Integrated System Safety is a process developed through years of experience (since before Apollo and Soyuz) as a way to assess risks involved in space travel and prevent such losses. The intent of Integrated System Safety is to take a look at an entire program and put together all the pieces in such a way that the risks can be identified, understood and dispositioned by program management. This process has many inherent challenges and they need to be explored, understood and addressed.In order to prepare truly integrated analysis safety professionals must gain a level of technical understanding of all of the project's pieces and how they interact. Next, they must find a way to present the analysis so the customer can understand the risks and make decisions about managing them. However, every organization in a large-scale project can have different ideas about what is or is not a hazard, what is or is not an appropriate hazard control, and what is or is not adequate hazard control verification. NASA provides some direction on these topics, but interpretations of those instructions can vary widely.Even more challenging is the fact that every individual/organization involved in a project has different levels of risk tolerance. When the discrete hazard controls of the contracts and agreements cannot be met, additional risk must be accepted. However, when one has left the arena of compliance with the known rules, there can be no longer be specific ground rules on which to base a decision as to what is acceptable and what is not. The integrator must find common grounds between all parties to achieve

  14. Use of large-scale multi-configuration EMI measurements to characterize heterogeneous subsurface structures and their impact on crop productivity

    Science.gov (United States)

    Brogi, Cosimo; Huisman, Johan Alexander; Kaufmann, Manuela Sarah; von Hebel, Christian; van der Kruk, Jan; Vereecken, Harry

    2017-04-01

    Soil subsurface structures can play a key role in crop performance, especially during water stress periods. Geophysical techniques like electromagnetic induction EMI have been shown to be able of providing information about dominant shallow subsurface features. However, previous work with EMI has typically not reached beyond the field scale. The objective of this study is to use large-scale multi-configuration EMI to characterize patterns of soil structural organization (layering and texture) and the associated impact on crop vegetation at the km2 scale. For this, we carried out an intensive measurement campaign and collected high spatial resolution multi-configuration EMI data on an agricultural area of approx. 1 km2 (102 ha) near Selhausen (North Rhine-Westphalia, Germany) with a maximum depth of investigation of around 2.5 m. We measured using two EMI instruments simultaneously with a total of nine coil configurations. The instruments were placed inside polyethylene sleds that were pulled by an all-terrain-vehicle along parallel lines with a spacing of 2 to 2.5 m. The driving speed was between 5 and 7 km h-1 and we used a 0.2 Hz sampling frequency to obtain an in-line resolution of approximately 0.3 m. The survey area consists of almost 50 different fields managed in different way. The EMI measurements were collected between April and December 2016 within a few days after the harvest of each field. After data acquisition, EMI data were automatically filtered, temperature corrected, and interpolated onto a common grid. The resulting EMI maps allowed us to identify three main areas with different subsurface heterogeneities. The differences between these areas are likely related to the late quaternary geological history (Pleistocene and Holocene) of the area that resulted in spatially variable soil texture and layering, which has a strong impact on spatio-temporal soil water content variability. The high resolution surveys also allowed us to identify small scale

  15. Structured approaches to large-scale systems: Variational integrators for interconnected Lagrange-Dirac systems and structured model reduction on Lie groups

    Science.gov (United States)

    Parks, Helen Frances

    This dissertation presents two projects related to the structured integration of large-scale mechanical systems. Structured integration uses the considerable differential geometric structure inherent in mechanical motion to inform the design of numerical integration schemes. This process improves the qualitative properties of simulations and becomes especially valuable as a measure of accuracy over long time simulations in which traditional Gronwall accuracy estimates lose their meaning. Often, structured integration schemes replicate continuous symmetries and their associated conservation laws at the discrete level. Such is the case for variational integrators, which discretely replicate the process of deriving equations of motion from variational principles. This results in the conservation of momenta associated to symmetries in the discrete system and conservation of a symplectic form when applicable. In the case of Lagrange-Dirac systems, variational integrators preserve a discrete analogue of the Dirac structure preserved in the continuous flow. In the first project of this thesis, we extend Dirac variational integrators to accommodate interconnected systems. We hope this work will find use in the fields of control, where a controlled system can be thought of as a "plant" system joined to its controller, and in the approach of very large systems, where modular modeling may prove easier than monolithically modeling the entire system. The second project of the thesis considers a different approach to large systems. Given a detailed model of the full system, can we reduce it to a more computationally efficient model without losing essential geometric structures in the system? Asked without the reference to structure, this is the essential question of the field of model reduction. The answer there has been a resounding yes, with Principal Orthogonal Decomposition (POD) with snapshots rising as one of the most successful methods. Our project builds on previous work

  16. Cascade of chromosomal rearrangements caused by a heterogeneous T-DNA integration supports the double-stranded break repair model for T-DNA integration.

    Science.gov (United States)

    Hu, Yufei; Chen, Zhiyu; Zhuang, Chuxiong; Huang, Jilei

    2017-06-01

    Transferred DNA (T-DNA) from Agrobacterium tumefaciens can be integrated into the plant genome. The double-stranded break repair (DSBR) pathway is a major model for T-DNA integration. From this model, we expect that two ends of a T-DNA molecule would invade into a single DNA double-stranded break (DSB) or independent DSBs in the plant genome. We call the later phenomenon a heterogeneous T-DNA integration, which has never been observed. In this work, we demonstrated it in an Arabidopsis T-DNA insertion mutant seb19. To resolve the chromosomal structural changes caused by T-DNA integration at both the nucleotide and chromosome levels, we performed inverse PCR, genome resequencing, fluorescence in situ hybridization and linkage analysis. We found, in seb19, a single T-DNA connected two different chromosomal loci and caused complex chromosomal rearrangements. The specific break-junction pattern in seb19 is consistent with the result of heterogeneous T-DNA integration but not of recombination between two T-DNA insertions. We demonstrated that, in seb19, heterogeneous T-DNA integration evoked a cascade of incorrect repair of seven DSBs on chromosomes 4 and 5, and then produced translocation, inversion, duplication and deletion. Heterogeneous T-DNA integration supports the DSBR model and suggests that two ends of a T-DNA molecule could be integrated into the plant genome independently. Our results also show a new origin of chromosomal abnormalities. © 2017 The Authors The Plant Journal © 2017 John Wiley & Sons Ltd.

  17. Large-scale modeling of condition-specific gene regulatory networks by information integration and inference.

    Science.gov (United States)

    Ellwanger, Daniel Christian; Leonhardt, Jörn Florian; Mewes, Hans-Werner

    2014-12-01

    Understanding how regulatory networks globally coordinate the response of a cell to changing conditions, such as perturbations by shifting environments, is an elementary challenge in systems biology which has yet to be met. Genome-wide gene expression measurements are high dimensional as these are reflecting the condition-specific interplay of thousands of cellular components. The integration of prior biological knowledge into the modeling process of systems-wide gene regulation enables the large-scale interpretation of gene expression signals in the context of known regulatory relations. We developed COGERE (http://mips.helmholtz-muenchen.de/cogere), a method for the inference of condition-specific gene regulatory networks in human and mouse. We integrated existing knowledge of regulatory interactions from multiple sources to a comprehensive model of prior information. COGERE infers condition-specific regulation by evaluating the mutual dependency between regulator (transcription factor or miRNA) and target gene expression using prior information. This dependency is scored by the non-parametric, nonlinear correlation coefficient η(2) (eta squared) that is derived by a two-way analysis of variance. We show that COGERE significantly outperforms alternative methods in predicting condition-specific gene regulatory networks on simulated data sets. Furthermore, by inferring the cancer-specific gene regulatory network from the NCI-60 expression study, we demonstrate the utility of COGERE to promote hypothesis-driven clinical research. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  18. Generative Adversarial Networks Based Heterogeneous Data Integration and Its Application for Intelligent Power Distribution and Utilization

    Directory of Open Access Journals (Sweden)

    Yuanpeng Tan

    2018-01-01

    Full Text Available Heterogeneous characteristics of a big data system for intelligent power distribution and utilization have already become more and more prominent, which brings new challenges for the traditional data analysis technologies and restricts the comprehensive management of distribution network assets. In order to solve the problem that heterogeneous data resources of power distribution systems are difficult to be effectively utilized, a novel generative adversarial networks (GANs based heterogeneous data integration method for intelligent power distribution and utilization is proposed. In the proposed method, GANs theory is introduced to expand the distribution of completed data samples. Then, a so-called peak clustering algorithm is proposed to realize the finite open coverage of the expanded sample space, and repair those incomplete samples to eliminate the heterogeneous characteristics. Finally, in order to realize the integration of the heterogeneous data for intelligent power distribution and utilization, the well-trained discriminator model of GANs is employed to check the restored data samples. The simulation experiments verified the validity and stability of the proposed heterogeneous data integration method, which provides a novel perspective for the further data quality management of power distribution systems.

  19. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  20. Evaluation of scaling concepts for integral system test facilities

    International Nuclear Information System (INIS)

    Condie, K.G.; Larson, T.K.; Davis, C.B.

    1987-01-01

    A study was conducted by EG and G Idaho, Inc., to identify and technically evaluate potential concepts which will allow the U.S. Nuclear Regulatory Commission to maintain the capability to conduct future integral, thermal-hydraulic facility experiments of interest to light water reactor safety. This paper summarizes the methodology used in the study and presents a rankings for each facility concept relative to its ability to simulate phenomena identified as important in selected reactor transients in Babcock and Wilcox and Westinghouse large pressurized water reactors. Established scaling methodologies are used to develop potential concepts for scaled integral thermal-hydraulic experiment facilities. Concepts selected included: full height, full pressure water; reduced height, reduced pressure water; reduced height, full pressure water; one-tenth linear, full pressure water; and reduced height, full scaled pressure Freon. Results from this study suggest that a facility capable of operating at typical reactor operating conditions will scale most phenomena reasonably well. Local heat transfer phenomena is best scaled by the full height facility, while the reduced height facilities provide better scaling where multi-dimensional phenomena are considered important. Although many phenomena in facilities using Freon or water at nontypical pressure will scale reasonably well, those phenomena which are heavily dependent on quality can be distorted. Furthermore, relation of data produced in facilities operating with nontypical fluids or at nontypical pressures to large plants will be a difficult and time-consuming process

  1. Large-scale Wind Power integration in a Hydro-Thermal Power Market

    OpenAIRE

    Trøtscher, Thomas

    2007-01-01

    This master thesis describes a quadratic programming model used to calculate the spot prices in an efficient multi-area power market. The model has been adapted to Northern Europe, with focus on Denmark West and the integration of large quantities of wind power. In the model, demand and supply of electricity are equated, at an hourly time resolution, to find the spot price in each area. Historical load values are used to represent demand which is assumed to be completely inelastic. Supply i...

  2. The use of production management techniques in the construction of large scale physics detectors

    CERN Document Server

    Bazan, A; Estrella, F; Kovács, Z; Le Flour, T; Le Goff, J M; Lieunard, S; McClatchey, R; Murray, S; Varga, L Z; Vialle, J P; Zsenei, M

    1999-01-01

    The construction process of detectors for the Large Hadron Collider (LHC) experiments is large scale, heavily constrained by resource availability and evolves with time. As a consequence, changes in detector component design need to be tracked and quickly reflected in the construction process. With similar problems in industry engineers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so- called Workflow Management software (WfMS) to coordinate production work processes. However, PDM and WfMS software are not generally integrated in industry. The scale of LHC experiments, like CMS, demands that industrial production techniques be applied in detector construction. This paper outlines the major functions and applications of the CRISTAL system (Cooperating Repositories and an information System for Tracking Assembly Lifecycles) in use in CMS which successfully integrates PDM and WfMS techniques in managing large scale physics detector ...

  3. A CMOS-compatible large-scale monolithic integration of heterogeneous multi-sensors on flexible silicon for IoT applications

    KAUST Repository

    Nassar, Joanna M.

    2017-02-07

    We report CMOS technology enabled fabrication and system level integration of flexible bulk silicon (100) based multi-sensors platform which can simultaneously sense pressure, temperature, strain and humidity under various physical deformations. We also show an advanced wearable version for body vital monitoring which can enable advanced healthcare for IoT applications.

  4. A CMOS-compatible large-scale monolithic integration of heterogeneous multi-sensors on flexible silicon for IoT applications

    KAUST Repository

    Nassar, Joanna M.; Sevilla, Galo T.; Velling, Seneca J.; Cordero, Marlon D.; Hussain, Muhammad Mustafa

    2017-01-01

    We report CMOS technology enabled fabrication and system level integration of flexible bulk silicon (100) based multi-sensors platform which can simultaneously sense pressure, temperature, strain and humidity under various physical deformations. We also show an advanced wearable version for body vital monitoring which can enable advanced healthcare for IoT applications.

  5. Understanding as Integration of Heterogeneous Representations

    Science.gov (United States)

    Martínez, Sergio F.

    2014-03-01

    The search for understanding is a major aim of science. Traditionally, understanding has been undervalued in the philosophy of science because of its psychological underpinnings; nowadays, however, it is widely recognized that epistemology cannot be divorced from psychology as sharp as traditional epistemology required. This eliminates the main obstacle to give scientific understanding due attention in philosophy of science. My aim in this paper is to describe an account of scientific understanding as an emergent feature of our mastering of different (causal) explanatory frameworks that takes place through the mastering of scientific practices. Different practices lead to different kinds of representations. Such representations are often heterogeneous. The integration of such representations constitute understanding.

  6. MiSTIC, an integrated platform for the analysis of heterogeneity in large tumour transcriptome datasets.

    Science.gov (United States)

    Lemieux, Sebastien; Sargeant, Tobias; Laperrière, David; Ismail, Houssam; Boucher, Geneviève; Rozendaal, Marieke; Lavallée, Vincent-Philippe; Ashton-Beaucage, Dariel; Wilhelm, Brian; Hébert, Josée; Hilton, Douglas J; Mader, Sylvie; Sauvageau, Guy

    2017-07-27

    Genome-wide transcriptome profiling has enabled non-supervised classification of tumours, revealing different sub-groups characterized by specific gene expression features. However, the biological significance of these subtypes remains for the most part unclear. We describe herein an interactive platform, Minimum Spanning Trees Inferred Clustering (MiSTIC), that integrates the direct visualization and comparison of the gene correlation structure between datasets, the analysis of the molecular causes underlying co-variations in gene expression in cancer samples, and the clinical annotation of tumour sets defined by the combined expression of selected biomarkers. We have used MiSTIC to highlight the roles of specific transcription factors in breast cancer subtype specification, to compare the aspects of tumour heterogeneity targeted by different prognostic signatures, and to highlight biomarker interactions in AML. A version of MiSTIC preloaded with datasets described herein can be accessed through a public web server (http://mistic.iric.ca); in addition, the MiSTIC software package can be obtained (github.com/iric-soft/MiSTIC) for local use with personalized datasets. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. Polymer-based 2D/3D wafer level heterogeneous integration for SSL module

    NARCIS (Netherlands)

    Yuan, C.; Wei, J.; Ye, H.; Koh, S.; Harianto, S.; Nieuwenhof, M.A. van den; Zhang, G.Q.

    2012-01-01

    This paper demonstrates a heterogeneous integration of solid state lighting (SSL) module, including light source (LED) and driver/control components. Such integration has been realized by the polymer-based reconfigured wafer level package technologies and such structure has been prototyped and

  8. The large-scale integration of wind generation: Impacts on price, reliability and dispatchable conventional suppliers

    International Nuclear Information System (INIS)

    MacCormack, John; Hollis, Aidan; Zareipour, Hamidreza; Rosehart, William

    2010-01-01

    This work examines the effects of large-scale integration of wind powered electricity generation in a deregulated energy-only market on loads (in terms of electricity prices and supply reliability) and dispatchable conventional power suppliers. Hourly models of wind generation time series, load and resultant residual demand are created. From these a non-chronological residual demand duration curve is developed that is combined with a probabilistic model of dispatchable conventional generator availability, a model of an energy-only market with a price cap, and a model of generator costs and dispatch behavior. A number of simulations are performed to evaluate the effect on electricity prices, overall reliability of supply, the ability of a dominant supplier acting strategically to profitably withhold supplies, and the fixed cost recovery of dispatchable conventional power suppliers at different levels of wind generation penetration. Medium and long term responses of the market and/or regulator in the long term are discussed.

  9. Heterogeneous Gossip

    Science.gov (United States)

    Frey, Davide; Guerraoui, Rachid; Kermarrec, Anne-Marie; Koldehofe, Boris; Mogensen, Martin; Monod, Maxime; Quéma, Vivien

    Gossip-based information dissemination protocols are considered easy to deploy, scalable and resilient to network dynamics. Load-balancing is inherent in these protocols as the dissemination work is evenly spread among all nodes. Yet, large-scale distributed systems are usually heterogeneous with respect to network capabilities such as bandwidth. In practice, a blind load-balancing strategy might significantly hamper the performance of the gossip dissemination.

  10. Automated Integration of Dedicated Hardwired IP Cores in Heterogeneous MPSoCs Designed with ESPAM

    Directory of Open Access Journals (Sweden)

    Ed Deprettere

    2008-06-01

    Full Text Available This paper presents a methodology and techniques for automated integration of dedicated hardwired (HW IP cores into heterogeneous multiprocessor systems. We propose an IP core integration approach based on an HW module generation that consists of a wrapper around a predefined IP core. This approach has been implemented in a tool called ESPAM for automated multiprocessor system design, programming, and implementation. In order to keep high performance of the integrated IP cores, the structure of the IP core wrapper is devised in a way that adequately represents and efficiently implements the main characteristics of the formal model of computation, namely, Kahn process networks, we use as an underlying programming model in ESPAM. We present details about the structure of the HW module, the supported types of IP cores, and the minimum interfaces these IP cores have to provide in order to allow automated integration in heterogeneous multiprocessor systems generated by ESPAM. The ESPAM design flow, the multiprocessor platforms we consider, and the underlying programming (KPN model are introduced as well. Furthermore, we present the efficiency of our approach by applying our methodology and ESPAM tool to automatically generate, implement, and program heterogeneous multiprocessor systems that integrate dedicated IP cores and execute real-life applications.

  11. Design study on sodium cooled large-scale reactor

    International Nuclear Information System (INIS)

    Murakami, Tsutomu; Hishida, Masahiko; Kisohara, Naoyuki

    2004-07-01

    In Phase 1 of the 'Feasibility Studies on Commercialized Fast Reactor Cycle Systems (F/S)', an advanced loop type reactor has been selected as a promising concept of sodium-cooled large-scale reactor, which has a possibility to fulfill the design requirements of the F/S. In Phase 2, design improvement for further cost reduction of establishment of the plant concept has been performed. This report summarizes the results of the design study on the sodium-cooled large-scale reactor performed in JFY2003, which is the third year of Phase 2. In the JFY2003 design study, critical subjects related to safety, structural integrity and thermal hydraulics which found in the last fiscal year has been examined and the plant concept has been modified. Furthermore, fundamental specifications of main systems and components have been set and economy has been evaluated. In addition, as the interim evaluation of the candidate concept of the FBR fuel cycle is to be conducted, cost effectiveness and achievability for the development goal were evaluated and the data of the three large-scale reactor candidate concepts were prepared. As a results of this study, the plant concept of the sodium-cooled large-scale reactor has been constructed, which has a prospect to satisfy the economic goal (construction cost: less than 200,000 yens/kWe, etc.) and has a prospect to solve the critical subjects. From now on, reflecting the results of elemental experiments, the preliminary conceptual design of this plant will be preceded toward the selection for narrowing down candidate concepts at the end of Phase 2. (author)

  12. Design study on sodium-cooled large-scale reactor

    International Nuclear Information System (INIS)

    Shimakawa, Yoshio; Nibe, Nobuaki; Hori, Toru

    2002-05-01

    In Phase 1 of the 'Feasibility Study on Commercialized Fast Reactor Cycle Systems (F/S)', an advanced loop type reactor has been selected as a promising concept of sodium-cooled large-scale reactor, which has a possibility to fulfill the design requirements of the F/S. In Phase 2 of the F/S, it is planed to precede a preliminary conceptual design of a sodium-cooled large-scale reactor based on the design of the advanced loop type reactor. Through the design study, it is intended to construct such a plant concept that can show its attraction and competitiveness as a commercialized reactor. This report summarizes the results of the design study on the sodium-cooled large-scale reactor performed in JFY2001, which is the first year of Phase 2. In the JFY2001 design study, a plant concept has been constructed based on the design of the advanced loop type reactor, and fundamental specifications of main systems and components have been set. Furthermore, critical subjects related to safety, structural integrity, thermal hydraulics, operability, maintainability and economy have been examined and evaluated. As a result of this study, the plant concept of the sodium-cooled large-scale reactor has been constructed, which has a prospect to satisfy the economic goal (construction cost: less than 200,000yens/kWe, etc.) and has a prospect to solve the critical subjects. From now on, reflecting the results of elemental experiments, the preliminary conceptual design of this plant will be preceded toward the selection for narrowing down candidate concepts at the end of Phase 2. (author)

  13. Optimization and large scale computation of an entropy-based moment closure

    Science.gov (United States)

    Kristopher Garrett, C.; Hauck, Cory; Hill, Judith

    2015-12-01

    We present computational advances and results in the implementation of an entropy-based moment closure, MN, in the context of linear kinetic equations, with an emphasis on heterogeneous and large-scale computing platforms. Entropy-based closures are known in several cases to yield more accurate results than closures based on standard spectral approximations, such as PN, but the computational cost is generally much higher and often prohibitive. Several optimizations are introduced to improve the performance of entropy-based algorithms over previous implementations. These optimizations include the use of GPU acceleration and the exploitation of the mathematical properties of spherical harmonics, which are used as test functions in the moment formulation. To test the emerging high-performance computing paradigm of communication bound simulations, we present timing results at the largest computational scales currently available. These results show, in particular, load balancing issues in scaling the MN algorithm that do not appear for the PN algorithm. We also observe that in weak scaling tests, the ratio in time to solution of MN to PN decreases.

  14. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  15. Scale-dependent effects of a heterogeneous landscape on genetic differentiation in the Central American squirrel monkey (Saimiri oerstedii.

    Directory of Open Access Journals (Sweden)

    Mary E Blair

    Full Text Available Landscape genetic studies offer a fine-scale understanding of how habitat heterogeneity influences population genetic structure. We examined population genetic structure and conducted a landscape genetic analysis for the endangered Central American Squirrel Monkey (Saimiri oerstedii that lives in the fragmented, human-modified habitats of the Central Pacific region of Costa Rica. We analyzed non-invasively collected fecal samples from 244 individuals from 14 groups for 16 microsatellite markers. We found two geographically separate genetic clusters in the Central Pacific region with evidence of recent gene flow among them. We also found significant differentiation among groups of S. o. citrinellus using pairwise F(ST comparisons. These groups are in fragments of secondary forest separated by unsuitable "matrix" habitats such as cattle pasture, commercial African oil palm plantations, and human residential areas. We used an individual-based landscape genetic approach to measure spatial patterns of genetic variance while taking into account landscape heterogeneity. We found that large, commercial oil palm plantations represent moderate barriers to gene flow between populations, but cattle pastures, rivers, and residential areas do not. However, the influence of oil palm plantations on genetic variance was diminished when we restricted analyses to within population pairs, suggesting that their effect is scale-dependent and manifests during longer dispersal events among populations. We show that when landscape genetic methods are applied rigorously and at the right scale, they are sensitive enough to track population processes even in species with long, overlapping generations such as primates. Thus landscape genetic approaches are extremely valuable for the conservation management of a diverse array of endangered species in heterogeneous, human-modified habitats. Our results also stress the importance of explicitly considering the heterogeneity of

  16. The Integrated Use of DMSP-OLS Nighttime Light and MODIS Data for Monitoring Large-Scale Impervious Surface Dynamics: A Case Study in the Yangtze River Delta

    Directory of Open Access Journals (Sweden)

    Zhenfeng Shao

    2014-09-01

    Full Text Available The timely and reliable estimation of imperviousness is essential for the scientific understanding of human-Earth interactions. Due to the unique capacity of capturing artificial light luminosity and long-term data records, the Defense Meteorological Satellite Program (DMSP’s Operational Line-scan System (OLS nighttime light (NTL imagery offers an appealing opportunity for continuously characterizing impervious surface area (ISA at regional and continental scales. Although different levels of success have been achieved, critical challenges still remain in the literature. ISA results generated by DMSP-OLS NTL alone suffer from limitations due to systemic defects of the sensor. Moreover, the majority of developed methodologies seldom consider spatial heterogeneity, which is a key issue in coarse imagery applications. In this study, we proposed a novel method for multi-temporal ISA estimation. This method is based on a linear regression model developed between the sub-pixel ISA fraction and a multi-source index with the integrated use of DMSP-OLS NTL and MODIS NDVI. In contrast with traditional regression analysis, we incorporated spatial information to the regression model for obtaining spatially adaptive coefficients at the per-pixel level. To produce multi-temporal ISA maps using a mono-temporal reference dataset, temporally stable samples were extracted for model training and validation. We tested the proposed method in the Yangtze River Delta and generated annual ISA fraction maps for the decade 2000–2009. According to our assessments, the proposed method exhibited substantial improvements compared with the standard linear regression model and provided a feasible way to monitor large-scale impervious surface dynamics.

  17. Large Scale Environmental Monitoring through Integration of Sensor and Mesh Networks

    Directory of Open Access Journals (Sweden)

    Raja Jurdak

    2008-11-01

    Full Text Available Monitoring outdoor environments through networks of wireless sensors has received interest for collecting physical and chemical samples at high spatial and temporal scales. A central challenge to environmental monitoring applications of sensor networks is the short communication range of the sensor nodes, which increases the complexity and cost of monitoring commodities that are located in geographically spread areas. To address this issue, we propose a new communication architecture that integrates sensor networks with medium range wireless mesh networks, and provides users with an advanced web portal for managing sensed information in an integrated manner. Our architecture adopts a holistic approach targeted at improving the user experience by optimizing the system performance for handling data that originates at the sensors, traverses the mesh network, and resides at the server for user consumption. This holistic approach enables users to set high level policies that can adapt the resolution of information collected at the sensors, set the preferred performance targets for their application, and run a wide range of queries and analysis on both real-time and historical data. All system components and processes will be described in this paper.

  18. Large Scale Environmental Monitoring through Integration of Sensor and Mesh Networks.

    Science.gov (United States)

    Jurdak, Raja; Nafaa, Abdelhamid; Barbirato, Alessio

    2008-11-24

    Monitoring outdoor environments through networks of wireless sensors has received interest for collecting physical and chemical samples at high spatial and temporal scales. A central challenge to environmental monitoring applications of sensor networks is the short communication range of the sensor nodes, which increases the complexity and cost of monitoring commodities that are located in geographically spread areas. To address this issue, we propose a new communication architecture that integrates sensor networks with medium range wireless mesh networks, and provides users with an advanced web portal for managing sensed information in an integrated manner. Our architecture adopts a holistic approach targeted at improving the user experience by optimizing the system performance for handling data that originates at the sensors, traverses the mesh network, and resides at the server for user consumption. This holistic approach enables users to set high level policies that can adapt the resolution of information collected at the sensors, set the preferred performance targets for their application, and run a wide range of queries and analysis on both real-time and historical data. All system components and processes will be described in this paper.

  19. Electricity prices, large-scale renewable integration, and policy implications

    International Nuclear Information System (INIS)

    Kyritsis, Evangelos; Andersson, Jonas; Serletis, Apostolos

    2017-01-01

    This paper investigates the effects of intermittent solar and wind power generation on electricity price formation in Germany. We use daily data from 2010 to 2015, a period with profound modifications in the German electricity market, the most notable being the rapid integration of photovoltaic and wind power sources, as well as the phasing out of nuclear energy. In the context of a GARCH-in-Mean model, we show that both solar and wind power Granger cause electricity prices, that solar power generation reduces the volatility of electricity prices by scaling down the use of peak-load power plants, and that wind power generation increases the volatility of electricity prices by challenging electricity market flexibility. - Highlights: • We model the impact of solar and wind power generation on day-ahead electricity prices. • We discuss the different nature of renewables in relation to market design. • We explore the impact of renewables on the distributional properties of electricity prices. • Solar and wind reduce electricity prices but affect price volatility in the opposite way. • Solar decreases the probability of electricity price spikes, while wind increases it.

  20. Assessing heterogeneity in soil nitrogen cycling: a plot-scale approach

    Science.gov (United States)

    Peter Baas; Jacqueline E. Mohan; David Markewitz; Jennifer D. Knoepp

    2014-01-01

    The high level of spatial and temporal heterogeneity in soil N cycling processes hinders our ability to develop an ecosystem-wide understanding of this cycle. This study examined how incorporating an intensive assessment of spatial variability for soil moisture, C, nutrients, and soil texture can better explain ecosystem N cycling at the plot scale. Five sites...

  1. Fine scale heterogeneity in the Earth's upper mantle - observation and interpretation

    DEFF Research Database (Denmark)

    Thybo, Hans

    2014-01-01

    can be correlated to main plate tectonic features, such as oceanic spreading centres, continental rift zones and subducting slabs. Much seismological mantle research is now concentrated on imaging fine scale heterogeneity, which may be detected and imaged with high-resolution seismic data with dense...

  2. Cascading Dynamics of Heterogenous Scale-Free Networks with Recovery Mechanism

    Directory of Open Access Journals (Sweden)

    Shudong Li

    2013-01-01

    Full Text Available In network security, how to use efficient response methods against cascading failures of complex networks is very important. In this paper, concerned with the highest-load attack (HL and random attack (RA on one edge, we define five kinds of weighting strategies to assign the external resources for recovering the edges from cascading failures in heterogeneous scale-free (SF networks. The influence of external resources, the tolerance parameter, and the different weighting strategies on SF networks against cascading failures is investigated carefully. We find that, under HL attack, the fourth kind of weighting method can more effectively improve the integral robustness of SF networks, simultaneously control the spreading velocity, and control the outburst of cascading failures in SF networks than other methods. Moreover, the third method is optimal if we only knew the local structure of SF networks and the uniform assignment is the worst. The simulations of the real-world autonomous system in, Internet have also supported our findings. The results are useful for using efficient response strategy against the emergent accidents and controlling the cascading failures in the real-world networks.

  3. Statistical characterization of Earth’s heterogeneities from seismic scattering

    Science.gov (United States)

    Zheng, Y.; Wu, R.

    2009-12-01

    The distortion of a teleseismic wavefront carries information about the heterogeneities through which the wave propagates and it is manifestited as logarithmic amplitude (logA) and phase fluctuations of the direct P wave recorded by a seismic network. By cross correlating the fluctuations (e.g., logA-logA or phase-phase), we obtain coherence functions, which depend on spatial lags between stations and incident angles between the incident waves. We have mathematically related the depth-dependent heterogeneity spectrum to the observable coherence functions using seismic scattering theory. We will show that our method has sharp depth resolution. Using the HiNet seismic network data in Japan, we have inverted power spectra for two depth ranges, ~0-120km and below ~120km depth. The coherence functions formed by different groups of stations or by different groups of earthquakes at different back azimuths are similar. This demonstrates that the method is statistically stable and the inhomogeneities are statistically stationary. In both depth intervals, the trend of the spectral amplitude decays from large scale to small scale in a power-law fashion with exceptions at ~50km for the logA data. Due to the spatial spacing of the seismometers, only information from length scale 15km to 200km is inverted. However our scattering method provides new information on small to intermediate scales that are comparable to scales of the recycled materials and thus is complimentary to the global seismic tomography which reveals mainly large-scale heterogeneities on the order of ~1000km. The small-scale heterogeneities revealed here are not likely of pure thermal origin. Therefore, the length scale and strength of heterogeneities as a function of depth may provide important constraints in mechanical mixing of various components in the mantle convection.

  4. European wind integration study (EWIS). Towards a successful integration of large scale wind power into European electricity grids. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Winter, W.

    2010-03-15

    Large capacities of wind generators have already been installed and are operating in Germany (26GW) and Spain (16GW). Installations which are as significant in terms of proportion to system size are also established in Denmark (3.3GW), the All Island Power System of Ireland and Northern Ireland (1.5GW), and Portugal (3.4GW). Many other countries expect significant growth in wind generation such that the total currently installed capacity in Europe of 68GW is expected to at least double by 2015. Yet further increases can be expected in order to achieve Europe's 2020 targets for renewable energy. The scale of this development poses big challenges for wind generation developers in terms of obtaining suitable sites, delivering large construction projects, and financing the associated investments from their operations. Such developments also impact the networks and it was to address the immediate transmission related challenges that the European Wind Integration Study (EWIS) was initiated by Transmission System Operators (TSOs) with the objective of ensuring the most effective integration of large scale wind generation into Europe's transmission networks and electricity system. The challenges anticipated and addressed include: 1) How to efficiently accommodate wind generation when markets and transmission access arrangements have evolved for the needs of traditional controllable generation. 2) How to ensure supplies remain secure as wind varies (establishing the required backup/reserves for low wind days and wind forecast errors as well as managing network congestion in windy conditions). 3) How to maintain the quality and reliability of supplies given the new generation characteristics. 4) How to achieve efficient network costs by suitable design and operation of network connections, the deeper infrastructure including offshore connections, and crossborder interconnections. EWIS has focused on the immediate network related challenges by analysing detailed

  5. The ENIGMA Consortium : large-scale collaborative analyses of neuroimaging and genetic data

    NARCIS (Netherlands)

    Thompson, Paul M.; Stein, Jason L.; Medland, Sarah E.; Hibar, Derrek P.; Vasquez, Alejandro Arias; Renteria, Miguel E.; Toro, Roberto; Jahanshad, Neda; Schumann, Gunter; Franke, Barbara; Wright, Margaret J.; Martin, Nicholas G.; Agartz, Ingrid; Alda, Martin; Alhusaini, Saud; Almasy, Laura; Almeida, Jorge; Alpert, Kathryn; Andreasen, Nancy C.; Andreassen, Ole A.; Apostolova, Liana G.; Appel, Katja; Armstrong, Nicola J.; Aribisala, Benjamin; Bastin, Mark E.; Bauer, Michael; Bearden, Carrie E.; Bergmann, Orjan; Binder, Elisabeth B.; Blangero, John; Bockholt, Henry J.; Boen, Erlend; Bois, Catherine; Boomsma, Dorret I.; Booth, Tom; Bowman, Ian J.; Bralten, Janita; Brouwer, Rachel M.; Brunner, Han G.; Brohawn, David G.; Buckner, Randy L.; Buitelaar, Jan; Bulayeva, Kazima; Bustillo, Juan R.; Calhoun, Vince D.; Hartman, Catharina A.; Hoekstra, Pieter J.; Penninx, Brenda W.; Schmaal, Lianne; van Tol, Marie-Jose

    The Enhancing NeuroImaging Genetics through Meta-Analysis (ENIGMA) Consortium is a collaborative network of researchers working together on a range of large-scale studies that integrate data from 70 institutions worldwide. Organized into Working Groups that tackle questions in neuroscience,

  6. The ENIGMA Consortium: large-scale collaborative analyses of neuroimaging and genetic data

    NARCIS (Netherlands)

    Thompson, Paul M.; Stein, Jason L.; Medland, Sarah E.; Hibar, Derrek P.; Vasquez, Alejandro Arias; Renteria, Miguel E.; Toro, Roberto; Jahanshad, Neda; Schumann, Gunter; Franke, Barbara; Wright, Margaret J.; Martin, Nicholas G.; Agartz, Ingrid; Alda, Martin; Alhusaini, Saud; Almasy, Laura; Almeida, Jorge; Alpert, Kathryn; Andreasen, Nancy C.; Andreassen, Ole A.; Apostolova, Liana G.; Appel, Katja; Armstrong, Nicola J.; Aribisala, Benjamin; Bastin, Mark E.; Bauer, Michael; Bearden, Carrie E.; Bergmann, Orjan; Binder, Elisabeth B.; Blangero, John; Bockholt, Henry J.; Bøen, Erlend; Bois, Catherine; Boomsma, Dorret I.; Booth, Tom; Bowman, Ian J.; Bralten, Janita; Brouwer, Rachel M.; Brunner, Han G.; Brohawn, David G.; Buckner, Randy L.; Buitelaar, Jan; Bulayeva, Kazima; Bustillo, Juan R.; Calhoun, Vince D.; Cannon, Dara M.; Cantor, Rita M.; Carless, Melanie A.; Caseras, Xavier; Cavalleri, Gianpiero L.; Chakravarty, M. Mallar; Chang, Kiki D.; Ching, Christopher R. K.; Christoforou, Andrea; Cichon, Sven; Clark, Vincent P.; Conrod, Patricia; Coppola, Giovanni; Crespo-Facorro, Benedicto; Curran, Joanne E.; Czisch, Michael; Deary, Ian J.; de Geus, Eco J. C.; den Braber, Anouk; Delvecchio, Giuseppe; Depondt, Chantal; de Haan, Lieuwe; de Zubicaray, Greig I.; Dima, Danai; Dimitrova, Rali; Djurovic, Srdjan; Dong, Hongwei; Donohoe, Gary; Duggirala, Ravindranath; Dyer, Thomas D.; Ehrlich, Stefan; Ekman, Carl Johan; Elvsåshagen, Torbjørn; Emsell, Louise; Erk, Susanne; Espeseth, Thomas; Fagerness, Jesen; Fears, Scott; Fedko, Iryna; Fernández, Guillén; Fisher, Simon E.; Foroud, Tatiana; Fox, Peter T.; Francks, Clyde; Frangou, Sophia; Frey, Eva Maria; Frodl, Thomas; Frouin, Vincent; Garavan, Hugh; Giddaluru, Sudheer; Glahn, David C.; Godlewska, Beata; Goldstein, Rita Z.; Gollub, Randy L.; Grabe, Hans J.; Grimm, Oliver; Gruber, Oliver; Guadalupe, Tulio; Gur, Raquel E.; Gur, Ruben C.; Göring, Harald H. H.; Hagenaars, Saskia; Hajek, Tomas; Hall, Geoffrey B.; Hall, Jeremy; Hardy, John; Hartman, Catharina A.; Hass, Johanna; Hatton, Sean N.; Haukvik, Unn K.; Hegenscheid, Katrin; Heinz, Andreas; Hickie, Ian B.; Ho, Beng-Choon; Hoehn, David; Hoekstra, Pieter J.; Hollinshead, Marisa; Holmes, Avram J.; Homuth, Georg; Hoogman, Martine; Hong, L. Elliot; Hosten, Norbert; Hottenga, Jouke-Jan; Hulshoff Pol, Hilleke E.; Hwang, Kristy S.; Jack, Clifford R.; Jenkinson, Mark; Johnston, Caroline; Jönsson, Erik G.; Kahn, René S.; Kasperaviciute, Dalia; Kelly, Sinead; Kim, Sungeun; Kochunov, Peter; Koenders, Laura; Krämer, Bernd; Kwok, John B. J.; Lagopoulos, Jim; Laje, Gonzalo; Landen, Mikael; Landman, Bennett A.; Lauriello, John; Lawrie, Stephen M.; Lee, Phil H.; Le Hellard, Stephanie; Lemaître, Herve; Leonardo, Cassandra D.; Li, Chiang-Shan; Liberg, Benny; Liewald, David C.; Liu, Xinmin; Lopez, Lorna M.; Loth, Eva; Lourdusamy, Anbarasu; Luciano, Michelle; Macciardi, Fabio; Machielsen, Marise W. J.; Macqueen, Glenda M.; Malt, Ulrik F.; Mandl, René; Manoach, Dara S.; Martinot, Jean-Luc; Matarin, Mar; Mather, Karen A.; Mattheisen, Manuel; Mattingsdal, Morten; Meyer-Lindenberg, Andreas; McDonald, Colm; McIntosh, Andrew M.; McMahon, Francis J.; McMahon, Katie L.; Meisenzahl, Eva; Melle, Ingrid; Milaneschi, Yuri; Mohnke, Sebastian; Montgomery, Grant W.; Morris, Derek W.; Moses, Eric K.; Mueller, Bryon A.; Muñoz Maniega, Susana; Mühleisen, Thomas W.; Müller-Myhsok, Bertram; Mwangi, Benson; Nauck, Matthias; Nho, Kwangsik; Nichols, Thomas E.; Nilsson, Lars-Göran; Nugent, Allison C.; Nyberg, Lars; Olvera, Rene L.; Oosterlaan, Jaap; Ophoff, Roel A.; Pandolfo, Massimo; Papalampropoulou-Tsiridou, Melina; Papmeyer, Martina; Paus, Tomas; Pausova, Zdenka; Pearlson, Godfrey D.; Penninx, Brenda W.; Peterson, Charles P.; Pfennig, Andrea; Phillips, Mary; Pike, G. Bruce; Poline, Jean-Baptiste; Potkin, Steven G.; Pütz, Benno; Ramasamy, Adaikalavan; Rasmussen, Jerod; Rietschel, Marcella; Rijpkema, Mark; Risacher, Shannon L.; Roffman, Joshua L.; Roiz-Santiañez, Roberto; Romanczuk-Seiferth, Nina; Rose, Emma J.; Royle, Natalie A.; Rujescu, Dan; Ryten, Mina; Sachdev, Perminder S.; Salami, Alireza; Satterthwaite, Theodore D.; Savitz, Jonathan; Saykin, Andrew J.; Scanlon, Cathy; Schmaal, Lianne; Schnack, Hugo G.; Schork, Andrew J.; Schulz, S. Charles; Schür, Remmelt; Seidman, Larry; Shen, Li; Shoemaker, Jody M.; Simmons, Andrew; Sisodiya, Sanjay M.; Smith, Colin; Smoller, Jordan W.; Soares, Jair C.; Sponheim, Scott R.; Sprooten, Emma; Starr, John M.; Steen, Vidar M.; Strakowski, Stephen; Strike, Lachlan; Sussmann, Jessika; Sämann, Philipp G.; Teumer, Alexander; Toga, Arthur W.; Tordesillas-Gutierrez, Diana; Trabzuni, Daniah; Trost, Sarah; Turner, Jessica; van den Heuvel, Martijn; van der Wee, Nic J.; van Eijk, Kristel; van Erp, Theo G. M.; van Haren, Neeltje E. M.; van 't Ent, Dennis; van Tol, Marie-Jose; Valdés Hernández, Maria C.; Veltman, Dick J.; Versace, Amelia; Völzke, Henry; Walker, Robert; Walter, Henrik; Wang, Lei; Wardlaw, Joanna M.; Weale, Michael E.; Weiner, Michael W.; Wen, Wei; Westlye, Lars T.; Whalley, Heather C.; Whelan, Christopher D.; White, Tonya; Winkler, Anderson M.; Wittfeld, Katharina; Woldehawariat, Girma; Wolf, Christiane; Zilles, David; Zwiers, Marcel P.; Thalamuthu, Anbupalam; Schofield, Peter R.; Freimer, Nelson B.; Lawrence, Natalia S.; Drevets, Wayne

    2014-01-01

    The Enhancing NeuroImaging Genetics through Meta-Analysis (ENIGMA) Consortium is a collaborative network of researchers working together on a range of large-scale studies that integrate data from 70 institutions worldwide. Organized into Working Groups that tackle questions in neuroscience,

  7. The ENIGMA Consortium: Large-scale collaborative analyses of neuroimaging and genetic data

    NARCIS (Netherlands)

    P.M. Thompson (Paul); J.L. Stein; S.E. Medland (Sarah Elizabeth); D.P. Hibar (Derrek); A.A. Vásquez (Arias); M.E. Rentería (Miguel); R. Toro (Roberto); N. Jahanshad (Neda); G. Schumann (Gunter); B. Franke (Barbara); M.J. Wright (Margaret); N.G. Martin (Nicholas); I. Agartz (Ingrid); M. Alda (Martin); S. Alhusaini (Saud); L. Almasy (Laura); K. Alpert (Kathryn); N.C. Andreasen; O.A. Andreassen (Ole); L.G. Apostolova (Liana); K. Appel (Katja); N.J. Armstrong (Nicola); B. Aribisala (Benjamin); M.E. Bastin (Mark); M. Bauer (Michael); C.E. Bearden (Carrie); Ø. Bergmann (Ørjan); E.B. Binder (Elisabeth); J. Blangero (John); H.J. Bockholt; E. Bøen (Erlend); M. Bois (Monique); D.I. Boomsma (Dorret); T. Booth (Tom); I.J. Bowman (Ian); L.B.C. Bralten (Linda); R.M. Brouwer (Rachel); H.G. Brunner; D.G. Brohawn (David); M. Buckner; J.K. Buitelaar (Jan); K. Bulayeva (Kazima); J. Bustillo; V.D. Calhoun (Vince); D.M. Cannon (Dara); R.M. Cantor; M.A. Carless (Melanie); X. Caseras (Xavier); G. Cavalleri (Gianpiero); M.M. Chakravarty (M. Mallar); K.D. Chang (Kiki); C.R.K. Ching (Christopher); A. Christoforou (Andrea); S. Cichon (Sven); V.P. Clark; P. Conrod (Patricia); D. Coppola (Domenico); B. Crespo-Facorro (Benedicto); J.E. Curran (Joanne); M. Czisch (Michael); I.J. Deary (Ian); E.J.C. de Geus (Eco); A. den Braber (Anouk); G. Delvecchio (Giuseppe); C. Depondt (Chantal); L. de Haan (Lieuwe); G.I. de Zubicaray (Greig); D. Dima (Danai); R. Dimitrova (Rali); S. Djurovic (Srdjan); H. Dong (Hongwei); D.J. Donohoe (Dennis); A. Duggirala (Aparna); M.D. Dyer (Matthew); S.M. Ehrlich (Stefan); C.J. Ekman (Carl Johan); T. Elvsåshagen (Torbjørn); L. Emsell (Louise); S. Erk; T. Espeseth (Thomas); J. Fagerness (Jesen); S. Fears (Scott); I. Fedko (Iryna); G. Fernandez (Guillén); S.E. Fisher (Simon); T. Foroud (Tatiana); P.T. Fox (Peter); C. Francks (Clyde); S. Frangou (Sophia); E.M. Frey (Eva Maria); T. Frodl (Thomas); V. Frouin (Vincent); H. Garavan (Hugh); S. Giddaluru (Sudheer); D.C. Glahn (David); B. Godlewska (Beata); R.Z. Goldstein (Rita); R.L. Gollub (Randy); H.J. Grabe (Hans Jörgen); O. Grimm (Oliver); O. Gruber (Oliver); T. Guadalupe (Tulio); R.E. Gur (Raquel); R.C. Gur (Ruben); H.H.H. Göring (Harald); S. Hagenaars (Saskia); T. Hajek (Tomas); G.B. Hall (Garry); J. Hall (Jeremy); J. Hardy (John); C.A. Hartman (Catharina); J. Hass (Johanna); W. Hatton; U.K. Haukvik (Unn); K. Hegenscheid (Katrin); J. Heinz (Judith); I.B. Hickie (Ian); B.C. Ho (Beng ); D. Hoehn (David); P.J. Hoekstra (Pieter); M. Hollinshead (Marisa); A.J. Holmes (Avram); G. Homuth (Georg); M. Hoogman (Martine); L.E. Hong (L.Elliot); N. Hosten (Norbert); J.J. Hottenga (Jouke Jan); H.E. Hulshoff Pol (Hilleke); K.S. Hwang (Kristy); C.R. Jack Jr. (Clifford); S. Jenkinson (Sarah); C. Johnston; E.G. Jönsson (Erik); R.S. Kahn (René); D. Kasperaviciute (Dalia); S. Kelly (Steve); S. Kim (Shinseog); P. Kochunov (Peter); L. Koenders (Laura); B. Krämer (Bernd); J.B.J. Kwok (John); J. Lagopoulos (Jim); G. Laje (Gonzalo); M. Landén (Mikael); B.A. Landman (Bennett); J. Lauriello; S. Lawrie (Stephen); P.H. Lee (Phil); S. Le Hellard (Stephanie); H. Lemaître (Herve); C.D. Leonardo (Cassandra); C.-S. Li (Chiang-shan); B. Liberg (Benny); D.C. Liewald (David C.); X. Liu (Xinmin); L.M. Lopez (Lorna); E. Loth (Eva); A. Lourdusamy (Anbarasu); M. Luciano (Michelle); F. MacCiardi (Fabio); M.W.J. Machielsen (Marise); G.M. MacQueen (Glenda); U.F. Malt (Ulrik); R. Mandl (René); D.S. Manoach (Dara); J.-L. Martinot (Jean-Luc); M. Matarin (Mar); R. Mather; M. Mattheisen (Manuel); M. Mattingsdal (Morten); A. Meyer-Lindenberg; C. McDonald (Colm); A.M. McIntosh (Andrew); F.J. Mcmahon (Francis J); K.L. Mcmahon (Katie); E. Meisenzahl (Eva); I. Melle (Ingrid); Y. Milaneschi (Yuri); S. Mohnke (Sebastian); G.W. Montgomery (Grant); D.W. Morris (Derek W); E.K. Moses (Eric); B.A. Mueller (Bryon ); S. Muñoz Maniega (Susana); T.W. Mühleisen (Thomas); B. Müller-Myhsok (Bertram); B. Mwangi (Benson); M. Nauck (Matthias); K. Nho (Kwangsik); T.E. Nichols (Thomas); L.G. Nilsson; A.C. Nugent (Allison); L. Nyberg (Lisa); R.L. Olvera (Rene); J. Oosterlaan (Jaap); R.A. Ophoff (Roel); M. Pandolfo (Massimo); M. Papalampropoulou-Tsiridou (Melina); M. Papmeyer (Martina); T. Paus (Tomas); Z. Pausova (Zdenka); G. Pearlson (Godfrey); B.W.J.H. Penninx (Brenda); C.P. Peterson (Charles); A. Pfennig (Andrea); M. Phillips (Mary); G.B. Pike (G Bruce); J.B. Poline (Jean Baptiste); S.G. Potkin (Steven); B. Pütz (Benno); A. Ramasamy (Adaikalavan); J. Rasmussen (Jerod); M. Rietschel (Marcella); M. Rijpkema (Mark); S.L. Risacher (Shannon); J.L. Roffman (Joshua); R. Roiz-Santiañez (Roberto); N. Romanczuk-Seiferth (Nina); E.J. Rose (Emma); N.A. Royle (Natalie); D. Rujescu (Dan); M. Ryten (Mina); P.S. Sachdev (Perminder); A. Salami (Alireza); T.D. Satterthwaite (Theodore); J. Savitz (Jonathan); A.J. Saykin (Andrew); C. Scanlon (Cathy); L. Schmaal (Lianne); H. Schnack (Hugo); N.J. Schork (Nicholas); S.C. Schulz (S.Charles); R. Schür (Remmelt); L.J. Seidman (Larry); L. Shen (Li); L. Shoemaker (Lawrence); A. Simmons (Andrew); S.M. Sisodiya (Sanjay); C. Smith (Colin); J.W. Smoller; J.C. Soares (Jair); S.R. Sponheim (Scott); R. Sprooten (Roy); J.M. Starr (John); V.M. Steen (Vidar); S. Strakowski (Stephen); L.T. Strike (Lachlan); J. Sussmann (Jessika); P.G. Sämann (Philipp); A. Teumer (Alexander); A.W. Toga (Arthur); D. Tordesillas-Gutierrez (Diana); D. Trabzuni (Danyah); S. Trost (Sarah); J. Turner (Jessica); M. van den Heuvel (Martijn); N.J. van der Wee (Nic); K.R. van Eijk (Kristel); T.G.M. van Erp (Theo G.); N.E.M. van Haren (Neeltje E.); D. van 't Ent (Dennis); M.J.D. van Tol (Marie-José); M.C. Valdés Hernández (Maria); D.J. Veltman (Dick); A. Versace (Amelia); H. Völzke (Henry); R. Walker (Robert); H.J. Walter (Henrik); L. Wang (Lei); J.M. Wardlaw (J.); M.E. Weale (Michael); M.W. Weiner (Michael); W. Wen (Wei); L.T. Westlye (Lars); H.C. Whalley (Heather); C.D. Whelan (Christopher); T.J.H. White (Tonya); A.M. Winkler (Anderson); K. Wittfeld (Katharina); G. Woldehawariat (Girma); A. Björnsson (Asgeir); D. Zilles (David); M.P. Zwiers (Marcel); A. Thalamuthu (Anbupalam); J.R. Almeida (Jorge); C.J. Schofield (Christopher); N.B. Freimer (Nelson); N.S. Lawrence (Natalia); D.A. Drevets (Douglas)

    2014-01-01

    textabstractThe Enhancing NeuroImaging Genetics through Meta-Analysis (ENIGMA) Consortium is a collaborative network of researchers working together on a range of large-scale studies that integrate data from 70 institutions worldwide. Organized into Working Groups that tackle questions in

  8. The challenge of integrating large scale wind power

    Energy Technology Data Exchange (ETDEWEB)

    Kryszak, B.

    2007-07-01

    The support of renewable energy sources is one of the key issues in current energy policies. The paper presents aspects of the integration of wind power in the electric power system from the perspective of a Transmission System Operator (TSO). Technical, operational and market aspects related to the integration of more than 8000 MW of installed wind power into the Transmission Network of Vattenfall Europe Transmission are discussed, and experiences with the transmission of wind power, wind power prediction, balancing of wind power, power production behaviour and fluctuations are reported. Moreover, issues for wind power integration on a European level will be discussed with the background of a wind power study. (auth)

  9. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  10. Integrating an agent-based model into a large-scale hydrological model for evaluating drought management in California

    Science.gov (United States)

    Sheffield, J.; He, X.; Wada, Y.; Burek, P.; Kahil, M.; Wood, E. F.; Oppenheimer, M.

    2017-12-01

    California has endured record-breaking drought since winter 2011 and will likely experience more severe and persistent drought in the coming decades under changing climate. At the same time, human water management practices can also affect drought frequency and intensity, which underscores the importance of human behaviour in effective drought adaptation and mitigation. Currently, although a few large-scale hydrological and water resources models (e.g., PCR-GLOBWB) consider human water use and management practices (e.g., irrigation, reservoir operation, groundwater pumping), none of them includes the dynamic feedback between local human behaviors/decisions and the natural hydrological system. It is, therefore, vital to integrate social and behavioral dimensions into current hydrological modeling frameworks. This study applies the agent-based modeling (ABM) approach and couples it with a large-scale hydrological model (i.e., Community Water Model, CWatM) in order to have a balanced representation of social, environmental and economic factors and a more realistic representation of the bi-directional interactions and feedbacks in coupled human and natural systems. In this study, we focus on drought management in California and considers two types of agents, which are (groups of) farmers and state management authorities, and assumed that their corresponding objectives are to maximize the net crop profit and to maintain sufficient water supply, respectively. Farmers' behaviors are linked with local agricultural practices such as cropping patterns and deficit irrigation. More precisely, farmers' decisions are incorporated into CWatM across different time scales in terms of daily irrigation amount, seasonal/annual decisions on crop types and irrigated area as well as the long-term investment of irrigation infrastructure. This simulation-based optimization framework is further applied by performing different sets of scenarios to investigate and evaluate the effectiveness

  11. A multi-scale computational scheme for anisotropic hydro-mechanical couplings in saturated heterogeneous porous media

    NARCIS (Netherlands)

    Mercatoris, B.C.N.; Massart, T.J.; Sluys, L.J.

    2013-01-01

    This contribution discusses a coupled two-scale framework for hydro-mechanical problems in saturated heterogeneous porous geomaterials. The heterogeneous nature of such materials can lead to an anisotropy of the hydro-mechanical couplings and non-linear effects. Based on an assumed model of the

  12. Scale-up of miscible flood processes for heterogeneous reservoirs. 1993 annual report

    Energy Technology Data Exchange (ETDEWEB)

    Orr, F.M. Jr.

    1994-05-01

    Progress is reported for a comprehensive investigation of the scaling behavior of gas injection processes in heterogeneous reservoirs. The interplay of phase behavior, viscous fingering, gravity segregation, capillary imbibition and drainage, and reservoir heterogeneity is examined in a series of simulations and experiments. Compositional and first-contact miscable simulations of viscous fingering and gravity segregation are compared to show that the two techniques can give very different results. Also, analyzed are two-dimensional and three-dimensional flows in which gravity segregation and viscous fingering interact. The simulations show that 2D and 3D flows can differ significantly. A comparison of analytical solutions for three-component two-phase flow with experimental results for oil/water/alcohol systems is reported. While the experiments and theory show reasonable agreement, some differences remain to be explained. The scaling behavior of the interaction of gravity segregation and capillary forces is investigated through simulations and through scaling arguments based on analysis of the differential equations. The simulations show that standard approaches do not agree well with results of low IFT displacements. The scaling analyses, however, reveal flow regimes where capillary, gravity, or viscous forces dominate the flow.

  13. Large-Scale Environment Properties of Narrow-Line Seyfert 1 Galaxies at z < 0.4

    Energy Technology Data Exchange (ETDEWEB)

    Järvelä, Emilia [Metsähovi Radio Observatory, Aalto University, Espoo (Finland); Department of Electronics and Nanoengineering, Aalto University, Espoo (Finland); Lähteenmäki, A. [Metsähovi Radio Observatory, Aalto University, Espoo (Finland); Department of Electronics and Nanoengineering, Aalto University, Espoo (Finland); Tartu Observatory, Tõravere (Estonia); Lietzen, H., E-mail: emilia.jarvela@aalto.fi [Tartu Observatory, Tõravere (Estonia)

    2017-11-30

    The large-scale environment is believed to affect the evolution and intrinsic properties of galaxies. It offers a new perspective on narrow-line Seyfert 1 galaxies (NLS1) which have not been extensively studied in this context before. We study a large and diverse sample of 960 NLS1 galaxies using a luminosity-density field constructed using Sloan Digital Sky Survey. We investigate how the large-scale environment is connected to the properties of NLS1 galaxies, especially their radio loudness. Furthermore, we compare the large-scale environment properties of NLS1 galaxies with other active galactic nuclei (AGN) classes, for example, other jetted AGN and broad-line Seyfert 1 (BLS1) galaxies, to shed light on their possible relations. In general NLS1 galaxies reside in less dense large-scale environments than any of our comparison samples, thus supporting their young age. The average luminosity-density and distribution to different luminosity-density regions of NLS1 sources is significantly different compared to BLS1 galaxies. This contradicts the simple orientation-based unification of NLS1 and BLS1 galaxies, and weakens the hypothesis that BLS1 galaxies are the parent population of NLS1 galaxies. The large-scale environment density also has an impact on the intrinsic properties of NLS1 galaxies; the radio loudness increases with the increasing luminosity-density. However, our results suggest that the NLS1 population is indeed heterogeneous, and that a considerable fraction of them are misclassified. We support a suggested description that the traditional classification based on the radio loudness should be replaced with the division to jetted and non-jetted sources.

  14. A data-driven analysis of energy balance closure across FLUXNET research sites: The role of landscape scale heterogeneity

    DEFF Research Database (Denmark)

    Stoy, Paul C.; Mauder, Matthias; Foken, Thomas

    2013-01-01

    approached 1. These results suggest that landscape-level heterogeneity in vegetation and topography cannot be ignored as a contributor to incomplete energy balance closure at the flux network level, although net radiation measurements, biological energy assimilation, unmeasured storage terms......The energy balance at most surface-atmosphere flux research sites remains unclosed. The mechanisms underlying the discrepancy between measured energy inputs and outputs across the global FLUXNET tower network are still under debate. Recent reviews have identified exchange processes and turbulent...... motions at large spatial and temporal scales in heterogeneous landscapes as the primary cause of the lack of energy balance closure at some intensively-researched sites, while unmeasured storage terms cannot be ruled out as a dominant contributor to the lack of energy balance closure at many other sites...

  15. Quantitative multi-scale analysis of mineral distributions and fractal pore structures for a heterogeneous Junger Basin shale

    International Nuclear Information System (INIS)

    Wang, Y.D.; Ren, Y.Q.; Hu, T.; Deng, B.; Xiao, T.Q.; Liu, K.Y.; Yang, Y.S.

    2016-01-01

    Three dimensional (3D) characterization of shales has recently attracted wide attentions in relation to the growing importance of shale oil and gas. Obtaining a complete 3D compositional distribution of shale has proven to be challenging due to its multi-scale characteristics. A combined multi-energy X-ray micro-CT technique and data-constrained modelling (DCM) approach has been used to quantitatively investigate the multi-scale mineral and porosity distributions of a heterogeneous shale from the Junger Basin, northwestern China by sub-sampling. The 3D sub-resolution structures of minerals and pores in the samples are quantitatively obtained as the partial volume fraction distributions, with colours representing compositions. The shale sub-samples from two areas have different physical structures for minerals and pores, with the dominant minerals being feldspar and dolomite, respectively. Significant heterogeneities have been observed in the analysis. The sub-voxel sized pores form large interconnected clusters with fractal structures. The fractal dimensions of the largest clusters for both sub-samples were quantitatively calculated and found to be 2.34 and 2.86, respectively. The results are relevant in quantitative modelling of gas transport in shale reservoirs

  16. Large scale hydrogeological modelling of a low-lying complex coastal aquifer system

    DEFF Research Database (Denmark)

    Meyer, Rena

    2018-01-01

    intrusion. In this thesis a new methodological approach was developed to combine 3D numerical groundwater modelling with a detailed geological description and hydrological, geochemical and geophysical data. It was applied to a regional scale saltwater intrusion in order to analyse and quantify...... the groundwater flow dynamics, identify the driving mechanisms that formed the saltwater intrusion to its present extent and to predict its progression in the future. The study area is located in the transboundary region between Southern Denmark and Northern Germany, adjacent to the Wadden Sea. Here, a large-scale...... parametrization schemes that accommodate hydrogeological heterogeneities. Subsequently, density-dependent flow and transport modelling of multiple salt sources was successfully applied to simulate the formation of the saltwater intrusion during the last 4200 years, accounting for historic changes in the hydraulic...

  17. Vertically Integrated Models for Carbon Storage Modeling in Heterogeneous Domains

    Science.gov (United States)

    Bandilla, K.; Celia, M. A.

    2017-12-01

    Numerical modeling is an essential tool for studying the impacts of geologic carbon storage (GCS). Injection of carbon dioxide (CO2) into deep saline aquifers leads to multi-phase flow (injected CO2 and resident brine), which can be described by a set of three-dimensional governing equations, including mass-balance equation, volumetric flux equations (modified Darcy), and constitutive equations. This is the modeling approach on which commonly used reservoir simulators such as TOUGH2 are based. Due to the large density difference between CO2 and brine, GCS models can often be simplified by assuming buoyant segregation and integrating the three-dimensional governing equations in the vertical direction. The integration leads to a set of two-dimensional equations coupled with reconstruction operators for vertical profiles of saturation and pressure. Vertically-integrated approaches have been shown to give results of comparable quality as three-dimensional reservoir simulators when applied to realistic CO2 injection sites such as the upper sand wedge at the Sleipner site. However, vertically-integrated approaches usually rely on homogeneous properties over the thickness of a geologic layer. Here, we investigate the impact of general (vertical and horizontal) heterogeneity in intrinsic permeability, relative permeability functions, and capillary pressure functions. We consider formations involving complex fluvial deposition environments and compare the performance of vertically-integrated models to full three-dimensional models for a set of hypothetical test cases consisting of high permeability channels (streams) embedded in a low permeability background (floodplains). The domains are randomly generated assuming that stream channels can be represented by sinusoidal waves in the plan-view and by parabolas for the streams' cross-sections. Stream parameters such as width, thickness and wavelength are based on values found at the Ketzin site in Germany. Results from the

  18. Integrating large-scale data and RNA technology to protect crops from fungal pathogens

    Directory of Open Access Journals (Sweden)

    Ian Joseph Girard

    2016-05-01

    Full Text Available With a rapidly growing human population it is expected that plant science researchers and the agricultural community will need to increase food productivity using less arable land. This challenge is complicated by fungal pathogens and diseases, many of which can severely impact crop yield. Current measures to control fungal pathogens are either ineffective or have adverse effects on the agricultural enterprise. Thus, developing new strategies through research innovation to protect plants from pathogenic fungi is necessary to overcome these hurdles. RNA sequencing technologies are increasing our understanding of the underlying genes and gene regulatory networks mediating disease outcomes. The application of invigorating next generation sequencing strategies to study plant-pathogen interactions has and will provide unprecedented insight into the complex patterns of gene activity responsible for crop protection. However, questions remain about how biological processes in both the pathogen and the host are specified in space directly at the site of infection and over the infection period. The integration of cutting edge molecular and computational tools will provide plant scientists with the arsenal required to identify genes and molecules that play a role in plant protection. Large scale RNA sequence data can then be used to protect plants by targeting genes essential for pathogen viability in the production of stably transformed lines expressing RNA interference molecules, or through foliar applications of double stranded RNA.

  19. Impact of Subsurface Heterogeneities on nano-Scale Zero Valent Iron Transport

    Science.gov (United States)

    Krol, M. M.; Sleep, B. E.; O'Carroll, D. M.

    2011-12-01

    Nano-scale zero valent iron (nZVI) has been applied as a remediation technology at sites contaminated with chlorinated compounds and heavy metals. Although laboratory studies have demonstrated high reactivity for the degradation of target contaminants, the success of nZVI in the field has been limited due to poor subsurface mobility. When injected into the subsurface, nZVI tends to aggregate and be retained by subsurface soils. As such nZVI suspensions need to be stabilized for increased mobility. However, even with stabilization, soil heterogeneities can still lead to non-uniform nZVI transport, resulting in poor distribution and consequently decreased degradation of target compounds. Understanding how nZVI transport can be affected by subsurface heterogeneities can aid in improving the technology. This can be done with the use of a numerical model which can simulate nZVI transport. In this study CompSim, a finite difference groundwater model, is used to simulate the movement of nZVI in a two-dimensional domain. CompSim has been shown in previous studies to accurately predict nZVI movement in the subsurface, and is used in this study to examine the impact of soil heterogeneity on nZVI transport. This work also explores the impact of different viscosities of the injected nZVI suspensions (corresponding to different stabilizing polymers) and injection rates on nZVI mobility. Analysis metrics include travel time, travel distance, and average nZVI concentrations. Improving our understanding of the influence of soil heterogeneity on nZVI transport will lead to improved field scale implementation and, potentially, to more effective remediation of contaminated sites.

  20. A middleware-based platform for the integration of bioinformatic services

    Directory of Open Access Journals (Sweden)

    Guzmán Llambías

    2015-08-01

    Full Text Available Performing Bioinformatic´s experiments involve an intensive access to distributed services and information resources through Internet. Although existing tools facilitate the implementation of workflow-oriented applications, they lack of capabilities to integrate services beyond low-scale applications, particularly integrating services with heterogeneous interaction patterns and in a larger scale. This is particularly required to enable a large-scale distributed processing of biological data generated by massive sequencing technologies. On the other hand, such integration mechanisms are provided by middleware products like Enterprise Service Buses (ESB, which enable to integrate distributed systems following a Service Oriented Architecture. This paper proposes an integration platform, based on enterprise middleware, to integrate Bioinformatics services. It presents a multi-level reference architecture and focuses on ESB-based mechanisms to provide asynchronous communications, event-based interactions and data transformation capabilities. The paper presents a formal specification of the platform using the Event-B model.

  1. Large scale renewable power generation advances in technologies for generation, transmission and storage

    CERN Document Server

    Hossain, Jahangir

    2014-01-01

    This book focuses on the issues of integrating large-scale renewable power generation into existing grids. It includes a new protection technique for renewable generators along with the inclusion of current status of smart grid.

  2. The impact of continuous integration on other software development practices: a large-scale empirical study

    NARCIS (Netherlands)

    Zhao, Y.; Serebrenik, A.; Zhou, Y.; Filkov, V.; Vasilescu, B.N.

    2017-01-01

    Continuous Integration (CI) has become a disruptive innovation in software development: with proper tool support and adoption, positive effects have been demonstrated for pull request throughput and scaling up of project sizes. As any other innovation, adopting CI implies adapting existing practices

  3. Heterogeneity and Scaling in Geologic Media: Applications to Transport in the Vadose and Saturated Zones

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Stephen R.

    2003-06-01

    Heterogeneity and Scaling in Geologic Media: Applications to Transport in the Vadose and Saturated Zones Stephen Brown, Gregory Boitnott, and Martin Smith New England Research In rocks and soils, the bulk geophysical and transport properties of the matrix and of fracture systems are determined by the juxtaposition of geometric features at many length scales. For sedimentary materials the length scales are: the pore scale (irregularities in grain surface roughness and cementation), the scale of grain packing faults (and the resulting correlated porosity structures), the scale dominated by sorting or winnowing due to depositional processes, and the scale of geomorphology at the time of deposition. We are studying the heterogeneity and anisotropy in geometry, permeability, and geophysical response from the pore (microscopic), laboratory (mesoscopic), and backyard field (macroscopic) scales. In turn these data are being described and synthesized for development of mathematical models. Eventually, we will perform parameter studies to explore these models in the context of transport in the vadose and saturated zones. We have developed a multi-probe physical properties scanner which allows for the mapping of geophysical properties on a slabbed sample or core. This device allows for detailed study of heterogeneity at those length scales most difficult to quantify using standard field and laboratory practices. The measurement head consists of a variety of probes designed to make local measurements of various properties, including: gas permeability, acoustic velocities (compressional and shear), complex electrical impedance (4 electrode, wide frequency coverage), and ultrasonic reflection (ultrasonic impedance and permeability). We can thus routinely generate detailed geophysical maps of a particular sample. With the exception of the acoustic velocity, we are testing and modifying these probes as necessary for use on soil samples. As a baseline study we have been

  4. Modified stress intensity factor as a crack growth parameter applicable under large scale yielding conditions

    International Nuclear Information System (INIS)

    Yasuoka, Tetsuo; Mizutani, Yoshihiro; Todoroki, Akira

    2014-01-01

    High-temperature water stress corrosion cracking has high tensile stress sensitivity, and its growth rate has been evaluated using the stress intensity factor, which is a linear fracture mechanics parameter. Stress corrosion cracking mainly occurs and propagates around welded metals or heat-affected zones. These regions have complex residual stress distributions and yield strength distributions because of input heat effects. The authors previously reported that the stress intensity factor becomes inapplicable when steep residual stress distributions or yield strength distributions occur along the crack propagation path, because small-scale yielding conditions deviate around those distributions. Here, when the stress intensity factor is modified by considering these distributions, the modified stress intensity factor may be used for crack growth evaluation for large-scale yielding. The authors previously proposed a modified stress intensity factor incorporating the stress distribution or yield strength distribution in front of the crack using the rate of change of stress intensity factor and yield strength. However, the applicable range of modified stress intensity factor for large-scale yielding was not clarified. In this study, the range was analytically investigated by comparison with the J-integral solution. A three-point bending specimen with parallel surface crack was adopted as the analytical model and the stress intensity factor, modified stress intensity factor and equivalent stress intensity factor derived from the J-integral were calculated and compared under large-scale yielding conditions. The modified stress intensity was closer to the equivalent stress intensity factor when compared with the stress intensity factor. If deviation from the J-integral solution is acceptable up to 2%, the modified stress intensity factor is applicable up to 30% of the J-integral limit, while the stress intensity factor is applicable up to 10%. These results showed that

  5. On the renormalization of the effective field theory of large scale structures

    International Nuclear Information System (INIS)

    Pajer, Enrico; Zaldarriaga, Matias

    2013-01-01

    Standard perturbation theory (SPT) for large-scale matter inhomogeneities is unsatisfactory for at least three reasons: there is no clear expansion parameter since the density contrast is not small on all scales; it does not fully account for deviations at large scales from a perfect pressureless fluid induced by short-scale non-linearities; for generic initial conditions, loop corrections are UV-divergent, making predictions cutoff dependent and hence unphysical. The Effective Field Theory of Large Scale Structures successfully addresses all three issues. Here we focus on the third one and show explicitly that the terms induced by integrating out short scales, neglected in SPT, have exactly the right scale dependence to cancel all UV-divergences at one loop, and this should hold at all loops. A particularly clear example is an Einstein deSitter universe with no-scale initial conditions P in ∼ k n . After renormalizing the theory, we use self-similarity to derive a very simple result for the final power spectrum for any n, excluding two-loop corrections and higher. We show how the relative importance of different corrections depends on n. For n ∼ −1.5, relevant for our universe, pressure and dissipative corrections are more important than the two-loop corrections

  6. On the renormalization of the effective field theory of large scale structures

    Energy Technology Data Exchange (ETDEWEB)

    Pajer, Enrico [Department of Physics, Princeton University, Princeton, NJ 08544 (United States); Zaldarriaga, Matias, E-mail: enrico.pajer@gmail.com, E-mail: matiasz@ias.edu [Institute for Advanced Study, Princeton, NJ 08544 (United States)

    2013-08-01

    Standard perturbation theory (SPT) for large-scale matter inhomogeneities is unsatisfactory for at least three reasons: there is no clear expansion parameter since the density contrast is not small on all scales; it does not fully account for deviations at large scales from a perfect pressureless fluid induced by short-scale non-linearities; for generic initial conditions, loop corrections are UV-divergent, making predictions cutoff dependent and hence unphysical. The Effective Field Theory of Large Scale Structures successfully addresses all three issues. Here we focus on the third one and show explicitly that the terms induced by integrating out short scales, neglected in SPT, have exactly the right scale dependence to cancel all UV-divergences at one loop, and this should hold at all loops. A particularly clear example is an Einstein deSitter universe with no-scale initial conditions P{sub in} ∼ k{sup n}. After renormalizing the theory, we use self-similarity to derive a very simple result for the final power spectrum for any n, excluding two-loop corrections and higher. We show how the relative importance of different corrections depends on n. For n ∼ −1.5, relevant for our universe, pressure and dissipative corrections are more important than the two-loop corrections.

  7. Iterative resonance self-shielding methods using resonance integral table in heterogeneous transport lattice calculations

    International Nuclear Information System (INIS)

    Hong, Ser Gi; Kim, Kang-Seog

    2011-01-01

    This paper describes the iteration methods using resonance integral tables to estimate the effective resonance cross sections in heterogeneous transport lattice calculations. Basically, these methods have been devised to reduce an effort to convert resonance integral table into subgroup data to be used in the physical subgroup method. Since these methods do not use subgroup data but only use resonance integral tables directly, these methods do not include an error in converting resonance integral into subgroup data. The effective resonance cross sections are estimated iteratively for each resonance nuclide through the heterogeneous fixed source calculations for the whole problem domain to obtain the background cross sections. These methods have been implemented in the transport lattice code KARMA which uses the method of characteristics (MOC) to solve the transport equation. The computational results show that these iteration methods are quite promising in the practical transport lattice calculations.

  8. Engineering management of large scale systems

    Science.gov (United States)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  9. Topographically Engineered Large Scale Nanostructures for Plasmonic Biosensing

    Science.gov (United States)

    Xiao, Bo; Pradhan, Sangram K.; Santiago, Kevin C.; Rutherford, Gugu N.; Pradhan, Aswini K.

    2016-04-01

    We demonstrate that a nanostructured metal thin film can achieve enhanced transmission efficiency and sharp resonances and use a large-scale and high-throughput nanofabrication technique for the plasmonic structures. The fabrication technique combines the features of nanoimprint and soft lithography to topographically construct metal thin films with nanoscale patterns. Metal nanogratings developed using this method show significantly enhanced optical transmission (up to a one-order-of-magnitude enhancement) and sharp resonances with full width at half maximum (FWHM) of ~15nm in the zero-order transmission using an incoherent white light source. These nanostructures are sensitive to the surrounding environment, and the resonance can shift as the refractive index changes. We derive an analytical method using a spatial Fourier transformation to understand the enhancement phenomenon and the sensing mechanism. The use of real-time monitoring of protein-protein interactions in microfluidic cells integrated with these nanostructures is demonstrated to be effective for biosensing. The perpendicular transmission configuration and large-scale structures provide a feasible platform without sophisticated optical instrumentation to realize label-free surface plasmon resonance (SPR) sensing.

  10. The effects of habitat connectivity and regional heterogeneity on artificial pond metacommunities.

    Science.gov (United States)

    Pedruski, Michael T; Arnott, Shelley E

    2011-05-01

    Habitat connectivity and regional heterogeneity represent two factors likely to affect biodiversity across different spatial scales. We performed a 3 × 2 factorial design experiment to investigate the effects of connectivity, heterogeneity, and their interaction on artificial pond communities of freshwater invertebrates at the local (α), among-community (β), and regional (γ) scales. Despite expectations that the effects of connectivity would depend on levels of regional heterogeneity, no significant interactions were found for any diversity index investigated at any spatial scale. While observed responses of biodiversity to connectivity and heterogeneity depended to some extent on the diversity index and spatial partitioning formula used, the general pattern shows that these factors largely act at the β scale, as opposed to the α or γ scales. We conclude that the major role of connectivity in aquatic invertebrate communities is to act as a homogenizing force with relatively little effect on diversity at the α or γ levels. Conversely, heterogeneity acts as a force maintaining differences between communities.

  11. Large-scale inverse model analyses employing fast randomized data reduction

    Science.gov (United States)

    Lin, Youzuo; Le, Ellen B.; O'Malley, Daniel; Vesselinov, Velimir V.; Bui-Thanh, Tan

    2017-08-01

    When the number of observations is large, it is computationally challenging to apply classical inverse modeling techniques. We have developed a new computationally efficient technique for solving inverse problems with a large number of observations (e.g., on the order of 107 or greater). Our method, which we call the randomized geostatistical approach (RGA), is built upon the principal component geostatistical approach (PCGA). We employ a data reduction technique combined with the PCGA to improve the computational efficiency and reduce the memory usage. Specifically, we employ a randomized numerical linear algebra technique based on a so-called "sketching" matrix to effectively reduce the dimension of the observations without losing the information content needed for the inverse analysis. In this way, the computational and memory costs for RGA scale with the information content rather than the size of the calibration data. Our algorithm is coded in Julia and implemented in the MADS open-source high-performance computational framework (http://mads.lanl.gov). We apply our new inverse modeling method to invert for a synthetic transmissivity field. Compared to a standard geostatistical approach (GA), our method is more efficient when the number of observations is large. Most importantly, our method is capable of solving larger inverse problems than the standard GA and PCGA approaches. Therefore, our new model inversion method is a powerful tool for solving large-scale inverse problems. The method can be applied in any field and is not limited to hydrogeological applications such as the characterization of aquifer heterogeneity.

  12. Detecting Local Drivers of Fire Cycle Heterogeneity in Boreal Forests: A Scale Issue

    Directory of Open Access Journals (Sweden)

    Annie Claude Bélisle

    2016-07-01

    Full Text Available Severe crown fires are determining disturbances for the composition and structure of boreal forests in North America. Fire cycle (FC associations with continental climate gradients are well known, but smaller scale controls remain poorly documented. Using a time since fire map (time scale of 300 years, the study aims to assess the relative contributions of local and regional controls on FC and to describe the relationship between FC heterogeneity and vegetation patterns. The study area, located in boreal eastern North America, was partitioned into watersheds according to five scales going from local (3 km2 to landscape (2800 km2 scales. Using survival analysis, we observed that dry surficial deposits and hydrography density better predict FC when measured at the local scale, while terrain complexity and slope position perform better when measured at the middle and landscape scales. The most parsimonious model was selected according to the Akaike information criterion to predict FC throughout the study area. We detected two FC zones, one short (159 years and one long (303 years, with specific age structures and tree compositions. We argue that the local heterogeneity of the fire regime contributes to ecosystem diversity and must be considered in ecosystem management.

  13. Feasibility of an energy conversion system in Canada involving large-scale integrated hydrogen production using solid fuels

    International Nuclear Information System (INIS)

    Gnanapragasam, Nirmal V.; Reddy, Bale V.; Rosen, Marc A.

    2010-01-01

    A large-scale hydrogen production system is proposed using solid fuels and designed to increase the sustainability of alternative energy forms in Canada, and the technical and economic aspects of the system within the Canadian energy market are examined. The work investigates the feasibility and constraints in implementing such a system within the energy infrastructure of Canada. The proposed multi-conversion and single-function system produces hydrogen in large quantities using energy from solid fuels such as coal, tar sands, biomass, municipal solid waste (MSW) and agricultural/forest/industrial residue. The proposed system involves significant technology integration, with various energy conversion processes (such as gasification, chemical looping combustion, anaerobic digestion, combustion power cycles-electrolysis and solar-thermal converters) interconnected to increase the utilization of solid fuels as much as feasible within cost, environmental and other constraints. The analysis involves quantitative and qualitative assessments based on (i) energy resources availability and demand for hydrogen, (ii) commercial viability of primary energy conversion technologies, (iii) academia, industry and government participation, (iv) sustainability and (v) economics. An illustrative example provides an initial road map for implementing such a system. (author)

  14. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  15. Ultra-wideband WDM VCSEL arrays by lateral heterogeneous integration

    Science.gov (United States)

    Geske, Jon

    Advancements in heterogeneous integration are a driving factor in the development of evermore sophisticated and functional electronic and photonic devices. Such advancements will merge the optical and electronic capabilities of different material systems onto a common integrated device platform. This thesis presents a new lateral heterogeneous integration technology called nonplanar wafer bonding. The technique is capable of integrating multiple dissimilar semiconductor device structures on the surface of a substrate in a single wafer bond step, leaving different integrated device structures adjacent to each other on the wafer surface. Material characterization and numerical simulations confirm that the material quality is not compromised during the process. Nonplanar wafer bonding is used to fabricate ultra-wideband wavelength division multiplexed (WDM) vertical-cavity surface-emitting laser (VCSEL) arrays. The optically-pumped VCSEL arrays span 140 nm from 1470 to 1610 nm, a record wavelength span for devices operating in this wavelength range. The array uses eight wavelength channels to span the 140 nm with all channels separated by precisely 20 nm. All channels in the array operate single mode to at least 65°C with output power uniformity of +/- 1 dB. The ultra-wideband WDM VCSEL arrays are a significant first step toward the development of a single-chip source for optical networks based on coarse WDM (CWDM), a low-cost alternative to traditional dense WDM. The CWDM VCSEL arrays make use of fully-oxidized distributed Bragg reflectors (DBRs) to provide the wideband reflectivity required for optical feedback and lasing across 140 rim. In addition, a novel optically-pumped active region design is presented. It is demonstrated, with an analytical model and experimental results, that the new active-region design significantly improves the carrier uniformity in the quantum wells and results in a 50% lasing threshold reduction and a 20°C improvement in the peak

  16. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  17. Large-Scale medical image analytics: Recent methodologies, applications and Future directions.

    Science.gov (United States)

    Zhang, Shaoting; Metaxas, Dimitris

    2016-10-01

    Despite the ever-increasing amount and complexity of annotated medical image data, the development of large-scale medical image analysis algorithms has not kept pace with the need for methods that bridge the semantic gap between images and diagnoses. The goal of this position paper is to discuss and explore innovative and large-scale data science techniques in medical image analytics, which will benefit clinical decision-making and facilitate efficient medical data management. Particularly, we advocate that the scale of image retrieval systems should be significantly increased at which interactive systems can be effective for knowledge discovery in potentially large databases of medical images. For clinical relevance, such systems should return results in real-time, incorporate expert feedback, and be able to cope with the size, quality, and variety of the medical images and their associated metadata for a particular domain. The design, development, and testing of the such framework can significantly impact interactive mining in medical image databases that are growing rapidly in size and complexity and enable novel methods of analysis at much larger scales in an efficient, integrated fashion. Copyright © 2016. Published by Elsevier B.V.

  18. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  19. Cosmology Large Angular Scale Surveyor (CLASS) Focal Plane Development

    Science.gov (United States)

    Chuss, D. T.; Ali, A.; Amiri, M.; Appel, J.; Bennett, C. L.; Colazo, F.; Denis, K. L.; Dunner, R.; Essinger-Hileman, T.; Eimer, J.; hide

    2015-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) will measure the polarization of the Cosmic Microwave Background to search for and characterize the polarized signature of inflation. CLASS will operate from the Atacama Desert and observe approx.70% of the sky. A variable-delay polarization modulator provides modulation of the polarization at approx.10Hz to suppress the 1/f noise of the atmosphere and enable the measurement of the large angular scale polarization modes. The measurement of the inflationary signal across angular scales that spans both the recombination and reionization features allows a test of the predicted shape of the polarized angular power spectra in addition to a measurement of the energy scale of inflation. CLASS is an array of telescopes covering frequencies of 38, 93, 148, and 217 GHz. These frequencies straddle the foreground minimum and thus allow the extraction of foregrounds from the primordial signal. Each focal plane contains feedhorn-coupled transition-edge sensors that simultaneously detect two orthogonal linear polarizations. The use of single-crystal silicon as the dielectric for the on-chip transmission lines enables both high efficiency and uniformity in fabrication. Integrated band definition has been implemented that both controls the bandpass of the single-mode transmission on the chip and prevents stray light from coupling to the detectors.

  20. Integration of heterogeneous molecular networks to unravel gene-regulation in Mycobacterium tuberculosis

    NARCIS (Netherlands)

    Dam, van J.C.J.; Schaap, P.J.; Martins dos Santos, V.A.P.; Suarez Diez, M.

    2014-01-01

    Background: Different methods have been developed to infer regulatory networks from heterogeneous omics datasets and to construct co-expression networks. Each algorithm produces different networks and efforts have been devoted to automatically integrate them into consensus sets. However each

  1. Networks and landscapes: a framework for setting goals and evaluating performance at the large landscape scale

    Science.gov (United States)

    R Patrick Bixler; Shawn Johnson; Kirk Emerson; Tina Nabatchi; Melly Reuling; Charles Curtin; Michele Romolini; Morgan Grove

    2016-01-01

    The objective of large landscape conser vation is to mitigate complex ecological problems through interventions at multiple and overlapping scales. Implementation requires coordination among a diverse network of individuals and organizations to integrate local-scale conservation activities with broad-scale goals. This requires an understanding of the governance options...

  2. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  3. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  4. Heterogeneously integrated silicon photonics for the mid-infrared and spectroscopic sensing.

    Science.gov (United States)

    Chen, Yu; Lin, Hongtao; Hu, Juejun; Li, Mo

    2014-07-22

    Besides being the foundational material for microelectronics, crystalline silicon has long been used for the production of infrared lenses and mirrors. More recently, silicon has become the key material to achieve large-scale integration of photonic devices for on-chip optical interconnect and signal processing. For optics, silicon has significant advantages: it offers a very high refractive index and is highly transparent in the spectral range from 1.2 to 8 μm. To fully exploit silicon’s superior performance in a remarkably broad range and to enable new optoelectronic functionalities, here we describe a general method to integrate silicon photonic devices on arbitrary foreign substrates. In particular, we apply the technique to integrate silicon microring resonators on mid-infrared compatible substrates for operation in the mid-infrared. These high-performance mid-infrared optical resonators are utilized to demonstrate, for the first time, on-chip cavity-enhanced mid-infrared spectroscopic analysis of organic chemicals with a limit of detection of less than 0.1 ng.

  5. Surface fluxes in heterogeneous landscape

    Energy Technology Data Exchange (ETDEWEB)

    Bay Hasager, C

    1997-01-01

    The surface fluxes in homogeneous landscapes are calculated by similarity scaling principles. The methodology is well establish. In heterogeneous landscapes with spatial changes in the micro scale range, i e from 100 m to 10 km, advective effects are significant. The present work focus on these effects in an agricultural countryside typical for the midlatitudes. Meteorological and satellite data from a highly heterogeneous landscape in the Rhine Valley, Germany was collected in the large-scale field experiment TRACT (Transport of pollutants over complex terrain) in 1992. Classified satellite images, Landsat TM and ERS SAR, are used as basis for roughness maps. The roughnesses were measured at meteorological masts in the various cover classes and assigned pixel by pixel to the images. The roughness maps are aggregated, i e spatially averaged, into so-called effective roughness lengths. This calculation is performed by a micro scale aggregation model. The model solves the linearized atmospheric flow equations by a numerical (Fast Fourier Transform) method. This model also calculate maps of friction velocity and momentum flux pixel wise in heterogeneous landscapes. It is indicated how the aggregation methodology can be used to calculate the heat fluxes based on the relevant satellite data i e temperature and soil moisture information. (au) 10 tabs., 49 ills., 223 refs.

  6. Scale-up of miscible flood processes for heterogeneous reservoirs. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Orr, F.M. Jr.

    1996-04-01

    Results of a wide-ranging investigation of the scaling of gas injection processes are reported. The research examines how the physical mechanisms at work during a gas injection project interact to determine process performance. In particular, the authors examine: the interactions of equilibrium phase behavior and two-phase flow that determine local displacement efficiency and minimum miscibility pressure, the combined effects of viscous fingering, gravity segregation and heterogeneity that control sweep efficiency in 2- and 3-dimensional porous media, the use of streamtube/streamline methods to create very efficient simulation technique for multiphase compositional displacements, the scaling of viscous, capillary and gravity forces for heterogeneous reservoirs, and the effects of the thin films and spreading behavior on three-phase flow. The following key results are documented: rigorous procedures for determination of minimum miscibility pressure (MMP) or minimum miscibility enrichment (MME) for miscibility have been developed for multicomponent systems; the complex dependence of MMP`s for nitrogen/methane floods on oil and injection gas composition observed experimentally is explained for the first time; the presence of layer-like heterogeneities strongly influences the interplay of gravity segregation and viscous fingering, as viscous fingers adapt to preferential flow paths and low permeability layers restrict vertical flow; streamtube/streamline simulation techniques are demonstrated for a variety of injection processes in 2 and 3 dimensions; quantitative scaling estimates for the transitions from capillary-dominated to gravity-dominated to viscous-dominated flows are reported; experimental results are given that demonstrate that high pressure CO{sub 2} can be used to generate low IFT gravity drainage in fractured reservoirs if fractures are suitably connected; and the effect of wetting and spreading behavior on three-phase flow is described. 209 refs.

  7. Sensitivity of local air quality to the interplay between small- and large-scale circulations: a large-eddy simulation study

    Science.gov (United States)

    Wolf-Grosse, Tobias; Esau, Igor; Reuder, Joachim

    2017-06-01

    Street-level urban air pollution is a challenging concern for modern urban societies. Pollution dispersion models assume that the concentrations decrease monotonically with raising wind speed. This convenient assumption breaks down when applied to flows with local recirculations such as those found in topographically complex coastal areas. This study looks at a practically important and sufficiently common case of air pollution in a coastal valley city. Here, the observed concentrations are determined by the interaction between large-scale topographically forced and local-scale breeze-like recirculations. Analysis of a long observational dataset in Bergen, Norway, revealed that the most extreme cases of recurring wintertime air pollution episodes were accompanied by increased large-scale wind speeds above the valley. Contrary to the theoretical assumption and intuitive expectations, the maximum NO2 concentrations were not found for the lowest 10 m ERA-Interim wind speeds but in situations with wind speeds of 3 m s-1. To explain this phenomenon, we investigated empirical relationships between the large-scale forcing and the local wind and air quality parameters. We conducted 16 large-eddy simulation (LES) experiments with the Parallelised Large-Eddy Simulation Model (PALM) for atmospheric and oceanic flows. The LES accounted for the realistic relief and coastal configuration as well as for the large-scale forcing and local surface condition heterogeneity in Bergen. They revealed that emerging local breeze-like circulations strongly enhance the urban ventilation and dispersion of the air pollutants in situations with weak large-scale winds. Slightly stronger large-scale winds, however, can counteract these local recirculations, leading to enhanced surface air stagnation. Furthermore, this study looks at the concrete impact of the relative configuration of warmer water bodies in the city and the major transport corridor. We found that a relatively small local water

  8. Sensitivity of local air quality to the interplay between small- and large-scale circulations: a large-eddy simulation study

    Directory of Open Access Journals (Sweden)

    T. Wolf-Grosse

    2017-06-01

    Full Text Available Street-level urban air pollution is a challenging concern for modern urban societies. Pollution dispersion models assume that the concentrations decrease monotonically with raising wind speed. This convenient assumption breaks down when applied to flows with local recirculations such as those found in topographically complex coastal areas. This study looks at a practically important and sufficiently common case of air pollution in a coastal valley city. Here, the observed concentrations are determined by the interaction between large-scale topographically forced and local-scale breeze-like recirculations. Analysis of a long observational dataset in Bergen, Norway, revealed that the most extreme cases of recurring wintertime air pollution episodes were accompanied by increased large-scale wind speeds above the valley. Contrary to the theoretical assumption and intuitive expectations, the maximum NO2 concentrations were not found for the lowest 10 m ERA-Interim wind speeds but in situations with wind speeds of 3 m s−1. To explain this phenomenon, we investigated empirical relationships between the large-scale forcing and the local wind and air quality parameters. We conducted 16 large-eddy simulation (LES experiments with the Parallelised Large-Eddy Simulation Model (PALM for atmospheric and oceanic flows. The LES accounted for the realistic relief and coastal configuration as well as for the large-scale forcing and local surface condition heterogeneity in Bergen. They revealed that emerging local breeze-like circulations strongly enhance the urban ventilation and dispersion of the air pollutants in situations with weak large-scale winds. Slightly stronger large-scale winds, however, can counteract these local recirculations, leading to enhanced surface air stagnation. Furthermore, this study looks at the concrete impact of the relative configuration of warmer water bodies in the city and the major transport corridor. We found that a

  9. Hartle-Hawking wave function and large-scale power suppression of CMB*

    Directory of Open Access Journals (Sweden)

    Yeom Dong-han

    2018-01-01

    Full Text Available In this presentation, we first describe the Hartle-Hawking wave function in the Euclidean path integral approach. After we introduce perturbations to the background instanton solution, following the formalism developed by Halliwell-Hawking and Laflamme, one can obtain the scale-invariant power spectrum for small-scales. We further emphasize that the Hartle-Hawking wave function can explain the large-scale power suppression by choosing suitable potential parameters, where this will be a possible window to confirm or falsify models of quantum cosmology. Finally, we further comment on possible future applications, e.g., Euclidean wormholes, which can result in distinct signatures to the power spectrum.

  10. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  11. The use of cosmic muons in detecting heterogeneities in large volumes

    International Nuclear Information System (INIS)

    Grabski, V.; Reche, R.; Alfaro, R.; Belmont-Moreno, E.; Martinez-Davalos, A.; Sandoval, A.; Menchaca-Rocha, A.

    2008-01-01

    The muon intensity attenuation method to detect heterogeneities in large matter volumes is analyzed. Approximate analytical expressions to estimate the collection time and the signal to noise ratio, are proposed and validated by Monte Carlo simulations. Important parameters, including point spread function and coordinate reconstruction uncertainty are also estimated using Monte Carlo simulations

  12. A Novel CPU/GPU Simulation Environment for Large-Scale Biologically-Realistic Neural Modeling

    Directory of Open Access Journals (Sweden)

    Roger V Hoang

    2013-10-01

    Full Text Available Computational Neuroscience is an emerging field that provides unique opportunities to studycomplex brain structures through realistic neural simulations. However, as biological details are added tomodels, the execution time for the simulation becomes longer. Graphics Processing Units (GPUs are now being utilized to accelerate simulations due to their ability to perform computations in parallel. As such, they haveshown significant improvement in execution time compared to Central Processing Units (CPUs. Most neural simulators utilize either multiple CPUs or a single GPU for better performance, but still show limitations in execution time when biological details are not sacrificed. Therefore, we present a novel CPU/GPU simulation environment for large-scale biological networks,the NeoCortical Simulator version 6 (NCS6. NCS6 is a free, open-source, parallelizable, and scalable simula-tor, designed to run on clusters of multiple machines, potentially with high performance computing devicesin each of them. It has built-in leaky-integrate-and-fire (LIF and Izhikevich (IZH neuron models, but usersalso have the capability to design their own plug-in interface for different neuron types as desired. NCS6is currently able to simulate one million cells and 100 million synapses in quasi real time by distributing dataacross these heterogeneous clusters of CPUs and GPUs.

  13. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  14. Parallel Framework for Dimensionality Reduction of Large-Scale Datasets

    Directory of Open Access Journals (Sweden)

    Sai Kiranmayee Samudrala

    2015-01-01

    Full Text Available Dimensionality reduction refers to a set of mathematical techniques used to reduce complexity of the original high-dimensional data, while preserving its selected properties. Improvements in simulation strategies and experimental data collection methods are resulting in a deluge of heterogeneous and high-dimensional data, which often makes dimensionality reduction the only viable way to gain qualitative and quantitative understanding of the data. However, existing dimensionality reduction software often does not scale to datasets arising in real-life applications, which may consist of thousands of points with millions of dimensions. In this paper, we propose a parallel framework for dimensionality reduction of large-scale data. We identify key components underlying the spectral dimensionality reduction techniques, and propose their efficient parallel implementation. We show that the resulting framework can be used to process datasets consisting of millions of points when executed on a 16,000-core cluster, which is beyond the reach of currently available methods. To further demonstrate applicability of our framework we perform dimensionality reduction of 75,000 images representing morphology evolution during manufacturing of organic solar cells in order to identify how processing parameters affect morphology evolution.

  15. Engineering large-scale agent-based systems with consensus

    Science.gov (United States)

    Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.

    1994-01-01

    The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.

  16. PathlinesExplorer — Image-based exploration of large-scale pathline fields

    KAUST Repository

    Nagoor, Omniah H.

    2015-10-25

    PathlinesExplorer is a novel image-based tool, which has been designed to visualize large scale pathline fields on a single computer [7]. PathlinesExplorer integrates explorable images (EI) technique [4] with order-independent transparency (OIT) method [2]. What makes this method different is that it allows users to handle large data on a single workstation. Although it is a view-dependent method, PathlinesExplorer combines both exploration and modification of visual aspects without re-accessing the original huge data. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathline segments. With this view-dependent method, it is possible to filter, color-code, and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  17. Atomic-scale structural signature of dynamic heterogeneities in metallic liquids

    Science.gov (United States)

    Pasturel, Alain; Jakse, Noel

    2017-08-01

    With sufficiently high cooling rates, liquids will cross their equilibrium melting temperatures and can be maintained in a metastable undercooled state before solidifying. Studies of undercooled liquids reveal several intriguing dynamic phenomena and because explicit connections between liquid structure and liquids dynamics are difficult to identify, it remains a major challenge to capture the underlying structural link to these phenomena. Ab initio molecular dynamics (AIMD) simulations are yet especially powerful in providing atomic-scale details otherwise not accessible in experiments. Through the AIMD-based study of Cr additions in Al-based liquids, we evidence for the first time a close relationship between the decoupling of component diffusion and the emergence of dynamic heterogeneities in the undercooling regime. In addition, we demonstrate that the origin of both phenomena is related to a structural heterogeneity caused by a strong interplay between chemical short-range order (CSRO) and local fivefold topology (ISRO) at the short-range scale in the liquid phase that develops into an icosahedral-based medium-range order (IMRO) upon undercooling. Finally, our findings reveal that this structural signature is also captured in the temperature dependence of partial pair-distribution functions which opens up the route to more elaborated experimental studies.

  18. A continuous time random walk model for Darcy-scale anomalous transport in heterogeneous porous media.

    Science.gov (United States)

    Comolli, Alessandro; Hakoun, Vivien; Dentz, Marco

    2017-04-01

    Achieving the understanding of the process of solute transport in heterogeneous porous media is of crucial importance for several environmental and social purposes, ranging from aquifers contamination and remediation, to risk assessment in nuclear waste repositories. The complexity of this aim is mainly ascribable to the heterogeneity of natural media, which can be observed at all the scales of interest, from pore scale to catchment scale. In fact, the intrinsic heterogeneity of porous media is responsible for the arising of the well-known non-Fickian footprints of transport, including heavy-tailed breakthrough curves, non-Gaussian spatial density profiles and the non-linear growth of the mean squared displacement. Several studies investigated the processes through which heterogeneity impacts the transport properties, which include local modifications to the advective-dispersive motion of solutes, mass exchanges between some mobile and immobile phases (e.g. sorption/desorption reactions or diffusion into solid matrix) and spatial correlation of the flow field. In the last decades, the continuous time random walk (CTRW) model has often been used to describe solute transport in heterogenous conditions and to quantify the impact of point heterogeneity, spatial correlation and mass transfer on the average transport properties [1]. Open issues regarding this approach are the possibility to relate measurable properties of the medium to the parameters of the model, as well as its capability to provide predictive information. In a recent work [2] the authors have shed new light on understanding the relationship between Lagrangian and Eulerian dynamics as well as on their evolution from arbitrary initial conditions. On the basis of these results, we derive a CTRW model for the description of Darcy-scale transport in d-dimensional media characterized by spatially random permeability fields. The CTRW approach models particle velocities as a spatial Markov process, which is

  19. Stochastic description of heterogeneities of permeability within groundwater flow models

    International Nuclear Information System (INIS)

    Cacas, M.C.; Lachassagne, P.; Ledoux, E.; Marsily, G. de

    1991-01-01

    In order to model radionuclide migration in the geosphere realistically at the field scale, the hydrogeologist needs to be able to simulate groundwater flow in heterogeneous media. Heterogeneity of the medium can be described using a stochastic approach, that affects the way in which a flow model is formulated. In this paper, we discuss the problems that we have encountered in modelling both continuous and fractured media. The stochastic approach leads to a methodology that enables local measurements of permeability to be integrated into a model which gives a good prediction of groundwater flow on a regional scale. 5 Figs.; 8 Refs

  20. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele

    2015-08-23

    The interaction between scales is investigated in a turbulent mixing layer. The large-scale amplitude modulation of the small scales already observed in other works depends on the crosswise location. Large-scale positive fluctuations correlate with a stronger activity of the small scales on the low speed-side of the mixing layer, and a reduced activity on the high speed-side. However, from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  1. Large-scale integration of off-shore wind power and regulation strategies of cogeneration plants in the Danish electricity system

    DEFF Research Database (Denmark)

    Østergaard, Poul Alberg

    2005-01-01

    The article analyses how the amount of a small-scale CHP plants and heat pumps and the regulation strategies of these affect the quantity of off-shore wind power that may be integrated into Danish electricity supply......The article analyses how the amount of a small-scale CHP plants and heat pumps and the regulation strategies of these affect the quantity of off-shore wind power that may be integrated into Danish electricity supply...

  2. Argentinean integrated small reactor design and scale economy analysis of integrated reactor

    International Nuclear Information System (INIS)

    Florido, P. C.; Bergallo, J. E.; Ishida, M. V.

    2000-01-01

    This paper describes the design of CAREM, which is Argentinean integrated small reactor project and the scale economy analysis results of integrated reactor. CAREM project consists on the development, design and construction of a small nuclear power plant. CAREM is an advanced reactor conceived with new generation design solutions and standing on the large experience accumulated in the safe operation of Light Water Reactors. The CAREM is an indirect cycle reactor with some distinctive and characteristic features that greatly simplify the reactor and also contribute to a highly level of safety: integrated primary cooling system, self pressurized, primary cooling by natural circulation and safety system relying on passive features. For a fully doupled economic evaluation of integrated reactors done by IREP (Integrated Reactor Evaluation Program) code transferred to IAEA, CAREM have been used as a reference point. The results shows that integrated reactors become competitive with power larger than 200MWe with Argentinean cheapest electricity option. Due to reactor pressure vessel construction limit, low pressure drop steam generator are used to reach power output of 200MWe for natural circulation. For forced circulation, 300MWe can be achieved. (author)

  3. The use of production management techniques in the construction of large scale physics detectors

    International Nuclear Information System (INIS)

    Bazan, A.; Chevenier, G.; Estrella, F.

    1999-01-01

    The construction process of detectors for the Large Hadron Collider (LHC) experiments is large scale, heavily constrained by resource availability and evolves with time. As a consequence, changes in detector component design need to be tracked and quickly reflected in the construction process. With similar problems in industry engineers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so-called Workflow Management Software (WfMS) to coordinate production work processes. However, PDM and WfMS software are not generally integrated in industry. The scale of LHC experiments, like CMS, demands that industrial production techniques be applied in detector construction. This paper outlines the major functions and applications of the CRISTAL system (Cooperating Repositories and an Information System for Tracking Assembly Lifecycles) in use in CMS which successfully integrates PDM and WfMS techniques in managing large scale physics detector construction. This is the first time industrial production techniques have been deployed to this extent in detector construction

  4. Development of Large-Scale Spacecraft Fire Safety Experiments

    DEFF Research Database (Denmark)

    Ruff, Gary A.; Urban, David L.; Fernandez-Pello, A. Carlos

    2013-01-01

    exploration missions outside of low-earth orbit and accordingly, more complex in terms of operations, logistics, and safety. This will increase the challenge of ensuring a fire-safe environment for the crew throughout the mission. Based on our fundamental uncertainty of the behavior of fires in low...... of the spacecraft fire safety risk. The activity of this project is supported by an international topical team of fire experts from other space agencies who conduct research that is integrated into the overall experiment design. The large-scale space flight experiment will be conducted in an Orbital Sciences...

  5. Integration of large-scale heat pumps in the district heating systems of Greater Copenhagen

    DEFF Research Database (Denmark)

    Bach, Bjarne; Werling, Jesper; Ommen, Torben Schmidt

    2016-01-01

    This study analyses the technical and private economic aspects of integrating a large capacity of electric driven HP (heat pumps) in the Greater Copenhagen DH (district heating) system, which is an example of a state-of-the-art large district heating system with many consumers and suppliers....... The analysis was based on using the energy model Balmorel to determine the optimum dispatch of HPs in the system. The potential heat sources in Copenhagen for use in HPs were determined based on data related to temperatures, flows, and hydrography at different locations, while respecting technical constraints...

  6. The Saskatchewan River Basin - a large scale observatory for water security research (Invited)

    Science.gov (United States)

    Wheater, H. S.

    2013-12-01

    The 336,000 km2 Saskatchewan River Basin (SaskRB) in Western Canada illustrates many of the issues of Water Security faced world-wide. It poses globally-important science challenges due to the diversity in its hydro-climate and ecological zones. With one of the world's more extreme climates, it embodies environments of global significance, including the Rocky Mountains (source of the major rivers in Western Canada), the Boreal Forest (representing 30% of Canada's land area) and the Prairies (home to 80% of Canada's agriculture). Management concerns include: provision of water resources to more than three million inhabitants, including indigenous communities; balancing competing needs for water between different uses, such as urban centres, industry, agriculture, hydropower and environmental flows; issues of water allocation between upstream and downstream users in the three prairie provinces; managing the risks of flood and droughts; and assessing water quality impacts of discharges from major cities and intensive agricultural production. Superimposed on these issues is the need to understand and manage uncertain water futures, including effects of economic growth and environmental change, in a highly fragmented water governance environment. Key science questions focus on understanding and predicting the effects of land and water management and environmental change on water quantity and quality. To address the science challenges, observational data are necessary across multiple scales. This requires focussed research at intensively monitored sites and small watersheds to improve process understanding and fine-scale models. To understand large-scale effects on river flows and quality, land-atmosphere feedbacks, and regional climate, integrated monitoring, modelling and analysis is needed at large basin scale. And to support water management, new tools are needed for operational management and scenario-based planning that can be implemented across multiple scales and

  7. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  8. Can Large Scale Land Acquisition for Agro-Development in Indonesia be Managed Sustainably?

    NARCIS (Netherlands)

    Obidzinski, K.; Takahashi, I.; Dermawan, A.; Komarudin, H.; Andrianto, A.

    2013-01-01

    This paper explores the impacts of large scale land acquisition for agro-development by analyzing the Merauke Integrated Food and Energy Estate (MIFEE) in Indonesia. It also examines the potential for MIFEE to meet sustainability requirements under RSPO, ISPO, and FSC. The plantation development

  9. A new large-scale manufacturing platform for complex biopharmaceuticals.

    Science.gov (United States)

    Vogel, Jens H; Nguyen, Huong; Giovannini, Roberto; Ignowski, Jolene; Garger, Steve; Salgotra, Anil; Tom, Jennifer

    2012-12-01

    Complex biopharmaceuticals, such as recombinant blood coagulation factors, are addressing critical medical needs and represent a growing multibillion-dollar market. For commercial manufacturing of such, sometimes inherently unstable, molecules it is important to minimize product residence time in non-ideal milieu in order to obtain acceptable yields and consistently high product quality. Continuous perfusion cell culture allows minimization of residence time in the bioreactor, but also brings unique challenges in product recovery, which requires innovative solutions. In order to maximize yield, process efficiency, facility and equipment utilization, we have developed, scaled-up and successfully implemented a new integrated manufacturing platform in commercial scale. This platform consists of a (semi-)continuous cell separation process based on a disposable flow path and integrated with the upstream perfusion operation, followed by membrane chromatography on large-scale adsorber capsules in rapid cycling mode. Implementation of the platform at commercial scale for a new product candidate led to a yield improvement of 40% compared to the conventional process technology, while product quality has been shown to be more consistently high. Over 1,000,000 L of cell culture harvest have been processed with 100% success rate to date, demonstrating the robustness of the new platform process in GMP manufacturing. While membrane chromatography is well established for polishing in flow-through mode, this is its first commercial-scale application for bind/elute chromatography in the biopharmaceutical industry and demonstrates its potential in particular for manufacturing of potent, low-dose biopharmaceuticals. Copyright © 2012 Wiley Periodicals, Inc.

  10. Predicting Protein Function via Semantic Integration of Multiple Networks.

    Science.gov (United States)

    Yu, Guoxian; Fu, Guangyuan; Wang, Jun; Zhu, Hailong

    2016-01-01

    Determining the biological functions of proteins is one of the key challenges in the post-genomic era. The rapidly accumulated large volumes of proteomic and genomic data drives to develop computational models for automatically predicting protein function in large scale. Recent approaches focus on integrating multiple heterogeneous data sources and they often get better results than methods that use single data source alone. In this paper, we investigate how to integrate multiple biological data sources with the biological knowledge, i.e., Gene Ontology (GO), for protein function prediction. We propose a method, called SimNet, to Semantically integrate multiple functional association Networks derived from heterogenous data sources. SimNet firstly utilizes GO annotations of proteins to capture the semantic similarity between proteins and introduces a semantic kernel based on the similarity. Next, SimNet constructs a composite network, obtained as a weighted summation of individual networks, and aligns the network with the kernel to get the weights assigned to individual networks. Then, it applies a network-based classifier on the composite network to predict protein function. Experiment results on heterogenous proteomic data sources of Yeast, Human, Mouse, and Fly show that, SimNet not only achieves better (or comparable) results than other related competitive approaches, but also takes much less time. The Matlab codes of SimNet are available at https://sites.google.com/site/guoxian85/simnet.

  11. Finite-Time Stability of Large-Scale Systems with Interval Time-Varying Delay in Interconnection

    Directory of Open Access Journals (Sweden)

    T. La-inchua

    2017-01-01

    Full Text Available We investigate finite-time stability of a class of nonlinear large-scale systems with interval time-varying delays in interconnection. Time-delay functions are continuous but not necessarily differentiable. Based on Lyapunov stability theory and new integral bounding technique, finite-time stability of large-scale systems with interval time-varying delays in interconnection is derived. The finite-time stability criteria are delays-dependent and are given in terms of linear matrix inequalities which can be solved by various available algorithms. Numerical examples are given to illustrate effectiveness of the proposed method.

  12. Large-scale heat pumps in sustainable energy systems: System and project perspectives

    Directory of Open Access Journals (Sweden)

    Blarke Morten B.

    2007-01-01

    Full Text Available This paper shows that in support of its ability to improve the overall economic cost-effectiveness and flexibility of the Danish energy system, the financially feasible integration of large-scale heat pumps (HP with existing combined heat and power (CHP plants, is critically sensitive to the operational mode of the HP vis-à-vis the operational coefficient of performance, mainly given by the temperature level of the heat source. When using ground source for low-temperature heat source, heat production costs increases by about 10%, while partial use of condensed flue gasses for low-temperature heat source results in an 8% cost reduction. Furthermore, the analysis shows that when a large-scale HP is integrated with an existing CHP plant, the projected spot market situation in The Nordic Power Exchange (Nord Pool towards 2025, which reflects a growing share of wind power and heat-supply constrained power generation electricity, further reduces the operational hours of the CHP unit over time, while increasing the operational hours of the HP unit. In result, an HP unit at half the heat production capacity as the CHP unit in combination with a heat-only boiler represents as a possibly financially feasible alternative to CHP operation, rather than a supplement to CHP unit operation. While such revised operational strategy would have impacts on policies to promote co-generation, these results indicate that the integration of large-scale HP may jeopardize efforts to promote co-generation. Policy instruments should be designed to promote the integration of HP with lower than half of the heating capacity of the CHP unit. Also it is found, that CHP-HP plant designs should allow for the utilization of heat recovered from the CHP unit’s flue gasses for both concurrent (CHP unit and HP unit and independent operation (HP unit only. For independent operation, the recovered heat is required to be stored. .

  13. A large-scale perspective on stress-induced alterations in resting-state networks

    Science.gov (United States)

    Maron-Katz, Adi; Vaisvaser, Sharon; Lin, Tamar; Hendler, Talma; Shamir, Ron

    2016-02-01

    Stress is known to induce large-scale neural modulations. However, its neural effect once the stressor is removed and how it relates to subjective experience are not fully understood. Here we used a statistically sound data-driven approach to investigate alterations in large-scale resting-state functional connectivity (rsFC) induced by acute social stress. We compared rsfMRI profiles of 57 healthy male subjects before and after stress induction. Using a parcellation-based univariate statistical analysis, we identified a large-scale rsFC change, involving 490 parcel-pairs. Aiming to characterize this change, we employed statistical enrichment analysis, identifying anatomic structures that were significantly interconnected by these pairs. This analysis revealed strengthening of thalamo-cortical connectivity and weakening of cross-hemispheral parieto-temporal connectivity. These alterations were further found to be associated with change in subjective stress reports. Integrating report-based information on stress sustainment 20 minutes post induction, revealed a single significant rsFC change between the right amygdala and the precuneus, which inversely correlated with the level of subjective recovery. Our study demonstrates the value of enrichment analysis for exploring large-scale network reorganization patterns, and provides new insight on stress-induced neural modulations and their relation to subjective experience.

  14. An integrated assessment of a large-scale biodiesel production in Italy: Killing several birds with one stone?

    International Nuclear Information System (INIS)

    Russi, Daniela

    2008-01-01

    Biofuels are often presented as a contribution towards the solution of the problems related to our strong dependency on fossil fuels, i.e. greenhouse effect, energy dependency, urban pollution, besides being a way to support rural development. In this paper, an integrated assessment approach is employed to discuss the social desirability of a large-scale biodiesel production in Italy, taking into account social, environmental and economic factors. The conclusion is that the advantages in terms of reduction of greenhouse gas emissions, energy dependency and urban pollution would be very modest. The small benefits would not be enough to offset the huge costs in terms of land requirement: if the target of the European Directive 2003/30/EC were reached (5.75% of the energy used for transport by 2010) the equivalent of about one-third of the Italian agricultural land would be needed. The consequences would be a considerable increase in food imports and large environmental impacts in the agricultural phase. Also, since biodiesel must be de-taxed in order to make it competitive with oil-derived diesel, the Italian energy revenues would be reduced. In the end, rural development remains the only sound reason to promote biodiesel, but even for this objective other strategies look more advisable, like supporting organic agriculture. (author)

  15. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  16. Role of medium heterogeneity and viscosity contrast in miscible flow regimes and mixing zone growth: A computational pore-scale approach

    Science.gov (United States)

    Afshari, Saied; Hejazi, S. Hossein; Kantzas, Apostolos

    2018-05-01

    Miscible displacement of fluids in porous media is often characterized by the scaling of the mixing zone length with displacement time. Depending on the viscosity contrast of fluids, the scaling law varies between the square root relationship, a sign for dispersive transport regime during stable displacement, and the linear relationship, which represents the viscous fingering regime during an unstable displacement. The presence of heterogeneities in a porous medium significantly affects the scaling behavior of the mixing length as it interacts with the viscosity contrast to control the mixing of fluids in the pore space. In this study, the dynamics of the flow and transport during both unit and adverse viscosity ratio miscible displacements are investigated in heterogeneous packings of circular grains using pore-scale numerical simulations. The pore-scale heterogeneity level is characterized by the variations of the grain diameter and velocity field. The growth of mixing length is employed to identify the nature of the miscible transport regime at different viscosity ratios and heterogeneity levels. It is shown that as the viscosity ratio increases to higher adverse values, the scaling law of mixing length gradually shifts from dispersive to fingering nature up to a certain viscosity ratio and remains almost the same afterwards. In heterogeneous media, the mixing length scaling law is observed to be generally governed by the variations of the velocity field rather than the grain size. Furthermore, the normalization of mixing length temporal plots with respect to the governing parameters of viscosity ratio, heterogeneity, medium length, and medium aspect ratio is performed. The results indicate that mixing length scales exponentially with log-viscosity ratio and grain size standard deviation while the impact of aspect ratio is insignificant. For stable flows, mixing length scales with the square root of medium length, whereas it changes linearly with length during

  17. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  18. Service Oriented Integration of Distributed Heterogeneous IT Systems in Production Engineering Using Information Standards and Linked Data

    Directory of Open Access Journals (Sweden)

    Navid Shariat Zadeh

    2017-01-01

    Full Text Available While design of production systems based on digital models brings benefits, the communication of models comes with challenges since models typically reside in a heterogeneous IT environment using different syntax and semantics. Coping with heterogeneity requires a smart integration strategy. One main paradigm to integrate data and IT systems is to deploy information standards. In particular, ISO 10303 STEP has been endorsed as a suitable standard to exchange a wide variety of product manufacturing data. One the other hand, service-oriented tool integration solutions are progressively adopted for the integration of data and IT-tools, especially with the emergence of Open Services for Lifecycle Collaboration whose focus is on the linking of data from heterogeneous software tools. In practice, there should be a combination of these approaches to facilitate the integration process. Hence, the aim of this paper is to investigate the applications of the approaches and the principles behind them and try to find criteria for where to use which approach. In addition, we explore the synergy between them and consequently suggest an approach based on combination of them. In addition, a systematic approach is suggested to identify required level of integrations and their corresponding approaches exemplified in a typical IT system architecture in Production Engineering.

  19. Scale interactions in a mixing layer – the role of the large-scale gradients

    KAUST Repository

    Fiscaletti, D.

    2016-02-15

    © 2016 Cambridge University Press. The interaction between the large and the small scales of turbulence is investigated in a mixing layer, at a Reynolds number based on the Taylor microscale of , via direct numerical simulations. The analysis is performed in physical space, and the local vorticity root-mean-square (r.m.s.) is taken as a measure of the small-scale activity. It is found that positive large-scale velocity fluctuations correspond to large vorticity r.m.s. on the low-speed side of the mixing layer, whereas, they correspond to low vorticity r.m.s. on the high-speed side. The relationship between large and small scales thus depends on position if the vorticity r.m.s. is correlated with the large-scale velocity fluctuations. On the contrary, the correlation coefficient is nearly constant throughout the mixing layer and close to unity if the vorticity r.m.s. is correlated with the large-scale velocity gradients. Therefore, the small-scale activity appears closely related to large-scale gradients, while the correlation between the small-scale activity and the large-scale velocity fluctuations is shown to reflect a property of the large scales. Furthermore, the vorticity from unfiltered (small scales) and from low pass filtered (large scales) velocity fields tend to be aligned when examined within vortical tubes. These results provide evidence for the so-called \\'scale invariance\\' (Meneveau & Katz, Annu. Rev. Fluid Mech., vol. 32, 2000, pp. 1-32), and suggest that some of the large-scale characteristics are not lost at the small scales, at least at the Reynolds number achieved in the present simulation.

  20. Analysis of Surface Heterogeneity Effects with Mesoscale Terrestrial Modeling Platforms

    Science.gov (United States)

    Simmer, C.

    2015-12-01

    An improved understanding of the full variability in the weather and climate system is crucial for reducing the uncertainty in weather forecasting and climate prediction, and to aid policy makers to develop adaptation and mitigation strategies. A yet unknown part of uncertainty in the predictions from the numerical models is caused by the negligence of non-resolved land surface heterogeneity and the sub-surface dynamics and their potential impact on the state of the atmosphere. At the same time, mesoscale numerical models using finer horizontal grid resolution [O(1)km] can suffer from inconsistencies and neglected scale-dependencies in ABL parameterizations and non-resolved effects of integrated surface-subsurface lateral flow at this scale. Our present knowledge suggests large-eddy-simulation (LES) as an eventual solution to overcome the inadequacy of the physical parameterizations in the atmosphere in this transition scale, yet we are constrained by the computational resources, memory management, big-data, when using LES for regional domains. For the present, there is a need for scale-aware parameterizations not only in the atmosphere but also in the land surface and subsurface model components. In this study, we use the recently developed Terrestrial Systems Modeling Platform (TerrSysMP) as a numerical tool to analyze the uncertainty in the simulation of surface exchange fluxes and boundary layer circulations at grid resolutions of the order of 1km, and explore the sensitivity of the atmospheric boundary layer evolution and convective rainfall processes on land surface heterogeneity.

  1. Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems

    Science.gov (United States)

    Koch, Patrick Nathan

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.

  2. Research and development of safeguards measures for the large scale reprocessing plant

    Energy Technology Data Exchange (ETDEWEB)

    Kikuchi, Masahiro; Sato, Yuji; Yokota, Yasuhiro; Masuda, Shoichiro; Kobayashi, Isao; Uchikoshi, Seiji; Tsutaki, Yasuhiro; Nidaira, Kazuo [Nuclear Material Control Center, Tokyo (Japan)

    1994-12-31

    The Government of Japan agreed on the safeguards concepts of commercial size reprocessing plant under the bilateral agreement for cooperation between the Japan and the United States. In addition, the LASCAR, that is the forum of large scale reprocessing plant safeguards, could obtain the fruitful results in the spring of 1992. The research and development of safeguards measures for the Rokkasho Reprocessing Plant should be progressed with every regard to the concepts described in both documents. Basically, the material accountancy and monitoring system should be established, based on the NRTA and other measures in order to obtain the timeliness goal for plutonium, and the un-attended mode inspection approach based on the integrated containment/surveillance system coupled with radiation monitoring in order to reduce the inspection efforts. NMCC has been studying on the following measures for a large scale reprocessing plant safeguards (1) A radiation gate monitor and integrated surveillance system (2) A near real time Shipper and Receiver Difference monitoring (3) A near real time material accountancy system operated for the bulk handling area (4) A volume measurement technique in a large scale input accountancy vessel (5) An in-process inventory estimation technique applied to the process equipment such as the pulse column and evaporator (6) Solution transfer monitoring approach applied to buffer tanks in the chemical process (7) A timely analysis technique such as a hybrid K edge densitometer operated in the on-site laboratory (J.P.N.).

  3. Micromechanics Based Failure Analysis of Heterogeneous Materials

    Science.gov (United States)

    Sertse, Hamsasew M.

    In recent decades, heterogeneous materials are extensively used in various industries such as aerospace, defense, automotive and others due to their desirable specific properties and excellent capability of accumulating damage. Despite their wide use, there are numerous challenges associated with the application of these materials. One of the main challenges is lack of accurate tools to predict the initiation, progression and final failure of these materials under various thermomechanical loading conditions. Although failure is usually treated at the macro and meso-scale level, the initiation and growth of failure is a complex phenomena across multiple scales. The objective of this work is to enable the mechanics of structure genome (MSG) and its companion code SwiftComp to analyze the initial failure (also called static failure), progressive failure, and fatigue failure of heterogeneous materials using micromechanics approach. The initial failure is evaluated at each numerical integration point using pointwise and nonlocal approach for each constituent of the heterogeneous materials. The effects of imperfect interfaces among constituents of heterogeneous materials are also investigated using a linear traction-displacement model. Moreover, the progressive and fatigue damage analyses are conducted using continuum damage mechanics (CDM) approach. The various failure criteria are also applied at a material point to analyze progressive damage in each constituent. The constitutive equation of a damaged material is formulated based on a consistent irreversible thermodynamics approach. The overall tangent modulus of uncoupled elastoplastic damage for negligible back stress effect is derived. The initiation of plasticity and damage in each constituent is evaluated at each numerical integration point using a nonlocal approach. The accumulated plastic strain and anisotropic damage evolution variables are iteratively solved using an incremental algorithm. The damage analyses

  4. Steffensen's Integral Inequality on Time Scales

    Directory of Open Access Journals (Sweden)

    Ozkan Umut Mutlu

    2007-01-01

    Full Text Available We establish generalizations of Steffensen's integral inequality on time scales via the diamond- dynamic integral, which is defined as a linear combination of the delta and nabla integrals.

  5. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  6. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  7. An integrated system for large scale scanning of nuclear emulsions

    Energy Technology Data Exchange (ETDEWEB)

    Bozza, Cristiano, E-mail: kryss@sa.infn.it [University of Salerno and INFN, via Ponte Don Melillo, Fisciano 84084 (Italy); D’Ambrosio, Nicola [Laboratori Nazionali del Gran Sasso, S.S. 17 BIS km 18.910, Assergi (AQ) 67010 (Italy); De Lellis, Giovanni [University of Napoli and INFN, Complesso Universitario di Monte Sant' Angelo, via Cintia Ed. G, Napoli 80126 (Italy); De Serio, Marilisa [University of Bari and INFN, via E. Orabona 4, Bari 70125 (Italy); Di Capua, Francesco [INFN Napoli, Complesso Universitario di Monte Sant' Angelo, via Cintia Ed. G, Napoli 80126 (Italy); Di Crescenzo, Antonia [University of Napoli and INFN, Complesso Universitario di Monte Sant' Angelo, via Cintia Ed. G, Napoli 80126 (Italy); Di Ferdinando, Donato [INFN Bologna, viale B. Pichat 6/2, Bologna 40127 (Italy); Di Marco, Natalia [Laboratori Nazionali del Gran Sasso, S.S. 17 BIS km 18.910, Assergi (AQ) 67010 (Italy); Esposito, Luigi Salvatore [Laboratori Nazionali del Gran Sasso, now at CERN, Geneva (Switzerland); Fini, Rosa Anna [INFN Bari, via E. Orabona 4, Bari 70125 (Italy); Giacomelli, Giorgio [University of Bologna and INFN, viale B. Pichat 6/2, Bologna 40127 (Italy); Grella, Giuseppe [University of Salerno and INFN, via Ponte Don Melillo, Fisciano 84084 (Italy); Ieva, Michela [University of Bari and INFN, via E. Orabona 4, Bari 70125 (Italy); Kose, Umut [INFN Padova, via Marzolo 8, Padova (PD) 35131 (Italy); Longhin, Andrea; Mauri, Nicoletta [INFN Laboratori Nazionali di Frascati, via E. Fermi 40, Frascati (RM) 00044 (Italy); Medinaceli, Eduardo [University of Padova and INFN, via Marzolo 8, Padova (PD) 35131 (Italy); Monacelli, Piero [University of L' Aquila and INFN, via Vetoio Loc. Coppito, L' Aquila (AQ) 67100 (Italy); Muciaccia, Maria Teresa; Pastore, Alessandra [University of Bari and INFN, via E. Orabona 4, Bari 70125 (Italy); and others

    2013-03-01

    The European Scanning System, developed to analyse nuclear emulsions at high speed, has been completed with the development of a high level software infrastructure to automate and support large-scale emulsion scanning. In one year, an average installation is capable of performing data-taking and online analysis on a total surface ranging from few m{sup 2} to tens of m{sup 2}, acquiring many billions of tracks, corresponding to several TB. This paper focuses on the procedures that have been implemented and on their impact on physics measurements. The system proved robust, reliable, fault-tolerant and user-friendly, and seldom needs assistance. A dedicated relational Data Base system is the backbone of the whole infrastructure, storing data themselves and not only catalogues of data files, as in common practice, being a unique case in high-energy physics DAQ systems. The logical organisation of the system is described and a summary is given of the physics measurement that are readily available by automated processing.

  8. An integrated system for large scale scanning of nuclear emulsions

    International Nuclear Information System (INIS)

    Bozza, Cristiano; D’Ambrosio, Nicola; De Lellis, Giovanni; De Serio, Marilisa; Di Capua, Francesco; Di Crescenzo, Antonia; Di Ferdinando, Donato; Di Marco, Natalia; Esposito, Luigi Salvatore; Fini, Rosa Anna; Giacomelli, Giorgio; Grella, Giuseppe; Ieva, Michela; Kose, Umut; Longhin, Andrea; Mauri, Nicoletta; Medinaceli, Eduardo; Monacelli, Piero; Muciaccia, Maria Teresa; Pastore, Alessandra

    2013-01-01

    The European Scanning System, developed to analyse nuclear emulsions at high speed, has been completed with the development of a high level software infrastructure to automate and support large-scale emulsion scanning. In one year, an average installation is capable of performing data-taking and online analysis on a total surface ranging from few m 2 to tens of m 2 , acquiring many billions of tracks, corresponding to several TB. This paper focuses on the procedures that have been implemented and on their impact on physics measurements. The system proved robust, reliable, fault-tolerant and user-friendly, and seldom needs assistance. A dedicated relational Data Base system is the backbone of the whole infrastructure, storing data themselves and not only catalogues of data files, as in common practice, being a unique case in high-energy physics DAQ systems. The logical organisation of the system is described and a summary is given of the physics measurement that are readily available by automated processing

  9. InterMine: a flexible data warehouse system for the integration and analysis of heterogeneous biological data.

    Science.gov (United States)

    Smith, Richard N; Aleksic, Jelena; Butano, Daniela; Carr, Adrian; Contrino, Sergio; Hu, Fengyuan; Lyne, Mike; Lyne, Rachel; Kalderimis, Alex; Rutherford, Kim; Stepan, Radek; Sullivan, Julie; Wakeling, Matthew; Watkins, Xavier; Micklem, Gos

    2012-12-01

    InterMine is an open-source data warehouse system that facilitates the building of databases with complex data integration requirements and a need for a fast customizable query facility. Using InterMine, large biological databases can be created from a range of heterogeneous data sources, and the extensible data model allows for easy integration of new data types. The analysis tools include a flexible query builder, genomic region search and a library of 'widgets' performing various statistical analyses. The results can be exported in many commonly used formats. InterMine is a fully extensible framework where developers can add new tools and functionality. Additionally, there is a comprehensive set of web services, for which client libraries are provided in five commonly used programming languages. Freely available from http://www.intermine.org under the LGPL license. g.micklem@gen.cam.ac.uk Supplementary data are available at Bioinformatics online.

  10. ORNL Pre-test Analyses of A Large-scale Experiment in STYLE

    International Nuclear Information System (INIS)

    Williams, Paul T.; Yin, Shengjun; Klasky, Hilda B.; Bass, Bennett Richard

    2011-01-01

    Oak Ridge National Laboratory (ORNL) is conducting a series of numerical analyses to simulate a large scale mock-up experiment planned within the European Network for Structural Integrity for Lifetime Management non-RPV Components (STYLE). STYLE is a European cooperative effort to assess the structural integrity of (non-reactor pressure vessel) reactor coolant pressure boundary components relevant to ageing and life-time management and to integrate the knowledge created in the project into mainstream nuclear industry assessment codes. ORNL contributes work-in-kind support to STYLE Work Package 2 (Numerical Analysis/Advanced Tools) and Work Package 3 (Engineering Assessment Methods/LBB Analyses). This paper summarizes the current status of ORNL analyses of the STYLE Mock-Up3 large-scale experiment to simulate and evaluate crack growth in a cladded ferritic pipe. The analyses are being performed in two parts. In the first part, advanced fracture mechanics models are being developed and performed to evaluate several experiment designs taking into account the capabilities of the test facility while satisfying the test objectives. Then these advanced fracture mechanics models will be utilized to simulate the crack growth in the large scale mock-up test. For the second part, the recently developed ORNL SIAM-PFM open-source, cross-platform, probabilistic computational tool will be used to generate an alternative assessment for comparison with the advanced fracture mechanics model results. The SIAM-PFM probabilistic analysis of the Mock-Up3 experiment will utilize fracture modules that are installed into a general probabilistic framework. The probabilistic results of the Mock-Up3 experiment obtained from SIAM-PFM will be compared to those results generated using the deterministic 3D nonlinear finite-element modeling approach. The objective of the probabilistic analysis is to provide uncertainty bounds that will assist in assessing the more detailed 3D finite

  11. Studies on improvement of tomato productivity in a large-scale greenhouse: Prediction of tomato yield based on integrated solar radiation

    International Nuclear Information System (INIS)

    Hisaeda, K.; Nishina, H.

    2007-01-01

    As there are currently many large-scale production facilities that have contracts with the large retailing companies, accurate prediction of yield is necessary. The present study developed a method to predict tomato yield accurately using the data on the outside solar radiation. The present study was conducted in a Venlo-type greenhouse (29,568 square m) at Sera Farm Co., Ltd. in Sera-cho in Hiroshima prefecture. The cultivar used for this experiment was plum tomato. The sowing took place on July 18, the planting took place on August 30, and the harvesting started on October 9, 2002. The planting density was 2.5 plants msup(-2). As the results of the analysis of correlation between the weekly tomato yield and the integrated solar radiation for the period from October 7 to July 28 (43 weeks), the highest correlation (r = 0.518) between the weekly tomato yield and the solar radiation integrated from seven to one weeks before the harvesting was observed. Further investigation by the same correlation analysis was conducted for the 25 weeks period from December 8 to May 26, during which time the effect of growing stages and air temperature were considered to be relatively small. The results showed the highest correlation (r = 0.730) between the weekly tomato yield and the solar radiation integrated from eight to one weeks before the harvesting. The tomato yield occasionally needed to be adjusted at Sera Farm. Consequently, the correlation between the three-week moving average of tomato yield and the integrated solar radiation was calculated. The results showed the highest correlation was obtained for the period from eight to one weeks before the harvesting (r = 0.860). This study therefore showed that it was possible to predict the tomato yield (y: kg.msup(-2).weeksup(-1)) using the following equation on the solar radiation integrated from eight to one weeks before the harvesting(x: MJ.msup(-2)): y = 7.50 x 10 sup(-6)x + 0.148 (rsup(2) = 0.740)

  12. Large Scale Software Building with CMake in ATLAS

    Science.gov (United States)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.

  13. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  14. 2 μm wavelength range InP-based type-II quantum well photodiodes heterogeneously integrated on silicon photonic integrated circuits.

    Science.gov (United States)

    Wang, Ruijun; Sprengel, Stephan; Muneeb, Muhammad; Boehm, Gerhard; Baets, Roel; Amann, Markus-Christian; Roelkens, Gunther

    2015-10-05

    The heterogeneous integration of InP-based type-II quantum well photodiodes on silicon photonic integrated circuits for the 2 µm wavelength range is presented. A responsivity of 1.2 A/W at a wavelength of 2.32 µm and 0.6 A/W at 2.4 µm wavelength is demonstrated. The photodiodes have a dark current of 12 nA at -0.5 V at room temperature. The absorbing active region of the integrated photodiodes consists of six periods of a "W"-shaped quantum well, also allowing for laser integration on the same platform.

  15. A Normalization-Free and Nonparametric Method Sharpens Large-Scale Transcriptome Analysis and Reveals Common Gene Alteration Patterns in Cancers.

    Science.gov (United States)

    Li, Qi-Gang; He, Yong-Han; Wu, Huan; Yang, Cui-Ping; Pu, Shao-Yan; Fan, Song-Qing; Jiang, Li-Ping; Shen, Qiu-Shuo; Wang, Xiao-Xiong; Chen, Xiao-Qiong; Yu, Qin; Li, Ying; Sun, Chang; Wang, Xiangting; Zhou, Jumin; Li, Hai-Peng; Chen, Yong-Bin; Kong, Qing-Peng

    2017-01-01

    Heterogeneity in transcriptional data hampers the identification of differentially expressed genes (DEGs) and understanding of cancer, essentially because current methods rely on cross-sample normalization and/or distribution assumption-both sensitive to heterogeneous values. Here, we developed a new method, Cross-Value Association Analysis (CVAA), which overcomes the limitation and is more robust to heterogeneous data than the other methods. Applying CVAA to a more complex pan-cancer dataset containing 5,540 transcriptomes discovered numerous new DEGs and many previously rarely explored pathways/processes; some of them were validated, both in vitro and in vivo , to be crucial in tumorigenesis, e.g., alcohol metabolism ( ADH1B ), chromosome remodeling ( NCAPH ) and complement system ( Adipsin ). Together, we present a sharper tool to navigate large-scale expression data and gain new mechanistic insights into tumorigenesis.

  16. Assessment of the integrity of degraded steam generator tube by the use of heterogeneous finite element method

    International Nuclear Information System (INIS)

    Duan, X.; Kozluk, M.; Pagan, S.; Mills, B.

    2006-01-01

    3-D. This two-scale (micro-macro) model takes into account the heterogeneous microstructural distribution and the consequential scatter in the mechanical properties. These inhomogeneities are then explicitly incorporated into a large deformation finite element program. In this work, the application of this HFEM to the assessment of integrity of degraded steam generator tubes is illustrated. The HFEM is validated by comparing the predicted failure modes and failure pressure with experimental tests for the tubes with uniformly thinned circumferential defects and various axial defects. The Taguchi experimental design method is then applied to prioritize the influencing parameters that affect the integrity of degraded steam generator tubes such as the defect length, depth, morphology, position and the number of defects, and internal pressure. (author)

  17. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  18. Output Control Technologies for a Large-scale PV System Considering Impacts on a Power Grid

    Science.gov (United States)

    Kuwayama, Akira

    The mega-solar demonstration project named “Verification of Grid Stabilization with Large-scale PV Power Generation systems” had been completed in March 2011 at Wakkanai, the northernmost city of Japan. The major objectives of this project were to evaluate adverse impacts of large-scale PV power generation systems connected to the power grid and develop output control technologies with integrated battery storage system. This paper describes the outline and results of this project. These results show the effectiveness of battery storage system and also proposed output control methods for a large-scale PV system to ensure stable operation of power grids. NEDO, New Energy and Industrial Technology Development Organization of Japan conducted this project and HEPCO, Hokkaido Electric Power Co., Inc managed the overall project.

  19. Modeling fine-scale geological heterogeneity-examples of sand lenses in tills

    DEFF Research Database (Denmark)

    Kessler, Timo Christian; Comunian, Alessandro; Oriani, Fabio

    2013-01-01

    that hamper subsequent simulation. Transition probability (TP) and multiple-point statistics (MPS) were employed to simulate sand lens heterogeneity. We used one cross-section to parameterize the spatial correlation and a second, parallel section as a reference: it allowed testing the quality......Sand lenses at various spatial scales are recognized to add heterogeneity to glacial sediments. They have high hydraulic conductivities relative to the surrounding till matrix and may affect the advective transport of water and contaminants in clayey till settings. Sand lenses were investigated...... on till outcrops producing binary images of geological cross-sections capturing the size, shape and distribution of individual features. Sand lenses occur as elongated, anisotropic geobodies that vary in size and extent. Besides, sand lenses show strong non-stationary patterns on section images...

  20. Participatory Design of Large-Scale Information Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    into a PD process model that (1) emphasizes PD experiments as transcending traditional prototyping by evaluating fully integrated systems exposed to real work practices; (2) incorporates improvisational change management including anticipated, emergent, and opportunity-based change; and (3) extends initial...... design and development into a sustained and ongoing stepwise implementation that constitutes an overall technology-driven organizational change. The process model is presented through a largescale PD experiment in the Danish healthcare sector. We reflect on our experiences from this experiment......In this article we discuss how to engage in large-scale information systems development by applying a participatory design (PD) approach that acknowledges the unique situated work practices conducted by the domain experts of modern organizations. We reconstruct the iterative prototyping approach...

  1. The transport sectors potential contribution to the flexibility in the power sector required by large-scale wind power integration

    DEFF Research Database (Denmark)

    Nørgård, Per Bromand; Lund, H.; Mathiesen, B.V.

    2007-01-01

    -scale integration of renewable energy in the power system – in specific wind power. In the plan, 20 % of the road transport is based on electricity and 20 % on bio- fuels. This, together with other initiatives allows for up to 55-60 % wind power penetration in the power system. A fleet of 0.5 mio electrical...... vehicles in Denmark in 2030 connected to the grid 50 % of the time represents an aggregated flexible power capacity of 1- 1.5 GW and an energy capacity of 10-150 GWh.......In 2006, the Danish Society of Engineers developed a visionary plan for the Danish energy system in 2030. The paper presents and qualifies selected part of the analyses, illustrating the transport sectors potential to contribute to the flexibility in the power sector, necessary for large...

  2. A Sensor Middleware for integration of heterogeneous medical devices.

    Science.gov (United States)

    Brito, M; Vale, L; Carvalho, P; Henriques, J

    2010-01-01

    In this paper, the architecture of a modular, service-oriented, Sensor Middleware for data acquisition and processing is presented. The described solution was developed with the purpose of solving two increasingly relevant problems in the context of modern pHealth systems: i) to aggregate a number of heterogeneous, off-the-shelf, devices from which clinical measurements can be acquired and ii) to provide access and integration with an 802.15.4 network of wearable sensors. The modular nature of the Middleware provides the means to easily integrate pre-processing algorithms into processing pipelines, as well as new drivers for adding support for new sensor devices or communication technologies. Tests performed with both real and artificially generated data streams show that the presented solution is suitable for use both in a Windows PC or a Windows Mobile PDA with minimal overhead.

  3. Characterizing hydrogeologic heterogeneity using lithologic data

    International Nuclear Information System (INIS)

    Flach, G.P.; Hamm, L.L.; Harris, M.K.; Thayer, P.A.; Haselow, J.S.; Smits, A.D.

    1995-01-01

    Large-scale (> 1 m) variability in hydraulic conductivity is usually the main influence on field-scale groundwater flow patterns and dispersive transport. Sediment lithologic descriptions and geophysical logs typically offer finer spatial resolution, and therefore more potential information about site-scale heterogeneity, than other site characterization data. In this study, a technique for generating a heterogeneous, three-dimensional hydraulic conductivity field from sediment lithologic descriptions is presented. The approach involves creating a three-dimensional, fine-scale representation of mud (silt + clay) percentage using a stratified interpolation algorithm. Mud percentage is then translated into horizontal and vertical conductivity using direct correlations derived from measured data and inverse groundwater flow modeling. Lastly, the fine-scale conductivity fields are averaged to create a coarser grid for use in groundwater flow and transport modeling. The approach is demonstrated using a finite-element groundwater flow model of a Savannah River Site solid radioactive and hazardous waste burial ground. Hydrostratigraphic units in the area consist of fluvial, deltaic, and shallow marine sand, mud and calcareous sediment that exhibit abrupt facies changes over short distances

  4. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  5. Economics of intermittent renewable energy sources: four essays on large-scale integration into European power systems

    International Nuclear Information System (INIS)

    Henriot, Arthur

    2014-01-01

    This thesis centres on issues of economic efficiency originating from the large-scale development of intermittent renewable energy sources (RES) in Europe. The flexible resources that are necessary to cope with their specificities (variability, low-predictability, site specificity) are already known, but adequate signals are required to foster efficient operation and investment in these resources. A first question is to what extent intermittent RES can remain out of the market at times when they are the main driver of investment and operation in power systems. A second question is whether the current market design is adapted to their specificities. These two questions are tackled in four distinct contributions.The first chapter is a critical literature review. This analysis introduces and confronts two (often implicit) paradigms for RES integration. It then identifies and discusses a set of evolutions required to develop a market design adapted to the large-scale development of RES, such as new definitions of the products exchanged and reorganisation of the sequence of electricity markets.In the second chapter, an analytical model is used to assess the potential of intra-day markets as a flexibility provider to intermittent RES with low production predictability. This study highlights and demonstrates how the potential of intra-day markets is heavily dependent on the evolution of the forecast errors.The third chapter focuses on the benefits of curtailing the production by intermittent RES, as a tool to smooth out their variability and reduce overall generation costs. Another analytical model is employed to anatomise the relationship between these benefits and a set of pivotal parameters. Special attention is also paid to the allocation of these benefits between the different stakeholders.In the fourth chapter, a numerical simulation is used to evaluate the ability of the European transmission system operators to tackle the investment wave required in order to

  6. A modular approach to large-scale design optimization of aerospace systems

    Science.gov (United States)

    Hwang, John T.

    Gradient-based optimization and the adjoint method form a synergistic combination that enables the efficient solution of large-scale optimization problems. Though the gradient-based approach struggles with non-smooth or multi-modal problems, the capability to efficiently optimize up to tens of thousands of design variables provides a valuable design tool for exploring complex tradeoffs and finding unintuitive designs. However, the widespread adoption of gradient-based optimization is limited by the implementation challenges for computing derivatives efficiently and accurately, particularly in multidisciplinary and shape design problems. This thesis addresses these difficulties in two ways. First, to deal with the heterogeneity and integration challenges of multidisciplinary problems, this thesis presents a computational modeling framework that solves multidisciplinary systems and computes their derivatives in a semi-automated fashion. This framework is built upon a new mathematical formulation developed in this thesis that expresses any computational model as a system of algebraic equations and unifies all methods for computing derivatives using a single equation. The framework is applied to two engineering problems: the optimization of a nanosatellite with 7 disciplines and over 25,000 design variables; and simultaneous allocation and mission optimization for commercial aircraft involving 330 design variables, 12 of which are integer variables handled using the branch-and-bound method. In both cases, the framework makes large-scale optimization possible by reducing the implementation effort and code complexity. The second half of this thesis presents a differentiable parametrization of aircraft geometries and structures for high-fidelity shape optimization. Existing geometry parametrizations are not differentiable, or they are limited in the types of shape changes they allow. This is addressed by a novel parametrization that smoothly interpolates aircraft

  7. Modeling Transport of Cesium in Grimsel Granodiorite With Micrometer Scale Heterogeneities and Dynamic Update of Kd

    Science.gov (United States)

    Voutilainen, Mikko; Kekäläinen, Pekka; Siitari-Kauppi, Marja; Sardini, Paul; Muuri, Eveliina; Timonen, Jussi; Martin, Andrew

    2017-11-01

    Transport and retardation of cesium in Grimsel granodiorite taking into account heterogeneity of mineral and pore structure was studied using rock samples overcored from an in situ diffusion test at the Grimsel Test Site. The field test was part of the Long-Term Diffusion (LTD) project designed to characterize retardation properties (diffusion and distribution coefficients) under in situ conditions. Results of the LTD experiment for cesium showed that in-diffusion profiles and spatial concentration distributions were strongly influenced by the heterogeneous pore structure and mineral distribution. In order to study the effect of heterogeneity on the in-diffusion profile and spatial concentration distribution, a Time Domain Random Walk (TDRW) method was applied along with a feature for modeling chemical sorption in geological materials. A heterogeneous mineral structure of Grimsel granodiorite was constructed using X-ray microcomputed tomography (X-μCT) and the map was linked to previous results for mineral specific porosities and distribution coefficients (Kd) that were determined using C-14-PMMA autoradiography and batch sorption experiments, respectively. After this the heterogeneous structure contains information on local porosity and Kd in 3-D. It was found that the heterogeneity of the mineral structure on the micrometer scale affects significantly the diffusion and sorption of cesium in Grimsel granodiorite at the centimeter scale. Furthermore, the modeled in-diffusion profiles and spatial concentration distributions show similar shape and pattern to those from the LTD experiment. It was concluded that the use of detailed structure characterization and quantitative data on heterogeneity can significantly improve the interpretation and evaluation of transport experiments.

  8. Integrating weather and geotechnical monitoring data for assessing the stability of large scale surface mining operations

    Directory of Open Access Journals (Sweden)

    Steiakakis Chrysanthos

    2016-01-01

    Full Text Available The geotechnical challenges for safe slope design in large scale surface mining operations are enormous. Sometimes one degree of slope inclination can significantly reduce the overburden to ore ratio and therefore dramatically improve the economics of the operation, while large scale slope failures may have a significant impact on human lives. Furthermore, adverse weather conditions, such as high precipitation rates, may unfavorably affect the already delicate balance between operations and safety. Geotechnical, weather and production parameters should be systematically monitored and evaluated in order to safely operate such pits. Appropriate data management, processing and storage are critical to ensure timely and informed decisions.

  9. Using relational databases for improved sequence similarity searching and large-scale genomic analyses.

    Science.gov (United States)

    Mackey, Aaron J; Pearson, William R

    2004-10-01

    Relational databases are designed to integrate diverse types of information and manage large sets of search results, greatly simplifying genome-scale analyses. Relational databases are essential for management and analysis of large-scale sequence analyses, and can also be used to improve the statistical significance of similarity searches by focusing on subsets of sequence libraries most likely to contain homologs. This unit describes using relational databases to improve the efficiency of sequence similarity searching and to demonstrate various large-scale genomic analyses of homology-related data. This unit describes the installation and use of a simple protein sequence database, seqdb_demo, which is used as a basis for the other protocols. These include basic use of the database to generate a novel sequence library subset, how to extend and use seqdb_demo for the storage of sequence similarity search results and making use of various kinds of stored search results to address aspects of comparative genomic analysis.

  10. Bursting and large-scale intermittency in turbulent convection with differential rotation

    International Nuclear Information System (INIS)

    Garcia, O.E.; Bian, N.H.

    2003-01-01

    The tilting mechanism, which generates differential rotation in two-dimensional turbulent convection, is shown to produce relaxation oscillations in the mean flow energy integral and bursts in the global fluctuation level, akin to Lotka-Volterra oscillations. The basic reason for such behavior is the unidirectional and conservative transfer of kinetic energy from the fluctuating motions to the mean component of the flows, and its dissipation at large scales. Results from numerical simulations further demonstrate the intimate relation between these low-frequency modulations and the large-scale intermittency of convective turbulence, as manifested by exponential tails in single-point probability distribution functions. Moreover, the spatio-temporal evolution of convective structures illustrates the mechanism triggering avalanche events in the transport process. The latter involves the overlap of delocalized mixing regions when the barrier to transport, produced by the mean component of the flow, transiently disappears

  11. Large-scale image-based profiling of single-cell phenotypes in arrayed CRISPR-Cas9 gene perturbation screens.

    Science.gov (United States)

    de Groot, Reinoud; Lüthi, Joel; Lindsay, Helen; Holtackers, René; Pelkmans, Lucas

    2018-01-23

    High-content imaging using automated microscopy and computer vision allows multivariate profiling of single-cell phenotypes. Here, we present methods for the application of the CISPR-Cas9 system in large-scale, image-based, gene perturbation experiments. We show that CRISPR-Cas9-mediated gene perturbation can be achieved in human tissue culture cells in a timeframe that is compatible with image-based phenotyping. We developed a pipeline to construct a large-scale arrayed library of 2,281 sequence-verified CRISPR-Cas9 targeting plasmids and profiled this library for genes affecting cellular morphology and the subcellular localization of components of the nuclear pore complex (NPC). We conceived a machine-learning method that harnesses genetic heterogeneity to score gene perturbations and identify phenotypically perturbed cells for in-depth characterization of gene perturbation effects. This approach enables genome-scale image-based multivariate gene perturbation profiling using CRISPR-Cas9. © 2018 The Authors. Published under the terms of the CC BY 4.0 license.

  12. Large-scale wind power integration and wholesale electricity trading benefits: Estimation via an ex post approach

    International Nuclear Information System (INIS)

    Gil, Hugo A.; Gomez-Quiles, Catalina; Riquelme, Jesus

    2012-01-01

    The integration of large-scale wind power has brought about a series of challenges to the power industry, but at the same time a number of benefits are being realized. Among those, the ability of wind power to cause a decline in the electricity market prices has been recognized. In quantifying this effect, some models used in recent years are based on simulations of the market supply-side and the price clearing process. The accuracy of the estimates depend on the quality of the input data, the veracity of the adopted scenarios and the rigorousness of the solution technique. In this work, a series of econometric techniques based on actual ex post wind power and electricity price data are implemented for the estimation of the impact of region-wide wind power integration on the local electricity market clearing prices and the trading savings that stem from this effect. The model is applied to the case of Spain, where the estimated savings are compared against actual credit and bonus expenses to ratepayers. The implications and extent of these results for current and future renewable energy policy-making are discussed. - Highlights: ► Wholesale electricity market trading benefits by wind power are quantified. ► Actual wind power forecast-based bids and electricity price data from Spain are used. ► Different econometric tools are used and compared for improved estimation accuracy. ► Estimated benefits outweigh current credit overhead paid to wind farms in Spain. ► An economically efficient benefit surplus allocation framework is proposed.

  13. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  14. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    Science.gov (United States)

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright

  15. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  16. Heterogeneous patterns enhancing static and dynamic texture classification

    International Nuclear Information System (INIS)

    Silva, Núbia Rosa da; Martinez Bruno, Odemir

    2013-01-01

    Some mixtures, such as colloids like milk, blood, and gelatin, have homogeneous appearance when viewed with the naked eye, however, to observe them at the nanoscale is possible to understand the heterogeneity of its components. The same phenomenon can occur in pattern recognition in which it is possible to see heterogeneous patterns in texture images. However, current methods of texture analysis can not adequately describe such heterogeneous patterns. Common methods used by researchers analyse the image information in a global way, taking all its features in an integrated manner. Furthermore, multi-scale analysis verifies the patterns at different scales, but still preserving the homogeneous analysis. On the other hand various methods use textons to represent the texture, breaking texture down into its smallest unit. To tackle this problem, we propose a method to identify texture patterns not small as textons at distinct scales enhancing the separability among different types of texture. We find sub patterns of texture according to the scale and then group similar patterns for a more refined analysis. Tests were performed in four static texture databases and one dynamical one. Results show that our method provide better classification rate compared with conventional approaches both in static and in dynamic texture.

  17. Towards a uniform and large-scale deposition of MoS2 nanosheets via sulfurization of ultra-thin Mo-based solid films.

    Science.gov (United States)

    Vangelista, Silvia; Cinquanta, Eugenio; Martella, Christian; Alia, Mario; Longo, Massimo; Lamperti, Alessio; Mantovan, Roberto; Basset, Francesco Basso; Pezzoli, Fabio; Molle, Alessandro

    2016-04-29

    Large-scale integration of MoS2 in electronic devices requires the development of reliable and cost-effective deposition processes, leading to uniform MoS2 layers on a wafer scale. Here we report on the detailed study of the heterogeneous vapor-solid reaction between a pre-deposited molybdenum solid film and sulfur vapor, thus resulting in a controlled growth of MoS2 films onto SiO2/Si substrates with a tunable thickness and cm(2)-scale uniformity. Based on Raman spectroscopy and photoluminescence, we show that the degree of crystallinity in the MoS2 layers is dictated by the deposition temperature and thickness. In particular, the MoS2 structural disorder observed at low temperature (<750 °C) and low thickness (two layers) evolves to a more ordered crystalline structure at high temperature (1000 °C) and high thickness (four layers). From an atomic force microscopy investigation prior to and after sulfurization, this parametrical dependence is associated with the inherent granularity of the MoS2 nanosheet that is inherited by the pristine morphology of the pre-deposited Mo film. This work paves the way to a closer control of the synthesis of wafer-scale and atomically thin MoS2, potentially extendable to other transition metal dichalcogenides and hence targeting massive and high-volume production for electronic device manufacturing.

  18. Clinical results of HIS, RIS, PACS integration using data integration CASE tools

    Science.gov (United States)

    Taira, Ricky K.; Chan, Hing-Ming; Breant, Claudine M.; Huang, Lu J.; Valentino, Daniel J.

    1995-05-01

    Current infrastructure research in PACS is dominated by the development of communication networks (local area networks, teleradiology, ATM networks, etc.), multimedia display workstations, and hierarchical image storage architectures. However, limited work has been performed on developing flexible, expansible, and intelligent information processing architectures for the vast decentralized image and text data repositories prevalent in healthcare environments. Patient information is often distributed among multiple data management systems. Current large-scale efforts to integrate medical information and knowledge sources have been costly with limited retrieval functionality. Software integration strategies to unify distributed data and knowledge sources is still lacking commercially. Systems heterogeneity (i.e., differences in hardware platforms, communication protocols, database management software, nomenclature, etc.) is at the heart of the problem and is unlikely to be standardized in the near future. In this paper, we demonstrate the use of newly available CASE (computer- aided software engineering) tools to rapidly integrate HIS, RIS, and PACS information systems. The advantages of these tools include fast development time (low-level code is generated from graphical specifications), and easy system maintenance (excellent documentation, easy to perform changes, and centralized code repository in an object-oriented database). The CASE tools are used to develop and manage the `middle-ware' in our client- mediator-serve architecture for systems integration. Our architecture is scalable and can accommodate heterogeneous database and communication protocols.

  19. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele; Attili, Antonio; Bisetti, Fabrizio; Elsinga, Gerrit E.

    2015-01-01

    from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  20. The influence of small-scale interlayer heterogeneity on DDT removal efficiency for flushing technology

    Science.gov (United States)

    Wang, Xingwei; Chen, Jiajun

    2017-06-01

    With an aim to investigate the influence of small-scale interlayer heterogeneity on DDT removal efficiency, batch test including surfactant-stabilized foam flushing and solution flushing were carried out. Two man-made heterogeneous patterns consisting of coarse and fine quartz sand were designed to reveal the influencing mechanism. Moreover, the removal mechanism and the corresponding contribution by foam flushing were quantitatively studied. Compared with surfactant solution flushing, the DDT removal efficiency by surfactant-stabilized foam flushing increased by 9.47% and 11.28% under heterogeneous patterns 1 and 2, respectively. The DDT removal contributions of improving sweep efficiency for heterogeneous patterns 1 and 2 by foam flushing were 40.82% and 45.98%, and the contribution of dissolving capacity were 59.18% and 54.02%, respectively. The dissolving capacity of DDT played a major role in DDT removal efficiency by foam flushing under laboratory conditions. And the DDT removal contribution of significant improving sweep efficiency was higher than that of removal decline caused by weak solubilizing ability of foam film compared with solution flushing. The obtained results indicated that the difference of DDT removal efficiency by foam flushing was decreased under two different heterogeneous patterns with the increase of the contribution of improving foam flushing sweep efficiency. It suggested that foam flushing can reduce the disturbance from interlayer heterogeneity in remediating DDT contaminated heterogeneous medium.

  1. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    Science.gov (United States)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  2. Factors Affecting the Rate of Penetration of Large-Scale Electricity Technologies: The Case of Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    James R. McFarland; Howard J. Herzog

    2007-05-14

    This project falls under the Technology Innovation and Diffusion topic of the Integrated Assessment of Climate Change Research Program. The objective was to better understand the critical variables that affect the rate of penetration of large-scale electricity technologies in order to improve their representation in integrated assessment models. We conducted this research in six integrated tasks. In our first two tasks, we identified potential factors that affect penetration rates through discussions with modeling groups and through case studies of historical precedent. In the next three tasks, we investigated in detail three potential sets of critical factors: industrial conditions, resource conditions, and regulatory/environmental considerations. Research to assess the significance and relative importance of these factors involved the development of a microeconomic, system dynamics model of the US electric power sector. Finally, we implemented the penetration rate models in an integrated assessment model. While the focus of this effort is on carbon capture and sequestration technologies, much of the work will be applicable to other large-scale energy conversion technologies.

  3. Large scale grid integration of renewable energy sources

    CERN Document Server

    Moreno-Munoz, Antonio

    2017-01-01

    This book presents comprehensive coverage of the means to integrate renewable power, namely wind and solar power. It looks at new approaches to meet the challenges, such as increasing interconnection capacity among geographical areas, hybridisation of different distributed energy resources and building up demand response capabilities.

  4. Moditored unsaturated soil transport processes as a support for large scale soil and water management

    Science.gov (United States)

    Vanclooster, Marnik

    2010-05-01

    The current societal demand for sustainable soil and water management is very large. The drivers of global and climate change exert many pressures on the soil and water ecosystems, endangering appropriate ecosystem functioning. The unsaturated soil transport processes play a key role in soil-water system functioning as it controls the fluxes of water and nutrients from the soil to plants (the pedo-biosphere link), the infiltration flux of precipitated water to groundwater and the evaporative flux, and hence the feed back from the soil to the climate system. Yet, unsaturated soil transport processes are difficult to quantify since they are affected by huge variability of the governing properties at different space-time scales and the intrinsic non-linearity of the transport processes. The incompatibility of the scales between the scale at which processes reasonably can be characterized, the scale at which the theoretical process correctly can be described and the scale at which the soil and water system need to be managed, calls for further development of scaling procedures in unsaturated zone science. It also calls for a better integration of theoretical and modelling approaches to elucidate transport processes at the appropriate scales, compatible with the sustainable soil and water management objective. Moditoring science, i.e the interdisciplinary research domain where modelling and monitoring science are linked, is currently evolving significantly in the unsaturated zone hydrology area. In this presentation, a review of current moditoring strategies/techniques will be given and illustrated for solving large scale soil and water management problems. This will also allow identifying research needs in the interdisciplinary domain of modelling and monitoring and to improve the integration of unsaturated zone science in solving soil and water management issues. A focus will be given on examples of large scale soil and water management problems in Europe.

  5. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  6. Incorporating the Impacts of Small Scale Rock Heterogeneity into Models of Flow and Trapping in Target UK CO2 Storage Systems

    Science.gov (United States)

    Jackson, S. J.; Reynolds, C.; Krevor, S. C.

    2017-12-01

    Predictions of the flow behaviour and storage capacity of CO2 in subsurface reservoirs are dependent on accurate modelling of multiphase flow and trapping. A number of studies have shown that small scale rock heterogeneities have a significant impact on CO2flow propagating to larger scales. The need to simulate flow in heterogeneous reservoir systems has led to the development of numerical upscaling techniques which are widely used in industry. Less well understood, however, is the best approach for incorporating laboratory characterisations of small scale heterogeneities into models. At small scales, heterogeneity in the capillary pressure characteristic function becomes significant. We present a digital rock workflow that combines core flood experiments with numerical simulations to characterise sub-core scale capillary pressure heterogeneities within rock cores from several target UK storage reservoirs - the Bunter, Captain and Ormskirk sandstone formations. Measured intrinsic properties (permeability, capillary pressure, relative permeability) and 3D saturations maps from steady-state core flood experiments were the primary inputs to construct a 3D digital rock model in CMG IMEX. We used vertical end-point scaling to iteratively update the voxel by voxel capillary pressure curves from the average MICP curve; with each iteration more closely predicting the experimental saturations and pressure drops. Once characterised, the digital rock cores were used to predict equivalent flow functions, such as relative permeability and residual trapping, across the range of flow conditions estimated to prevail in the CO2 storage reservoirs. In the case of the Captain sandstone, rock cores were characterised across an entire 100m vertical transect of the reservoir. This allowed analysis of the upscaled impact of small scale heterogeneity on flow and trapping. Figure 1 shows the varying degree to which heterogeneity impacted flow depending on the capillary number in the

  7. Temporal and spatial heterogeneity in lacustrine δ13CDIC and δ18ODO signatures in a large mid-latitude temperate lake

    Directory of Open Access Journals (Sweden)

    Jane DRUMMOND

    2010-08-01

    Full Text Available Modelling limnetic carbon processes is necessary for accurate global carbon models and stable isotope analysis can provide additional insight of carbon flow pathways. This research examined the spatial and temporal complexity of carbon cycling in a large temperate lake. Dissolved inorganic carbon (DIC is utilised by photosynthetic organisms and dissolved oxygen (DO is used by heterotrophic organisms during respiration. Thus the spatial heterogeneity in the pelagic metabolic balance in Loch Lomond, Scotland was investigated using a combined natural abundance isotope technique. The isotopic signatures of dissolved inorganic carbon (δ13CDIC and dissolved oxygen (δ18ODO were measured concurrently on four different dates between November 2004 and September 2005. We measured isotopic variation over small and large spatial scales, both horizontal distance and depth. δ13CDIC and δ18ODO changed over a seasonal cycle, becoming concurrently more positive (negative in the summer (winter months, responding to increased photosynthetic and respiratory rates, respectively. With increasing depth, δ13CDIC became more negative and δ18ODO more positive, reflecting the shift to a respiration-dominated system. The horizontal distribution of δ13CDIC and δ18ODO in the epilimnion was heterogeneous. In general, the south basin had the most positive δ13CDIC, becoming more negative with increasing latitude, except in winter when the opposite pattern was observed. Areas of local variation were often observed near inflows. Clearly δ13CDIC and δ18ODO can show large spatial heterogeneity, as a result of varying metabolic balance coupled with inflow proximity and thus single point sampling to extrapolate whole lake metabolic patterns can result in error when modelling large lake systems Whilst we advise caution when using single point representation, we also show that this combined isotopic approach has potential to assist in constructing detailed lake carbon models.

  8. Numerical calculations on heterogeneity of groundwater flow

    International Nuclear Information System (INIS)

    Follin, S.

    1992-01-01

    The upscaling of model parameters is a key issue in many research fields concerned with parameter heterogeneity. The upscaling process allows for fewer model blocks and relaxes the numerical problems caused by high contrasts in the hydraulic conductivity. The trade-offs are dependent on the object but the general drawback is an increasing uncertainty about the representativeness. The present study deals with numerical calculations of heterogeneity of groundwater flow and solute transport in hypothetical blocks of fractured hard rock in a '3m scale' and addresses both conceptual and practical problems in numerical simulation. Evidence that the hydraulic conductivity (K) of the rock mass between major fracture zones is highly heterogeneous in a 3m scale is provided by a large number of field investigations. The present uses the documented heterogeneity and investigates flow and transport in a two-dimensional stochastic continuum characterized by a variance in Y = In(K) of σ y 2 = 16, corresponding to about 12 log 10 cycles in K. The study considers anisotropy, channelling, non-Fickian and Fickian transport, and conditional simulation. The major conclusions are: * heterogeneity gives rise to anisotropy in the upscaling process, * the choice of support scale is crucial for the modelling of solute transport. As a consequence of the obtained results, a two-dimensional stochastic discontinuum model is presented, which provides a tool for linking stochastic continuum models to discrete fracture network models. (au) (14 figs., 136 refs.)

  9. Mapping multi-scale vascular plant richness in a forest landscape with integrated LiDAR and hyperspectral remote-sensing.

    Science.gov (United States)

    Hakkenberg, C R; Zhu, K; Peet, R K; Song, C

    2018-02-01

    The central role of floristic diversity in maintaining habitat integrity and ecosystem function has propelled efforts to map and monitor its distribution across forest landscapes. While biodiversity studies have traditionally relied largely on ground-based observations, the immensity of the task of generating accurate, repeatable, and spatially-continuous data on biodiversity patterns at large scales has stimulated the development of remote-sensing methods for scaling up from field plot measurements. One such approach is through integrated LiDAR and hyperspectral remote-sensing. However, despite their efficiencies in cost and effort, LiDAR-hyperspectral sensors are still highly constrained in structurally- and taxonomically-heterogeneous forests - especially when species' cover is smaller than the image resolution, intertwined with neighboring taxa, or otherwise obscured by overlapping canopy strata. In light of these challenges, this study goes beyond the remote characterization of upper canopy diversity to instead model total vascular plant species richness in a continuous-cover North Carolina Piedmont forest landscape. We focus on two related, but parallel, tasks. First, we demonstrate an application of predictive biodiversity mapping, using nonparametric models trained with spatially-nested field plots and aerial LiDAR-hyperspectral data, to predict spatially-explicit landscape patterns in floristic diversity across seven spatial scales between 0.01-900 m 2 . Second, we employ bivariate parametric models to test the significance of individual, remotely-sensed predictors of plant richness to determine how parameter estimates vary with scale. Cross-validated results indicate that predictive models were able to account for 15-70% of variance in plant richness, with LiDAR-derived estimates of topography and forest structural complexity, as well as spectral variance in hyperspectral imagery explaining the largest portion of variance in diversity levels. Importantly

  10. Numerical Investigation of Earthquake Nucleation on a Laboratory-Scale Heterogeneous Fault with Rate-and-State Friction

    Science.gov (United States)

    Higgins, N.; Lapusta, N.

    2014-12-01

    Many large earthquakes on natural faults are preceded by smaller events, often termed foreshocks, that occur close in time and space to the larger event that follows. Understanding the origin of such events is important for understanding earthquake physics. Unique laboratory experiments of earthquake nucleation in a meter-scale slab of granite (McLaskey and Kilgore, 2013; McLaskey et al., 2014) demonstrate that sample-scale nucleation processes are also accompanied by much smaller seismic events. One potential explanation for these foreshocks is that they occur on small asperities - or bumps - on the fault interface, which may also be the locations of smaller critical nucleation size. We explore this possibility through 3D numerical simulations of a heterogeneous 2D fault embedded in a homogeneous elastic half-space, in an attempt to qualitatively reproduce the laboratory observations of foreshocks. In our model, the simulated fault interface is governed by rate-and-state friction with laboratory-relevant frictional properties, fault loading, and fault size. To create favorable locations for foreshocks, the fault surface heterogeneity is represented as patches of increased normal stress, decreased characteristic slip distance L, or both. Our simulation results indicate that one can create a rate-and-state model of the experimental observations. Models with a combination of higher normal stress and lower L at the patches are closest to matching the laboratory observations of foreshocks in moment magnitude, source size, and stress drop. In particular, we find that, when the local compression is increased, foreshocks can occur on patches that are smaller than theoretical critical nucleation size estimates. The additional inclusion of lower L for these patches helps to keep stress drops within the range observed in experiments, and is compatible with the asperity model of foreshock sources, since one would expect more compressed spots to be smoother (and hence have

  11. Large Scale Flood Risk Analysis using a New Hyper-resolution Population Dataset

    Science.gov (United States)

    Smith, A.; Neal, J. C.; Bates, P. D.; Quinn, N.; Wing, O.

    2017-12-01

    Here we present the first national scale flood risk analyses, using high resolution Facebook Connectivity Lab population data and data from a hyper resolution flood hazard model. In recent years the field of large scale hydraulic modelling has been transformed by new remotely sensed datasets, improved process representation, highly efficient flow algorithms and increases in computational power. These developments have allowed flood risk analysis to be undertaken in previously unmodeled territories and from continental to global scales. Flood risk analyses are typically conducted via the integration of modelled water depths with an exposure dataset. Over large scales and in data poor areas, these exposure data typically take the form of a gridded population dataset, estimating population density using remotely sensed data and/or locally available census data. The local nature of flooding dictates that for robust flood risk analysis to be undertaken both hazard and exposure data should sufficiently resolve local scale features. Global flood frameworks are enabling flood hazard data to produced at 90m resolution, resulting in a mis-match with available population datasets which are typically more coarsely resolved. Moreover, these exposure data are typically focused on urban areas and struggle to represent rural populations. In this study we integrate a new population dataset with a global flood hazard model. The population dataset was produced by the Connectivity Lab at Facebook, providing gridded population data at 5m resolution, representing a resolution increase over previous countrywide data sets of multiple orders of magnitude. Flood risk analysis undertaken over a number of developing countries are presented, along with a comparison of flood risk analyses undertaken using pre-existing population datasets.

  12. The Large Scale Distribution of Water Ice in the Polar Regions of the Moon

    Science.gov (United States)

    Jordan, A.; Wilson, J. K.; Schwadron, N.; Spence, H. E.

    2017-12-01

    For in situ resource utilization, one must know where water ice is on the Moon. Many datasets have revealed both surface deposits of water ice and subsurface deposits of hydrogen near the lunar poles, but it has proved difficult to resolve the differences among the locations of these deposits. Despite these datasets disagreeing on how deposits are distributed on small scales, we show that most of these datasets do agree on the large scale distribution of water ice. We present data from the Cosmic Ray Telescope for the Effects of Radiation (CRaTER) on the Lunar Reconnaissance Orbiter (LRO), LRO's Lunar Exploration Neutron Detector (LEND), the Neutron Spectrometer on Lunar Prospector (LPNS), LRO's Lyman Alpha Mapping Project (LAMP), LRO's Lunar Orbiter Laser Altimeter (LOLA), and Chandrayaan-1's Moon Mineralogy Mapper (M3). All, including those that show clear evidence for water ice, reveal surprisingly similar trends with latitude, suggesting that both surface and subsurface datasets are measuring ice. All show that water ice increases towards the poles, and most demonstrate that its signature appears at about ±70° latitude and increases poleward. This is consistent with simulations of how surface and subsurface cold traps are distributed with latitude. This large scale agreement constrains the origin of the ice, suggesting that an ancient cometary impact (or impacts) created a large scale deposit that has been rendered locally heterogeneous by subsequent impacts. Furthermore, it also shows that water ice may be available down to ±70°—latitudes that are more accessible than the poles for landing.

  13. High-resolution observations of combustion in heterogeneous surface fuels

    Science.gov (United States)

    E. Louise Loudermilk; Gary L. Achtemeier; Joseph J. O' Brien; J. Kevin Hiers; Benjamin S. Hornsby

    2014-01-01

    In ecosystems with frequent surface fires, fire and fuel heterogeneity at relevant scales have been largely ignored. This could be because complete burns give an impression of homogeneity, or due to the difficulty in capturing fine-scale variation in fuel characteristics and fire behaviour. Fire movement between patches of fuel can have implications for modelling fire...

  14. Large Scale Integration of Carbon Nanotubes in Microsystems

    DEFF Research Database (Denmark)

    Gjerde, Kjetil

    2007-01-01

    Kulstof nanorør har mange egenskaber der kunne anvendes i kombination med traditionelle mikrosystemer, her især overlegne mekaniske og elektriske egenskaber. I dette arbejde bliver metoder til stor-skala integration av kulstof nanorør i mikrosystemer undersøgt, med henblik på anvendelse som mekan...

  15. Integration of Heterogeneous Information Sources into a Knowledge Resource Management System for Lifelong Learning

    NARCIS (Netherlands)

    Demidova, Elena; Ternier, Stefaan; Olmedilla, Daniel; Duval, Erik; Dicerto, Michele; Stefanov, Krassen; Sacristán, Naiara

    2007-01-01

    Demidova, E., Ternier, S., Olmedilla, D., Duval, E., Dicerto, M., Stefanov, K., et al. (2007). Integration of Heterogeneous Information Sources into a Knowledge Resource Management System for Lifelong. TENCompetence Workshop on Service Oriented Approaches and Lifelong Competence Development

  16. Complex modular structure of large-scale brain networks

    Science.gov (United States)

    Valencia, M.; Pastor, M. A.; Fernández-Seara, M. A.; Artieda, J.; Martinerie, J.; Chavez, M.

    2009-06-01

    Modular structure is ubiquitous among real-world networks from related proteins to social groups. Here we analyze the modular organization of brain networks at a large scale (voxel level) extracted from functional magnetic resonance imaging signals. By using a random-walk-based method, we unveil the modularity of brain webs and show modules with a spatial distribution that matches anatomical structures with functional significance. The functional role of each node in the network is studied by analyzing its patterns of inter- and intramodular connections. Results suggest that the modular architecture constitutes the structural basis for the coexistence of functional integration of distant and specialized brain areas during normal brain activities at rest.

  17. Heterogeneous Biomedical Database Integration Using a Hybrid Strategy: A p53 Cancer Research Database

    Directory of Open Access Journals (Sweden)

    Vadim Y. Bichutskiy

    2006-01-01

    Full Text Available Complex problems in life science research give rise to multidisciplinary collaboration, and hence, to the need for heterogeneous database integration. The tumor suppressor p53 is mutated in close to 50% of human cancers, and a small drug-like molecule with the ability to restore native function to cancerous p53 mutants is a long-held medical goal of cancer treatment. The Cancer Research DataBase (CRDB was designed in support of a project to find such small molecules. As a cancer informatics project, the CRDB involved small molecule data, computational docking results, functional assays, and protein structure data. As an example of the hybrid strategy for data integration, it combined the mediation and data warehousing approaches. This paper uses the CRDB to illustrate the hybrid strategy as a viable approach to heterogeneous data integration in biomedicine, and provides a design method for those considering similar systems. More efficient data sharing implies increased productivity, and, hopefully, improved chances of success in cancer research. (Code and database schemas are freely downloadable, http://www.igb.uci.edu/research/research.html.

  18. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  19. A framework for integrating heterogeneous clinical data for a disease area into a central data warehouse.

    Science.gov (United States)

    Karmen, Christian; Ganzinger, Matthias; Kohl, Christian D; Firnkorn, Daniel; Knaup-Gregori, Petra

    2014-01-01

    Structured collection of clinical facts is a common approach in clinical research. Especially in the analysis of rare diseases it is often necessary to aggregate study data from several sites in order to achieve a statistically significant cohort size. In this paper we describe a framework how to approach an integration of heterogeneous clinical data into a central register. This enables site-spanning queries for the occurrence of specific clinical facts and thus supports clinical research. The framework consists of three sequential steps, starting from a formal data harmonization process, to the data transformation methods and finally the integration into a proper data warehouse. We implemented reusable software templates that are based on our best practices in several projects in integrating heterogeneous clinical data. Our methods potentially increase the efficiency and quality for future data integration projects by reducing the implementation effort as well as the project management effort by usage of our approaches as a guideline.

  20. Fractionaly Integrated Flux model and Scaling Laws in Weather and Climate

    Science.gov (United States)

    Schertzer, Daniel; Lovejoy, Shaun

    2013-04-01

    The Fractionaly Integrated Flux model (FIF) has been extensively used to model intermittent observables, like the velocity field, by defining them with the help of a fractional integration of a conservative (i.e. strictly scale invariant) flux, such as the turbulent energy flux. It indeed corresponds to a well-defined modelling that yields the observed scaling laws. Generalised Scale Invariance (GSI) enables FIF to deal with anisotropic fractional integrations and has been rather successful to define and model a unique regime of scaling anisotropic turbulence up to planetary scales. This turbulence has an effective dimension of 23/9=2.55... instead of the classical hypothesised 2D and 3D turbulent regimes, respectively for large and small spatial scales. It therefore theoretically eliminates a non plausible "dimension transition" between these two regimes and the resulting requirement of a turbulent energy "mesoscale gap", whose empirical evidence has been brought more and more into question. More recently, GSI-FIF was used to analyse climate, therefore at much larger time scales. Indeed, the 23/9-dimensional regime necessarily breaks up at the outer spatial scales. The corresponding transition range, which can be called "macroweather", seems to have many interesting properties, e.g. it rather corresponds to a fractional differentiation in time with a roughly flat frequency spectrum. Furthermore, this transition yields the possibility to have at much larger time scales scaling space-time climate fluctuations with a much stronger scaling anisotropy between time and space. Lovejoy, S. and D. Schertzer (2013). The Weather and Climate: Emergent Laws and Multifractal Cascades. Cambridge Press (in press). Schertzer, D. et al. (1997). Fractals 5(3): 427-471. Schertzer, D. and S. Lovejoy (2011). International Journal of Bifurcation and Chaos 21(12): 3417-3456.

  1. Laboratory-scale in situ bioremediation in heterogeneous porous media: biokinetics-limited scenario.

    Science.gov (United States)

    Song, Xin; Hong, Eunyoung; Seagren, Eric A

    2014-03-01

    Subsurface heterogeneities influence interfacial mass-transfer processes and affect the application of in situ bioremediation by impacting the availability of substrates to the microorganisms. However, for difficult-to-degrade compounds, and/or cases with inhibitory biodegradation conditions, slow biokinetics may also limit the overall bioremediation rate, or be as limiting as mass-transfer processes. In this work, a quantitative framework based on a set of dimensionless coefficients was used to capture the effects of the competing interfacial and biokinetic processes and define the overall rate-limiting process. An integrated numerical modeling and experimental approach was used to evaluate application of the quantitative framework for a scenario in which slow-biokinetics limited the overall bioremediation rate of a polycyclic aromatic hydrocarbon (naphthalene). Numerical modeling was conducted to simulate the groundwater flow and naphthalene transport and verify the system parameters, which were used in the quantitative framework application. The experiments examined the movement and biodegradation of naphthalene in a saturated, heterogeneous intermediate-scale flow cell with two layers of contrasting hydraulic conductivities. These experiments were conducted in two phases: Phase I, simulating an inhibited slow biodegradation; and Phase II, simulating an engineered bioremediation, with system perturbations selected to enhance the slow biodegradation rate. In Phase II, two engineered perturbations to the system were selected to examine their ability to enhance in situ biodegradation. In the first perturbation, nitrogen and phosphorus in excess of the required stoichiometric amounts were spiked into the influent solution to mimic a common remedial action taken in the field. The results showed that this perturbation had a moderate positive impact, consistent with slow biokinetics being the overall rate-limiting process. However, the second perturbation, which was to

  2. Flow of miscible and immiscible hydrocarbons in heterogeneous porous media

    Energy Technology Data Exchange (ETDEWEB)

    Butts, M.B.

    1996-12-31

    A series of large-scale two-dimensional physical model studies has been carried out in order to better understand and predict the multiphase flow of hydrocarbon contaminants and the release of the water-soluble fraction of such contaminants into the groundwater stream. The detailed measurements of the fluid saturations within the bulk hydrocarbon plume as well as the aqueous concentrations recorded downstream should provide a useful data set for testing and improving numerical models of both multiphase flow and transport. Predictions of a numerical model of immiscible multiphase flow developed in the petroleum industry were found to compare favourably with the observed oil plume for the case of an immiscible oil spill. Nevertheless, subtle layering within the experimental flume altered the long-term development of the oil plume in a manner not predicted by the numerical model. A stochastic model for three-dimensional, two-phase incompressible flow in heterogeneous soil and rock formations is developed. Analytical solutions for the resulting stochastic differential equations are derived for asymptotic flows using a perturbation approach. These solutions were used to derive general expressions for the large-scale (effective) properties for large-scale two-phase flow in porous media. An important observation from this analysis is that general large-scale flow in heterogeneous soils cannot be predicted on the basis of simple averages of the soil hydraulic properties alone. The large-scale capillary pressure saturation relation is evaluated for imbibition into a wet soil or rock formation. (EG) 194 refs.

  3. Flow of miscible and immiscible hydrocarbons in heterogeneous porous media

    Energy Technology Data Exchange (ETDEWEB)

    Butts, M B

    1997-12-31

    A series of large-scale two-dimensional physical model studies has been carried out in order to better understand and predict the multiphase flow of hydrocarbon contaminants and the release of the water-soluble fraction of such contaminants into the groundwater stream. The detailed measurements of the fluid saturations within the bulk hydrocarbon plume as well as the aqueous concentrations recorded downstream should provide a useful data set for testing and improving numerical models of both multiphase flow and transport. Predictions of a numerical model of immiscible multiphase flow developed in the petroleum industry were found to compare favourably with the observed oil plume for the case of an immiscible oil spill. Nevertheless, subtle layering within the experimental flume altered the long-term development of the oil plume in a manner not predicted by the numerical model. A stochastic model for three-dimensional, two-phase incompressible flow in heterogeneous soil and rock formations is developed. Analytical solutions for the resulting stochastic differential equations are derived for asymptotic flows using a perturbation approach. These solutions were used to derive general expressions for the large-scale (effective) properties for large-scale two-phase flow in porous media. An important observation from this analysis is that general large-scale flow in heterogeneous soils cannot be predicted on the basis of simple averages of the soil hydraulic properties alone. The large-scale capillary pressure saturation relation is evaluated for imbibition into a wet soil or rock formation. (EG) 194 refs.

  4. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  5. Integrating Remote Sensing Information Into A Distributed Hydrological Model for Improving Water Budget Predictions in Large-scale Basins through Data Assimilation

    Science.gov (United States)

    Qin, Changbo; Jia, Yangwen; Su, Z.(Bob); Zhou, Zuhao; Qiu, Yaqin; Suhui, Shen

    2008-01-01

    This paper investigates whether remote sensing evapotranspiration estimates can be integrated by means of data assimilation into a distributed hydrological model for improving the predictions of spatial water distribution over a large river basin with an area of 317,800 km2. A series of available MODIS satellite images over the Haihe River basin in China are used for the year 2005. Evapotranspiration is retrieved from these 1×1 km resolution images using the SEBS (Surface Energy Balance System) algorithm. The physically-based distributed model WEP-L (Water and Energy transfer Process in Large river basins) is used to compute the water balance of the Haihe River basin in the same year. Comparison between model-derived and remote sensing retrieval basin-averaged evapotranspiration estimates shows a good piecewise linear relationship, but their spatial distribution within the Haihe basin is different. The remote sensing derived evapotranspiration shows variability at finer scales. An extended Kalman filter (EKF) data assimilation algorithm, suitable for non-linear problems, is used. Assimilation results indicate that remote sensing observations have a potentially important role in providing spatial information to the assimilation system for the spatially optical hydrological parameterization of the model. This is especially important for large basins, such as the Haihe River basin in this study. Combining and integrating the capabilities of and information from model simulation and remote sensing techniques may provide the best spatial and temporal characteristics for hydrological states/fluxes, and would be both appealing and necessary for improving our knowledge of fundamental hydrological processes and for addressing important water resource management problems. PMID:27879946

  6. Scaling Analysis of Ocean Surface Turbulent Heterogeneities from Satellite Remote Sensing: Use of 2D Structure Functions.

    Directory of Open Access Journals (Sweden)

    P R Renosh

    Full Text Available Satellite remote sensing observations allow the ocean surface to be sampled synoptically over large spatio-temporal scales. The images provided from visible and thermal infrared satellite observations are widely used in physical, biological, and ecological oceanography. The present work proposes a method to understand the multi-scaling properties of satellite products such as the Chlorophyll-a (Chl-a, and the Sea Surface Temperature (SST, rarely studied. The specific objectives of this study are to show how the small scale heterogeneities of satellite images can be characterised using tools borrowed from the fields of turbulence. For that purpose, we show how the structure function, which is classically used in the frame of scaling time series analysis, can be used also in 2D. The main advantage of this method is that it can be applied to process images which have missing data. Based on both simulated and real images, we demonstrate that coarse-graining (CG of a gradient modulus transform of the original image does not provide correct scaling exponents. We show, using a fractional Brownian simulation in 2D, that the structure function (SF can be used with randomly sampled couple of points, and verify that 1 million of couple of points provides enough statistics.

  7. Research on precision grinding technology of large scale and ultra thin optics

    Science.gov (United States)

    Zhou, Lian; Wei, Qiancai; Li, Jie; Chen, Xianhua; Zhang, Qinghua

    2018-03-01

    The flatness and parallelism error of large scale and ultra thin optics have an important influence on the subsequent polishing efficiency and accuracy. In order to realize the high precision grinding of those ductile elements, the low deformation vacuum chuck was designed first, which was used for clamping the optics with high supporting rigidity in the full aperture. Then the optics was planar grinded under vacuum adsorption. After machining, the vacuum system was turned off. The form error of optics was on-machine measured using displacement sensor after elastic restitution. The flatness would be convergenced with high accuracy by compensation machining, whose trajectories were integrated with the measurement result. For purpose of getting high parallelism, the optics was turned over and compensation grinded using the form error of vacuum chuck. Finally, the grinding experiment of large scale and ultra thin fused silica optics with aperture of 430mm×430mm×10mm was performed. The best P-V flatness of optics was below 3 μm, and parallelism was below 3 ″. This machining technique has applied in batch grinding of large scale and ultra thin optics.

  8. Large-scale network dynamics of beta-band oscillations underlie auditory perceptual decision-making

    Directory of Open Access Journals (Sweden)

    Mohsen Alavash

    2017-06-01

    Full Text Available Perceptual decisions vary in the speed at which we make them. Evidence suggests that translating sensory information into perceptual decisions relies on distributed interacting neural populations, with decision speed hinging on power modulations of the neural oscillations. Yet the dependence of perceptual decisions on the large-scale network organization of coupled neural oscillations has remained elusive. We measured magnetoencephalographic signals in human listeners who judged acoustic stimuli composed of carefully titrated clouds of tone sweeps. These stimuli were used in two task contexts, in which the participants judged the overall pitch or direction of the tone sweeps. We traced the large-scale network dynamics of the source-projected neural oscillations on a trial-by-trial basis using power-envelope correlations and graph-theoretical network discovery. In both tasks, faster decisions were predicted by higher segregation and lower integration of coupled beta-band (∼16–28 Hz oscillations. We also uncovered the brain network states that promoted faster decisions in either lower-order auditory or higher-order control brain areas. Specifically, decision speed in judging the tone sweep direction critically relied on the nodal network configurations of anterior temporal, cingulate, and middle frontal cortices. Our findings suggest that global network communication during perceptual decision-making is implemented in the human brain by large-scale couplings between beta-band neural oscillations. The speed at which we make perceptual decisions varies. This translation of sensory information into perceptual decisions hinges on dynamic changes in neural oscillatory activity. However, the large-scale neural-network embodiment supporting perceptual decision-making is unclear. We addressed this question by experimenting two auditory perceptual decision-making situations. Using graph-theoretical network discovery, we traced the large-scale network

  9. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    Science.gov (United States)

    Dinov, Ivo D; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W; Price, Nathan D; Van Horn, John D; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M; Dauer, William; Toga, Arthur W

    2016-01-01

    A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches

  10. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    Full Text Available A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI. The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data.Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i introduce methods for rebalancing imbalanced cohorts, (ii utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model

  11. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    survival and recruitment estimates from the French CES scheme to assess the relative contributions of survival and recruitment to overall population changes. He develops a novel approach to modelling survival rates from such multi–site data by using within–year recaptures to provide a covariate of between–year recapture rates. This provided parsimonious models of variation in recapture probabilities between sites and years. The approach provides promising results for the four species investigated and can potentially be extended to similar data from other CES/MAPS schemes. The final paper by Blandine Doligez, David Thomson and Arie van Noordwijk (Doligez et al., 2004 illustrates how large-scale studies of population dynamics can be important for evaluating the effects of conservation measures. Their study is concerned with the reintroduction of White Stork populations to the Netherlands where a re–introduction programme started in 1969 had resulted in a breeding population of 396 pairs by 2000. They demonstrate the need to consider a wide range of models in order to account for potential age, time, cohort and “trap–happiness” effects. As the data are based on resightings such trap–happiness must reflect some form of heterogeneity in resighting probabilities. Perhaps surprisingly, the provision of supplementary food did not influence survival, but it may havehad an indirect effect via the alteration of migratory behaviour. Spatially explicit modelling of data gathered at many sites inevitably results in starting models with very large numbers of parameters. The problem is often complicated further by having relatively sparse data at each site, even where the total amount of data gathered is very large. Both Julliard (2004 and Doligez et al. (2004 give explicit examples of problems caused by needing to handle very large numbers of parameters and show how they overcame them for their particular data sets. Such problems involve both the choice of appropriate

  12. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  13. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  14. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  15. Characterizing permafrost active layer dynamics and sensitivity to landscape spatial heterogeneity in Alaska

    Science.gov (United States)

    Yi, Yonghong; Kimball, John S.; Chen, Richard H.; Moghaddam, Mahta; Reichle, Rolf H.; Mishra, Umakant; Zona, Donatella; Oechel, Walter C.

    2018-01-01

    An important feature of the Arctic is large spatial heterogeneity in active layer conditions, which is generally poorly represented by global models and can lead to large uncertainties in predicting regional ecosystem responses and climate feedbacks. In this study, we developed a spatially integrated modeling and analysis framework combining field observations, local-scale ( ˜ 50 m resolution) active layer thickness (ALT) and soil moisture maps derived from low-frequency (L + P-band) airborne radar measurements, and global satellite environmental observations to investigate the ALT sensitivity to recent climate trends and landscape heterogeneity in Alaska. Modeled ALT results show good correspondence with in situ measurements in higher-permafrost-probability (PP ≥ 70 %) areas (n = 33; R = 0.60; mean bias = 1.58 cm; RMSE = 20.32 cm), but with larger uncertainty in sporadic and discontinuous permafrost areas. The model results also reveal widespread ALT deepening since 2001, with smaller ALT increases in northern Alaska (mean trend = 0.32±1.18 cm yr-1) and much larger increases (> 3 cm yr-1) across interior and southern Alaska. The positive ALT trend coincides with regional warming and a longer snow-free season (R = 0.60 ± 0.32). A spatially integrated analysis of the radar retrievals and model sensitivity simulations demonstrated that uncertainty in the spatial and vertical distribution of soil organic carbon (SOC) was the largest factor affecting modeled ALT accuracy, while soil moisture played a secondary role. Potential improvements in characterizing SOC heterogeneity, including better spatial sampling of soil conditions and advances in remote sensing of SOC and soil moisture, will enable more accurate predictions of active layer conditions and refinement of the modeling framework across a larger domain.

  16. The method of arbitrarily large moments to calculate single scale processes in quantum field theory

    Energy Technology Data Exchange (ETDEWEB)

    Bluemlein, Johannes [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Schneider, Carsten [Johannes Kepler Univ., Linz (Austria). Research Inst. for Symbolic Computation (RISC)

    2017-01-15

    We device a new method to calculate a large number of Mellin moments of single scale quantities using the systems of differential and/or difference equations obtained by integration-by-parts identities between the corresponding Feynman integrals of loop corrections to physical quantities. These scalar quantities have a much simpler mathematical structure than the complete quantity. A sufficiently large set of moments may even allow the analytic reconstruction of the whole quantity considered, holding in case of first order factorizing systems. In any case, one may derive highly precise numerical representations in general using this method, which is otherwise completely analytic.

  17. WAMS Based Intelligent Operation and Control of Modern Power System with large Scale Renewable Energy Penetration

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain

    security limits. Under such scenario, progressive displacement of conventional generation by wind generation is expected to eventually lead a complex power system with least presence of central power plants. Consequently the support from conventional power plants is expected to reach its all-time low...... system voltage control responsibility from conventional power plants to wind turbines. With increased wind penetration and displaced conventional central power plants, dynamic voltage security has been identified as one of the challenging issue for large scale wind integration. To address the dynamic...... security issue, a WAMS based systematic voltage control scheme for large scale wind integrated power system has been proposed. Along with the optimal reactive power compensation, the proposed scheme considers voltage support from wind farms (equipped with voltage support functionality) and refurbished...

  18. Integral large scale experiments on hydrogen combustion for severe accident code validation-HYCOM

    International Nuclear Information System (INIS)

    Breitung, W.; Dorofeev, S.; Kotchourko, A.; Redlinger, R.; Scholtyssek, W.; Bentaib, A.; L'Heriteau, J.-P.; Pailhories, P.; Eyink, J.; Movahed, M.; Petzold, K.-G.; Heitsch, M.; Alekseev, V.; Denkevits, A.; Kuznetsov, M.; Efimenko, A.; Okun, M.V.; Huld, T.; Baraldi, D.

    2005-01-01

    A joint research project was carried out in the EU Fifth Framework Programme, concerning hydrogen risk in a nuclear power plant. The goals were: Firstly, to create a new data base of results on hydrogen combustion experiments in the slow to turbulent combustion regimes. Secondly, to validate the partners CFD and lumped parameter codes on the experimental data, and to evaluate suitable parameter sets for application calculations. Thirdly, to conduct a benchmark exercise by applying the codes to the full scale analysis of a postulated hydrogen combustion scenario in a light water reactor containment after a core melt accident. The paper describes the work programme of the project and the partners activities. Significant progress has been made in the experimental area, where test series in medium and large scale facilities have been carried out with the focus on specific effects of scale, multi-compartent geometry, heat losses and venting. The data were used for the validation of the partners CFD and lumped parameter codes, which included blind predictive calculations and pre- and post-test intercomparison exercises. Finally, a benchmark exercise was conducted by applying the codes to the full scale analysis of a hydrogen combustion scenario. The comparison and assessment of the results of the validation phase and of the challenging containment calculation exercise allows a deep insight in the quality, capabilities and limits of the CFD and the lumped parameter tools which are currently in use at various research laboratories

  19. Overview of medium heterogeneity and transport processes

    International Nuclear Information System (INIS)

    Tsang, Y.; Tsang, C.F.

    1993-11-01

    Medium heterogeneity can have significant impact on the behavior of solute transport. Tracer breakthrough curves from transport in a heterogeneous medium are distinctly different from that in a homogeneous porous medium. Usually the shape of the breakthrough curves are highly non-symmetrical with a fast rise at early times and very long tail at late times, and often, they consist of multiple peaks. Moreover, unlike transport in a homogeneous medium where the same transport parameters describe the entire medium, transport through heterogeneous media gives rise to breakthrough curves which have strong spatial dependence. These inherent characteristics of transport in heterogeneous medium present special challenge to the performance assessment of a potential high level nuclear waste repository with respect to the possible release of radio nuclides to the accessible environment. Since an inherently desirable site characteristic for a waste repository is that flow and transport should be slow, then transport measurements in site characterization efforts will necessarily be spatially small and temporally short compare to the scales which are of relevance to performance assessment predictions. In this paper we discuss the role of medium heterogeneity in site characterization and performance assessment. Our discussion will be based on a specific example of a 3D heterogeneous stochastic model of a site generally similar to, the Aespoe Island, the site of the Hard Rock Laboratory in Southern Sweden. For our study, alternative 3D stochastic fields of hydraulic conductivities conditioned on ''point'' measurements shall be generated. Results of stochastic flow and transport simulations would be used to address the issues of (1) the relationship of tracer breakthrough with the structure of heterogeneity, and (2) the inference from small scale testing results to large scale and long term predictions

  20. The role of large‐scale heat pumps for short term integration of renewable energy

    DEFF Research Database (Denmark)

    Mathiesen, Brian Vad; Blarke, Morten; Hansen, Kenneth

    2011-01-01

    technologies is focusing on natural working fluid hydrocarbons, ammonia, and carbon dioxide. Large-scale heat pumps are crucial for integrating 50% wind power as anticipated to be installed in Denmark in 2020, along with other measures. Also in the longer term heat pumps can contribute to the minimization...... savings with increased wind power and may additionally lead to economic savings in the range of 1,500-1,700 MDKK in total in the period until 2020. Furthermore, the energy system efficiency may be increased due to large heat pumps replacing boiler production. Finally data sheets for large-scale ammonium......In this report the role of large-scale heat pumps in a future energy system with increased renewable energy is presented. The main concepts for large heat pumps in district heating systems are outlined along with the development for heat pump refrigerants. The development of future heat pump...

  1. Large Scale Cosmological Anomalies and Inhomogeneous Dark Energy

    Directory of Open Access Journals (Sweden)

    Leandros Perivolaropoulos

    2014-01-01

    Full Text Available A wide range of large scale observations hint towards possible modifications on the standard cosmological model which is based on a homogeneous and isotropic universe with a small cosmological constant and matter. These observations, also known as “cosmic anomalies” include unexpected Cosmic Microwave Background perturbations on large angular scales, large dipolar peculiar velocity flows of galaxies (“bulk flows”, the measurement of inhomogenous values of the fine structure constant on cosmological scales (“alpha dipole” and other effects. The presence of the observational anomalies could either be a large statistical fluctuation in the context of ΛCDM or it could indicate a non-trivial departure from the cosmological principle on Hubble scales. Such a departure is very much constrained by cosmological observations for matter. For dark energy however there are no significant observational constraints for Hubble scale inhomogeneities. In this brief review I discuss some of the theoretical models that can naturally lead to inhomogeneous dark energy, their observational constraints and their potential to explain the large scale cosmic anomalies.

  2. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Hardenberg, J. von; Parodi, A.; Passoni, G.; Provenzale, A.; Spiegel, E.A.

    2008-01-01

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 10 5 and 10 8 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  3. Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [The University of Texas at Austin

    2013-10-15

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUARO Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.

  4. Oak Ridge Bio-surveillance Toolkit (ORBiT): Integrating Big-Data Analytics with Visual Analysis for Public Health Dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Ramanathan, Arvind [ORNL; Pullum, Laura L [ORNL; Steed, Chad A [ORNL; Chennubhotla, Chakra [University of Pittsburgh School of Medicine, Pittsburgh PA; Quinn, Shannon [University of Pittsburgh School of Medicine, Pittsburgh PA

    2013-01-01

    In this position paper, we describe the design and implementation of the Oak Ridge Bio-surveillance Toolkit (ORBiT): a collection of novel statistical and machine learning tools implemented for (1) integrating heterogeneous traditional (e.g. emergency room visits, prescription sales data, etc.) and non-traditional (social media such as Twitter and Instagram) data sources, (2) analyzing large-scale datasets and (3) presenting the results from the analytics as a visual interface for the end-user to interact and provide feedback. We present examples of how ORBiT can be used to summarize ex- tremely large-scale datasets effectively and how user interactions can translate into the data analytics process for bio-surveillance. We also present a strategy to estimate parameters relevant to dis- ease spread models from near real time data feeds and show how these estimates can be integrated with disease spread models for large-scale populations. We conclude with a perspective on how integrating data and visual analytics could lead to better forecasting and prediction of disease spread as well as improved awareness of disease susceptible regions.

  5. Electricity network limitations on large-scale deployment of wind energy

    Energy Technology Data Exchange (ETDEWEB)

    Fairbairn, R.J.

    1999-07-01

    This report sought to identify limitation on large scale deployment of wind energy in the UK. A description of the existing electricity supply system in England, Scotland and Wales is given, and operational aspects of the integrated electricity networks, licence conditions, types of wind turbine generators, and the scope for deployment of wind energy in the UK are addressed. A review of technical limitations and technical criteria stipulated by the Distribution and Grid Codes, the effects of system losses, and commercial issues are examined. Potential solutions to technical limitations are proposed, and recommendations are outlined.

  6. A Large Scale Problem Based Learning inter-European Student Satellite Construction Project

    DEFF Research Database (Denmark)

    Nielsen, Jens Frederik Dalsgaard; Alminde, Lars; Bisgaard, Morten

    2006-01-01

    that electronic communication technology was vital within the project. Additionally the SSETI EXPRESS project implied the following problems it didn’t fit to a standard semester - 18 months for the satellite project compared to 5/6 months for a “normal” semester project. difficulties in integrating the tasks......A LARGE SCALE PROBLEM BASED LEARNING INTER-EUROPEAN STUDENT SATELLITE CONSTRUCTION PROJECT This paper describes the pedagogical outcome of a large scale PBL experiment. ESA (European Space Agency) Education Office launched January 2004 an ambitious project: Let students from all over Europe build....... The satellite was successfully launched on October 27th 2005 (http://www.express.space.aau.dk). The project was a student driven project with student project responsibility adding at lot of international experiences and project management skills to the outcome of more traditional one semester, single group...

  7. Density functional theory in surface science and heterogeneous catalysis

    DEFF Research Database (Denmark)

    Nørskov, Jens Kehlet; Scheffler, M.; Toulhoat, H.

    2006-01-01

    Solid surfaces are used extensively as catalysts throughout the chemical industry, in the energy sector, and in environmental protection. Recently, density functional theory has started providing new insight into the atomic-scale mechanisms of heterogeneous catalysis, helping to interpret the large...

  8. A service platform architecture design towards a light integration of heterogeneous systems in the wellbeing domain.

    Science.gov (United States)

    Yang, Yaojin; Ahtinen, Aino; Lahteenmaki, Jaakko; Nyman, Petri; Paajanen, Henrik; Peltoniemi, Teijo; Quiroz, Carlos

    2007-01-01

    System integration is one of the major challenges for building wellbeing or healthcare related information systems. In this paper, we are going to share our experiences on how to design a service platform called Nuadu service platform, for providing integrated services in occupational health promotion and health risk management through two heterogeneous systems. Our design aims for a light integration covering the layers, from data through service up to presentation, while maintaining the integrity of the underlying systems.

  9. Full-Scale Continuous Mini-Reactor Setup for Heterogeneous Grignard Alkylation of a Pharmaceutical Intermediate

    DEFF Research Database (Denmark)

    Pedersen, Michael Jønch; Holm, Thomas; Rahbek, Jesper P.

    2013-01-01

    A reactor setup consisting of two reactors in series has been implemented for a full-scale, heterogeneous Grignard alkylation. Solutions pass from a small filter reactor into a static mixer reactor with multiple side entries, thus combining continuous stirred tank reactor (CSTR) and plug flow...

  10. Manufacturing test of large scale hollow capsule and long length cladding in the large scale oxide dispersion strengthened (ODS) martensitic steel

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2004-04-01

    Mass production capability of oxide dispersion strengthened (ODS) martensitic steel cladding (9Cr) has being evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube (raw materials powder production, mechanical alloying (MA) by ball mill, canning, hot extrusion, and machining) is a dominant factor in the total cost for manufacturing ODS ferritic steel cladding. In this study, the large-sale 9Cr-ODS martensitic steel mother tube which is made with a large-scale hollow capsule, and long length claddings were manufactured, and the applicability of these processes was evaluated. Following results were obtained in this study. (1) Manufacturing the large scale mother tube in the dimension of 32 mm OD, 21 mm ID, and 2 m length has been successfully carried out using large scale hollow capsule. This mother tube has a high degree of accuracy in size. (2) The chemical composition and the micro structure of the manufactured mother tube are similar to the existing mother tube manufactured by a small scale can. And the remarkable difference between the bottom and top sides in the manufactured mother tube has not been observed. (3) The long length cladding has been successfully manufactured from the large scale mother tube which was made using a large scale hollow capsule. (4) For reducing the manufacturing cost of the ODS steel claddings, manufacturing process of the mother tubes using a large scale hollow capsules is promising. (author)

  11. Applicability of laboratory data to large scale tests under dynamic loading conditions

    International Nuclear Information System (INIS)

    Kussmaul, K.; Klenk, A.

    1993-01-01

    The analysis of dynamic loading and subsequent fracture must be based on reliable data for loading and deformation history. This paper describes an investigation to examine the applicability of parameters which are determined by means of small-scale laboratory tests to large-scale tests. The following steps were carried out: (1) Determination of crack initiation by means of strain gauges applied in the crack tip field of compact tension specimens. (2) Determination of dynamic crack resistance curves of CT-specimens using a modified key-curve technique. The key curves are determined by dynamic finite element analyses. (3) Determination of strain-rate-dependent stress-strain relationships for the finite element simulation of small-scale and large-scale tests. (4) Analysis of the loading history for small-scale tests with the aid of experimental data and finite element calculations. (5) Testing of dynamically loaded tensile specimens taken as strips from ferritic steel pipes with a thickness of 13 mm resp. 18 mm. The strips contained slits and surface cracks. (6) Fracture mechanics analyses of the above mentioned tests and of wide plate tests. The wide plates (960x608x40 mm 3 ) had been tested in a propellant-driven 12 MN dynamic testing facility. For calculating the fracture mechanics parameters of both tests, a dynamic finite element simulation considering the dynamic material behaviour was employed. The finite element analyses showed a good agreement with the simulated tests. This prerequisite allowed to gain critical J-integral values. Generally the results of the large-scale tests were conservative. 19 refs., 20 figs., 4 tabs

  12. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  13. Modular design of artificial tissue homeostasis: robust control through synthetic cellular heterogeneity.

    Directory of Open Access Journals (Sweden)

    Miles Miller

    Full Text Available Synthetic biology efforts have largely focused on small engineered gene networks, yet understanding how to integrate multiple synthetic modules and interface them with endogenous pathways remains a challenge. Here we present the design, system integration, and analysis of several large scale synthetic gene circuits for artificial tissue homeostasis. Diabetes therapy represents a possible application for engineered homeostasis, where genetically programmed stem cells maintain a steady population of β-cells despite continuous turnover. We develop a new iterative process that incorporates modular design principles with hierarchical performance optimization targeted for environments with uncertainty and incomplete information. We employ theoretical analysis and computational simulations of multicellular reaction/diffusion models to design and understand system behavior, and find that certain features often associated with robustness (e.g., multicellular synchronization and noise attenuation are actually detrimental for tissue homeostasis. We overcome these problems by engineering a new class of genetic modules for 'synthetic cellular heterogeneity' that function to generate beneficial population diversity. We design two such modules (an asynchronous genetic oscillator and a signaling throttle mechanism, demonstrate their capacity for enhancing robust control, and provide guidance for experimental implementation with various computational techniques. We found that designing modules for synthetic heterogeneity can be complex, and in general requires a framework for non-linear and multifactorial analysis. Consequently, we adapt a 'phenotypic sensitivity analysis' method to determine how functional module behaviors combine to achieve optimal system performance. We ultimately combine this analysis with Bayesian network inference to extract critical, causal relationships between a module's biochemical rate-constants, its high level functional behavior in

  14. Modular design of artificial tissue homeostasis: robust control through synthetic cellular heterogeneity.

    Science.gov (United States)

    Miller, Miles; Hafner, Marc; Sontag, Eduardo; Davidsohn, Noah; Subramanian, Sairam; Purnick, Priscilla E M; Lauffenburger, Douglas; Weiss, Ron

    2012-01-01

    Synthetic biology efforts have largely focused on small engineered gene networks, yet understanding how to integrate multiple synthetic modules and interface them with endogenous pathways remains a challenge. Here we present the design, system integration, and analysis of several large scale synthetic gene circuits for artificial tissue homeostasis. Diabetes therapy represents a possible application for engineered homeostasis, where genetically programmed stem cells maintain a steady population of β-cells despite continuous turnover. We develop a new iterative process that incorporates modular design principles with hierarchical performance optimization targeted for environments with uncertainty and incomplete information. We employ theoretical analysis and computational simulations of multicellular reaction/diffusion models to design and understand system behavior, and find that certain features often associated with robustness (e.g., multicellular synchronization and noise attenuation) are actually detrimental for tissue homeostasis. We overcome these problems by engineering a new class of genetic modules for 'synthetic cellular heterogeneity' that function to generate beneficial population diversity. We design two such modules (an asynchronous genetic oscillator and a signaling throttle mechanism), demonstrate their capacity for enhancing robust control, and provide guidance for experimental implementation with various computational techniques. We found that designing modules for synthetic heterogeneity can be complex, and in general requires a framework for non-linear and multifactorial analysis. Consequently, we adapt a 'phenotypic sensitivity analysis' method to determine how functional module behaviors combine to achieve optimal system performance. We ultimately combine this analysis with Bayesian network inference to extract critical, causal relationships between a module's biochemical rate-constants, its high level functional behavior in isolation, and

  15. Reactive solute transport in physically and chemically heterogeneous porous media with multimodal reactive mineral facies: the Lagrangian approach.

    Science.gov (United States)

    Soltanian, Mohamad Reza; Ritzi, Robert W; Dai, Zhenxue; Huang, Chao Cheng

    2015-03-01

    Physical and chemical heterogeneities have a large impact on reactive transport in porous media. Examples of heterogeneous attributes affecting reactive mass transport are the hydraulic conductivity (K), and the equilibrium sorption distribution coefficient (Kd). This paper uses the Deng et al. (2013) conceptual model for multimodal reactive mineral facies and a Lagrangian-based stochastic theory in order to analyze the reactive solute dispersion in three-dimensional anisotropic heterogeneous porous media with hierarchical organization of reactive minerals. An example based on real field data is used to illustrate the time evolution trends of reactive solute dispersion. The results show that the correlation between the hydraulic conductivity and the equilibrium sorption distribution coefficient does have a significant effect on reactive solute dispersion. The anisotropy ratio does not have a significant effect on reactive solute dispersion. Furthermore, through a sensitivity analysis we investigate the impact of changing the mean, variance, and integral scale of K and Kd on reactive solute dispersion. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  17. Large-scale influences in near-wall turbulence.

    Science.gov (United States)

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  18. Effects of vegetation heterogeneity and surface topography on spatial scaling of net primary productivity

    Science.gov (United States)

    Chen, J. M.; Chen, X.; Ju, W.

    2013-07-01

    Due to the heterogeneous nature of the land surface, spatial scaling is an inevitable issue in the development of land models coupled with low-resolution Earth system models (ESMs) for predicting land-atmosphere interactions and carbon-climate feedbacks. In this study, a simple spatial scaling algorithm is developed to correct errors in net primary productivity (NPP) estimates made at a coarse spatial resolution based on sub-pixel information of vegetation heterogeneity and surface topography. An eco-hydrological model BEPS-TerrainLab, which considers both vegetation and topographical effects on the vertical and lateral water flows and the carbon cycle, is used to simulate NPP at 30 m and 1 km resolutions for a 5700 km2 watershed with an elevation range from 518 m to 3767 m in the Qinling Mountain, Shanxi Province, China. Assuming that the NPP simulated at 30 m resolution represents the reality and that at 1 km resolution is subject to errors due to sub-pixel heterogeneity, a spatial scaling index (SSI) is developed to correct the coarse resolution NPP values pixel by pixel. The agreement between the NPP values at these two resolutions is improved considerably from R2 = 0.782 to R2 = 0.884 after the correction. The mean bias error (MBE) in NPP modelled at the 1 km resolution is reduced from 14.8 g C m-2 yr-1 to 4.8 g C m-2 yr-1 in comparison with NPP modelled at 30 m resolution, where the mean NPP is 668 g C m-2 yr-1. The range of spatial variations of NPP at 30 m resolution is larger than that at 1 km resolution. Land cover fraction is the most important vegetation factor to be considered in NPP spatial scaling, and slope is the most important topographical factor for NPP spatial scaling especially in mountainous areas, because of its influence on the lateral water redistribution, affecting water table, soil moisture and plant growth. Other factors including leaf area index (LAI) and elevation have small and additive effects on improving the spatial scaling

  19. Effects of vegetation heterogeneity and surface topography on spatial scaling of net primary productivity

    Directory of Open Access Journals (Sweden)

    J. M. Chen

    2013-07-01

    Full Text Available Due to the heterogeneous nature of the land surface, spatial scaling is an inevitable issue in the development of land models coupled with low-resolution Earth system models (ESMs for predicting land-atmosphere interactions and carbon-climate feedbacks. In this study, a simple spatial scaling algorithm is developed to correct errors in net primary productivity (NPP estimates made at a coarse spatial resolution based on sub-pixel information of vegetation heterogeneity and surface topography. An eco-hydrological model BEPS-TerrainLab, which considers both vegetation and topographical effects on the vertical and lateral water flows and the carbon cycle, is used to simulate NPP at 30 m and 1 km resolutions for a 5700 km2 watershed with an elevation range from 518 m to 3767 m in the Qinling Mountain, Shanxi Province, China. Assuming that the NPP simulated at 30 m resolution represents the reality and that at 1 km resolution is subject to errors due to sub-pixel heterogeneity, a spatial scaling index (SSI is developed to correct the coarse resolution NPP values pixel by pixel. The agreement between the NPP values at these two resolutions is improved considerably from R2 = 0.782 to R2 = 0.884 after the correction. The mean bias error (MBE in NPP modelled at the 1 km resolution is reduced from 14.8 g C m−2 yr−1 to 4.8 g C m−2 yr−1 in comparison with NPP modelled at 30 m resolution, where the mean NPP is 668 g C m−2 yr−1. The range of spatial variations of NPP at 30 m resolution is larger than that at 1 km resolution. Land cover fraction is the most important vegetation factor to be considered in NPP spatial scaling, and slope is the most important topographical factor for NPP spatial scaling especially in mountainous areas, because of its influence on the lateral water redistribution, affecting water table, soil moisture and plant growth. Other factors including leaf area index (LAI and elevation have small and additive effects on improving

  20. Locating inefficient links in a large-scale transportation network

    Science.gov (United States)

    Sun, Li; Liu, Like; Xu, Zhongzhi; Jie, Yang; Wei, Dong; Wang, Pu

    2015-02-01

    Based on data from geographical information system (GIS) and daily commuting origin destination (OD) matrices, we estimated the distribution of traffic flow in the San Francisco road network and studied Braess's paradox in a large-scale transportation network with realistic travel demand. We measured the variation of total travel time Δ T when a road segment is closed, and found that | Δ T | follows a power-law distribution if Δ T 0. This implies that most roads have a negligible effect on the efficiency of the road network, while the failure of a few crucial links would result in severe travel delays, and closure of a few inefficient links would counter-intuitively reduce travel costs considerably. Generating three theoretical networks, we discovered that the heterogeneously distributed travel demand may be the origin of the observed power-law distributions of | Δ T | . Finally, a genetic algorithm was used to pinpoint inefficient link clusters in the road network. We found that closing specific road clusters would further improve the transportation efficiency.

  1. Contextual Compression of Large-Scale Wind Turbine Array Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Gruchalla, Kenny M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Brunhart-Lupo, Nicholas J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Potter, Kristin C [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clyne, John [National Center for Atmospheric Research (NCAR)

    2017-12-04

    Data sizes are becoming a critical issue particularly for HPC applications. We have developed a user-driven lossy wavelet-based storage model to facilitate the analysis and visualization of large-scale wind turbine array simulations. The model stores data as heterogeneous blocks of wavelet coefficients, providing high-fidelity access to user-defined data regions believed the most salient, while providing lower-fidelity access to less salient regions on a block-by-block basis. In practice, by retaining the wavelet coefficients as a function of feature saliency, we have seen data reductions in excess of 94 percent, while retaining lossless information in the turbine-wake regions most critical to analysis and providing enough (low-fidelity) contextual information in the upper atmosphere to track incoming coherent turbulent structures. Our contextual wavelet compression approach has allowed us to deliver interative visual analysis while providing the user control over where data loss, and thus reduction in accuracy, in the analysis occurs. We argue this reduced but contextualized representation is a valid approach and encourages contextual data management.

  2. Viscous fingering with permeability heterogeneity

    International Nuclear Information System (INIS)

    Tan, C.; Homsy, G.M.

    1992-01-01

    Viscous fingering in miscible displacements in the presence of permeability heterogeneities is studied using two-dimensional simulations. The heterogeneities are modeled as stationary random functions of space with finite correlation scale. Both the variance and scale of the heterogeneities are varied over modest ranges. It is found that the fingered zone grows linearly in time in a fashion analogous to that found in homogeneous media by Tan and Homsy [Phys. Fluids 31, 1330 (1988)], indicating a close coupling between viscous fingering on the one hand and flow through preferentially more permeable paths on the other. The growth rate of the mixing zone increases monotonically with the variance of the heterogeneity, as expected, but shows a maximum as the correlation scale is varied. The latter is explained as a ''resonance'' between the natural scale of fingers in homogeneous media and the correlation scale

  3. An Automated Approach to Map Winter Cropped Area of Smallholder Farms across Large Scales Using MODIS Imagery

    Directory of Open Access Journals (Sweden)

    Meha Jain

    2017-06-01

    Full Text Available Fine-scale agricultural statistics are an important tool for understanding trends in food production and their associated drivers, yet these data are rarely collected in smallholder systems. These statistics are particularly important for smallholder systems given the large amount of fine-scale heterogeneity in production that occurs in these regions. To overcome the lack of ground data, satellite data are often used to map fine-scale agricultural statistics. However, doing so is challenging for smallholder systems because of (1 complex sub-pixel heterogeneity; (2 little to no available calibration data; and (3 high amounts of cloud cover as most smallholder systems occur in the tropics. We develop an automated method termed the MODIS Scaling Approach (MSA to map smallholder cropped area across large spatial and temporal scales using MODIS Enhanced Vegetation Index (EVI satellite data. We use this method to map winter cropped area, a key measure of cropping intensity, across the Indian subcontinent annually from 2000–2001 to 2015–2016. The MSA defines a pixel as cropped based on winter growing season phenology and scales the percent of cropped area within a single MODIS pixel based on observed EVI values at peak phenology. We validated the result with eleven high-resolution scenes (spatial scale of 5 × 5 m2 or finer that we classified into cropped versus non-cropped maps using training data collected by visual inspection of the high-resolution imagery. The MSA had moderate to high accuracies when validated using these eleven scenes across India (R2 ranging between 0.19 and 0.89 with an overall R2 of 0.71 across all sites. This method requires no calibration data, making it easy to implement across large spatial and temporal scales, with 100% spatial coverage due to the compositing of EVI to generate cloud-free data sets. The accuracies found in this study are similar to those of other studies that map crop production using automated methods

  4. A mixed-methods study of system-level sustainability of evidence-based practices in 12 large-scale implementation initiatives.

    Science.gov (United States)

    Scudder, Ashley T; Taber-Thomas, Sarah M; Schaffner, Kristen; Pemberton, Joy R; Hunter, Leah; Herschell, Amy D

    2017-12-07

    In recent decades, evidence-based practices (EBPs) have been broadly promoted in community behavioural health systems in the United States of America, yet reported EBP penetration rates remain low. Determining how to systematically sustain EBPs in complex, multi-level service systems has important implications for public health. This study examined factors impacting the sustainability of parent-child interaction therapy (PCIT) in large-scale initiatives in order to identify potential predictors of sustainment. A mixed-methods approach to data collection was used. Qualitative interviews and quantitative surveys examining sustainability processes and outcomes were completed by participants from 12 large-scale initiatives. Sustainment strategies fell into nine categories, including infrastructure, training, marketing, integration and building partnerships. Strategies involving integration of PCIT into existing practices and quality monitoring predicted sustainment, while financing also emerged as a key factor. The reported factors and strategies impacting sustainability varied across initiatives; however, integration into existing practices, monitoring quality and financing appear central to high levels of sustainability of PCIT in community-based systems. More detailed examination of the progression of specific activities related to these strategies may aide in identifying priorities to include in strategic planning of future large-scale initiatives. ClinicalTrials.gov ID NCT02543359 ; Protocol number PRO12060529.

  5. PKI security in large-scale healthcare networks.

    Science.gov (United States)

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  6. Prediction of Gas Injection Performance for Heterogeneous Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Blunt, Martin J.; Orr, Franklin M.

    1999-05-17

    This report describes research carried out in the Department of Petroleum Engineering at Stanford University from September 1997 - September 1998 under the second year of a three-year grant from the Department of Energy on the "Prediction of Gas Injection Performance for Heterogeneous Reservoirs." The research effort is an integrated study of the factors affecting gas injection, from the pore scale to the field scale, and involves theoretical analysis, laboratory experiments, and numerical simulation. The original proposal described research in four areas: (1) Pore scale modeling of three phase flow in porous media; (2) Laboratory experiments and analysis of factors influencing gas injection performance at the core scale with an emphasis on the fundamentals of three phase flow; (3) Benchmark simulations of gas injection at the field scale; and (4) Development of streamline-based reservoir simulator. Each state of the research is planned to provide input and insight into the next stage, such that at the end we should have an integrated understanding of the key factors affecting field scale displacements.

  7. Modeling the large-scale effects of surface moisture heterogeneity on wetland carbon fluxes in the West Siberian Lowland

    Directory of Open Access Journals (Sweden)

    T. J. Bohn

    2013-10-01

    Full Text Available We used a process-based model to examine the role of spatial heterogeneity of surface and sub-surface water on the carbon budget of the wetlands of the West Siberian Lowland over the period 1948–2010. We found that, while surface heterogeneity (fractional saturated area had little overall effect on estimates of the region's carbon fluxes, sub-surface heterogeneity (spatial variations in water table depth played an important role in both the overall magnitude and spatial distribution of estimates of the region's carbon fluxes. In particular, to reproduce the spatial pattern of CH4 emissions recorded by intensive in situ observations across the domain, in which very little CH4 is emitted north of 60° N, it was necessary to (a account for CH4 emissions from unsaturated wetlands and (b use spatially varying methane model parameters that reduced estimated CH4 emissions in the northern (permafrost half of the domain (and/or account for lower CH4 emissions under inundated conditions. Our results suggest that previous estimates of the response of these wetlands to thawing permafrost may have overestimated future increases in methane emissions in the permafrost zone.

  8. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  9. Emerging large-scale solar heating applications