WorldWideScience

Sample records for monitor large-scale distribution

  1. Configuration monitoring tool for large-scale distributed computing

    International Nuclear Information System (INIS)

    Wu, Y.; Graham, G.; Lu, X.; Afaq, A.; Kim, B.J.; Fisk, I.

    2004-01-01

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) at CERN will likely use a grid system to achieve much of its offline processing need. Given the heterogeneous and dynamic nature of grid systems, it is desirable to have in place a configuration monitor. The configuration monitoring tool is built using the Globus toolkit and web services. It consists of an information provider for the Globus MDS, a relational database for keeping track of the current and old configurations, and client interfaces to query and administer the configuration system. The Grid Security Infrastructure (GSI), together with EDG Java Security packages, are used for secure authentication and transparent access to the configuration information across the CMS grid. This work has been prototyped and tested using US-CMS grid resources

  2. Configuration monitoring tool for large-scale distributed computing

    CERN Document Server

    Wu, Y; Fisk, I; Graham, G; Kim, B J; Lü, X

    2004-01-01

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) at CERN will likely use a grid system to achieve much of its offline processing need. Given the heterogeneous and dynamic nature of grid systems, it is desirable to have in place a configuration monitor. The configuration monitoring tool is built using the Globus toolkit and web services. It consists of an information provider for the Globus MDS, a relational database for keeping track of the current and old configurations, and client interfaces to query and administer the configuration system. The Grid Security Infrastructure (GSI), together with EDG Java Security packages, are used for secure authentication and transparent access to the configuration information across the CMS grid. This work has been prototyped and tested using US-CMS grid resources.

  3. High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering

    Science.gov (United States)

    Maly, K.

    1998-01-01

    Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated

  4. Large scale distribution monitoring of FRP-OF based on BOTDR technique for infrastructures

    Science.gov (United States)

    Zhou, Zhi; He, Jianping; Yan, Kai; Ou, Jinping

    2007-04-01

    BOTDA(R) sensing technique is considered as one of the most practical solution for large-sized structures as the instrument. However, there is still a big obstacle to apply BOTDA(R) in large-scale area due to the high cost and the reliability problem of sensing head which is associated to the sensor installation and survival. In this paper, we report a novel low-cost and high reliable BOTDA(R) sensing head using FRP(Fiber Reinforced Polymer)-bare optical fiber rebar, named BOTDA(R)-FRP-OF. We investigated the surface bonding and its mechanical strength by SEM and intensity experiments. Considering the strain difference between OF and host matrix which may result in measurement error, the strain transfer from host to OF have been theoretically studied. Furthermore, GFRP-OFs sensing properties of strain and temperature at different gauge length were tested under different spatial and readout resolution using commercial BOTDA. Dual FRP-OFs temperature compensation method has also been proposed and analyzed. And finally, BOTDA(R)-OFs have been applied in Tiyu west road civil structure at Guangzhou and Daqing Highway. This novel FRP-OF rebar shows both high strengthen and good sensing properties, which can be used in long-term SHM for civil infrastructures.

  5. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  6. Using Distributed Fiber Optic Sensing to Monitor Large Scale Permafrost Transitions: Preliminary Results from a Controlled Thaw Experiment

    Science.gov (United States)

    Ajo Franklin, J. B.; Wagner, A. M.; Lindsey, N.; Dou, S.; Bjella, K.; Daley, T. M.; Freifeld, B. M.; Ulrich, C.; Gelvin, A.; Morales, A.; James, S. R.; Saari, S.; Ekblaw, I.; Wood, T.; Robertson, M.; Martin, E. R.

    2016-12-01

    In a warming world, permafrost landscapes are being rapidly transformed by thaw, yielding surface subsidence and groundwater flow alteration. The same transformations pose a threat to arctic infrastructure and can induce catastrophic failure of the roads, runways, and pipelines on which human habitation depends. Scalable solutions to monitoring permafrost thaw dynamics are required to both quantitatively understand biogeochemical feedbacks as well as to protect built infrastructure from damage. Unfortunately, permafrost alteration happens over the time scale of climate change, years to decades, a decided challenge for testing new sensing technologies in a limited context. One solution is to engineer systems capable of rapidly thawing large permafrost units to allow short duration experiments targeting next-generation sensing approaches. We present preliminary results from a large-scale controlled permafrost thaw experiment designed to evaluate the utility of different geophysical approaches for tracking the cause, precursors, and early phases of thaw subsidence. We focus on the use of distributed fiber optic sensing for this challenge and deployed distributed temperature (DTS), strain (DSS), and acoustic (DAS) sensing systems in a 2D array to detect thaw signatures. A 10 x 15 x 1 m section of subsurface permafrost was heated using an array of 120 downhole heaters (60 w) at an experimental site near Fairbanks, AK. Ambient noise analysis of DAS datasets collected at the plot, coupled to shear wave inversion, was utilized to evaluate changes in shear wave velocity associated with heating and thaw. These measurements were confirmed by seismic surveys collected using a semi-permanent orbital seismic source activated on a daily basis. Fiber optic measurements were complemented by subsurface thermistor and thermocouple arrays, timelapse total station surveys, LIDAR, secondary seismic measurements (geophone and broadband recordings), timelapse ERT, borehole NMR, soil

  7. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  8. Radiations: large scale monitoring in Japan

    International Nuclear Information System (INIS)

    Linton, M.; Khalatbari, A.

    2011-01-01

    As the consequences of radioactive leaks on their health are a matter of concern for Japanese people, a large scale epidemiological study has been launched by the Fukushima medical university. It concerns the two millions inhabitants of the Fukushima Prefecture. On the national level and with the support of public funds, medical care and follow-up, as well as systematic controls are foreseen, notably to check the thyroid of 360.000 young people less than 18 year old and of 20.000 pregnant women in the Fukushima Prefecture. Some measurements have already been performed on young children. Despite the sometimes rather low measures, and because they know that some parts of the area are at least as much contaminated as it was the case around Chernobyl, some people are reluctant to go back home

  9. Self-* and Adaptive Mechanisms for Large Scale Distributed Systems

    Science.gov (United States)

    Fragopoulou, P.; Mastroianni, C.; Montero, R.; Andrjezak, A.; Kondo, D.

    Large-scale distributed computing systems and infrastructure, such as Grids, P2P systems and desktop Grid platforms, are decentralized, pervasive, and composed of a large number of autonomous entities. The complexity of these systems is such that human administration is nearly impossible and centralized or hierarchical control is highly inefficient. These systems need to run on highly dynamic environments, where content, network topologies and workloads are continuously changing. Moreover, they are characterized by the high degree of volatility of their components and the need to provide efficient service management and to handle efficiently large amounts of data. This paper describes some of the areas for which adaptation emerges as a key feature, namely, the management of computational Grids, the self-management of desktop Grid platforms and the monitoring and healing of complex applications. It also elaborates on the use of bio-inspired algorithms to achieve self-management. Related future trends and challenges are described.

  10. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  11. Glass badge dosimetry system for large scale personal monitoring

    International Nuclear Information System (INIS)

    Norimichi Juto

    2002-01-01

    Glass Badge using silver activated phosphate glass dosemeter was specially developed for large scale personal monitoring. And dosimetry systems such as an automatic leader and a dose equipment calculation algorithm were developed at once to achieve reasonable personal monitoring. In large scale personal monitoring, both of precision for dosimetry and confidence for lot of personal data handling become very important. The silver activated phosphate glass dosemeter has basically excellent characteristics for dosimetry such as homogeneous and stable sensitivity, negligible fading and so on. Glass Badge was designed to measure 10 keV - 10 MeV range of photon. 300 keV - 3 MeV range of beta, and 0.025 eV - 15 MeV range of neutron by included SSNTD. And developed Glass Badge dosimetry system has not only these basic characteristics but also lot of features to keep good precision for dosimetry and data handling. In this presentation, features of Glass Badge dosimetry systems and examples for practical personal monitoring systems will be presented. (Author)

  12. Performance Health Monitoring of Large-Scale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rajamony, Ram [IBM Research, Austin, TX (United States)

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  13. LEMON - LHC Era Monitoring for Large-Scale Infrastructures

    International Nuclear Information System (INIS)

    Babik, Marian; Hook, Nicholas; Lansdale, Thomas Hector; Lenkes, Daniel; Siket, Miroslav; Waldron, Denis; Fedorko, Ivan

    2011-01-01

    At the present time computer centres are facing a massive rise in virtualization and cloud computing as these solutions bring advantages to service providers and consolidate the computer centre resources. However, as a result the monitoring complexity is increasing. Computer centre management requires not only to monitor servers, network equipment and associated software but also to collect additional environment and facilities data (e.g. temperature, power consumption, cooling efficiency, etc.) to have also a good overview of the infrastructure performance. The LHC Era Monitoring (Lemon) system is addressing these requirements for a very large scale infrastructure. The Lemon agent that collects data on every client and forwards the samples to the central measurement repository provides a flexible interface that allows rapid development of new sensors. The system allows also to report on behalf of remote devices such as switches and power supplies. Online and historical data can be visualized via a web-based interface or retrieved via command-line tools. The Lemon Alarm System component can be used for notifying the operator about error situations. In this article, an overview of the Lemon monitoring is provided together with a description of the CERN LEMON production instance. No direct comparison is made with other monitoring tool.

  14. ANTITRUST ISSUES IN THE LARGE-SCALE FOOD DISTRIBUTION SECTOR

    Directory of Open Access Journals (Sweden)

    Enrico Adriano Raffaelli

    2014-12-01

    Full Text Available In light of the slow modernization of the Italian large-scale food distribution sector, of the fragmentation at national level, of the significant roles of the cooperatives at local level and of the alliances between food retail chains, the ICA during the recent years has developed a strong interest in this sector.After having analyzed the peculiarities of the Italian large-scale food distribution sector, this article shows the recent approach taken by the ICA toward the main antitrust issues in this sector.In the analysis of such issues, mainly the contractual relations between the GDO retailers and their suppliers, the introduction of Article 62 of Law no. 27 dated 24th March 2012 is crucial, because, by facilitating and encouraging complaints by the interested parties, it should allow the developing of normal competitive dynamics within the food distribution sector, where companies should be free to enter the market using the tools at their disposal, without undue restrictions.

  15. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  16. Distributed system for large-scale remote research

    International Nuclear Information System (INIS)

    Ueshima, Yutaka

    2002-01-01

    In advanced photon research, large-scale simulations and high-resolution observations are powerfull tools. In numerical and real experiments, the real-time visualization and steering system is considered as a hopeful method of data analysis. This approach is valid in the typical analysis at one time or low cost experiment and simulation. In research of an unknown problem, it is necessary that the output data be analyzed many times because conclusive analysis is difficult at one time. Consequently, output data should be filed to refer and analyze at any time. To support research, we need the automatic functions, transporting data files from data generator to data storage, analyzing data, tracking history of data handling, and so on. The supporting system will be a functionally distributed system. (author)

  17. Large scale photovoltaic field trials. Second technical report: monitoring phase

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-09-15

    This report provides an update on the Large-Scale Building Integrated Photovoltaic Field Trials (LS-BIPV FT) programme commissioned by the Department of Trade and Industry (Department for Business, Enterprise and Industry; BERR). It provides detailed profiles of the 12 projects making up this programme, which is part of the UK programme on photovoltaics and has run in parallel with the Domestic Field Trial. These field trials aim to record the experience and use the lessons learnt to raise awareness of, and confidence in, the technology and increase UK capabilities. The projects involved: the visitor centre at the Gaia Energy Centre in Cornwall; a community church hall in London; council offices in West Oxfordshire; a sports science centre at Gloucester University; the visitor centre at Cotswold Water Park; the headquarters of the Insolvency Service; a Welsh Development Agency building; an athletics centre in Birmingham; a research facility at the University of East Anglia; a primary school in Belfast; and Barnstable civic centre in Devon. The report describes the aims of the field trials, monitoring issues, performance, observations and trends, lessons learnt and the results of occupancy surveys.

  18. Staghorn: An Automated Large-Scale Distributed System Analysis Platform

    Energy Technology Data Exchange (ETDEWEB)

    Gabert, Kasimir [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Burns, Ian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Elliott, Steven [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kallaher, Jenna [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vail, Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-09-01

    Conducting experiments on large-scale distributed computing systems is becoming significantly easier with the assistance of emulation. Researchers can now create a model of a distributed computing environment and then generate a virtual, laboratory copy of the entire system composed of potentially thousands of virtual machines, switches, and software. The use of real software, running at clock rate in full virtual machines, allows experiments to produce meaningful results without necessitating a full understanding of all model components. However, the ability to inspect and modify elements within these models is bound by the limitation that such modifications must compete with the model, either running in or alongside it. This inhibits entire classes of analyses from being conducted upon these models. We developed a mechanism to snapshot an entire emulation-based model as it is running. This allows us to \\freeze time" and subsequently fork execution, replay execution, modify arbitrary parts of the model, or deeply explore the model. This snapshot includes capturing packets in transit and other input/output state along with the running virtual machines. We were able to build this system in Linux using Open vSwitch and Kernel Virtual Machines on top of Sandia's emulation platform Firewheel. This primitive opens the door to numerous subsequent analyses on models, including state space exploration, debugging distributed systems, performance optimizations, improved training environments, and improved experiment repeatability.

  19. Monitoring and Information Fusion for Search and Rescue Operations in Large-Scale Disasters

    National Research Council Canada - National Science Library

    Nardi, Daniele

    2002-01-01

    ... for information fusion with application to search-and-rescue and large scale disaster relief. The objective is to develop and to deploy tools to support the monitoring activities in an intervention caused by a large-scale disaster...

  20. Large-Scale Wireless Temperature Monitoring System for Liquefied Petroleum Gas Storage Tanks

    Directory of Open Access Journals (Sweden)

    Guangwen Fan

    2015-09-01

    Full Text Available Temperature distribution is a critical indicator of the health condition for Liquefied Petroleum Gas (LPG storage tanks. In this paper, we present a large-scale wireless temperature monitoring system to evaluate the safety of LPG storage tanks. The system includes wireless sensors networks, high temperature fiber-optic sensors, and monitoring software. Finally, a case study on real-world LPG storage tanks proves the feasibility of the system. The unique features of wireless transmission, automatic data acquisition and management, local and remote access make the developed system a good alternative for temperature monitoring of LPG storage tanks in practical applications.

  1. Large-Scale Wireless Temperature Monitoring System for Liquefied Petroleum Gas Storage Tanks.

    Science.gov (United States)

    Fan, Guangwen; Shen, Yu; Hao, Xiaowei; Yuan, Zongming; Zhou, Zhi

    2015-09-18

    Temperature distribution is a critical indicator of the health condition for Liquefied Petroleum Gas (LPG) storage tanks. In this paper, we present a large-scale wireless temperature monitoring system to evaluate the safety of LPG storage tanks. The system includes wireless sensors networks, high temperature fiber-optic sensors, and monitoring software. Finally, a case study on real-world LPG storage tanks proves the feasibility of the system. The unique features of wireless transmission, automatic data acquisition and management, local and remote access make the developed system a good alternative for temperature monitoring of LPG storage tanks in practical applications.

  2. Dose monitoring in large-scale flowing aqueous media

    International Nuclear Information System (INIS)

    Kuruca, C.N.

    1995-01-01

    The Miami Electron Beam Research Facility (EBRF) has been in operation for six years. The EBRF houses a 1.5 MV, 75 KW DC scanned electron beam. Experiments have been conducted to evaluate the effectiveness of high-energy electron irradiation in the removal of toxic organic chemicals from contaminated water and the disinfection of various wastewater streams. The large-scale plant operates at approximately 450 L/min (120 gal/min). The radiation dose absorbed by the flowing aqueous streams is estimated by measuring the difference in water temperature before and after it passes in front of the beam. Temperature measurements are made using resistance temperature devices (RTDs) and recorded by computer along with other operating parameters. Estimated dose is obtained from the measured temperature differences using the specific heat of water. This presentation will discuss experience with this measurement system, its application to different water presentation devices, sources of error, and the advantages and disadvantages of its use in large-scale process applications

  3. DNSSM: A Large Scale Passive DNS Security Monitoring Framework

    OpenAIRE

    Marchal , Samuel; François , Jérôme; Wagner , Cynthia; State , Radu; Dulaunoy , Alexandre; Engel , Thomas; Festor , Olivier

    2012-01-01

    International audience; We present a monitoring approach and the supporting software architecture for passive DNS traffic. Monitoring DNS traffic can reveal essential network and system level activity profiles. Worm infected and botnet participating hosts can be identified and malicious backdoor communications can be detected. Any passive DNS monitoring solution needs to address several challenges that range from architectural approaches for dealing with large volumes of data up to specific D...

  4. Self managing monitoring for highly elastic large scale Cloud deployments

    OpenAIRE

    Ward, Jonathan Stuart; Barker, Adam David

    2014-01-01

    Infrastructure as a Service computing exhibits a number of properties, which are not found in conventional server deployments. Elasticity is among the most significant of these properties which has wide reaching implications for applications deployed in cloud hosted VMs. Among the applications affected by elasticity is monitoring. In this paper we investigate the challenges of monitoring large cloud deployments and how these challenges differ from previous monitoring problems. In order to mee...

  5. Large scale air monitoring: Biological indicators versus air particulate matter

    International Nuclear Information System (INIS)

    Rossbach, M.; Jayasekera, R.; Kniewald, G.

    2000-01-01

    Biological indicator organisms are widely used for monitoring and banking purposes since many years. Although the complexity of the interactions between bioorganisms and their environment is generally not easily comprehensible, environmental quality assessment using the bioindicator approach offers some convincing advantages compared to direct analysis of soil, water, or air. Direct measurement of air particulates is restricted to experienced laboratories with access to expensive sampling equipment. Additionally, the amount of material collected generally is just enough for one determination per sampling and no multidimensional characterization might be possible. Further, fluctuations in air masses have a pronounced effect on the results from air filter sampling. Combining the integrating property of bioindicators with the world wide availability and uniform matrix characteristics of air particulates as a prerequisite for global monitoring of air pollution will be discussed. A new approach for sampling urban dust using large volume filtering devices installed in air conditioners of large hotel buildings is assessed. A first experiment was initiated to collect air particulates (300 to 500 g each) from a number of hotels during a period of three to four months by successive vacuum cleaning of used inlet filters from high volume air conditioning installations reflecting average concentrations per three months in different large cities. This approach is expected to be upgraded and applied for global monitoring. Highly positive correlated elements were found in lichen such as K/S, Zn/P, the rare earth elements (REE) and a significant negative correlation between Fig and Cu was observed in these samples. The ratio of concentrations of elements in dust and Usnea spp. is highest for Cr, Zn, and Fe (400-200) and lowest for elements such as Ca, Rb, and Sr (20-10). (author)

  6. Vibration Monitoring of a Large Scale Heavy Haul Railway Viaduct

    Directory of Open Access Journals (Sweden)

    Busatta Fulvio

    2015-01-01

    Full Text Available In South Africa, heavy haul railway transport was introduced in the mid-1970s for the Iron Ore and the Coal Export lines. In recent decades, the expansion of existing mines, new markets and the competition of iron ore- and coal-exporting countries have led owners and operators to progressively increase the capacity of the export lines. On the one hand, operational efficiencies have been improved; on the other hand, a significant increase of the rail traffic has been experienced. Thus, bridges along the export lines are now crossed by heavier and longer trains with more frequent train passages than in the past. Increasing train loading might lead to significant consequences on structures such as dynamic amplifications and reduction of service life due to fatigue. Hence dynamic assessment and monitoring of the structural condition of bridges under actual train loading are becoming more relevant to support decision making processes. The paper presents the investigations carried out on the Olifants River Viaduct, a critical structure along the Iron Ore Export Line, in order to implement a state-of-the-art vibration monitoring system.

  7. Optimizing Distributed Machine Learning for Large Scale EEG Data Set

    Directory of Open Access Journals (Sweden)

    M Bilal Shaikh

    2017-06-01

    Full Text Available Distributed Machine Learning (DML has gained its importance more than ever in this era of Big Data. There are a lot of challenges to scale machine learning techniques on distributed platforms. When it comes to scalability, improving the processor technology for high level computation of data is at its limit, however increasing machine nodes and distributing data along with computation looks as a viable solution. Different frameworks   and platforms are available to solve DML problems. These platforms provide automated random data distribution of datasets which miss the power of user defined intelligent data partitioning based on domain knowledge. We have conducted an empirical study which uses an EEG Data Set collected through P300 Speller component of an ERP (Event Related Potential which is widely used in BCI problems; it helps in translating the intention of subject w h i l e performing any cognitive task. EEG data contains noise due to waves generated by other activities in the brain which contaminates true P300Speller. Use of Machine Learning techniques could help in detecting errors made by P300 Speller. We are solving this classification problem by partitioning data into different chunks and preparing distributed models using Elastic CV Classifier. To present a case of optimizing distributed machine learning, we propose an intelligent user defined data partitioning approach that could impact on the accuracy of distributed machine learners on average. Our results show better average AUC as compared to average AUC obtained after applying random data partitioning which gives no control to user over data partitioning. It improves the average accuracy of distributed learner due to the domain specific intelligent partitioning by the user. Our customized approach achieves 0.66 AUC on individual sessions and 0.75 AUC on mixed sessions, whereas random / uncontrolled data distribution records 0.63 AUC.

  8. MARVIN: Distributed reasoning over large-scale Semantic Web data

    NARCIS (Netherlands)

    Oren, E.; Kotoulas, S.; Anadiotis, G.; Siebes, R.M.; ten Teije, A.C.M.; van Harmelen, F.A.H.

    2009-01-01

    Many Semantic Web problems are difficult to solve through common divide-and-conquer strategies, since they are hard to partition. We present Marvin, a parallel and distributed platform for processing large amounts of RDF data, on a network of loosely coupled peers. We present our divide-conquer-swap

  9. Large-scale distribution of tritium in a commercial product

    International Nuclear Information System (INIS)

    Combs, F.; Doda, R.J.

    1979-01-01

    Tritium enters the environment from various sources including nuclear reactor operations, weapons testing, natural production, and from the manufacture, use and ultimate disposal of commercial products containing tritium. A recent commercial application of tritium in the United States of America involves the backlighting of liquid crystal displays (LCD) in digital electronic watches. These watches are distributed through normal commercial channels to the general public. One million curies (MCi) of tritium were distributed in 1977 in this product. This is a significant quantity of tritium compared with power reactor-produced tritium (3MCi yearly) or with naturally produced tritium (6MCi yearly). This is the single largest commercial application involving tritium to date. The final disposition of tritium from large quantities of this product, after its useful life, must be estimated by considering the means of disposal and the possibility of dispersal of tritium concurrent with disposal. The most likely method of final disposition of this product will be disposal in solid refuse; this includes burial in land fills and incineration. Burial in land fills will probably contain the tritium for its effective lifetime, whereas incineration will release all the tritium gas (as the oxide) to the atmosphere. The use and disposal of this product will be studied as part of an environmental study that is at present being prepared for the U.S. Nuclear Regulatory Commission. (author)

  10. Hierarchical fiber-optic-based sensing system: impact damage monitoring of large-scale CFRP structures

    International Nuclear Information System (INIS)

    Minakuchi, Shu; Banshoya, Hidehiko; Takeda, Nobuo; Tsukamoto, Haruka

    2011-01-01

    This study proposes a novel fiber-optic-based hierarchical sensing concept for monitoring randomly induced damage in large-scale composite structures. In a hierarchical system, several kinds of specialized devices are hierarchically combined to form a sensing network. Specifically, numerous three-dimensionally structured sensor devices are distributed throughout the whole structural area and connected with an optical fiber network through transducing mechanisms. The distributed devices detect damage, and the fiber-optic network gathers the damage signals and transmits the information to a measuring instrument. This study began by discussing the basic concept of a hierarchical sensing system through comparison with existing fiber-optic-based systems, and an impact damage detection system was then proposed to validate the new concept. The sensor devices were developed based on comparative vacuum monitoring (CVM), and Brillouin-based distributed strain measurement was utilized to identify damaged areas. Verification tests were conducted step-by-step, beginning with a basic test using a single sensor unit, and, finally, the proposed monitoring system was successfully verified using a carbon fiber reinforced plastic (CFRP) fuselage demonstrator. It was clearly confirmed that the hierarchical system has better repairability, higher robustness, and a wider monitorable area compared to existing systems

  11. State of the Art in Large-Scale Soil Moisture Monitoring

    Science.gov (United States)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  12. Large-Scale Cooperative Task Distribution on Peer-to-Peer Networks

    Science.gov (United States)

    2012-01-01

    SUBTITLE Large-scale cooperative task distribution on peer-to-peer networks 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...disadvantages of ML- Chord are its fixed size (two layers), and limited scala - bility for large-scale systems. RC-Chord extends ML- D. Karrels et al...configurable before runtime. This can be improved by incorporating a distributed learning algorithm to tune the number and range of the DLoE tracking

  13. Design of Availability-Dependent Distributed Services in Large-Scale Uncooperative Settings

    Science.gov (United States)

    Morales, Ramses Victor

    2009-01-01

    Thesis Statement: "Availability-dependent global predicates can be efficiently and scalably realized for a class of distributed services, in spite of specific selfish and colluding behaviors, using local and decentralized protocols". Several types of large-scale distributed systems spanning the Internet have to deal with availability variations…

  14. Integrating weather and geotechnical monitoring data for assessing the stability of large scale surface mining operations

    Directory of Open Access Journals (Sweden)

    Steiakakis Chrysanthos

    2016-01-01

    Full Text Available The geotechnical challenges for safe slope design in large scale surface mining operations are enormous. Sometimes one degree of slope inclination can significantly reduce the overburden to ore ratio and therefore dramatically improve the economics of the operation, while large scale slope failures may have a significant impact on human lives. Furthermore, adverse weather conditions, such as high precipitation rates, may unfavorably affect the already delicate balance between operations and safety. Geotechnical, weather and production parameters should be systematically monitored and evaluated in order to safely operate such pits. Appropriate data management, processing and storage are critical to ensure timely and informed decisions.

  15. Watchdog - a workflow management system for the distributed analysis of large-scale experimental data.

    Science.gov (United States)

    Kluge, Michael; Friedel, Caroline C

    2018-03-13

    The development of high-throughput experimental technologies, such as next-generation sequencing, have led to new challenges for handling, analyzing and integrating the resulting large and diverse datasets. Bioinformatical analysis of these data commonly requires a number of mutually dependent steps applied to numerous samples for multiple conditions and replicates. To support these analyses, a number of workflow management systems (WMSs) have been developed to allow automated execution of corresponding analysis workflows. Major advantages of WMSs are the easy reproducibility of results as well as the reusability of workflows or their components. In this article, we present Watchdog, a WMS for the automated analysis of large-scale experimental data. Main features include straightforward processing of replicate data, support for distributed computer systems, customizable error detection and manual intervention into workflow execution. Watchdog is implemented in Java and thus platform-independent and allows easy sharing of workflows and corresponding program modules. It provides a graphical user interface (GUI) for workflow construction using pre-defined modules as well as a helper script for creating new module definitions. Execution of workflows is possible using either the GUI or a command-line interface and a web-interface is provided for monitoring the execution status and intervening in case of errors. To illustrate its potentials on a real-life example, a comprehensive workflow and modules for the analysis of RNA-seq experiments were implemented and are provided with the software in addition to simple test examples. Watchdog is a powerful and flexible WMS for the analysis of large-scale high-throughput experiments. We believe it will greatly benefit both users with and without programming skills who want to develop and apply bioinformatical workflows with reasonable overhead. The software, example workflows and a comprehensive documentation are freely

  16. A Topology Visualization Early Warning Distribution Algorithm for Large-Scale Network Security Incidents

    Directory of Open Access Journals (Sweden)

    Hui He

    2013-01-01

    Full Text Available It is of great significance to research the early warning system for large-scale network security incidents. It can improve the network system’s emergency response capabilities, alleviate the cyber attacks’ damage, and strengthen the system’s counterattack ability. A comprehensive early warning system is presented in this paper, which combines active measurement and anomaly detection. The key visualization algorithm and technology of the system are mainly discussed. The large-scale network system’s plane visualization is realized based on the divide and conquer thought. First, the topology of the large-scale network is divided into some small-scale networks by the MLkP/CR algorithm. Second, the sub graph plane visualization algorithm is applied to each small-scale network. Finally, the small-scale networks’ topologies are combined into a topology based on the automatic distribution algorithm of force analysis. As the algorithm transforms the large-scale network topology plane visualization problem into a series of small-scale network topology plane visualization and distribution problems, it has higher parallelism and is able to handle the display of ultra-large-scale network topology.

  17. Local, distributed topology control for large-scale wireless ad-hoc networks

    NARCIS (Netherlands)

    Nieberg, T.; Hurink, Johann L.

    In this document, topology control of a large-scale, wireless network by a distributed algorithm that uses only locally available information is presented. Topology control algorithms adjust the transmission power of wireless nodes to create a desired topology. The algorithm, named local power

  18. Aggregated Representation of Distribution Networks for Large-Scale Transmission Network Simulations

    DEFF Research Database (Denmark)

    Göksu, Ömer; Altin, Müfit; Sørensen, Poul Ejnar

    2014-01-01

    As a common practice of large-scale transmission network analysis the distribution networks have been represented as aggregated loads. However, with increasing share of distributed generation, especially wind and solar power, in the distribution networks, it became necessary to include...... the distributed generation within those analysis. In this paper a practical methodology to obtain aggregated behaviour of the distributed generation is proposed. The methodology, which is based on the use of the IEC standard wind turbine models, is applied on a benchmark distribution network via simulations....

  19. NASA's Information Power Grid: Large Scale Distributed Computing and Data Management

    Science.gov (United States)

    Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)

    2001-01-01

    Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.

  20. On a digital wireless impact-monitoring network for large-scale composite structures

    International Nuclear Information System (INIS)

    Yuan, Shenfang; Mei, Hanfei; Qiu, Lei; Ren, Yuanqiang

    2014-01-01

    Impact, which may occur during manufacture, service or maintenance, is one of the major concerns to be monitored throughout the lifetime of aircraft composite structures. Aiming at monitoring impacts online while minimizing the weight added to the aircraft to meet the strict limitations of aerospace engineering, this paper puts forward a new digital wireless network based on miniaturized wireless digital impact-monitoring nodes developed for large-scale composite structures. In addition to investigations on the design methods of the network architecture, time synchronization and implementation method, a conflict resolution method based on the feature parameters of digital sequences is first presented to address impact localization conflicts when several nodes are arranged close together. To verify the feasibility and stability of the wireless network, experiments are performed on a complex aircraft composite wing box and an unmanned aerial vehicle (UAV) composite wing. Experimental results show the successful design of the presented network. (paper)

  1. Large-Scale Ichthyoplankton and Water Mass Distribution along the South Brazil Shelf

    Science.gov (United States)

    de Macedo-Soares, Luis Carlos Pinto; Garcia, Carlos Alberto Eiras; Freire, Andrea Santarosa; Muelbert, José Henrique

    2014-01-01

    Ichthyoplankton is an essential component of pelagic ecosystems, and environmental factors play an important role in determining its distribution. We have investigated simultaneous latitudinal and cross-shelf gradients in ichthyoplankton abundance to test the hypothesis that the large-scale distribution of fish larvae in the South Brazil Shelf is associated with water mass composition. Vertical plankton tows were collected between 21°27′ and 34°51′S at 107 stations, in austral late spring and early summer seasons. Samples were taken with a conical-cylindrical plankton net from the depth of chlorophyll maxima to the surface in deep stations, or from 10 m from the bottom to the surface in shallow waters. Salinity and temperature were obtained with a CTD/rosette system, which provided seawater for chlorophyll-a and nutrient concentrations. The influence of water mass on larval fish species was studied using Indicator Species Analysis, whereas environmental effects on the distribution of larval fish species were analyzed by Distance-based Redundancy Analysis. Larval fish species were associated with specific water masses: in the north, Sardinella brasiliensis was found in Shelf Water; whereas in the south, Engraulis anchoita inhabited the Plata Plume Water. At the slope, Tropical Water was characterized by the bristlemouth Cyclothone acclinidens. The concurrent analysis showed the importance of both cross-shelf and latitudinal gradients on the large-scale distribution of larval fish species. Our findings reveal that ichthyoplankton composition and large-scale spatial distribution are determined by water mass composition in both latitudinal and cross-shelf gradients. PMID:24614798

  2. Large-scale geographic variation in distribution and abundance of Australian deep-water kelp forests.

    Directory of Open Access Journals (Sweden)

    Ezequiel M Marzinelli

    Full Text Available Despite the significance of marine habitat-forming organisms, little is known about their large-scale distribution and abundance in deeper waters, where they are difficult to access. Such information is necessary to develop sound conservation and management strategies. Kelps are main habitat-formers in temperate reefs worldwide; however, these habitats are highly sensitive to environmental change. The kelp Ecklonia radiate is the major habitat-forming organism on subtidal reefs in temperate Australia. Here, we provide large-scale ecological data encompassing the latitudinal distribution along the continent of these kelp forests, which is a necessary first step towards quantitative inferences about the effects of climatic change and other stressors on these valuable habitats. We used the Autonomous Underwater Vehicle (AUV facility of Australia's Integrated Marine Observing System (IMOS to survey 157,000 m2 of seabed, of which ca 13,000 m2 were used to quantify kelp covers at multiple spatial scales (10-100 m to 100-1,000 km and depths (15-60 m across several regions ca 2-6° latitude apart along the East and West coast of Australia. We investigated the large-scale geographic variation in distribution and abundance of deep-water kelp (>15 m depth and their relationships with physical variables. Kelp cover generally increased with latitude despite great variability at smaller spatial scales. Maximum depth of kelp occurrence was 40-50 m. Kelp latitudinal distribution along the continent was most strongly related to water temperature and substratum availability. This extensive survey data, coupled with ongoing AUV missions, will allow for the detection of long-term shifts in the distribution and abundance of habitat-forming kelp and the organisms they support on a continental scale, and provide information necessary for successful implementation and management of conservation reserves.

  3. Large-scale ichthyoplankton and water mass distribution along the South Brazil Shelf.

    Directory of Open Access Journals (Sweden)

    Luis Carlos Pinto de Macedo-Soares

    Full Text Available Ichthyoplankton is an essential component of pelagic ecosystems, and environmental factors play an important role in determining its distribution. We have investigated simultaneous latitudinal and cross-shelf gradients in ichthyoplankton abundance to test the hypothesis that the large-scale distribution of fish larvae in the South Brazil Shelf is associated with water mass composition. Vertical plankton tows were collected between 21°27' and 34°51'S at 107 stations, in austral late spring and early summer seasons. Samples were taken with a conical-cylindrical plankton net from the depth of chlorophyll maxima to the surface in deep stations, or from 10 m from the bottom to the surface in shallow waters. Salinity and temperature were obtained with a CTD/rosette system, which provided seawater for chlorophyll-a and nutrient concentrations. The influence of water mass on larval fish species was studied using Indicator Species Analysis, whereas environmental effects on the distribution of larval fish species were analyzed by Distance-based Redundancy Analysis. Larval fish species were associated with specific water masses: in the north, Sardinella brasiliensis was found in Shelf Water; whereas in the south, Engraulis anchoita inhabited the Plata Plume Water. At the slope, Tropical Water was characterized by the bristlemouth Cyclothone acclinidens. The concurrent analysis showed the importance of both cross-shelf and latitudinal gradients on the large-scale distribution of larval fish species. Our findings reveal that ichthyoplankton composition and large-scale spatial distribution are determined by water mass composition in both latitudinal and cross-shelf gradients.

  4. Large-scale ichthyoplankton and water mass distribution along the South Brazil Shelf.

    Science.gov (United States)

    de Macedo-Soares, Luis Carlos Pinto; Garcia, Carlos Alberto Eiras; Freire, Andrea Santarosa; Muelbert, José Henrique

    2014-01-01

    Ichthyoplankton is an essential component of pelagic ecosystems, and environmental factors play an important role in determining its distribution. We have investigated simultaneous latitudinal and cross-shelf gradients in ichthyoplankton abundance to test the hypothesis that the large-scale distribution of fish larvae in the South Brazil Shelf is associated with water mass composition. Vertical plankton tows were collected between 21°27' and 34°51'S at 107 stations, in austral late spring and early summer seasons. Samples were taken with a conical-cylindrical plankton net from the depth of chlorophyll maxima to the surface in deep stations, or from 10 m from the bottom to the surface in shallow waters. Salinity and temperature were obtained with a CTD/rosette system, which provided seawater for chlorophyll-a and nutrient concentrations. The influence of water mass on larval fish species was studied using Indicator Species Analysis, whereas environmental effects on the distribution of larval fish species were analyzed by Distance-based Redundancy Analysis. Larval fish species were associated with specific water masses: in the north, Sardinella brasiliensis was found in Shelf Water; whereas in the south, Engraulis anchoita inhabited the Plata Plume Water. At the slope, Tropical Water was characterized by the bristlemouth Cyclothone acclinidens. The concurrent analysis showed the importance of both cross-shelf and latitudinal gradients on the large-scale distribution of larval fish species. Our findings reveal that ichthyoplankton composition and large-scale spatial distribution are determined by water mass composition in both latitudinal and cross-shelf gradients.

  5. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    Science.gov (United States)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  6. Mapping the distribution of the denitrifier community at large scales (Invited)

    Science.gov (United States)

    Philippot, L.; Bru, D.; Ramette, A.; Dequiedt, S.; Ranjard, L.; Jolivet, C.; Arrouays, D.

    2010-12-01

    Little information is available regarding the landscape-scale distribution of microbial communities and its environmental determinants. Here we combined molecular approaches and geostatistical modeling to explore spatial patterns of the denitrifying community at large scales. The distribution of denitrifrying community was investigated over 107 sites in Burgundy, a 31 500 km2 region of France, using a 16 X 16 km sampling grid. At each sampling site, the abundances of denitrifiers and 42 soil physico-chemical properties were measured. The relative contributions of land use, spatial distance, climatic conditions, time and soil physico-chemical properties to the denitrifier spatial distribution were analyzed by canonical variation partitioning. Our results indicate that 43% to 85% of the spatial variation in community abundances could be explained by the measured environmental parameters, with soil chemical properties (mostly pH) being the main driver. We found spatial autocorrelation up to 740 km and used geostatistical modelling to generate predictive maps of the distribution of denitrifiers at the landscape scale. Studying the distribution of the denitrifiers at large scale can help closing the artificial gap between the investigation of microbial processes and microbial community ecology, therefore facilitating our understanding of the relationships between the ecology of denitrifiers and N-fluxes by denitrification.

  7. Integrating weather and geotechnical monitoring data for assessing the stability of large scale surface mining operations

    Science.gov (United States)

    Steiakakis, Chrysanthos; Agioutantis, Zacharias; Apostolou, Evangelia; Papavgeri, Georgia; Tripolitsiotis, Achilles

    2016-01-01

    The geotechnical challenges for safe slope design in large scale surface mining operations are enormous. Sometimes one degree of slope inclination can significantly reduce the overburden to ore ratio and therefore dramatically improve the economics of the operation, while large scale slope failures may have a significant impact on human lives. Furthermore, adverse weather conditions, such as high precipitation rates, may unfavorably affect the already delicate balance between operations and safety. Geotechnical, weather and production parameters should be systematically monitored and evaluated in order to safely operate such pits. Appropriate data management, processing and storage are critical to ensure timely and informed decisions. This paper presents an integrated data management system which was developed over a number of years as well as the advantages through a specific application. The presented case study illustrates how the high production slopes of a mine that exceed depths of 100-120 m were successfully mined with an average displacement rate of 10- 20 mm/day, approaching an almost slow to moderate landslide velocity. Monitoring data of the past four years are included in the database and can be analyzed to produce valuable results. Time-series data correlations of movements, precipitation records, etc. are evaluated and presented in this case study. The results can be used to successfully manage mine operations and ensure the safety of the mine and the workforce.

  8. Distributed constraint satisfaction for coordinating and integrating a large-scale, heterogenous enterprise

    CERN Document Server

    Eisenberg, C

    2003-01-01

    Market forces are continuously driving public and private organisations towards higher productivity, shorter process and production times, and fewer labour hours. To cope with these changes, organisations are adopting new organisational models of coordination and cooperation that increase their flexibility, consistency, efficiency, productivity and profit margins. In this thesis an organisational model of coordination and cooperation is examined using a real life example; the technical integration of a distributed large-scale project of an international physics collaboration. The distributed resource constraint project scheduling problem is modelled and solved with the methods of distributed constraint satisfaction. A distributed local search method, the distributed breakout algorithm (DisBO), is used as the basis for the coordination scheme. The efficiency of the local search method is improved by extending it with an incremental problem solving scheme with variable ordering. The scheme is implemented as cen...

  9. The Large Scale Distribution of Water Ice in the Polar Regions of the Moon

    Science.gov (United States)

    Jordan, A.; Wilson, J. K.; Schwadron, N.; Spence, H. E.

    2017-12-01

    For in situ resource utilization, one must know where water ice is on the Moon. Many datasets have revealed both surface deposits of water ice and subsurface deposits of hydrogen near the lunar poles, but it has proved difficult to resolve the differences among the locations of these deposits. Despite these datasets disagreeing on how deposits are distributed on small scales, we show that most of these datasets do agree on the large scale distribution of water ice. We present data from the Cosmic Ray Telescope for the Effects of Radiation (CRaTER) on the Lunar Reconnaissance Orbiter (LRO), LRO's Lunar Exploration Neutron Detector (LEND), the Neutron Spectrometer on Lunar Prospector (LPNS), LRO's Lyman Alpha Mapping Project (LAMP), LRO's Lunar Orbiter Laser Altimeter (LOLA), and Chandrayaan-1's Moon Mineralogy Mapper (M3). All, including those that show clear evidence for water ice, reveal surprisingly similar trends with latitude, suggesting that both surface and subsurface datasets are measuring ice. All show that water ice increases towards the poles, and most demonstrate that its signature appears at about ±70° latitude and increases poleward. This is consistent with simulations of how surface and subsurface cold traps are distributed with latitude. This large scale agreement constrains the origin of the ice, suggesting that an ancient cometary impact (or impacts) created a large scale deposit that has been rendered locally heterogeneous by subsequent impacts. Furthermore, it also shows that water ice may be available down to ±70°—latitudes that are more accessible than the poles for landing.

  10. TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems

    OpenAIRE

    Abadi, Martín; Agarwal, Ashish; Barham, Paul; Brevdo, Eugene; Chen, Zhifeng; Citro, Craig; Corrado, Greg S.; Davis, Andy; Dean, Jeffrey; Devin, Matthieu; Ghemawat, Sanjay; Goodfellow, Ian; Harp, Andrew; Irving, Geoffrey; Isard, Michael

    2016-01-01

    TensorFlow is an interface for expressing machine learning algorithms, and an implementation for executing such algorithms. A computation expressed using TensorFlow can be executed with little or no change on a wide variety of heterogeneous systems, ranging from mobile devices such as phones and tablets up to large-scale distributed systems of hundreds of machines and thousands of computational devices such as GPU cards. The system is flexible and can be used to express a wide variety of algo...

  11. Can wide consultation help with setting priorities for large-scale biodiversity monitoring programs?

    Directory of Open Access Journals (Sweden)

    Frédéric Boivin

    Full Text Available Climate and other global change phenomena affecting biodiversity require monitoring to track ecosystem changes and guide policy and management actions. Designing a biodiversity monitoring program is a difficult task that requires making decisions that often lack consensus due to budgetary constrains. As monitoring programs require long-term investment, they also require strong and continuing support from all interested parties. As such, stakeholder consultation is key to identify priorities and make sound design decisions that have as much support as possible. Here, we present the results of a consultation conducted to serve as an aid for designing a large-scale biodiversity monitoring program for the province of Québec (Canada. The consultation took the form of a survey with 13 discrete choices involving tradeoffs in respect to design priorities and 10 demographic questions (e.g., age, profession. The survey was sent to thousands of individuals having expected interests and knowledge about biodiversity and was completed by 621 participants. Overall, consensuses were few and it appeared difficult to create a design fulfilling the priorities of the majority. Most participants wanted 1 a monitoring design covering the entire territory and focusing on natural habitats; 2 a focus on species related to ecosystem services, on threatened and on invasive species. The only demographic characteristic that was related to the type of prioritization was the declared level of knowledge in biodiversity (null to high, but even then the influence was quite small.

  12. Can wide consultation help with setting priorities for large-scale biodiversity monitoring programs?

    Science.gov (United States)

    Boivin, Frédéric; Simard, Anouk; Peres-Neto, Pedro

    2014-01-01

    Climate and other global change phenomena affecting biodiversity require monitoring to track ecosystem changes and guide policy and management actions. Designing a biodiversity monitoring program is a difficult task that requires making decisions that often lack consensus due to budgetary constrains. As monitoring programs require long-term investment, they also require strong and continuing support from all interested parties. As such, stakeholder consultation is key to identify priorities and make sound design decisions that have as much support as possible. Here, we present the results of a consultation conducted to serve as an aid for designing a large-scale biodiversity monitoring program for the province of Québec (Canada). The consultation took the form of a survey with 13 discrete choices involving tradeoffs in respect to design priorities and 10 demographic questions (e.g., age, profession). The survey was sent to thousands of individuals having expected interests and knowledge about biodiversity and was completed by 621 participants. Overall, consensuses were few and it appeared difficult to create a design fulfilling the priorities of the majority. Most participants wanted 1) a monitoring design covering the entire territory and focusing on natural habitats; 2) a focus on species related to ecosystem services, on threatened and on invasive species. The only demographic characteristic that was related to the type of prioritization was the declared level of knowledge in biodiversity (null to high), but even then the influence was quite small.

  13. A KPI-based process monitoring and fault detection framework for large-scale processes.

    Science.gov (United States)

    Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang

    2017-05-01

    Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  14. On distributed wavefront reconstruction for large-scale adaptive optics systems.

    Science.gov (United States)

    de Visser, Cornelis C; Brunner, Elisabeth; Verhaegen, Michel

    2016-05-01

    The distributed-spline-based aberration reconstruction (D-SABRE) method is proposed for distributed wavefront reconstruction with applications to large-scale adaptive optics systems. D-SABRE decomposes the wavefront sensor domain into any number of partitions and solves a local wavefront reconstruction problem on each partition using multivariate splines. D-SABRE accuracy is within 1% of a global approach with a speedup that scales quadratically with the number of partitions. The D-SABRE is compared to the distributed cumulative reconstruction (CuRe-D) method in open-loop and closed-loop simulations using the YAO adaptive optics simulation tool. D-SABRE accuracy exceeds CuRe-D for low levels of decomposition, and D-SABRE proved to be more robust to variations in the loop gain.

  15. Distributed weighted least-squares estimation with fast convergence for large-scale systems.

    Science.gov (United States)

    Marelli, Damián Edgardo; Fu, Minyue

    2015-01-01

    In this paper we study a distributed weighted least-squares estimation problem for a large-scale system consisting of a network of interconnected sub-systems. Each sub-system is concerned with a subset of the unknown parameters and has a measurement linear in the unknown parameters with additive noise. The distributed estimation task is for each sub-system to compute the globally optimal estimate of its own parameters using its own measurement and information shared with the network through neighborhood communication. We first provide a fully distributed iterative algorithm to asymptotically compute the global optimal estimate. The convergence rate of the algorithm will be maximized using a scaling parameter and a preconditioning method. This algorithm works for a general network. For a network without loops, we also provide a different iterative algorithm to compute the global optimal estimate which converges in a finite number of steps. We include numerical experiments to illustrate the performances of the proposed methods.

  16. Secure File Allocation and Caching in Large-scale Distributed Systems

    DEFF Research Database (Denmark)

    Di Mauro, Alessio; Mei, Alessandro; Jajodia, Sushil

    2012-01-01

    In this paper, we present a file allocation and caching scheme that guarantees high assurance, availability, and load balancing in a large-scale distributed file system that can support dynamic updates of authorization policies. The scheme uses fragmentation and replication to store files with hi......-balancing, and reducing delay of read operations. The system offers a trade-off-between performance and security that is dynamically tunable according to the current level of threat. We validate our mechanisms with extensive simulations in an Internet-like network.......In this paper, we present a file allocation and caching scheme that guarantees high assurance, availability, and load balancing in a large-scale distributed file system that can support dynamic updates of authorization policies. The scheme uses fragmentation and replication to store files with high...... security requirements in a system composed of a majority of low-security servers. We develop mechanisms to fragment files, to allocate them into multiple servers, and to cache them as close as possible to their readers while preserving the security requirement of the files, providing load...

  17. Large Scale Triboelectric Nanogenerator and Self-Powered Flexible Sensor for Human Sleep Monitoring

    Directory of Open Access Journals (Sweden)

    Xiaoheng Ding

    2018-05-01

    Full Text Available The triboelectric nanogenerator (TENG and its application as a sensor is a popular research subject. There is demand for self-powered, flexible sensors with high sensitivity and high power-output for the next generation of consumer electronics. In this study, a 300 mm × 300 mm carbon nanotube (CNT-doped porous PDMS film was successfully fabricated wherein the CNT influenced the micropore structure. A self-powered TENG tactile sensor was established according to triboelectric theory. The CNT-doped porous TENG showed a voltage output seven times higher than undoped porous TENG and 16 times higher than TENG with pure PDMS, respectively. The TENG successfully acquired human motion signals, breath signals, and heartbeat signals during a sleep monitoring experiment. The results presented here may provide an effective approach for fabricating large-scale and low-cost flexible TENG sensors.

  18. Development of solution monitoring software for enhanced safeguards at a large scale reprocessing facility

    Energy Technology Data Exchange (ETDEWEB)

    Van Handenhove, Carl; Breban, Domnica; Creusot, Christophe [International Atomic Energy Agency, Vienna (Austria); Dransart, Pascal; Dechamp, Luc [Joint Research Centre, European Commission, Ispra, Varese, (Italy); Jarde, Eric [Euriware, Equeurdreville (France)

    2011-12-15

    The implementation of an effective and efficient IAEA safeguards approach at large scale reprocessing facilities with large throughput and continuous flow of nuclear material requires the introduction of enhanced safeguards measures to provide added assurance about the absence of diversion of nuclear material and confirmation that the facility is operated as declared. One of the enhanced safeguards measures, a Solution Monitoring and Measurement System (SMMS), comprising data collection instruments, data transmission equipment and an advanced Solution Monitoring Software (SMS), is being implemented at a large scale reprocessing plant in Japan. SMS is designed as a tool to enable automatic calculations of volumes, densities and flow-rates in selected process vessels, including most of the vessels of the main nuclear material stream. This software also includes automatic features to support the inspectorate in verifying inventories and inventory changes. The software also enables one to analyze the flows of nuclear material within the process and of specified 'cycles' of operation, and, in order to provide assurance that the facility is being operated as declared to compare these with those expected (reference signatures). The configuration and parameterization work (especially the analytical and comparative work) for the implementation and configuration of the SMS has been carried out jointly between the IAEA, Euriware-France (the software developer) and the Joint Research Centre (JRC)-Ispra. This paper describes the main features of the SMS, including the principles underlying the automatic analysis functionalities. It then focuses on the collaborative work performed by the JRC-Ispra, Euriware and the IAEA for the parameterization of the software (vessels and cycles of operation), including the current status and the future challenges.

  19. Data-driven process decomposition and robust online distributed modelling for large-scale processes

    Science.gov (United States)

    Shu, Zhang; Lijuan, Li; Lijuan, Yao; Shipin, Yang; Tao, Zou

    2018-02-01

    With the increasing attention of networked control, system decomposition and distributed models show significant importance in the implementation of model-based control strategy. In this paper, a data-driven system decomposition and online distributed subsystem modelling algorithm was proposed for large-scale chemical processes. The key controlled variables are first partitioned by affinity propagation clustering algorithm into several clusters. Each cluster can be regarded as a subsystem. Then the inputs of each subsystem are selected by offline canonical correlation analysis between all process variables and its controlled variables. Process decomposition is then realised after the screening of input and output variables. When the system decomposition is finished, the online subsystem modelling can be carried out by recursively block-wise renewing the samples. The proposed algorithm was applied in the Tennessee Eastman process and the validity was verified.

  20. Scalable and Fully Distributed Localization in Large-Scale Sensor Networks

    Directory of Open Access Journals (Sweden)

    Miao Jin

    2017-06-01

    Full Text Available This work proposes a novel connectivity-based localization algorithm, well suitable for large-scale sensor networks with complex shapes and a non-uniform nodal distribution. In contrast to current state-of-the-art connectivity-based localization methods, the proposed algorithm is highly scalable with linear computation and communication costs with respect to the size of the network; and fully distributed where each node only needs the information of its neighbors without cumbersome partitioning and merging process. The algorithm is theoretically guaranteed and numerically stable. Moreover, the algorithm can be readily extended to the localization of networks with a one-hop transmission range distance measurement, and the propagation of the measurement error at one sensor node is limited within a small area of the network around the node. Extensive simulations and comparison with other methods under various representative network settings are carried out, showing the superior performance of the proposed algorithm.

  1. New Distributed Multipole Methods for Accurate Electrostatics for Large-Scale Biomolecular Simultations

    Science.gov (United States)

    Sagui, Celeste

    2006-03-01

    An accurate and numerically efficient treatment of electrostatics is essential for biomolecular simulations, as this stabilizes much of the delicate 3-d structure associated with biomolecules. Currently, force fields such as AMBER and CHARMM assign ``partial charges'' to every atom in a simulation in order to model the interatomic electrostatic forces, so that the calculation of the electrostatics rapidly becomes the computational bottleneck in large-scale simulations. There are two main issues associated with the current treatment of classical electrostatics: (i) how does one eliminate the artifacts associated with the point-charges (e.g., the underdetermined nature of the current RESP fitting procedure for large, flexible molecules) used in the force fields in a physically meaningful way? (ii) how does one efficiently simulate the very costly long-range electrostatic interactions? Recently, we have dealt with both of these challenges as follows. In order to improve the description of the molecular electrostatic potentials (MEPs), a new distributed multipole analysis based on localized functions -- Wannier, Boys, and Edminston-Ruedenberg -- was introduced, which allows for a first principles calculation of the partial charges and multipoles. Through a suitable generalization of the particle mesh Ewald (PME) and multigrid method, one can treat electrostatic multipoles all the way to hexadecapoles all without prohibitive extra costs. The importance of these methods for large-scale simulations will be discussed, and examplified by simulations from polarizable DNA models.

  2. Voltage stability issues in a distribution grid with large scale PV plant

    Energy Technology Data Exchange (ETDEWEB)

    Perez, Alvaro Ruiz; Marinopoulos, Antonios; Reza, Muhamad; Srivastava, Kailash [ABB AB, Vaesteraas (Sweden). Corporate Research Center; Hertem, Dirk van [Katholieke Univ. Leuven, Heverlee (Belgium). ESAT-ELECTA

    2011-07-01

    Solar photovoltaics (PV) has become a competitive renewable energy source. The production of solar PV cells and panels has increased significantly, while the cost is reduced due to economics of scale and technological achievements in the field. At the same time, the increase in efficiency of PV power systems and high energy prices are expected to lead PV systems to grid parity in the coming decade. This is expected to boost even more the large scale implementation of PV power plants (utility scale PV) and therefore the impact of such large scale PV plants to power system needs to be studies. This paper investigates the voltage stability issues arising from the connection of a large PV power plant to the power grid. For this purpose, a 15 MW PV power plant was implemented into a distribution grid, modeled and simulated using DIgSILENT Power Factory. Two scenarios were developed: in the first scenario, active power injected into the grid by the PV power plants was varied and the resulted U-Q curve was analyzed. In the second scenario, the impact of connecting PV power plants to different points in the grid - resulting in different strength of the connection - was investigated. (orig.)

  3. Large Scale Beam-beam Simulations for the CERN LHC using Distributed Computing

    CERN Document Server

    Herr, Werner; McIntosh, E; Schmidt, F

    2006-01-01

    We report on a large scale simulation of beam-beam effects for the CERN Large Hadron Collider (LHC). The stability of particles which experience head-on and long-range beam-beam effects was investigated for different optical configurations and machine imperfections. To cover the interesting parameter space required computing resources not available at CERN. The necessary resources were available in the LHC@home project, based on the BOINC platform. At present, this project makes more than 60000 hosts available for distributed computing. We shall discuss our experience using this system during a simulation campaign of more than six months and describe the tools and procedures necessary to ensure consistent results. The results from this extended study are presented and future plans are discussed.

  4. LARGE SCALE DISTRIBUTED PARAMETER MODEL OF MAIN MAGNET SYSTEM AND FREQUENCY DECOMPOSITION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    ZHANG,W.; MARNERIS, I.; SANDBERG, J.

    2007-06-25

    Large accelerator main magnet system consists of hundreds, even thousands, of dipole magnets. They are linked together under selected configurations to provide highly uniform dipole fields when powered. Distributed capacitance, insulation resistance, coil resistance, magnet inductance, and coupling inductance of upper and lower pancakes make each magnet a complex network. When all dipole magnets are chained together in a circle, they become a coupled pair of very high order complex ladder networks. In this study, a network of more than thousand inductive, capacitive or resistive elements are used to model an actual system. The circuit is a large-scale network. Its equivalent polynomial form has several hundred degrees. Analysis of this high order circuit and simulation of the response of any or all components is often computationally infeasible. We present methods to use frequency decomposition approach to effectively simulate and analyze magnet configuration and power supply topologies.

  5. Economic Model Predictive Control for Large-Scale and Distributed Energy Systems

    DEFF Research Database (Denmark)

    Standardi, Laura

    Sources (RESs) in the smart grids is increasing. These energy sources bring uncertainty to the production due to their fluctuations. Hence,smart grids need suitable control systems that are able to continuously balance power production and consumption.  We apply the Economic Model Predictive Control (EMPC......) strategy to optimise the economic performances of the energy systems and to balance the power production and consumption. In the case of large-scale energy systems, the electrical grid connects a high number of power units. Because of this, the related control problem involves a high number of variables......In this thesis, we consider control strategies for large and distributed energy systems that are important for the implementation of smart grid technologies.  An electrical grid has to ensure reliability and avoid long-term interruptions in the power supply. Moreover, the share of Renewable Energy...

  6. Rucio - The next generation large scale distributed system for ATLAS Data Management

    CERN Document Server

    Beermann, T; The ATLAS collaboration; Lassnig, M; Barisits, M; Vigne, R; Serfon, C; Stewart, G A; Goossens, L; Nairz, A; Molfetas, A

    2014-01-01

    Rucio is the next-generation Distributed Data Management (DDM) system benefiting from recent advances in cloud and "Big Data" computing to address the ATLAS experiment scaling requirements. Rucio is an evolution of the ATLAS DDM system Don Quijote 2 (DQ2), which has demonstrated very large scale data management capabilities with more than 150 petabytes spread worldwide across 130 sites, and accesses from 1,000 active users. However, DQ2 is reaching its limits in terms of scalability, requiring a large number of support staff to operate and being hard to extend with new technologies. Rucio will deal with these issues by relying on new technologies to ensure system scalability, address new user requirements and employ a new automation framework to reduce operational overheads.

  7. Use of large-scale acoustic monitoring to assess anthropogenic pressures on Orthoptera communities.

    Science.gov (United States)

    Penone, Caterina; Le Viol, Isabelle; Pellissier, Vincent; Julien, Jean-François; Bas, Yves; Kerbiriou, Christian

    2013-10-01

    Biodiversity monitoring at large spatial and temporal scales is greatly needed in the context of global changes. Although insects are a species-rich group and are important for ecosystem functioning, they have been largely neglected in conservation studies and policies, mainly due to technical and methodological constraints. Sound detection, a nondestructive method, is easily applied within a citizen-science framework and could be an interesting solution for insect monitoring. However, it has not yet been tested at a large scale. We assessed the value of a citizen-science program in which Orthoptera species (Tettigoniidae) were monitored acoustically along roads. We used Bayesian model-averaging analyses to test whether we could detect widely known patterns of anthropogenic effects on insects, such as the negative effects of urbanization or intensive agriculture on Orthoptera populations and communities. We also examined site-abundance correlations between years and estimated the biases in species detection to evaluate and improve the protocol. Urbanization and intensive agricultural landscapes negatively affected Orthoptera species richness, diversity, and abundance. This finding is consistent with results of previous studies of Orthoptera, vertebrates, carabids, and butterflies. The average mass of communities decreased as urbanization increased. The dispersal ability of communities increased as the percentage of agricultural land and, to a lesser extent, urban area increased. Despite changes in abundances over time, we found significant correlations between yearly abundances. We identified biases linked to the protocol (e.g., car speed or temperature) that can be accounted for ease in analyses. We argue that acoustic monitoring of Orthoptera along roads offers several advantages for assessing Orthoptera biodiversity at large spatial and temporal extents, particularly in a citizen science framework. © 2013 Society for Conservation Biology.

  8. Top-k aggregation queries in large-scale distributed systems

    OpenAIRE

    Michel, Sebastian

    2007-01-01

    Distributed top-k query processing has recently become an essential functionality in a large number of emerging application classes like Internet traffic monitoring and Peer-to-Peer Web search. This work addresses efficient algorithms for distributed top-k queries in wide-area networks where the index lists for the attribute values (or text terms) of a query are distributed across a number of data peers. More precisely, in this thesis, we make the following distributions: We present the fa...

  9. Spatially-explicit estimation of geographical representation in large-scale species distribution datasets.

    Science.gov (United States)

    Kalwij, Jesse M; Robertson, Mark P; Ronk, Argo; Zobel, Martin; Pärtel, Meelis

    2014-01-01

    Much ecological research relies on existing multispecies distribution datasets. Such datasets, however, can vary considerably in quality, extent, resolution or taxonomic coverage. We provide a framework for a spatially-explicit evaluation of geographical representation within large-scale species distribution datasets, using the comparison of an occurrence atlas with a range atlas dataset as a working example. Specifically, we compared occurrence maps for 3773 taxa from the widely-used Atlas Florae Europaeae (AFE) with digitised range maps for 2049 taxa of the lesser-known Atlas of North European Vascular Plants. We calculated the level of agreement at a 50-km spatial resolution using average latitudinal and longitudinal species range, and area of occupancy. Agreement in species distribution was calculated and mapped using Jaccard similarity index and a reduced major axis (RMA) regression analysis of species richness between the entire atlases (5221 taxa in total) and between co-occurring species (601 taxa). We found no difference in distribution ranges or in the area of occupancy frequency distribution, indicating that atlases were sufficiently overlapping for a valid comparison. The similarity index map showed high levels of agreement for central, western, and northern Europe. The RMA regression confirmed that geographical representation of AFE was low in areas with a sparse data recording history (e.g., Russia, Belarus and the Ukraine). For co-occurring species in south-eastern Europe, however, the Atlas of North European Vascular Plants showed remarkably higher richness estimations. Geographical representation of atlas data can be much more heterogeneous than often assumed. Level of agreement between datasets can be used to evaluate geographical representation within datasets. Merging atlases into a single dataset is worthwhile in spite of methodological differences, and helps to fill gaps in our knowledge of species distribution ranges. Species distribution

  10. Distributed weighted least-squares estimation with fast convergence for large-scale systems☆

    Science.gov (United States)

    Marelli, Damián Edgardo; Fu, Minyue

    2015-01-01

    In this paper we study a distributed weighted least-squares estimation problem for a large-scale system consisting of a network of interconnected sub-systems. Each sub-system is concerned with a subset of the unknown parameters and has a measurement linear in the unknown parameters with additive noise. The distributed estimation task is for each sub-system to compute the globally optimal estimate of its own parameters using its own measurement and information shared with the network through neighborhood communication. We first provide a fully distributed iterative algorithm to asymptotically compute the global optimal estimate. The convergence rate of the algorithm will be maximized using a scaling parameter and a preconditioning method. This algorithm works for a general network. For a network without loops, we also provide a different iterative algorithm to compute the global optimal estimate which converges in a finite number of steps. We include numerical experiments to illustrate the performances of the proposed methods. PMID:25641976

  11. Galaxies distribution in the universe: large-scale statistics and structures

    International Nuclear Information System (INIS)

    Maurogordato, Sophie

    1988-01-01

    This research thesis addresses the distribution of galaxies in the Universe, and more particularly large scale statistics and structures. Based on an assessment of the main used statistical techniques, the author outlines the need to develop additional tools to correlation functions in order to characterise the distribution. She introduces a new indicator: the probability of a volume randomly tested in the distribution to be void. This allows a characterisation of void properties at the work scales (until 10h"-"1 Mpc) in the Harvard Smithsonian Center for Astrophysics Redshift Survey, or CfA catalog. A systematic analysis of statistical properties of different sub-samples has then been performed with respect to the size and location, luminosity class, and morphological type. This analysis is then extended to different scenarios of structure formation. A program of radial speed measurements based on observations allows the determination of possible relationships between apparent structures. The author also presents results of the search for south extensions of Perseus supernova [fr

  12. Large Scale Behavior and Droplet Size Distributions in Crude Oil Jets and Plumes

    Science.gov (United States)

    Katz, Joseph; Murphy, David; Morra, David

    2013-11-01

    The 2010 Deepwater Horizon blowout introduced several million barrels of crude oil into the Gulf of Mexico. Injected initially as a turbulent jet containing crude oil and gas, the spill caused formation of a subsurface plume stretching for tens of miles. The behavior of such buoyant multiphase plumes depends on several factors, such as the oil droplet and bubble size distributions, current speed, and ambient stratification. While large droplets quickly rise to the surface, fine ones together with entrained seawater form intrusion layers. Many elements of the physics of droplet formation by an immiscible turbulent jet and their resulting size distribution have not been elucidated, but are known to be significantly influenced by the addition of dispersants, which vary the Weber Number by orders of magnitude. We present experimental high speed visualizations of turbulent jets of sweet petroleum crude oil (MC 252) premixed with Corexit 9500A dispersant at various dispersant to oil ratios. Observations were conducted in a 0.9 m × 0.9 m × 2.5 m towing tank, where large-scale behavior of the jet, both stationary and towed at various speeds to simulate cross-flow, have been recorded at high speed. Preliminary data on oil droplet size and spatial distributions were also measured using a videoscope and pulsed light sheet. Sponsored by Gulf of Mexico Research Initiative (GoMRI).

  13. ALOS PALSAR Winter Coherence and Summer Intensities for Large Scale Forest Monitoring in Siberia

    Science.gov (United States)

    Thiel, Christian; Thiel, Carolin; Santoro, Maurizio; Schmullius, Christiane

    2008-11-01

    In this paper summer intensity and winter coherence images are used for large scale forest monitoring. The intensities (FBD HH/HV) have been acquired during summer 2007 and feature the K&C intensity stripes [1]. The processing consisted of radiometric calibration, orthorectification, and topographic normalisation. The coherence has been estimated from interferometric pairs with 46-days repeat-pass intervals. The pairs have been acquired during the winters 2006/2007 and 2007/2008. During both winters suited weather conditions have been reported. Interferometric processing consisted of SLC co-registration at sub-pixel level, common-band filtering in range and azimuth and generation of a differential interferogram, which was used in the coherence estimation procedure based on adaptive estimation. All images were geocoded using SRTM data. The pixel size of the final SAR products is 50 m x 50 m. It could already be demonstrated, that by using PALSAR intensities and winter coherence forest and non-forest can be clearly separated [2]. By combining both data types hardly any overlap of the class signatures was detected, even though the analysis was conducted on pixel level and no speckle filter has been applied. Thus, the delineation of a forest cover mask could be executed operationally. The major hitch is the definition of a biomass threshold for regrowing forest to be distinguished as forest.

  14. Distributed and Cooperative Link Scheduling for Large-Scale Multihop Wireless Networks

    Directory of Open Access Journals (Sweden)

    Swami Ananthram

    2007-01-01

    Full Text Available A distributed and cooperative link-scheduling (DCLS algorithm is introduced for large-scale multihop wireless networks. With this algorithm, each and every active link in the network cooperatively calibrates its environment and converges to a desired link schedule for data transmissions within a time frame of multiple slots. This schedule is such that the entire network is partitioned into a set of interleaved subnetworks, where each subnetwork consists of concurrent cochannel links that are properly separated from each other. The desired spacing in each subnetwork can be controlled by a tuning parameter and the number of time slots specified for each frame. Following the DCLS algorithm, a distributed and cooperative power control (DCPC algorithm can be applied to each subnetwork to ensure a desired data rate for each link with minimum network transmission power. As shown consistently by simulations, the DCLS algorithm along with a DCPC algorithm yields significant power savings. The power savings also imply an increased feasible region of averaged link data rates for the entire network.

  15. An optimal beam alignment method for large-scale distributed space surveillance radar system

    Science.gov (United States)

    Huang, Jian; Wang, Dongya; Xia, Shuangzhi

    2018-06-01

    Large-scale distributed space surveillance radar is a very important ground-based equipment to maintain a complete catalogue for Low Earth Orbit (LEO) space debris. However, due to the thousands of kilometers distance between each sites of the distributed radar system, how to optimally implement the Transmitting/Receiving (T/R) beams alignment in a great space using the narrow beam, which proposed a special and considerable technical challenge in the space surveillance area. According to the common coordinate transformation model and the radar beam space model, we presented a two dimensional projection algorithm for T/R beam using the direction angles, which could visually describe and assess the beam alignment performance. Subsequently, the optimal mathematical models for the orientation angle of the antenna array, the site location and the T/R beam coverage are constructed, and also the beam alignment parameters are precisely solved. At last, we conducted the optimal beam alignment experiments base on the site parameters of Air Force Space Surveillance System (AFSSS). The simulation results demonstrate the correctness and effectiveness of our novel method, which can significantly stimulate the construction for the LEO space debris surveillance equipment.

  16. Distributed and Cooperative Link Scheduling for Large-Scale Multihop Wireless Networks

    Directory of Open Access Journals (Sweden)

    Ananthram Swami

    2007-12-01

    Full Text Available A distributed and cooperative link-scheduling (DCLS algorithm is introduced for large-scale multihop wireless networks. With this algorithm, each and every active link in the network cooperatively calibrates its environment and converges to a desired link schedule for data transmissions within a time frame of multiple slots. This schedule is such that the entire network is partitioned into a set of interleaved subnetworks, where each subnetwork consists of concurrent cochannel links that are properly separated from each other. The desired spacing in each subnetwork can be controlled by a tuning parameter and the number of time slots specified for each frame. Following the DCLS algorithm, a distributed and cooperative power control (DCPC algorithm can be applied to each subnetwork to ensure a desired data rate for each link with minimum network transmission power. As shown consistently by simulations, the DCLS algorithm along with a DCPC algorithm yields significant power savings. The power savings also imply an increased feasible region of averaged link data rates for the entire network.

  17. Direct Satellite Data Acquisition and its Application for Large -scale Monitoring Projects in Russia

    Science.gov (United States)

    Gershenzon, O.

    2011-12-01

    ScanEx RDC created an infrastructure (ground stations network) to acquire and process remote sensing data from different satellites: Terra, Aqua, Landsat, IRS-P5/P6, SPOT 4/5, FORMOSAT-2, EROS A/B, RADARSAT-1/2, ENVISAT-1. It owns image archives from these satellites as well as from SPOT-2 and CARTOSAT-2. ScanEx RDC builds and delivers remote sensing ground stations (working with up to 15 satellites); and owns the ground stations network to acquire data for Russia and surrounding territory. ScanEx stations are the basic component in departmental networks of remote sensing data acquisition for different state authorities (Roshydromet, Ministry of Natural Recourses, Emercom) and University- based remote sensing data acquisition and processing centers in Russia and abroad. ScanEx performs large-scale projects in collaboration with government agencies to monitor forests, floods, fires, sea surface pollution, and ice situation in Northern Russia. During 2010-2011 ScanEx conducted daily monitoring of wild fires in Russia detecting and registering thermal anomalies using data from Terra, Aqua, Landsat and SPOT satellites. Detailed SPOT 4/5 data is used to analyze burnt areas and to assess damage caused by fire. Satellite data along with other information about fire situation in Russia was daily updated and published via free-access Internet geoportal. A few projects ScanEx conducted together with environmental NGO. Project "Satellite monitoring of Especially Protected Natural Areas of Russia and its results visualization on geoportal was conducted in cooperation with NGO "Transparent World". The project's goal was to observe natural phenomena and economical activity, including illegal, by means of Earth remote sensing data. Monitoring is based on multi-temporal optical space imagery of different spatial resolution. Project results include detection of anthropogenic objects that appeared in the vicinity or even within the border of natural territories, that have never been

  18. Monitoring of large-scale federated data storage: XRootD and beyond

    International Nuclear Information System (INIS)

    Andreeva, J; Beche, A; Arias, D Diguez; Giordano, D; Saiz, P; Tuckett, D; Belov, S; Oleynik, D; Petrosyan, A; Tadel, M; Vukotic, I

    2014-01-01

    The computing models of the LHC experiments are gradually moving from hierarchical data models with centrally managed data pre-placement towards federated storage which provides seamless access to data files independently of their location and dramatically improve recovery due to fail-over mechanisms. Construction of the data federations and understanding the impact of the new approach to data management on user analysis requires complete and detailed monitoring. Monitoring functionality should cover the status of all components of the federated storage, measuring data traffic and data access performance, as well as being able to detect any kind of inefficiencies and to provide hints for resource optimization and effective data distribution policy. Data mining of the collected monitoring data provides a deep insight into new usage patterns. In the WLCG context, there are several federations currently based on the XRootD technology. This paper will focus on monitoring for the ATLAS and CMS XRootD federations implemented in the Experiment Dashboard monitoring framework. Both federations consist of many dozens of sites accessed by many hundreds of clients and they continue to grow in size. Handling of the monitoring flow generated by these systems has to be well optimized in order to achieve the required performance. Furthermore, this paper demonstrates the XRootD monitoring architecture is sufficiently generic to be easily adapted for other technologies, such as HTTP/WebDAV dynamic federations.

  19. A theoretical bilevel control scheme for power networks with large-scale penetration of distributed renewable resources

    DEFF Research Database (Denmark)

    Boroojeni, Kianoosh; Amini, M. Hadi; Nejadpak, Arash

    2016-01-01

    In this paper, we present a bilevel control framework to achieve a highly-reliable smart distribution network with large-scale penetration of distributed renewable resources (DRRs). We assume that the power distribution network consists of several residential/commercial communities. In the first ...

  20. Rucio – The next generation of large scale distributed system for ATLAS data management

    International Nuclear Information System (INIS)

    Garonne, V; Vigne, R; Stewart, G; Barisits, M; Eermann, T B; Lassnig, M; Serfon, C; Goossens, L; Nairz, A

    2014-01-01

    Rucio is the next-generation Distributed Data Management (DDM) system benefiting from recent advances in cloud and 'Big Data' computing to address HEP experiments scaling requirements. Rucio is an evolution of the ATLAS DDM system Don Quijote 2 (DQ2), which has demonstrated very large scale data management capabilities with more than 140 petabytes spread worldwide across 130 sites, and accesses from 1,000 active users. However, DQ2 is reaching its limits in terms of scalability, requiring a large number of support staff to operate and being hard to extend with new technologies. Rucio will deal with these issues by relying on a conceptual data model and new technology to ensure system scalability, address new user requirements and employ new automation framework to reduce operational overheads. We present the key concepts of Rucio, including its data organization/representation and a model of how to manage central group and user activities. The Rucio design, and the technology it employs, is described, specifically looking at its RESTful architecture and the various software components it uses. We show also the performance of the system.

  1. Rucio - The next generation of large scale distributed system for ATLAS Data Management

    Science.gov (United States)

    Garonne, V.; Vigne, R.; Stewart, G.; Barisits, M.; eermann, T. B.; Lassnig, M.; Serfon, C.; Goossens, L.; Nairz, A.; Atlas Collaboration

    2014-06-01

    Rucio is the next-generation Distributed Data Management (DDM) system benefiting from recent advances in cloud and "Big Data" computing to address HEP experiments scaling requirements. Rucio is an evolution of the ATLAS DDM system Don Quijote 2 (DQ2), which has demonstrated very large scale data management capabilities with more than 140 petabytes spread worldwide across 130 sites, and accesses from 1,000 active users. However, DQ2 is reaching its limits in terms of scalability, requiring a large number of support staff to operate and being hard to extend with new technologies. Rucio will deal with these issues by relying on a conceptual data model and new technology to ensure system scalability, address new user requirements and employ new automation framework to reduce operational overheads. We present the key concepts of Rucio, including its data organization/representation and a model of how to manage central group and user activities. The Rucio design, and the technology it employs, is described, specifically looking at its RESTful architecture and the various software components it uses. We show also the performance of the system.

  2. A study of residence time distribution using radiotracer technique in the large scale plant facility

    Science.gov (United States)

    Wetchagarun, S.; Tippayakul, C.; Petchrak, A.; Sukrod, K.; Khoonkamjorn, P.

    2017-06-01

    As the demand for troubleshooting of large industrial plants increases, radiotracer techniques, which have capability to provide fast, online and effective detections to plant problems, have been continually developed. One of the good potential applications of the radiotracer for troubleshooting in a process plant is the analysis of Residence Time Distribution (RTD). In this paper, the study of RTD in a large scale plant facility using radiotracer technique was presented. The objective of this work is to gain experience on the RTD analysis using radiotracer technique in a “larger than laboratory” scale plant setup which can be comparable to the real industrial application. The experiment was carried out at the sedimentation tank in the water treatment facility of Thailand Institute of Nuclear Technology (Public Organization). Br-82 was selected to use in this work due to its chemical property, its suitable half-life and its on-site availability. NH4Br in the form of aqueous solution was injected into the system as the radiotracer. Six NaI detectors were placed along the pipelines and at the tank in order to determine the RTD of the system. The RTD and the Mean Residence Time (MRT) of the tank was analysed and calculated from the measured data. The experience and knowledge attained from this study is important for extending this technique to be applied to industrial facilities in the future.

  3. Rucio - The next generation of large scale distributed system for ATLAS Data Management

    CERN Document Server

    Garonne, V; The ATLAS collaboration; Beermann, T; Goossens, L; Lassnig, M; Nairz, A; Stewart, GA; Vigne, V; Serfon, C

    2013-01-01

    Rucio is the next-generation Distributed Data Management(DDM) system benefiting from recent advances in cloud and "Big Data" computing to address HEP experiments scaling requirements. Rucio is an evolution of the ATLAS DDM system Don Quijote 2 (DQ2), which has demonstrated very large scale data management capabilities with more than 140 petabytes spread worldwide across 130 sites, and accesses from 1,000 active users. However, DQ2 is reaching its limits in terms of scalability, requiring a large number of support staff to operate and being hard to extend with new technologies. Rucio will address these issues by relying on a conceptual data model and new technology to ensure system scalability, address new user requirements and employ new automation framework to reduce operational overheads. We present the key concepts of Rucio, including its data organization/representation and a model of how ATLAS central group and user activities will be managed. The Rucio design, and the technology it employs, is described...

  4. Rucio - The next generation of large scale distributed system for ATLAS Data Management

    CERN Document Server

    Garonne, V; The ATLAS collaboration; Beermann, T; Goossens, L; Lassnig, M; Nairz, A; Stewart, GA; Vigne, V; Serfon, C

    2014-01-01

    Rucio is the next-generation Distributed Data Management(DDM) system benefiting from recent advances in cloud and ”Big Data” computing to address HEP experiments scaling requirements. Rucio is an evolution of the ATLAS DDM system Don Quijote 2 (DQ2), which has demonstrated very large scale data management capabilities with more than 140 petabytes spread worldwide across 130 sites, and accesses from 1,000 active users. However, DQ2 is reaching its limits in terms of scalability, requiring a large number of support staff to operate and being hard to extend with new technologies. Rucio will address these issues by relying on a conceptual data model and new technology to ensure system scalability, address new user requirements and employ new automation framework to reduce operational overheads. We present the key concepts of Rucio, including its data organization/representation and a model of how ATLAS central group and user activities will be managed. The Rucio design, and the technology it employs, is descr...

  5. Large-scale spatial distribution patterns of gastropod assemblages in rocky shores.

    Directory of Open Access Journals (Sweden)

    Patricia Miloslavich

    Full Text Available Gastropod assemblages from nearshore rocky habitats were studied over large spatial scales to (1 describe broad-scale patterns in assemblage composition, including patterns by feeding modes, (2 identify latitudinal pattern of biodiversity, i.e., richness and abundance of gastropods and/or regional hotspots, and (3 identify potential environmental and anthropogenic drivers of these assemblages. Gastropods were sampled from 45 sites distributed within 12 Large Marine Ecosystem regions (LME following the NaGISA (Natural Geography in Shore Areas standard protocol (www.nagisa.coml.org. A total of 393 gastropod taxa from 87 families were collected. Eight of these families (9.2% appeared in four or more different LMEs. Among these, the Littorinidae was the most widely distributed (8 LMEs followed by the Trochidae and the Columbellidae (6 LMEs. In all regions, assemblages were dominated by few species, the most diverse and abundant of which were herbivores. No latitudinal gradients were evident in relation to species richness or densities among sampling sites. Highest diversity was found in the Mediterranean and in the Gulf of Alaska, while highest densities were found at different latitudes and represented by few species within one genus (e.g. Afrolittorina in the Agulhas Current, Littorina in the Scotian Shelf, and Lacuna in the Gulf of Alaska. No significant correlation was found between species composition and environmental variables (r≤0.355, p>0.05. Contributing variables to this low correlation included invasive species, inorganic pollution, SST anomalies, and chlorophyll-a anomalies. Despite data limitations in this study which restrict conclusions in a global context, this work represents the first effort to sample gastropod biodiversity on rocky shores using a standardized protocol across a wide scale. Our results will generate more work to build global databases allowing for large-scale diversity comparisons of rocky intertidal assemblages.

  6. Dictionaries and distributions: Combining expert knowledge and large scale textual data content analysis : Distributed dictionary representation.

    Science.gov (United States)

    Garten, Justin; Hoover, Joe; Johnson, Kate M; Boghrati, Reihane; Iskiwitch, Carol; Dehghani, Morteza

    2018-02-01

    Theory-driven text analysis has made extensive use of psychological concept dictionaries, leading to a wide range of important results. These dictionaries have generally been applied through word count methods which have proven to be both simple and effective. In this paper, we introduce Distributed Dictionary Representations (DDR), a method that applies psychological dictionaries using semantic similarity rather than word counts. This allows for the measurement of the similarity between dictionaries and spans of text ranging from complete documents to individual words. We show how DDR enables dictionary authors to place greater emphasis on construct validity without sacrificing linguistic coverage. We further demonstrate the benefits of DDR on two real-world tasks and finally conduct an extensive study of the interaction between dictionary size and task performance. These studies allow us to examine how DDR and word count methods complement one another as tools for applying concept dictionaries and where each is best applied. Finally, we provide references to tools and resources to make this method both available and accessible to a broad psychological audience.

  7. Lichen elemental content bioindicators for air quality in upper Midwest, USA: A model for large-scale monitoring

    Science.gov (United States)

    Susan Will-Wolf; Sarah Jovan; Michael C. Amacher

    2017-01-01

    Our development of lichen elemental bioindicators for a United States of America (USA) national monitoring program is a useful model for other large-scale programs. Concentrations of 20 elements were measured, validated, and analyzed for 203 samples of five common lichen species. Collections were made by trained non-specialists near 75 permanent plots and an expert...

  8. Development of lichen response indexes using a regional gradient modeling approach for large-scale monitoring of forests

    Science.gov (United States)

    Susan Will-Wolf; Peter Neitlich

    2010-01-01

    Development of a regional lichen gradient model from community data is a powerful tool to derive lichen indexes of response to environmental factors for large-scale and long-term monitoring of forest ecosystems. The Forest Inventory and Analysis (FIA) Program of the U.S. Department of Agriculture Forest Service includes lichens in its national inventory of forests of...

  9. Ecological niche modeling as a new paradigm for large-scale investigations of diversity and distribution of birds

    Science.gov (United States)

    A. Townsend Peterson; Daniel A. Kluza

    2005-01-01

    Large-scale assessments of the distribution and diversity of birds have been challenged by the need for a robust methodology for summarizing or predicting species' geographic distributions (e.g. Beard et al. 1999, Manel et al. 1999, Saveraid et al. 2001). Methodologies used in such studies have at times been inappropriate, or even more frequently limited in their...

  10. Application of a CFD based containment model to different large-scale hydrogen distribution experiments

    International Nuclear Information System (INIS)

    Visser, D.C.; Siccama, N.B.; Jayaraju, S.T.; Komen, E.M.J.

    2014-01-01

    Highlights: • A CFD based model developed in ANSYS-FLUENT for simulating the distribution of hydrogen in the containment of a nuclear power plant during a severe accident is validated against four large-scale experiments. • The successive formation and mixing of a stratified gas-layer in experiments performed in the THAI and PANDA facilities are predicted well by the CFD model. • The pressure evolution and related condensation rate during different mixed convection flow conditions in the TOSQAN facility are predicted well by the CFD model. • The results give confidence in the general applicability of the CFD model and model settings. - Abstract: In the event of core degradation during a severe accident in water-cooled nuclear power plants (NPPs), large amounts of hydrogen are generated that may be released into the reactor containment. As the hydrogen mixes with the air in the containment, it can form a flammable mixture. Upon ignition it can damage relevant safety systems and put the integrity of the containment at risk. Despite the installation of mitigation measures, it has been recognized that the temporary existence of combustible or explosive gas clouds cannot be fully excluded during certain postulated accident scenarios. The distribution of hydrogen in the containment and mitigation of the risk are, therefore, important safety issues for NPPs. Complementary to lumped parameter code modelling, Computational Fluid Dynamics (CFD) modelling is needed for the detailed assessment of the hydrogen risk in the containment and for the optimal design of hydrogen mitigation systems in order to reduce this risk as far as possible. The CFD model applied by NRG makes use of the well-developed basic features of the commercial CFD package ANSYS-FLUENT. This general purpose CFD package is complemented with specific user-defined sub-models required to capture the relevant thermal-hydraulic phenomena in the containment during a severe accident as well as the effect of

  11. Application of a CFD based containment model to different large-scale hydrogen distribution experiments

    Energy Technology Data Exchange (ETDEWEB)

    Visser, D.C., E-mail: visser@nrg.eu; Siccama, N.B.; Jayaraju, S.T.; Komen, E.M.J.

    2014-10-15

    Highlights: • A CFD based model developed in ANSYS-FLUENT for simulating the distribution of hydrogen in the containment of a nuclear power plant during a severe accident is validated against four large-scale experiments. • The successive formation and mixing of a stratified gas-layer in experiments performed in the THAI and PANDA facilities are predicted well by the CFD model. • The pressure evolution and related condensation rate during different mixed convection flow conditions in the TOSQAN facility are predicted well by the CFD model. • The results give confidence in the general applicability of the CFD model and model settings. - Abstract: In the event of core degradation during a severe accident in water-cooled nuclear power plants (NPPs), large amounts of hydrogen are generated that may be released into the reactor containment. As the hydrogen mixes with the air in the containment, it can form a flammable mixture. Upon ignition it can damage relevant safety systems and put the integrity of the containment at risk. Despite the installation of mitigation measures, it has been recognized that the temporary existence of combustible or explosive gas clouds cannot be fully excluded during certain postulated accident scenarios. The distribution of hydrogen in the containment and mitigation of the risk are, therefore, important safety issues for NPPs. Complementary to lumped parameter code modelling, Computational Fluid Dynamics (CFD) modelling is needed for the detailed assessment of the hydrogen risk in the containment and for the optimal design of hydrogen mitigation systems in order to reduce this risk as far as possible. The CFD model applied by NRG makes use of the well-developed basic features of the commercial CFD package ANSYS-FLUENT. This general purpose CFD package is complemented with specific user-defined sub-models required to capture the relevant thermal-hydraulic phenomena in the containment during a severe accident as well as the effect of

  12. Distributed and hierarchical control techniques for large-scale power plant systems

    International Nuclear Information System (INIS)

    Raju, G.V.S.; Kisner, R.A.

    1985-08-01

    In large-scale systems, integrated and coordinated control functions are required to maximize plant availability, to allow maneuverability through various power levels, and to meet externally imposed regulatory limitations. Nuclear power plants are large-scale systems. Prime subsystems are those that contribute directly to the behavior of the plant's ultimate output. The prime subsystems in a nuclear power plant include reactor, primary and intermediate heat transport, steam generator, turbine generator, and feedwater system. This paper describes and discusses the continuous-variable control system developed to supervise prime plant subsystems for optimal control and coordination

  13. Power monitoring and control for large scale projects: SKA, a case study

    Science.gov (United States)

    Barbosa, Domingos; Barraca, João. Paulo; Maia, Dalmiro; Carvalho, Bruno; Vieira, Jorge; Swart, Paul; Le Roux, Gerhard; Natarajan, Swaminathan; van Ardenne, Arnold; Seca, Luis

    2016-07-01

    Large sensor-based science infrastructures for radio astronomy like the SKA will be among the most intensive datadriven projects in the world, facing very high demanding computation, storage, management, and above all power demands. The geographically wide distribution of the SKA and its associated processing requirements in the form of tailored High Performance Computing (HPC) facilities, require a Greener approach towards the Information and Communications Technologies (ICT) adopted for the data processing to enable operational compliance to potentially strict power budgets. Addressing the reduction of electricity costs, improve system power monitoring and the generation and management of electricity at system level is paramount to avoid future inefficiencies and higher costs and enable fulfillments of Key Science Cases. Here we outline major characteristics and innovation approaches to address power efficiency and long-term power sustainability for radio astronomy projects, focusing on Green ICT for science and Smart power monitoring and control.

  14. Image subsampling and point scoring approaches for large-scale marine benthic monitoring programs

    Science.gov (United States)

    Perkins, Nicholas R.; Foster, Scott D.; Hill, Nicole A.; Barrett, Neville S.

    2016-07-01

    Benthic imagery is an effective tool for quantitative description of ecologically and economically important benthic habitats and biota. The recent development of autonomous underwater vehicles (AUVs) allows surveying of spatial scales that were previously unfeasible. However, an AUV collects a large number of images, the scoring of which is time and labour intensive. There is a need to optimise the way that subsamples of imagery are chosen and scored to gain meaningful inferences for ecological monitoring studies. We examine the trade-off between the number of images selected within transects and the number of random points scored within images on the percent cover of target biota, the typical output of such monitoring programs. We also investigate the efficacy of various image selection approaches, such as systematic or random, on the bias and precision of cover estimates. We use simulated biotas that have varying size, abundance and distributional patterns. We find that a relatively small sampling effort is required to minimise bias. An increased precision for groups that are likely to be the focus of monitoring programs is best gained through increasing the number of images sampled rather than the number of points scored within images. For rare species, sampling using point count approaches is unlikely to provide sufficient precision, and alternative sampling approaches may need to be employed. The approach by which images are selected (simple random sampling, regularly spaced etc.) had no discernible effect on mean and variance estimates, regardless of the distributional pattern of biota. Field validation of our findings is provided through Monte Carlo resampling analysis of a previously scored benthic survey from temperate waters. We show that point count sampling approaches are capable of providing relatively precise cover estimates for candidate groups that are not overly rare. The amount of sampling required, in terms of both the number of images and

  15. Distribution of ground rigidity and ground model for seismic response analysis in Hualian project of large scale seismic test

    International Nuclear Information System (INIS)

    Kokusho, T.; Nishi, K.; Okamoto, T.; Tanaka, Y.; Ueshima, T.; Kudo, K.; Kataoka, T.; Ikemi, M.; Kawai, T.; Sawada, Y.; Suzuki, K.; Yajima, K.; Higashi, S.

    1997-01-01

    An international joint research program called HLSST is proceeding. HLSST is large-scale seismic test (LSST) to investigate soil-structure interaction (SSI) during large earthquake in the field in Hualien, a high seismic region in Taiwan. A 1/4-scale model building was constructed on the gravelly soil in this site, and the backfill material of crushed stone was placed around the model plant after excavation for the construction. Also the model building and the foundation ground were extensively instrumental to monitor structure and ground response. To accurately evaluate SSI during earthquakes, geotechnical investigation and forced vibration test were performed during construction process namely before/after base excavation, after structure construction and after backfilling. And the distribution of the mechanical properties of the gravelly soil and the backfill are measured after the completion of the construction by penetration test and PS-logging etc. This paper describes the distribution and the change of the shear wave velocity (V s ) measured by the field test. Discussion is made on the effect of overburden pressure during the construction process on V s in the neighbouring soil and, further on the numerical soil model for SSI analysis. (orig.)

  16. Spatial and temporal distribution of pore gas concentrations during mainstream large-scale trough composting in China.

    Science.gov (United States)

    Zeng, Jianfei; Shen, Xiuli; Sun, Xiaoxi; Liu, Ning; Han, Lujia; Huang, Guangqun

    2018-05-01

    With the advantages of high treatment capacity and low operational cost, large-scale trough composting has become one of the mainstream composting patterns in composting plants in China. This study measured concentrations of O 2 , CO 2 , CH 4 and NH 3 on-site to investigate the spatial and temporal distribution of pore gas concentrations during mainstream large-scale trough composting in China. The results showed that the temperature in the center of the pile was obviously higher than that in the side of the pile. Pore O 2 concentration rapidly decreased and maintained composting process during large-scale trough composting when the pile was naturally aerated, which will contribute to improving the current undesirable atmosphere environment in China. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Feasibility analysis of using inverse modeling for estimating natural groundwater recharge from a large-scale soil moisture monitoring network

    Science.gov (United States)

    Wang, Tiejun; Franz, Trenton E.; Yue, Weifeng; Szilagyi, Jozsef; Zlotnik, Vitaly A.; You, Jinsheng; Chen, Xunhong; Shulski, Martha D.; Young, Aaron

    2016-02-01

    Despite the importance of groundwater recharge (GR), its accurate estimation still remains one of the most challenging tasks in the field of hydrology. In this study, with the help of inverse modeling, long-term (6 years) soil moisture data at 34 sites from the Automated Weather Data Network (AWDN) were used to estimate the spatial distribution of GR across Nebraska, USA, where significant spatial variability exists in soil properties and precipitation (P). To ensure the generality of this study and its potential broad applications, data from public domains and literature were used to parameterize the standard Hydrus-1D model. Although observed soil moisture differed significantly across the AWDN sites mainly due to the variations in P and soil properties, the simulations were able to capture the dynamics of observed soil moisture under different climatic and soil conditions. The inferred mean annual GR from the calibrated models varied over three orders of magnitude across the study area. To assess the uncertainties of the approach, estimates of GR and actual evapotranspiration (ETa) from the calibrated models were compared to the GR and ETa obtained from other techniques in the study area (e.g., remote sensing, tracers, and regional water balance). Comparison clearly demonstrated the feasibility of inverse modeling and large-scale (>104 km2) soil moisture monitoring networks for estimating GR. In addition, the model results were used to further examine the impacts of climate and soil on GR. The data showed that both P and soil properties had significant impacts on GR in the study area with coarser soils generating higher GR; however, different relationships between GR and P emerged at the AWDN sites, defined by local climatic and soil conditions. In general, positive correlations existed between annual GR and P for the sites with coarser-textured soils or under wetter climatic conditions. With the rapidly expanding soil moisture monitoring networks around the

  18. Radiations: large scale monitoring in Japan; Radiations: suivi a grande echelle au Japon

    Energy Technology Data Exchange (ETDEWEB)

    Linton, M.; Khalatbari, A.

    2011-10-15

    As the consequences of radioactive leaks on their health are a matter of concern for Japanese people, a large scale epidemiological study has been launched by the Fukushima medical university. It concerns the two millions inhabitants of the Fukushima Prefecture. On the national level and with the support of public funds, medical care and follow-up, as well as systematic controls are foreseen, notably to check the thyroid of 360.000 young people less than 18 year old and of 20.000 pregnant women in the Fukushima Prefecture. Some measurements have already been performed on young children. Despite the sometimes rather low measures, and because they know that some parts of the area are at least as much contaminated as it was the case around Chernobyl, some people are reluctant to go back home

  19. Research highlights from a large scale residential monitoring study in a hot climate

    Energy Technology Data Exchange (ETDEWEB)

    Parker, Danny S. [Florida Solar Energy Center, Cocoa, FL (United States)

    2003-10-01

    A utility load research project has monitored a large number of residences in Central Florida, collecting detailed end-use data. The monitoring was performed to better estimate the impact of a load control program, as well as obtain improved appliance energy load profiles. The monitoring measured total as well as a number of electrical end-uses on a 15 min basis. The measured end-uses included space cooling, heating, water heating, range and cooking, clothes drying, and swimming pools electricity use and demand. The project identified a number of influences on electrical demand that are not commonly described. (Author)

  20. Responses of Cloud Type Distributions to the Large-Scale Dynamical Circulation: Water Budget-Related Dynamical Phase Space and Dynamical Regimes

    Science.gov (United States)

    Wong, Sun; Del Genio, Anthony; Wang, Tao; Kahn, Brian; Fetzer, Eric J.; L'Ecuyer, Tristan S.

    2015-01-01

    Goals: Water budget-related dynamical phase space; Connect large-scale dynamical conditions to atmospheric water budget (including precipitation); Connect atmospheric water budget to cloud type distributions.

  1. Outdoor thermal monitoring of large scale structures by infrared thermography integrated in an ICT based architecture

    Science.gov (United States)

    Dumoulin, Jean; Crinière, Antoine; Averty, Rodolphe

    2015-04-01

    An infrared system has been developed to monitor transport infrastructures in a standalone configuration. Results obtained on bridges open to traffic allows to retrieve the inner structure of the decks. To complete this study, experiments were carried out over several months to monitor two reinforced concrete beams of 16 m long and 21 T each. Detection of a damaged area over one of the two beams was made by Pulse Phase Thermography approach. Measurements carried out over several months. Finally, conclusion on the robustness of the system is proposed and perspectives are presented.

  2. Large-scale control site selection for population monitoring: an example assessing Sage-grouse trends

    Science.gov (United States)

    Fedy, Bradley C.; O'Donnell, Michael; Bowen, Zachary H.

    2015-01-01

    Human impacts on wildlife populations are widespread and prolific and understanding wildlife responses to human impacts is a fundamental component of wildlife management. The first step to understanding wildlife responses is the documentation of changes in wildlife population parameters, such as population size. Meaningful assessment of population changes in potentially impacted sites requires the establishment of monitoring at similar, nonimpacted, control sites. However, it is often difficult to identify appropriate control sites in wildlife populations. We demonstrated use of Geographic Information System (GIS) data across large spatial scales to select biologically relevant control sites for population monitoring. Greater sage-grouse (Centrocercus urophasianus; hearafter, sage-grouse) are negatively affected by energy development, and monitoring of sage-grouse population within energy development areas is necessary to detect population-level responses. Weused population data (1995–2012) from an energy development area in Wyoming, USA, the Atlantic Rim Project Area (ARPA), and GIS data to identify control sites that were not impacted by energy development for population monitoring. Control sites were surrounded by similar habitat and were within similar climate areas to the ARPA. We developed nonlinear trend models for both the ARPA and control sites and compared long-term trends from the 2 areas. We found little difference between the ARPA and control sites trends over time. This research demonstrated an approach for control site selection across large landscapes and can be used as a template for similar impact-monitoring studies. It is important to note that identification of changes in population parameters between control and treatment sites is only the first step in understanding the mechanisms that underlie those changes. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.

  3. Large-Scale Production of Monitored Drift Tube Chambers for the ATLAS Muon Spectrometer

    CERN Document Server

    Bauer, F.; Kortner, O; Kroha, H; Manz, A; Mohrdieck, S; Richter, R; Zhuravlov, V

    2016-01-01

    Precision drift tube chambers with a sense wire positioning accuracy of better than 20 microns are under construction for the ATLAS muon spectrometer. 70% of the 88 large chambers for the outermost layer of the central part of the spectrometer have been assembled. Measurements during chamber construction of the positions of the sense wires and of the sensors for the optical alignment monitoring system demonstrate that the requirements for the mechanical precision of the chambers are fulfilled.

  4. Large Scale Environmental Monitoring through Integration of Sensor and Mesh Networks

    Directory of Open Access Journals (Sweden)

    Raja Jurdak

    2008-11-01

    Full Text Available Monitoring outdoor environments through networks of wireless sensors has received interest for collecting physical and chemical samples at high spatial and temporal scales. A central challenge to environmental monitoring applications of sensor networks is the short communication range of the sensor nodes, which increases the complexity and cost of monitoring commodities that are located in geographically spread areas. To address this issue, we propose a new communication architecture that integrates sensor networks with medium range wireless mesh networks, and provides users with an advanced web portal for managing sensed information in an integrated manner. Our architecture adopts a holistic approach targeted at improving the user experience by optimizing the system performance for handling data that originates at the sensors, traverses the mesh network, and resides at the server for user consumption. This holistic approach enables users to set high level policies that can adapt the resolution of information collected at the sensors, set the preferred performance targets for their application, and run a wide range of queries and analysis on both real-time and historical data. All system components and processes will be described in this paper.

  5. Large Scale Environmental Monitoring through Integration of Sensor and Mesh Networks.

    Science.gov (United States)

    Jurdak, Raja; Nafaa, Abdelhamid; Barbirato, Alessio

    2008-11-24

    Monitoring outdoor environments through networks of wireless sensors has received interest for collecting physical and chemical samples at high spatial and temporal scales. A central challenge to environmental monitoring applications of sensor networks is the short communication range of the sensor nodes, which increases the complexity and cost of monitoring commodities that are located in geographically spread areas. To address this issue, we propose a new communication architecture that integrates sensor networks with medium range wireless mesh networks, and provides users with an advanced web portal for managing sensed information in an integrated manner. Our architecture adopts a holistic approach targeted at improving the user experience by optimizing the system performance for handling data that originates at the sensors, traverses the mesh network, and resides at the server for user consumption. This holistic approach enables users to set high level policies that can adapt the resolution of information collected at the sensors, set the preferred performance targets for their application, and run a wide range of queries and analysis on both real-time and historical data. All system components and processes will be described in this paper.

  6. Multi-Sensing system for outdoor thermal monitoring: Application to large scale civil engineering components

    Science.gov (United States)

    Crinière, Antoine; Dumoulin, Jean; Manceau, Jean-Luc; Perez, Laetitia; Bourquin, Frederic

    2014-05-01

    Aging of transport infrastructures combined with traffic and climatic solicitations contribute to the reduction of their performances. To address and quantify the resilience of civil engineering structure, investigations on robust, fast and efficient methods are required. Among research works carried out at IFSTTAR, methods for long term monitoring face an increasing demand. Such works take benefits of this last decade technological progresses in ICT domain. The present study follows the ISTIMES European project [1], which aimed at demonstrate the ability of different electromagnetic sensing techniques, processing methods and ICT architecture, to be used for long term monitoring of critical transport infrastructures. Thanks to this project a multi-sensing techniques system, able to date and synchronize measurements carried out by infrared thermography coupled with various measurements data (i.e. weather parameters), have been designed, developed and implemented on real site [2]. Among experiments carried out on real transport infrastructure, it has been shown, for the "Musmesci" bridge deck (Italy), that by using infrared thermal image sequence with weather measurements during sevral days it was possible to develop analysis methods able to produce qualitative and quantitative data [3]. In the present study, added functionalities were designed and added to the "IrLAW" system in order to reach full autonomy in term of power supply, very long term measurement capability (at least 1 year) and automated data base feeding. The surveyed civil engineering structures consist in two concrete beams of 16 m long and 21 T weight each. One of the two beams was damage by high energy mechanical impact at the IFSTTAR falling rocks test station facilities located in the French Alpes [4]. The system is composed of one IR uncooled microbolometric camera (FLIR SC325) with a 320X240 Focal Plane Array detector in band III, a weather station VAISALA WXT520, a GPS, a failover power supply

  7. DC-DC Converter Topology Assessment for Large Scale Distributed Photovoltaic Plant Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Agamy, Mohammed S; Harfman-Todorovic, Maja; Elasser, Ahmed; Sabate, Juan A; Steigerwald, Robert L; Jiang, Yan; Essakiappan, Somasundaram

    2011-07-01

    Distributed photovoltaic (PV) plant architectures are emerging as a replacement for the classical central inverter based systems. However, power converters of smaller ratings may have a negative impact on system efficiency, reliability and cost. Therefore, it is necessary to design converters with very high efficiency and simpler topologies in order not to offset the benefits gained by using distributed PV systems. In this paper an evaluation of the selection criteria for dc-dc converters for distributed PV systems is performed; this evaluation includes efficiency, simplicity of design, reliability and cost. Based on this evaluation, recommendations can be made as to which class of converters is best fit for this application.

  8. Distributed Random Process for a Large-Scale Peer-to-Peer Lottery

    OpenAIRE

    Grumbach, Stéphane; Riemann, Robert

    2017-01-01

    International audience; Most online lotteries today fail to ensure the verifiability of the random process and rely on a trusted third party. This issue has received little attention since the emergence of distributed protocols like Bitcoin that demonstrated the potential of protocols with no trusted third party. We argue that the security requirements of online lotteries are similar to those of online voting, and propose a novel distributed online lottery protocol that applies techniques dev...

  9. Establishment and monitoring of large scale trials of short rotation coppice for energy

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, C.; Stevens, E.A.; Watters, M.P.

    1998-09-01

    The overall objective of the trials was to obtain information on costs, logistics, productivity and biology of short rotation coppice crops in order to evaluate their potential for producing wood for fuel. More specifically, the objectives of the final and most recent phase of the research work were: the continuing management and monitoring of the coppice trial sites established during phases 1 and 2 of the project; to provide technical and economic data on the management and maintenance of the continuing coppice trail sites; to identify appropriate methods for stool removal and land reclamation and provide technical and economic data on those operations; and to undertake yield assessment at the remaining sites using appropriate methods of yield estimation. (author)

  10. The application of acoustic emission measurements on laboratory testpieces to large scale pressure vessel monitoring

    International Nuclear Information System (INIS)

    Ingham, T.; Dawson, D.G.

    1975-01-01

    A test pressure vessel containing 4 artificial defects was monitored for emission whilst pressure cycling to failure. Testpieces cut from both the failed vessel and from as-rolled plate material were tested in the laboratory. A marked difference in emission characteristics was observed between plate and vessel testpieces. Activity from vessel material was virtually constant after general yield and emission amplitudes were low. Plate testpieces showed maximum activity at general yield and more frequent high amplitude emissions. An attempt has been made to compare the system sensitivities between the pressure vessel test and laboratory tests. In the absence of an absolute calibration device, system sensitivities were estimated using dummy signals generated by the excitation of an emission sensor. The measurements have shown an overall difference in sensitivity between vessel and laboratory tests of approximately 25db. The reduced sensitivity in the vessel test is attributed to a combination of differences in sensors, acoustic couplant, attenuation, and dispersion relative to laboratory tests and the relative significance of these factors is discussed. Signal amplitude analysis of the emissions monitored from laboratory testpieces showed that, whith losses of the order of 25 to 30db, few emissions would be detected from the pressure vessel test. It is concluded that no reliable prediction of acoustic behaviour of a structure may be made from laboratory test unless testpieces of the actual structural material are used. A considerable improvement in detection sensitivity, is also required for reliable detection of defects in low strength ductile materials and an absolute method of system calibration is required between tests

  11. Efficacy of extracting indices from large-scale acoustic recordings to monitor biodiversity.

    Science.gov (United States)

    Buxton, Rachel; McKenna, Megan F; Clapp, Mary; Meyer, Erik; Stabenau, Erik; Angeloni, Lisa M; Crooks, Kevin; Wittemyer, George

    2018-04-20

    Passive acoustic monitoring has the potential to be a powerful approach for assessing biodiversity across large spatial and temporal scales. However, extracting meaningful information from recordings can be prohibitively time consuming. Acoustic indices offer a relatively rapid method for processing acoustic data and are increasingly used to characterize biological communities. We examine the ability of acoustic indices to predict the diversity and abundance of biological sounds within recordings. First we reviewed the acoustic index literature and found that over 60 indices have been applied to a range of objectives with varying success. We then implemented a subset of the most successful indices on acoustic data collected at 43 sites in temperate terrestrial and tropical marine habitats across the continental U.S., developing a predictive model of the diversity of animal sounds observed in recordings. For terrestrial recordings, random forest models using a suite of acoustic indices as covariates predicted Shannon diversity, richness, and total number of biological sounds with high accuracy (R 2 > = 0.94, mean squared error MSE indices assessed, roughness, acoustic activity, and acoustic richness contributed most to the predictive ability of models. Performance of index models was negatively impacted by insect, weather, and anthropogenic sounds. For marine recordings, random forest models predicted Shannon diversity, richness, and total number of biological sounds with low accuracy (R 2 = 195), indicating that alternative methods are necessary in marine habitats. Our results suggest that using a combination of relevant indices in a flexible model can accurately predict the diversity of biological sounds in temperate terrestrial acoustic recordings. Thus, acoustic approaches could be an important contribution to biodiversity monitoring in some habitats in the face of accelerating human-caused ecological change. This article is protected by copyright. All rights

  12. Distributed Semidefinite Programming with Application to Large-scale System Analysis

    DEFF Research Database (Denmark)

    Khoshfetrat Pakazad, Sina; Hansson, Anders; Andersen, Martin S.

    2017-01-01

    Distributed algorithms for solving coupled semidefinite programs (SDPs) commonly require many iterations to converge. They also put high computational demand on the computational agents. In this paper we show that in case the coupled problem has an inherent tree structure, it is possible to devis...

  13. Large-Scale Distributed Bayesian Matrix Factorization using Stochastic Gradient MCMC

    NARCIS (Netherlands)

    Ahn, S.; Korattikara, A.; Liu, N.; Rajan, S.; Welling, M.

    2015-01-01

    Despite having various attractive qualities such as high prediction accuracy and the ability to quantify uncertainty and avoid ovrfitting, Bayesian Matrix Factorization has not been widely adopted because of the prohibitive cost of inference. In this paper, we propose a scalable distributed Bayesian

  14. Large Scale Solar Power Integration in Distribution Grids : PV Modelling, Voltage Support and Aggregation Studies

    NARCIS (Netherlands)

    Samadi, A.

    2014-01-01

    Long term supporting schemes for photovoltaic (PV) system installation have led to accommodating large numbers of PV systems within load pockets in distribution grids. High penetrations of PV systems can cause new technical challenges, such as voltage rise due to reverse power flow during light load

  15. Automated selected reaction monitoring data analysis workflow for large-scale targeted proteomic studies.

    Science.gov (United States)

    Surinova, Silvia; Hüttenhain, Ruth; Chang, Ching-Yun; Espona, Lucia; Vitek, Olga; Aebersold, Ruedi

    2013-08-01

    Targeted proteomics based on selected reaction monitoring (SRM) mass spectrometry is commonly used for accurate and reproducible quantification of protein analytes in complex biological mixtures. Strictly hypothesis-driven, SRM assays quantify each targeted protein by collecting measurements on its peptide fragment ions, called transitions. To achieve sensitive and accurate quantitative results, experimental design and data analysis must consistently account for the variability of the quantified transitions. This consistency is especially important in large experiments, which increasingly require profiling up to hundreds of proteins over hundreds of samples. Here we describe a robust and automated workflow for the analysis of large quantitative SRM data sets that integrates data processing, statistical protein identification and quantification, and dissemination of the results. The integrated workflow combines three software tools: mProphet for peptide identification via probabilistic scoring; SRMstats for protein significance analysis with linear mixed-effect models; and PASSEL, a public repository for storage, retrieval and query of SRM data. The input requirements for the protocol are files with SRM traces in mzXML format, and a file with a list of transitions in a text tab-separated format. The protocol is especially suited for data with heavy isotope-labeled peptide internal standards. We demonstrate the protocol on a clinical data set in which the abundances of 35 biomarker candidates were profiled in 83 blood plasma samples of subjects with ovarian cancer or benign ovarian tumors. The time frame to realize the protocol is 1-2 weeks, depending on the number of replicates used in the experiment.

  16. Large-scale indicators for monitoring forest diversity of the main forest types in Calabria (Italy

    Directory of Open Access Journals (Sweden)

    Infusino M

    2017-02-01

    Full Text Available Recently, the Society’s perception of forest resources has gone through significant changes. Forest ecosystems play a multifunctional role and host an important portion of the whole biodiversity, particularly in the Mediterranean area. Remote sensing technologies provide a unique way to obtain spatially extensive information on forest ecosystems, but relatively few studies used such information to evaluate forest habitat and biotic diversity. In this paper we evaluate the effectiveness of remote sensing to predict forest diversity by linking remotely sensed information with diversity metrics obtained from ground measurements of butterfly diversity. The field work was carried out in Calabria in four different forest types (beech, chestnut, black pine and silver fir forests. The sampling of Lepidoptera was carried out by LED light traps. We positioned 9 traps per forest type, for a total of 36 sites chosen to sample the different stages of forest succession in each forest type. Samples were carried out once a month from May to November 2015. Data from in situ butterfly measurements were compared with above ground forest biomass estimated from airborne LiDAR with NDVI estimated from Landsat 8. Results indicated that the Geometridae/Noctuideae ratio of lepidopteran communities was significantly correlated with the tree biomass, its distribution among tree size classes and the NDVI. The Geometridae/Noctuidae ratio, therefore, represents an index easy to calculate, which can be employed to integrate data acquired from remote sensing in order to obtain continuous spatial estimates of forest naturalness.

  17. Large-scale determinants of intestinal schistosomiasis and intermediate host snail distribution across Africa

    DEFF Research Database (Denmark)

    Stensgaard, Anna-Sofie; Utzinger, Jürg; Vounatsou, Penelope

    2013-01-01

    The geographical ranges of most species, including many infectious disease agents and their vectors and intermediate hosts, are assumed to be constrained by climatic tolerances, mainly temperature. It has been suggested that global warming will cause an expansion of the areas potentially suitable...... impacts of climatic changes. Snail species distribution models included several combinations of climatic and habitat-related predictors; the latter divided into "natural" and "human-impacted" habitat variables to measure anthropogenic influence. The predictive performance of the combined snail...... are more likely to contract and/or move into cooler areas in the south and east. Importantly, we also note that even though climate per se matters, the impact of humans on habitat play a crucial role in determining the distribution of the intermediate host snails in Africa. Thus, a future contraction...

  18. A cellphone based system for large-scale monitoring of black carbon

    Science.gov (United States)

    Ramanathan, N.; Lukac, M.; Ahmed, T.; Kar, A.; Praveen, P. S.; Honles, T.; Leong, I.; Rehman, I. H.; Schauer, J. J.; Ramanathan, V.

    2011-08-01

    Black carbon aerosols are a major component of soot and are also a major contributor to global and regional climate change. Reliable and cost-effective systems to measure near-surface black carbon (BC) mass concentrations (hereafter denoted as [BC]) globally are necessary to validate air pollution and climate models and to evaluate the effectiveness of BC mitigation actions. Toward this goal we describe a new wireless, low-cost, ultra low-power, BC cellphone based monitoring system (BC_CBM). BC_CBM integrates a Miniaturized Aerosol filter Sampler (MAS) with a cellphone for filter image collection, transmission and image analysis for determining [BC] in real time. The BC aerosols in the air accumulate on the MAS quartz filter, resulting in a coloration of the filter. A photograph of the filter is captured by the cellphone camera and transmitted by the cellphone to the analytics component of BC_CBM. The analytics component compares the image with a calibrated reference scale (also included in the photograph) to estimate [BC]. We demonstrate with field data collected from vastly differing environments, ranging from southern California to rural regions in the Indo-Gangetic plains of Northern India, that the total BC deposited on the filter is directly and uniquely related to the reflectance of the filter in the red wavelength, irrespective of its source or how the particles were deposited. [BC] varied from 0.1 to 1 μg m -3 in Southern California and from 10 to 200 μg m -3 in rural India in our field studies. In spite of the 3 orders of magnitude variation in [BC], the BC_CBM system was able to determine the [BC] well within the experimental error of two independent reference instruments for both indoor air and outdoor ambient air. Accurate, global-scale measurements of [BC] in urban and remote rural locations, enabled by the wireless, low-cost, ultra low-power operation of BC_CBM, will make it possible to better capture the large spatial and temporal variations in

  19. Large scale patterns in vertical distribution and behaviour of mesopelagic scattering layers

    KAUST Repository

    Klevjer, Thor Aleksander

    2016-01-27

    Recent studies suggest that previous estimates of mesopelagic biomasses are severely biased, with the new, higher estimates underlining the need to unveil behaviourally mediated coupling between shallow and deep ocean habitats. We analysed vertical distribution and diel vertical migration (DVM) of mesopelagic acoustic scattering layers (SLs) recorded at 38 kHz across oceanographic regimes encountered during the circumglobal Malaspina expedition. Mesopelagic SLs were observed in all areas covered, but vertical distributions and DVM patterns varied markedly. The distribution of mesopelagic backscatter was deepest in the southern Indian Ocean (weighted mean daytime depth: WMD 590 m) and shallowest at the oxygen minimum zone in the eastern Pacific (WMD 350 m). DVM was evident in all areas covered, on average ~50% of mesopelagic backscatter made daily excursions from mesopelagic depths to shallow waters. There were marked differences in migrating proportions between the regions, ranging from ~20% in the Indian Ocean to ~90% in the Eastern Pacific. Overall the data suggest strong spatial gradients in mesopelagic DVM patterns, with implied ecological and biogeochemical consequences. Our results suggest that parts of this spatial variability can be explained by horizontal patterns in physical-chemical properties of water masses, such as oxygen, temperature and turbidity.

  20. 'Oorja' in India: Assessing a large-scale commercial distribution of advanced biomass stoves to households.

    Science.gov (United States)

    Thurber, Mark C; Phadke, Himani; Nagavarapu, Sriniketh; Shrimali, Gireesh; Zerriffi, Hisham

    2014-04-01

    Replacing traditional stoves with advanced alternatives that burn more cleanly has the potential to ameliorate major health problems associated with indoor air pollution in developing countries. With a few exceptions, large government and charitable programs to distribute advanced stoves have not had the desired impact. Commercially-based distributions that seek cost recovery and even profits might plausibly do better, both because they encourage distributors to supply and promote products that people want and because they are based around properly-incentivized supply chains that could more be scalable, sustainable, and replicable. The sale in India of over 400,000 "Oorja" stoves to households from 2006 onwards represents the largest commercially-based distribution of a gasification-type advanced biomass stove. BP's Emerging Consumer Markets (ECM) division and then successor company First Energy sold this stove and the pelletized biomass fuel on which it operates. We assess the success of this effort and the role its commercial aspect played in outcomes using a survey of 998 households in areas of Maharashtra and Karnataka where the stove was sold as well as detailed interviews with BP and First Energy staff. Statistical models based on this data indicate that Oorja purchase rates were significantly influenced by the intensity of Oorja marketing in a region as well as by pre-existing stove mix among households. The highest rate of adoption came from LPG-using households for which Oorja's pelletized biomass fuel reduced costs. Smoke- and health-related messages from Oorja marketing did not significantly influence the purchase decision, although they did appear to affect household perceptions about smoke. By the time of our survey, only 9% of households that purchased Oorja were still using the stove, the result in large part of difficulties First Energy encountered in developing a viable supply chain around low-cost procurement of "agricultural waste" to make

  1. On the Fidelity of Semi-distributed Hydrologic Model Simulations for Large Scale Catchment Applications

    Science.gov (United States)

    Ajami, H.; Sharma, A.; Lakshmi, V.

    2017-12-01

    Application of semi-distributed hydrologic modeling frameworks is a viable alternative to fully distributed hyper-resolution hydrologic models due to computational efficiency and resolving fine-scale spatial structure of hydrologic fluxes and states. However, fidelity of semi-distributed model simulations is impacted by (1) formulation of hydrologic response units (HRUs), and (2) aggregation of catchment properties for formulating simulation elements. Here, we evaluate the performance of a recently developed Soil Moisture and Runoff simulation Toolkit (SMART) for large catchment scale simulations. In SMART, topologically connected HRUs are delineated using thresholds obtained from topographic and geomorphic analysis of a catchment, and simulation elements are equivalent cross sections (ECS) representative of a hillslope in first order sub-basins. Earlier investigations have shown that formulation of ECSs at the scale of a first order sub-basin reduces computational time significantly without compromising simulation accuracy. However, the implementation of this approach has not been fully explored for catchment scale simulations. To assess SMART performance, we set-up the model over the Little Washita watershed in Oklahoma. Model evaluations using in-situ soil moisture observations show satisfactory model performance. In addition, we evaluated the performance of a number of soil moisture disaggregation schemes recently developed to provide spatially explicit soil moisture outputs at fine scale resolution. Our results illustrate that the statistical disaggregation scheme performs significantly better than the methods based on topographic data. Future work is focused on assessing the performance of SMART using remotely sensed soil moisture observations using spatially based model evaluation metrics.

  2. Large-scale distribution patterns of mangrove nematodes: A global meta-analysis.

    Science.gov (United States)

    Brustolin, Marco C; Nagelkerken, Ivan; Fonseca, Gustavo

    2018-05-01

    Mangroves harbor diverse invertebrate communities, suggesting that macroecological distribution patterns of habitat-forming foundation species drive the associated faunal distribution. Whether these are driven by mangrove biogeography is still ambiguous. For small-bodied taxa, local factors and landscape metrics might be as important as macroecology. We performed a meta-analysis to address the following questions: (1) can richness of mangrove trees explain macroecological patterns of nematode richness? and (2) do local landscape attributes have equal or higher importance than biogeography in structuring nematode richness? Mangrove areas of Caribbean-Southwest Atlantic, Western Indian, Central Indo-Pacific, and Southwest Pacific biogeographic regions. We used random-effects meta-analyses based on natural logarithm of the response ratio (lnRR) to assess the importance of macroecology (i.e., biogeographic regions, latitude, longitude), local factors (i.e., aboveground mangrove biomass and tree richness), and landscape metrics (forest area and shape) in structuring nematode richness from 34 mangroves sites around the world. Latitude, mangrove forest area, and forest shape index explained 19% of the heterogeneity across studies. Richness was higher at low latitudes, closer to the equator. At local scales, richness increased slightly with landscape complexity and decreased with forest shape index. Our results contrast with biogeographic diversity patterns of mangrove-associated taxa. Global-scale nematode diversity may have evolved independently of mangrove tree richness, and diversity of small-bodied metazoans is probably more closely driven by latitude and associated climates, rather than local, landscape, or global biogeographic patterns.

  3. Large-scale monitoring of effects of clothianidin dressed oilseed rape seeds on pollinating insects in Northern Germany: implementation of the monitoring project and its representativeness

    OpenAIRE

    Heimbach, Fred; Russ, Anja; Schimmer, Maren; Born, Katrin

    2016-01-01

    Monitoring studies at the landscape level are complex, expensive and difficult to conduct. Many aspects have to be considered to avoid confounding effects which is probably the reason why they are not regularly performed in the context of risk assessments of plant protection products to pollinating insects. However, if conducted appropriately their contribution is most valuable. In this paper we identify the requirements of a large-scale monitoring study for the assessment of side-effects of ...

  4. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    International Nuclear Information System (INIS)

    Sig Drellack, Lance Prothro

    2007-01-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  5. Interoperable mesh components for large-scale, distributed-memory simulations

    International Nuclear Information System (INIS)

    Devine, K; Leung, V; Diachin, L; Miller, M

    2009-01-01

    SciDAC applications have a demonstrated need for advanced software tools to manage the complexities associated with sophisticated geometry, mesh, and field manipulation tasks, particularly as computer architectures move toward the petascale. In this paper, we describe a software component - an abstract data model and programming interface - designed to provide support for parallel unstructured mesh operations. We describe key issues that must be addressed to successfully provide high-performance, distributed-memory unstructured mesh services and highlight some recent research accomplishments in developing new load balancing and MPI-based communication libraries appropriate for leadership class computing. Finally, we give examples of the use of parallel adaptive mesh modification in two SciDAC applications.

  6. Atmospheric HT and HTO: V. distribution and large-scale circulation

    International Nuclear Information System (INIS)

    Mason, A.S.; Oestlund, H.G.

    1979-01-01

    The two major chemical forms of atmospheric tritium are water vapour (HTO) and hydrogen gas (HT). These forms have quite different sources, distributions and sinks. The chemical conversion from HT to HTO in the atmosphere proceeds with a characteristic time of 6.5 years. Combined with the radioactive decay, a net lifetime of 4.8 years is estimated for atmospheric HT. HT is released predominately at the surface in mid- to high latitudes in the northern hemisphere. A negative gradient southward has been found from aircraft transects and from sampling at surface stations. After many years of a relatively constant global inventory of 1.1 kg of tritium gas, the HT mixing ratios decreased during 1977, with the sharpest drop at high latitudes. The estimated decline in annual production was 100 g. At the end of 1977, the atmospheric HT burden was 1.0 kg, and the estimated annual release was 200 g. An unknown portion is present as T 2 gas. The effect of T 2 is to decrease the net lifetime to 3.7 years. In the troposphere, the cycle of HTO has been treated exhaustively by others. The stratospheric distribution of HTO has been sampled from aircraft, and found to increase rapidly with height above the troposphere. An annual cycle has been observed, in which the lower stratosphere is depleted during the spring, and replenished by subsidence from higher levels during summer and fall. The effects of a nuclear test by the People's Republic of China in November 1976 have been clearly observed in the stratospheric HTO; however, no HT deposition was found. Presumably, the HTO at higher levels was originally deposited by the large nuclear weapons tests of the 1960s. An estimated 5 kg of tritium are now present in the stratosphere below 19 km. (author)

  7. Seismic Search Engine: A distributed database for mining large scale seismic data

    Science.gov (United States)

    Liu, Y.; Vaidya, S.; Kuzma, H. A.

    2009-12-01

    The International Monitoring System (IMS) of the CTBTO collects terabytes worth of seismic measurements from many receiver stations situated around the earth with the goal of detecting underground nuclear testing events and distinguishing them from other benign, but more common events such as earthquakes and mine blasts. The International Data Center (IDC) processes and analyzes these measurements, as they are collected by the IMS, to summarize event detections in daily bulletins. Thereafter, the data measurements are archived into a large format database. Our proposed Seismic Search Engine (SSE) will facilitate a framework for data exploration of the seismic database as well as the development of seismic data mining algorithms. Analogous to GenBank, the annotated genetic sequence database maintained by NIH, through SSE, we intend to provide public access to seismic data and a set of processing and analysis tools, along with community-generated annotations and statistical models to help interpret the data. SSE will implement queries as user-defined functions composed from standard tools and models. Each query is compiled and executed over the database internally before reporting results back to the user. Since queries are expressed with standard tools and models, users can easily reproduce published results within this framework for peer-review and making metric comparisons. As an illustration, an example query is “what are the best receiver stations in East Asia for detecting events in the Middle East?” Evaluating this query involves listing all receiver stations in East Asia, characterizing known seismic events in that region, and constructing a profile for each receiver station to determine how effective its measurements are at predicting each event. The results of this query can be used to help prioritize how data is collected, identify defective instruments, and guide future sensor placements.

  8. Response time distributions in rapid chess: a large-scale decision making experiment.

    Science.gov (United States)

    Sigman, Mariano; Etchemendy, Pablo; Slezak, Diego Fernández; Cecchi, Guillermo A

    2010-01-01

    Rapid chess provides an unparalleled laboratory to understand decision making in a natural environment. In a chess game, players choose consecutively around 40 moves in a finite time budget. The goodness of each choice can be determined quantitatively since current chess algorithms estimate precisely the value of a position. Web-based chess produces vast amounts of data, millions of decisions per day, incommensurable with traditional psychological experiments. We generated a database of response times (RTs) and position value in rapid chess games. We measured robust emergent statistical observables: (1) RT distributions are long-tailed and show qualitatively distinct forms at different stages of the game, (2) RT of successive moves are highly correlated both for intra- and inter-player moves. These findings have theoretical implications since they deny two basic assumptions of sequential decision making algorithms: RTs are not stationary and can not be generated by a state-function. Our results also have practical implications. First, we characterized the capacity of blunders and score fluctuations to predict a player strength, which is yet an open problem in chess softwares. Second, we show that the winning likelihood can be reliably estimated from a weighted combination of remaining times and position evaluation.

  9. Response time distributions in rapid chess: A large-scale decision making experiment

    Directory of Open Access Journals (Sweden)

    Mariano Sigman

    2010-10-01

    Full Text Available Rapid chess provides an unparalleled laboratory to understand decision making in a natural environment. In a chess game, players choose consecutively around 40 moves in a finite time budget. The goodness of each choice can be determined quantitatively since current chess algorithms estimate precisely the value of a position. Web-based chess produces vast amounts of data, millions of decisions per day, incommensurable with traditional psychological experiments. We generated a database of response times and position value in rapid chess games. We measured robust emergent statistical observables: 1 Response time (RT distributions are long-tailed and show qualitatively distinct forms at different stages of the game, 2 RT of successive moves are highly correlated both for intra- and inter-player moves. These findings have theoretical implications since they deny two basic assumptions of sequential decision making algorithms: RTs are not stationary and can not be generated by a state function. Our results also have practical implications. First, we characterized the capacity of blunders and score fluctuations to predict a player strength, which is yet an open problem in chess softwares. Second, we show that the winning likelihood can be reliably estimated from a weighted combination of remaining times and position evaluation.

  10. Distribution of circular proteins in plants: large-scale mapping of cyclotides in the Violaceae

    Directory of Open Access Journals (Sweden)

    Robert eBurman

    2015-10-01

    Full Text Available During the last decade there has been increasing interest in small circular proteins found in plants of the violet family (Violaceae. These so-called cyclotides consist of a circular chain of approximately 30 amino acids, including six cysteines forming three disulfide bonds, arranged in a cyclic cystine knot motif. In this study we map the occurrence and distribution of cyclotides throughout the Violaceae. Plant material was obtained from herbarium sheets containing samples up to 200 years of age. Even the oldest specimens contained cyclotides in the preserved leaves, with no degradation products observable, confirming their place as one of the most stable proteins in nature. Over 200 samples covering 17 of the 23 genera in Violaceae were analysed, and cyclotides were positively identified in 150 species. Each species contained a unique set of between one and 25 cyclotides, with many exclusive to individual species. We estimate the number of different cyclotides in the Violaceae to be 5,000-25,000, and propose that cyclotides are ubiquitous among all Violaceae species. Twelve new cyclotides from eight phylogenetically dispersed genera were sequenced. Furthermore, the first glycosylated derivatives of cyclotides were identified and characterized, further increasing the diversity and complexity of this unique protein family.

  11. Embedded Electro-Optic Sensor Network for the On-Site Calibration and Real-Time Performance Monitoring of Large-Scale Phased Arrays

    National Research Council Canada - National Science Library

    Yang, Kyoung

    2005-01-01

    This final report summarizes the progress during the Phase I SBIR project entitled "Embedded Electro-Optic Sensor Network for the On-Site Calibration and Real-Time Performance Monitoring of Large-Scale Phased Arrays...

  12. A resource of large-scale molecular markers for monitoring Agropyron cristatum chromatin introgression in wheat background based on transcriptome sequences.

    Science.gov (United States)

    Zhang, Jinpeng; Liu, Weihua; Lu, Yuqing; Liu, Qunxing; Yang, Xinming; Li, Xiuquan; Li, Lihui

    2017-09-20

    Agropyron cristatum is a wild grass of the tribe Triticeae and serves as a gene donor for wheat improvement. However, very few markers can be used to monitor A. cristatum chromatin introgressions in wheat. Here, we reported a resource of large-scale molecular markers for tracking alien introgressions in wheat based on transcriptome sequences. By aligning A. cristatum unigenes with the Chinese Spring reference genome sequences, we designed 9602 A. cristatum expressed sequence tag-sequence-tagged site (EST-STS) markers for PCR amplification and experimental screening. As a result, 6063 polymorphic EST-STS markers were specific for the A. cristatum P genome in the single-receipt wheat background. A total of 4956 randomly selected polymorphic EST-STS markers were further tested in eight wheat variety backgrounds, and 3070 markers displaying stable and polymorphic amplification were validated. These markers covered more than 98% of the A. cristatum genome, and the marker distribution density was approximately 1.28 cM. An application case of all EST-STS markers was validated on the A. cristatum 6 P chromosome. These markers were successfully applied in the tracking of alien A. cristatum chromatin. Altogether, this study provided a universal method of large-scale molecular marker development to monitor wild relative chromatin in wheat.

  13. Distributed parallel cooperative coevolutionary multi-objective large-scale immune algorithm for deployment of wireless sensor networks

    DEFF Research Database (Denmark)

    Cao, Bin; Zhao, Jianwei; Yang, Po

    2018-01-01

    -objective evolutionary algorithms the Cooperative Coevolutionary Generalized Differential Evolution 3, the Cooperative Multi-objective Differential Evolution and the Nondominated Sorting Genetic Algorithm III, the proposed algorithm addresses the deployment optimization problem efficiently and effectively.......Using immune algorithms is generally a time-intensive process especially for problems with a large number of variables. In this paper, we propose a distributed parallel cooperative coevolutionary multi-objective large-scale immune algorithm that is implemented using the message passing interface...... (MPI). The proposed algorithm is composed of three layers: objective, group and individual layers. First, for each objective in the multi-objective problem to be addressed, a subpopulation is used for optimization, and an archive population is used to optimize all the objectives. Second, the large...

  14. Government management and implementation of national real-time energy monitoring system for China large-scale public building

    International Nuclear Information System (INIS)

    Na Wei; Wu Yong; Song Yan; Dong Zhongcheng

    2009-01-01

    The supervision of energy efficiency in government office buildings and large-scale public buildings (GOBLPB) is the main embodiment for government implementation of Public Administration in the fields of resource saving and environmental protection. It is significant for China government to achieve the target: reducing building energy consumption by 11 million ton standard coal before 2010. In the framework of a national demonstration project concerning the energy management system, Shenzhen Municipality has been selected for the implementation of the system. A data acquisition system and a methodology concerning the energy consumption of the GOBLPB have been developed. This paper summarizes the various features of the system incorporated into identifying the building consumes and energy saving potential. This paper also defines the methods to achieve the real-time monitoring and diagnosis: the meters installed at each building, the data transmitted through internet to a center server, the analysis and unification at the center server and the publication through web. Furthermore, this paper introduces the plans to implement the system and to extend countrywide. Finally, this paper presents some measurements to achieve a common benefit community in implementation of building energy efficiency supervisory system on GOBLPB in its construction, reconstruction or operation stages.

  15. Large-scale laboratory testing of bedload-monitoring technologies: overview of the StreamLab06 Experiments

    Science.gov (United States)

    Marr, Jeffrey D.G.; Gray, John R.; Davis, Broderick E.; Ellis, Chris; Johnson, Sara; Gray, John R.; Laronne, Jonathan B.; Marr, Jeffrey D.G.

    2010-01-01

    A 3-month-long, large-scale flume experiment involving research and testing of selected conventional and surrogate bedload-monitoring technologies was conducted in the Main Channel at the St. Anthony Falls Laboratory under the auspices of the National Center for Earth-surface Dynamics. These experiments, dubbed StreamLab06, involved 25 researchers and volunteers from academia, government, and the private sector. The research channel was equipped with a sediment-recirculation system and a sediment-flux monitoring system that allowed continuous measurement of sediment flux in the flume and provided a data set by which samplers were evaluated. Selected bedload-measurement technologies were tested under a range of flow and sediment-transport conditions. The experiment was conducted in two phases. The bed material in phase I was well-sorted siliceous sand (0.6-1.8 mm median diameter). A gravel mixture (1-32 mm median diameter) composed the bed material in phase II. Four conventional bedload samplers – a standard Helley-Smith, Elwha, BLH-84, and Toutle River II (TR-2) sampler – were manually deployed as part of both experiment phases. Bedload traps were deployed in study Phase II. Two surrogate bedload samplers – stationarymounted down-looking 600 kHz and 1200 kHz acoustic Doppler current profilers – were deployed in experiment phase II. This paper presents an overview of the experiment including the specific data-collection technologies used and the ambient hydraulic, sediment-transport and environmental conditions measured as part of the experiment. All data collected as part of the StreamLab06 experiments are, or will be available to the research community.

  16. On Event-Triggered Adaptive Architectures for Decentralized and Distributed Control of Large-Scale Modular Systems.

    Science.gov (United States)

    Albattat, Ali; Gruenwald, Benjamin C; Yucelen, Tansel

    2016-08-16

    The last decade has witnessed an increased interest in physical systems controlled over wireless networks (networked control systems). These systems allow the computation of control signals via processors that are not attached to the physical systems, and the feedback loops are closed over wireless networks. The contribution of this paper is to design and analyze event-triggered decentralized and distributed adaptive control architectures for uncertain networked large-scale modular systems; that is, systems consist of physically-interconnected modules controlled over wireless networks. Specifically, the proposed adaptive architectures guarantee overall system stability while reducing wireless network utilization and achieving a given system performance in the presence of system uncertainties that can result from modeling and degraded modes of operation of the modules and their interconnections between each other. In addition to the theoretical findings including rigorous system stability and the boundedness analysis of the closed-loop dynamical system, as well as the characterization of the effect of user-defined event-triggering thresholds and the design parameters of the proposed adaptive architectures on the overall system performance, an illustrative numerical example is further provided to demonstrate the efficacy of the proposed decentralized and distributed control approaches.

  17. Distributed Model Predictive Control over Multiple Groups of Vehicles in Highway Intelligent Space for Large Scale System

    Directory of Open Access Journals (Sweden)

    Tang Xiaofeng

    2014-01-01

    Full Text Available The paper presents the three time warning distances for solving the large scale system of multiple groups of vehicles safety driving characteristics towards highway tunnel environment based on distributed model prediction control approach. Generally speaking, the system includes two parts. First, multiple vehicles are divided into multiple groups. Meanwhile, the distributed model predictive control approach is proposed to calculate the information framework of each group. Each group of optimization performance considers the local optimization and the neighboring subgroup of optimization characteristics, which could ensure the global optimization performance. Second, the three time warning distances are studied based on the basic principles used for highway intelligent space (HIS and the information framework concept is proposed according to the multiple groups of vehicles. The math model is built to avoid the chain avoidance of vehicles. The results demonstrate that the proposed highway intelligent space method could effectively ensure driving safety of multiple groups of vehicles under the environment of fog, rain, or snow.

  18. On Event-Triggered Adaptive Architectures for Decentralized and Distributed Control of Large-Scale Modular Systems

    Directory of Open Access Journals (Sweden)

    Ali Albattat

    2016-08-01

    Full Text Available The last decade has witnessed an increased interest in physical systems controlled over wireless networks (networked control systems. These systems allow the computation of control signals via processors that are not attached to the physical systems, and the feedback loops are closed over wireless networks. The contribution of this paper is to design and analyze event-triggered decentralized and distributed adaptive control architectures for uncertain networked large-scale modular systems; that is, systems consist of physically-interconnected modules controlled over wireless networks. Specifically, the proposed adaptive architectures guarantee overall system stability while reducing wireless network utilization and achieving a given system performance in the presence of system uncertainties that can result from modeling and degraded modes of operation of the modules and their interconnections between each other. In addition to the theoretical findings including rigorous system stability and the boundedness analysis of the closed-loop dynamical system, as well as the characterization of the effect of user-defined event-triggering thresholds and the design parameters of the proposed adaptive architectures on the overall system performance, an illustrative numerical example is further provided to demonstrate the efficacy of the proposed decentralized and distributed control approaches.

  19. Monitor large-scale consumers market natural gas and electricity 2010; Monitor groothandelsmarkten gas en elektriciteit 2010

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-02-15

    The Office of Energy Regulation ('Energiekamer') carries out its legal task by means of a monitor, a practical tool to assess and analyze the wholesale market for electricity. Monitoring of the wholesale electricity market concerns continuous, accurate and structured following of developments in the market. The aim is to identify in time signals from the market that could lead to a decrease of competition and transparency. The starting point of the monitor for the wholesale electricity market is the selection of indicators which give insight in real competition, liquidity and transparency. [Dutch] De Energiekamer schrijft jaarlijks haar bevindingen over de mate van marktwerking in de groothandelsmarkten gas en elektriciteit in een rapport aan de Minister van Economische Zaken. Dit rapport is de monitor. Sinds 2007 zijn de bevindingen over de groothandelsmarkten voor gas en elektriciteit samengevoegd in een publicatie. Concreet verzamelt de Energiekamer marktinformatie zoals prijzen en hoeveelheden. Daarnaast bestudeert de Energiekamer of deze uitkomsten overeenkomen met wat beoogd was in de vrije energiemarkt. De Energiekamer onderzoekt of de voorwaarden (zoals bijvoorbeeld toetredingsbarrieres en transparantie) optimaal zijn voor doeltreffende concurrentie en geeft voorstellen van maatregelen om de marktwerking te verbeteren.

  20. Large-scale determinants of intestinal schistosomiasis and intermediate host snail distribution across Africa: does climate matter?

    Science.gov (United States)

    Stensgaard, Anna-Sofie; Utzinger, Jürg; Vounatsou, Penelope; Hürlimann, Eveline; Schur, Nadine; Saarnak, Christopher F L; Simoonga, Christopher; Mubita, Patricia; Kabatereine, Narcis B; Tchuem Tchuenté, Louis-Albert; Rahbek, Carsten; Kristensen, Thomas K

    2013-11-01

    The geographical ranges of most species, including many infectious disease agents and their vectors and intermediate hosts, are assumed to be constrained by climatic tolerances, mainly temperature. It has been suggested that global warming will cause an expansion of the areas potentially suitable for infectious disease transmission. However, the transmission of infectious diseases is governed by a myriad of ecological, economic, evolutionary and social factors. Hence, a deeper understanding of the total disease system (pathogens, vectors and hosts) and its drivers is important for predicting responses to climate change. Here, we combine a growing degree day model for Schistosoma mansoni with species distribution models for the intermediate host snail (Biomphalaria spp.) to investigate large-scale environmental determinants of the distribution of the African S. mansoni-Biomphalaria system and potential impacts of climatic changes. Snail species distribution models included several combinations of climatic and habitat-related predictors; the latter divided into "natural" and "human-impacted" habitat variables to measure anthropogenic influence. The predictive performance of the combined snail-parasite model was evaluated against a comprehensive compilation of historical S. mansoni parasitological survey records, and then examined for two climate change scenarios of increasing severity for 2080. Future projections indicate that while the potential S. mansoni transmission area expands, the snail ranges are more likely to contract and/or move into cooler areas in the south and east. Importantly, we also note that even though climate per se matters, the impact of humans on habitat play a crucial role in determining the distribution of the intermediate host snails in Africa. Thus, a future contraction in the geographical range size of the intermediate host snails caused by climatic changes does not necessarily translate into a decrease or zero-sum change in human

  1. Global direct pressures on biodiversity by large-scale metal mining: Spatial distribution and implications for conservation.

    Science.gov (United States)

    Murguía, Diego I; Bringezu, Stefan; Schaldach, Rüdiger

    2016-09-15

    Biodiversity loss is widely recognized as a serious global environmental change process. While large-scale metal mining activities do not belong to the top drivers of such change, these operations exert or may intensify pressures on biodiversity by adversely changing habitats, directly and indirectly, at local and regional scales. So far, analyses of global spatial dynamics of mining and its burden on biodiversity focused on the overlap between mines and protected areas or areas of high value for conservation. However, it is less clear how operating metal mines are globally exerting pressure on zones of different biodiversity richness; a similar gap exists for unmined but known mineral deposits. By using vascular plants' diversity as a proxy to quantify overall biodiversity, this study provides a first examination of the global spatial distribution of mines and deposits for five key metals across different biodiversity zones. The results indicate that mines and deposits are not randomly distributed, but concentrated within intermediate and high diversity zones, especially bauxite and silver. In contrast, iron, gold, and copper mines and deposits are closer to a more proportional distribution while showing a high concentration in the intermediate biodiversity zone. Considering the five metals together, 63% and 61% of available mines and deposits, respectively, are located in intermediate diversity zones, comprising 52% of the global land terrestrial surface. 23% of mines and 20% of ore deposits are located in areas of high plant diversity, covering 17% of the land. 13% of mines and 19% of deposits are in areas of low plant diversity, comprising 31% of the land surface. Thus, there seems to be potential for opening new mines in areas of low biodiversity in the future. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Social Network Analysis and Mining to Monitor and Identify Problems with Large-Scale Information and Communication Technology Interventions.

    Science.gov (United States)

    da Silva, Aleksandra do Socorro; de Brito, Silvana Rossy; Vijaykumar, Nandamudi Lankalapalli; da Rocha, Cláudio Alex Jorge; Monteiro, Maurílio de Abreu; Costa, João Crisóstomo Weyl Albuquerque; Francês, Carlos Renato Lisboa

    2016-01-01

    The published literature reveals several arguments concerning the strategic importance of information and communication technology (ICT) interventions for developing countries where the digital divide is a challenge. Large-scale ICT interventions can be an option for countries whose regions, both urban and rural, present a high number of digitally excluded people. Our goal was to monitor and identify problems in interventions aimed at certification for a large number of participants in different geographical regions. Our case study is the training at the Telecentros.BR, a program created in Brazil to install telecenters and certify individuals to use ICT resources. We propose an approach that applies social network analysis and mining techniques to data collected from Telecentros.BR dataset and from the socioeconomics and telecommunications infrastructure indicators of the participants' municipalities. We found that (i) the analysis of interactions in different time periods reflects the objectives of each phase of training, highlighting the increased density in the phase in which participants develop and disseminate their projects; (ii) analysis according to the roles of participants (i.e., tutors or community members) reveals that the interactions were influenced by the center (or region) to which the participant belongs (that is, a community contained mainly members of the same region and always with the presence of tutors, contradicting expectations of the training project, which aimed for intense collaboration of the participants, regardless of the geographic region); (iii) the social network of participants influences the success of the training: that is, given evidence that the degree of the community member is in the highest range, the probability of this individual concluding the training is 0.689; (iv) the North region presented the lowest probability of participant certification, whereas the Northeast, which served municipalities with similar

  3. Integrating SMOS brightness temperatures with a new conceptual spatially distributed hydrological model for improving flood and drought predictions at large scale.

    Science.gov (United States)

    Hostache, Renaud; Rains, Dominik; Chini, Marco; Lievens, Hans; Verhoest, Niko E. C.; Matgen, Patrick

    2017-04-01

    Motivated by climate change and its impact on the scarcity or excess of water in many parts of the world, several agencies and research institutions have taken initiatives in monitoring and predicting the hydrologic cycle at a global scale. Such a monitoring/prediction effort is important for understanding the vulnerability to extreme hydrological events and for providing early warnings. This can be based on an optimal combination of hydro-meteorological models and remote sensing, in which satellite measurements can be used as forcing or calibration data or for regularly updating the model states or parameters. Many advances have been made in these domains and the near future will bring new opportunities with respect to remote sensing as a result of the increasing number of spaceborn sensors enabling the large scale monitoring of water resources. Besides of these advances, there is currently a tendency to refine and further complicate physically-based hydrologic models to better capture the hydrologic processes at hand. However, this may not necessarily be beneficial for large-scale hydrology, as computational efforts are therefore increasing significantly. As a matter of fact, a novel thematic science question that is to be investigated is whether a flexible conceptual model can match the performance of a complex physically-based model for hydrologic simulations at large scale. In this context, the main objective of this study is to investigate how innovative techniques that allow for the estimation of soil moisture from satellite data can help in reducing errors and uncertainties in large scale conceptual hydro-meteorological modelling. A spatially distributed conceptual hydrologic model has been set up based on recent developments of the SUPERFLEX modelling framework. As it requires limited computational efforts, this model enables early warnings for large areas. Using as forcings the ERA-Interim public dataset and coupled with the CMEM radiative transfer model

  4. Occurrence, distribution and bioaccumulation behaviour of hydrophobic organic contaminants in a large-scale constructed wetland in Singapore.

    Science.gov (United States)

    Wang, Qian; Kelly, Barry C

    2017-09-01

    This study involved a field-based investigation to assess the occurrence, distribution and bioaccumulation behaviour of hydrophobic organic contaminants in a large-scale constructed wetland. Samples of raw leachate, water and wetland plants, Typha angustifolia, were collected for chemical analysis. Target contaminants included polychlorinated biphenyls (PCBs), organochlorine pesticides (OCP), as well as several halogenated flame retardants (HFRs) and personal care products (triclosan and synthetic musks). In addition to PCBs and OCPs, synthetic musks, triclosan (TCS) and dechlorane plus stereoisomers (syn- and anti-DPs) were frequently detected. Root concentration factors (log RCF L/kg wet weight) of the various contaminants ranged between 3.0 and 7.9. Leaf concentration factors (log LCF L/kg wet weight) ranged between 2.4 and 8.2. syn- and anti-DPs exhibited the greatest RCF and LCF values. A strong linear relationship was observed between log RCF and octanol-water partition coefficient (log K OW ). Translocation factors (log TFs) were negatively correlated with log K OW . The results demonstrate that more hydrophobic compounds exhibit higher degrees of partitioning into plant roots and are less effectively transported from roots to plant leaves. Methyl triclosan (MTCS) and 2,8-dichlorodibenzo-p-dioxin (DCDD), TCS degradation products, exhibited relatively high concentrations in roots and leaves., highlighting the importance of degradation/biotransformation. The results further suggest that Typha angustifolia in this constructed wetland can aid the removal of hydrophobic organic contaminants present in this landfill leachate. The findings will aid future investigations regarding the fate and bioaccumulation of hydrophobic organic contaminants in constructed wetlands. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Implementation of Pilot Protection System for Large Scale Distribution System like The Future Renewable Electric Energy Distribution Management Project

    Science.gov (United States)

    Iigaya, Kiyohito

    A robust, fast and accurate protection system based on pilot protection concept was developed previously and a few alterations in that algorithm were made to make it faster and more reliable and then was applied to smart distribution grids to verify the results for it. The new 10 sample window method was adapted into the pilot protection program and its performance for the test bed system operation was tabulated. Following that the system comparison between the hardware results for the same algorithm and the simulation results were compared. The development of the dual slope percentage differential method, its comparison with the 10 sample average window pilot protection system and the effects of CT saturation on the pilot protection system are also shown in this thesis. The implementation of the 10 sample average window pilot protection system is done to multiple distribution grids like Green Hub v4.3, IEEE 34, LSSS loop and modified LSSS loop. Case studies of these multi-terminal model are presented, and the results are also shown in this thesis. The result obtained shows that the new algorithm for the previously proposed protection system successfully identifies fault on the test bed and the results for both hardware and software simulations match and the response time is approximately less than quarter of a cycle which is fast as compared to the present commercial protection system and satisfies the FREEDM system requirement.

  6. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    Science.gov (United States)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  7. Novel probabilistic and distributed algorithms for guidance, control, and nonlinear estimation of large-scale multi-agent systems

    Science.gov (United States)

    Bandyopadhyay, Saptarshi

    Multi-agent systems are widely used for constructing a desired formation shape, exploring an area, surveillance, coverage, and other cooperative tasks. This dissertation introduces novel algorithms in the three main areas of shape formation, distributed estimation, and attitude control of large-scale multi-agent systems. In the first part of this dissertation, we address the problem of shape formation for thousands to millions of agents. Here, we present two novel algorithms for guiding a large-scale swarm of robotic systems into a desired formation shape in a distributed and scalable manner. These probabilistic swarm guidance algorithms adopt an Eulerian framework, where the physical space is partitioned into bins and the swarm's density distribution over each bin is controlled using tunable Markov chains. In the first algorithm - Probabilistic Swarm Guidance using Inhomogeneous Markov Chains (PSG-IMC) - each agent determines its bin transition probabilities using a time-inhomogeneous Markov chain that is constructed in real-time using feedback from the current swarm distribution. This PSG-IMC algorithm minimizes the expected cost of the transitions required to achieve and maintain the desired formation shape, even when agents are added to or removed from the swarm. The algorithm scales well with a large number of agents and complex formation shapes, and can also be adapted for area exploration applications. In the second algorithm - Probabilistic Swarm Guidance using Optimal Transport (PSG-OT) - each agent determines its bin transition probabilities by solving an optimal transport problem, which is recast as a linear program. In the presence of perfect feedback of the current swarm distribution, this algorithm minimizes the given cost function, guarantees faster convergence, reduces the number of transitions for achieving the desired formation, and is robust to disturbances or damages to the formation. We demonstrate the effectiveness of these two proposed swarm

  8. Which spatial discretization for distributed hydrological models? Proposition of a methodology and illustration for medium to large-scale catchments

    Directory of Open Access Journals (Sweden)

    J. Dehotin

    2008-05-01

    modelling units (third level of discretization.

    The first part of the paper presents a review about catchment discretization in hydrological models from which we derived the principles of our general methodology. The second part of the paper focuses on the derivation of hydro-landscape units for medium to large scale catchments. For this sub-catchment discretization, we propose the use of principles borrowed from landscape classification. These principles are independent of the catchment size. They allow retaining suitable features required in the catchment description in order to fulfil a specific modelling objective. The method leads to unstructured and homogeneous areas within the sub-catchments, which can be used to derive modelling meshes. It avoids map smoothing by suppressing the smallest units, the role of which can be very important in hydrology, and provides a confidence map (the distance map for the classification. The confidence map can be used for further uncertainty analysis of modelling results. The final discretization remains consistent with the resolution of input data and that of the source maps. The last part of the paper illustrates the method using available data for the upper Saône catchment in France. The interest of the method for an efficient representation of landscape heterogeneity is illustrated by a comparison with more traditional mapping approaches. Examples of possible models, which can be built on this spatial discretization, are finally given as perspectives for the work.

  9. The Design and Implementation of Smart Monitoring System for Large-Scale Railway Maintenance Equipment Cab Based on ZigBee Wireless Sensor Network

    OpenAIRE

    Hairui Wang; Junfu Yu

    2014-01-01

    In recent years, organizations use IEEE 802.15.4 and ZigBee technology to deliver solution in variety areas including home environment monitoring. ZigBee technology has advantages on low-cost, low power consumption and self-forming. With the rapid expansion of the Internet, there is the requirement for remote monitoring large-scale railway maintenance equipment cab. This paper discusses the disadvantages of the existing smart monitoring system, and proposes a solution. A ZigBee wireless senso...

  10. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  11. Towards a Scalable and Adaptive Application Support Platform for Large-Scale Distributed E-Sciences in High-Performance Network Environments

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Chase Qishi [New Jersey Inst. of Technology, Newark, NJ (United States); Univ. of Memphis, TN (United States); Zhu, Michelle Mengxia [Southern Illinois Univ., Carbondale, IL (United States)

    2016-06-06

    The advent of large-scale collaborative scientific applications has demonstrated the potential for broad scientific communities to pool globally distributed resources to produce unprecedented data acquisition, movement, and analysis. System resources including supercomputers, data repositories, computing facilities, network infrastructures, storage systems, and display devices have been increasingly deployed at national laboratories and academic institutes. These resources are typically shared by large communities of users over Internet or dedicated networks and hence exhibit an inherent dynamic nature in their availability, accessibility, capacity, and stability. Scientific applications using either experimental facilities or computation-based simulations with various physical, chemical, climatic, and biological models feature diverse scientific workflows as simple as linear pipelines or as complex as a directed acyclic graphs, which must be executed and supported over wide-area networks with massively distributed resources. Application users oftentimes need to manually configure their computing tasks over networks in an ad hoc manner, hence significantly limiting the productivity of scientists and constraining the utilization of resources. The success of these large-scale distributed applications requires a highly adaptive and massively scalable workflow platform that provides automated and optimized computing and networking services. This project is to design and develop a generic Scientific Workflow Automation and Management Platform (SWAMP), which contains a web-based user interface specially tailored for a target application, a set of user libraries, and several easy-to-use computing and networking toolkits for application scientists to conveniently assemble, execute, monitor, and control complex computing workflows in heterogeneous high-performance network environments. SWAMP will enable the automation and management of the entire process of scientific

  12. The Design and Implementation of Smart Monitoring System for Large-Scale Railway Maintenance Equipment Cab Based on ZigBee Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    Hairui Wang

    2014-06-01

    Full Text Available In recent years, organizations use IEEE 802.15.4 and ZigBee technology to deliver solution in variety areas including home environment monitoring. ZigBee technology has advantages on low-cost, low power consumption and self-forming. With the rapid expansion of the Internet, there is the requirement for remote monitoring large-scale railway maintenance equipment cab. This paper discusses the disadvantages of the existing smart monitoring system, and proposes a solution. A ZigBee wireless sensor network smart monitoring system and Wi-Fi network is integrated through a home gateway to increase the system flexibility. At the same time the home gateway cooperated with a pre- processing system provide a flexible user interface, and the security and safety of the smart monitoring system. To testify the efficiency of the proposed system, the temperature and humidity sensors and light sensors have developed and evaluated in the smart monitoring system.

  13. Some Examples of Residence-Time Distribution Studies in Large-Scale Chemical Processes by Using Radiotracer Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bullock, R. M.; Johnson, P.; Whiston, J. [Imperial Chemical Industries Ltd., Billingham, Co., Durham (United Kingdom)

    1967-06-15

    The application of radiotracers to determine flow patterns in chemical processes is discussed with particular reference to the derivation of design data from model reactors for translation to large-scale units, the study of operating efficiency and design attainment in established plant and the rapid identification of various types of process malfunction. The requirements governing the selection of tracers for various types of media are considered and an example is given of the testing of the behaviour of a typical tracer before use in a particular large-scale process operating at 250 atm and 200 Degree-Sign C. Information which may be derived from flow patterns is discussed including the determination of mixing parameters, gas hold-up in gas/liquid reactions and the detection of channelling and stagnant regions. Practical results and their interpretation are given in relation to an define hydroformylation reaction system, a process for the conversion of propylene to isopropanol, a moving bed catalyst system for the isomerization of xylenes and a three-stage gas-liquid reaction system. The use of mean residence-time data for the detection of leakage between reaction vessels and a heat interchanger system is given as an example of the identification of process malfunction. (author)

  14. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  15. Deformation integrity monitoring for GNSS positioning services including local, regional and large scale hazard monitoring - the Karlsruhe approach and software(MONIKA)

    Science.gov (United States)

    Jaeger, R.

    2007-05-01

    GNSS-positioning services like SAPOS/ascos in Germany and many others in Europe, America and worldwide, usually yield in a short time their interdisciplinary and country-wide use for precise geo-referencing, replacing traditional low order geodetic networks. So it becomes necessary that possible changes of the reference stations' coordinates are detected ad hoc. The GNSS-reference-station MONitoring by the KArlsruhe approach and software (MONIKA) are designed for that task. The developments at Karlsruhe University of Applied Sciences in cooperation with the State Survey of Baden-Württemberg are further motivated by a the official resolution of the German state survey departments' association (Arbeitsgemeinschaft der Vermessungsverwaltungen Deutschland (AdV)) 2006 on coordinate monitoring as a quality-control duty of the GNSS-positioning service provider. The presented approach can - besides the coordinate control of GNSS-positioning services - also be used to set up any GNSS-service for the tasks of an area-wide geodynamical and natural disaster-prevention service. The mathematical model of approach, which enables a multivariate and multi-epochal design approach, is based on the GNSS-observations input of the RINEX-data of the GNSS service, followed by fully automatic processing of baselines and/or session, and a near-online setting up of epoch-state vectors and their covariance-matrices in a rigorous 3D network adjustment. In case of large scale and long-term monitoring situations, geodynamical standard trends (datum-drift, plate-movements etc.) are accordingly considered and included in the mathematical model of MONIKA. The coordinate-based deformation monitoring approach, as third step of the stepwise adjustments, is based on the above epoch-state vectors, and - splitting off geodynamics trends - hereby on a multivariate and multi-epochal congruency testing. So far, that no other information exists, all points are assumed as being stable and congruent reference

  16. Report on a workshop on the application of thermoluminescence dosimetry to large scale individual monitoring, Ispra, 11-13 September 1985

    International Nuclear Information System (INIS)

    Barthe, J.R.; Christensen, P.; Driscoll, C.M.H.; Marshall, T.O.; Harvey, J.R.; Julius, H.W.; Marshall, M.; Oberhoffer, M.

    1987-01-01

    The workshop was held for the benefit of those actually involved in the operation of large scale automatic thermoluminescence dosimetry systems. It was organised around three overall themes subdivided into 13 subject headings: External constraints: User requirements, Quantities and Units, Legal requirements, Testing, Intercomparisons; Dosimetry systems: Materials, Dosemeter design, Reading systems, Annealing procedures, Rogue readings; Management: Distribution and organisation, Reporting and record keeping, Financial aspects. (author)

  17. Holistic information evaluation of divergence of soil's properties by using of legacy data of large scale monitoring surveys

    Science.gov (United States)

    Mikheeva, Irina

    2017-04-01

    Identification of tendencies of soil's transformations is very important for adequate ecological and economical assessment of degradation of soils. But monitoring of conditions of soils, and other natural objects, bring up a number of important methodological questions, including the probabilistic and statistical analysis of the accumulated legacy data and their use for verification of quantitative estimates of natural processes. Owing to considerable natural variability there is a problem of a reliable assessment of contemporary soil evolution under the influence of environmental management and climate changes. When studying dynamics of soil quality it is necessary to consider soil as open complex system with parameters which significantly vary in space. The analysis of probabilistic distributions of attributes of studied system is informative for the characteristic of holistic state and behavior of the system. Therefore earlier we had offered the method of evaluation of alterations of soils by analysis of changes of pdf of their properties and their statistical entropy. The executed analysis of dynamics of pdf showed that often opposite tendencies to decrease and to increase of property can be shown at the same time. However to give an adequate quantitative evaluation of changes of soil properties it is necessary to characterize them in general. We proposed that it is reasonable to name processes of modern changes in soil properties concerning their start meaning by the term "divergence" and investigate it quantitatively. For this purpose we suggested to use value of information divergence which is defined as a measure of distinctions of pdf in compared objects or in various time. As the measure of dissimilarity, divergence should satisfy come conditions, the most important is scale-invariance property. Information divergence was used by us for evaluation of distinctions of soils according heterogeneity of factors of soil formation and with course of natural and

  18. Large-scale monitoring of effects of clothianidin dressed oilseed rape seeds on pollinating insects in Northern Germany: implementation of the monitoring project and its representativeness.

    Science.gov (United States)

    Heimbach, Fred; Russ, Anja; Schimmer, Maren; Born, Katrin

    2016-11-01

    Monitoring studies at the landscape level are complex, expensive and difficult to conduct. Many aspects have to be considered to avoid confounding effects which is probably the reason why they are not regularly performed in the context of risk assessments of plant protection products to pollinating insects. However, if conducted appropriately their contribution is most valuable. In this paper we identify the requirements of a large-scale monitoring study for the assessment of side-effects of clothianidin seed-treated winter oilseed rape on three species of pollinating insects (Apis mellifera, Bombus terrestris and Osmia bicornis) and present how these requirements were implemented. Two circular study sites were delineated next to each other in northeast Germany and comprised almost 65 km 2 each. At the reference site, study fields were drilled with clothianidin-free OSR seeds while at the test site the oilseed rape seeds contained a coating with 10 g clothianidin and 2 g beta-cyfluthrin per kg seeds (Elado®). The comparison of environmental conditions at the study sites indicated that they are as similar as possible in terms of climate, soil, land use, history and current practice of agriculture as well as in availability of oilseed rape and non-crop bee forage. Accordingly, local environmental conditions were considered not to have had any confounding effect on the results of the monitoring of the bee species. Furthermore, the study area was found to be representative for other oilseed rape cultivation regions in Europe.

  19. Cost-Effective Large-Scale Occupancy-Abundance Monitoring of Invasive Brushtail Possums (Trichosurus Vulpecula on New Zealand's Public Conservation Land.

    Directory of Open Access Journals (Sweden)

    Andrew M Gormley

    Full Text Available There is interest in large-scale and unbiased monitoring of biodiversity status and trend, but there are few published examples of such monitoring being implemented. The New Zealand Department of Conservation is implementing a monitoring program that involves sampling selected biota at the vertices of an 8-km grid superimposed over the 8.6 million hectares of public conservation land that it manages. The introduced brushtail possum (Trichosurus Vulpecula is a major threat to some biota and is one taxon that they wish to monitor and report on. A pilot study revealed that the traditional method of monitoring possums using leg-hold traps set for two nights, termed the Trap Catch Index, was a constraint on the cost and logistical feasibility of the monitoring program. A phased implementation of the monitoring program was therefore conducted to collect data for evaluating the trade-off between possum occupancy-abundance estimates and the costs of sampling for one night rather than two nights. Reducing trapping effort from two nights to one night along four trap-lines reduced the estimated costs of monitoring by 5.8% due to savings in labour, food and allowances; it had a negligible effect on estimated national possum occupancy but resulted in slightly higher and less precise estimates of relative possum abundance. Monitoring possums for one night rather than two nights would provide an annual saving of NZ$72,400, with 271 fewer field days required for sampling. Possums occupied 60% (95% credible interval; 53-68 of sampling locations on New Zealand's public conservation land, with a mean relative abundance (Trap Catch Index of 2.7% (2.0-3.5. Possum occupancy and abundance were higher in forest than in non-forest habitats. Our case study illustrates the need to evaluate relationships between sampling design, cost, and occupancy-abundance estimates when designing and implementing large-scale occupancy-abundance monitoring programs.

  20. Spatiotemporal complexity of 2-D rupture nucleation process observed by direct monitoring during large-scale biaxial rock friction experiments

    Science.gov (United States)

    Fukuyama, Eiichi; Tsuchida, Kotoyo; Kawakata, Hironori; Yamashita, Futoshi; Mizoguchi, Kazuo; Xu, Shiqing

    2018-05-01

    We were able to successfully capture rupture nucleation processes on a 2-D fault surface during large-scale biaxial friction experiments using metagabbro rock specimens. Several rupture nucleation patterns have been detected by a strain gauge array embedded inside the rock specimens as well as by that installed along the edge walls of the fault. In most cases, the unstable rupture started just after the rupture front touched both ends of the rock specimen (i.e., when rupture front extended to the entire width of the fault). In some cases, rupture initiated at multiple locations and the rupture fronts coalesced to generate unstable ruptures, which could only be detected from the observation inside the rock specimen. Therefore, we need to carefully examine the 2-D nucleation process of the rupture especially when analyzing the data measured only outside the rock specimen. At least the measurements should be done at both sides of the fault to identify the asymmetric rupture propagation on the fault surface, although this is not perfect yet. In the present experiment, we observed three typical types of the 2-D rupture propagation patterns, two of which were initiated at a single location either close to the fault edge or inside the fault. This initiation could be accelerated by the free surface effect at the fault edge. The third one was initiated at multiple locations and had a rupture coalescence at the middle of the fault. These geometrically complicated rupture initiation patterns are important for understanding the earthquake nucleation process in nature.

  1. Reactor power distribution monitor

    International Nuclear Information System (INIS)

    Hoizumi, Atsushi.

    1986-01-01

    Purpose: To grasp the margin for the limit value of the power distribution peaking factor inside the reactor under operation by using the reactor power distribution monitor. Constitution: The monitor is composed of the 'constant' file, (to store in-reactor power distributions obtained from analysis), TIP and thermocouple, lateral output distribution calibrating apparatus, axial output distribution synthesizer and peaking factor synthesizer. The lateral output distribution calibrating apparatus is used to make calibration by comparing the power distribution obtained from the thermocouples to the power distribution obtained from the TIP, and then to provide the power distribution lateral peaking factors. The axial output distribution synthesizer provides the power distribution axial peaking factors in accordance with the signals from the out-pile neutron flux detector. These axial and lateral power peaking factors are synthesized with high precision in the three-dimensional format and can be monitored at any time. (Kamimura, M.)

  2. Large-scale monitoring of effects of clothianidin-dressed oilseed rape seeds on pollinating insects in northern Germany: residues of clothianidin in pollen, nectar and honey

    OpenAIRE

    Rolke, Daniel; Persigehl, Markus; Peters, Britta; Sterk, Guido; Blenau, Wolfgang

    2016-01-01

    This study was part of a large-scale monitoring project to assess the possible effects of Elado® (10 g clothianidin & 2 g β-cyfluthrin/kg seed)-dressed oilseed rape seeds on different pollinators in Northern Germany. Firstly, residues of clothianidin and its active metabolites thiazolylnitroguanidine and thiazolylmethylurea were measured in nectar and pollen from Elado®-dressed (test site, T) and undressed (reference site, R) oilseed rape collected by honey bees confined within tunnel tents. ...

  3. Low-cost Photoacoustic-based Measurement System for Carbon Dioxide Fluxes with the Potential for large-scale Monitoring

    Science.gov (United States)

    Scholz, L. T.; Bierer, B.; Ortiz Perez, A.; Woellenstein, J.; Sachs, T.; Palzer, S.

    2016-12-01

    The determination of carbon dioxide (CO2) fluxes between ecosystems and the atmosphere is crucial for understanding ecological processes on regional and global scales. High quality data sets with full uncertainty estimates are needed to evaluate model simulations. However, current flux monitoring techniques are unsuitable to provide reliable data of a large area at both a detailed level and an appropriate resolution, at best in combination with a high sampling rate. Currently used sensing technologies, such as non-dispersive infrared (NDIR) gas analyzers, cannot be deployed in large numbers to provide high spatial resolution due to their costs and complex maintenance requirements. Here, we propose a novel CO2 measurement system, whose gas sensing unit is made up of low-cost, low-power consuming components only, such as an IR-LED and a photoacoustic detector. The sensor offers a resolution of sensor response of just a few seconds. Since the sensor can be applied in-situ without special precautions, it allows for environmental monitoring in a non-invasive way. Its low energy consumption enables long-term measurements. The low overall costs favor the manufacturing in large quantities. This allows the operation of multiple sensors at a reasonable price and thus provides concentration measurements at any desired spatial coverage and at high temporal resolution. With appropriate 3D configuration of the units, vertical and horizontal fluxes can be determined. By applying a closely meshed wireless sensor network, inhomogeneities as well as CO2 sources and sinks in the lower atmosphere can be monitored. In combination with sensors for temperature, pressure and humidity, our sensor paves the way towards the reliable and extensive monitoring of ecosystem-atmosphere exchange rates. The technique can also be easily adapted to other relevant greenhouse gases.

  4. Use of two population metrics clarifies biodiversity dynamics in large-scale monitoring: the case of trees in Japanese old-growth forests: the need for multiple population metrics in large-scale monitoring.

    Science.gov (United States)

    Ogawa, Mifuyu; Yamaura, Yuichi; Abe, Shin; Hoshino, Daisuke; Hoshizaki, Kazuhiko; Iida, Shigeo; Katsuki, Toshio; Masaki, Takashi; Niiyama, Kaoru; Saito, Satoshi; Sakai, Takeshi; Sugita, Hisashi; Tanouchi, Hiroyuki; Amano, Tatsuya; Taki, Hisatomo; Okabe, Kimiko

    2011-07-01

    Many indicators/indices provide information on whether the 2010 biodiversity target of reducing declines in biodiversity have been achieved. The strengths and limitations of the various measures used to assess the success of such measures are now being discussed. Biodiversity dynamics are often evaluated by a single biological population metric, such as the abundance of each species. Here we examined tree population dynamics of 52 families (192 species) at 11 research sites (three vegetation zones) of Japanese old-growth forests using two population metrics: number of stems and basal area. We calculated indices that track the rate of change in all species of tree by taking the geometric mean of changes in population metrics between the 1990s and the 2000s at the national level and at the levels of the vegetation zone and family. We specifically focused on whether indices based on these two metrics behaved similarly. The indices showed that (1) the number of stems declined, whereas basal area did not change at the national level and (2) the degree of change in the indices varied by vegetation zone and family. These results suggest that Japanese old-growth forests have not degraded and may even be developing in some vegetation zones, and indicate that the use of a single population metric (or indicator/index) may be insufficient to precisely understand the state of biodiversity. It is therefore important to incorporate more metrics into monitoring schemes to overcome the risk of misunderstanding or misrepresenting biodiversity dynamics.

  5. Distributed hierarchical radiation monitoring system

    International Nuclear Information System (INIS)

    Barak, D.

    1985-01-01

    A solution to the problem of monitoring the radiation levels in and around a nuclear facility is presented in this paper. This is a private case of a large scale general purpose data acqisition system with high reliability, availability and short maintenance time. The physical layout of the detectors in the plant, and the strict control demands dictated a distributed and hierarchical system. The system is comprised of three levels, each level contains modules. Level one contains the Control modules which collects data from groups of detectors and executes emergency local control tasks. In level two are the Group controllers which concentrate data from the Control modules, and enable local display and communication. The system computer is in level three, enabling the plant operator to receive information from the detectors and execute control tasks. The described system was built and is operating successfully for about two years. (author)

  6. Operational Analysis of Distribution Systems Featuring Large-scale Variable RES: Contributions of Energy Storage Systems and Switchable Capacitor Banks

    OpenAIRE

    Mário Pascoal Santos Pereira

    2017-01-01

    In the last decade, the level of variable renewable energy sources (RESs) integrated in distribution network systems have been continuously growing. This adds more uncertainty to these systems, which also face many traditional sources of uncertainty, and those pertaining to other emerging technologies such as demand response and electric vehicles. As a result, distribution system operators are finding it increasingly difficult to maintain an optimal operation of such network systems. These ch...

  7. High performance architecture design for large scale fibre-optic sensor arrays using distributed EDFAs and hybrid TDM/DWDM

    Science.gov (United States)

    Liao, Yi; Austin, Ed; Nash, Philip J.; Kingsley, Stuart A.; Richardson, David J.

    2013-09-01

    A distributed amplified dense wavelength division multiplexing (DWDM) array architecture is presented for interferometric fibre-optic sensor array systems. This architecture employs a distributed erbium-doped fibre amplifier (EDFA) scheme to decrease the array insertion loss, and employs time division multiplexing (TDM) at each wavelength to increase the number of sensors that can be supported. The first experimental demonstration of this system is reported including results which show the potential for multiplexing and interrogating up to 4096 sensors using a single telemetry fibre pair with good system performance. The number can be increased to 8192 by using dual pump sources.

  8. An Efficient Audio Coding Scheme for Quantitative and Qualitative Large Scale Acoustic Monitoring Using the Sensor Grid Approach

    Directory of Open Access Journals (Sweden)

    Félix Gontier

    2017-11-01

    Full Text Available The spreading of urban areas and the growth of human population worldwide raise societal and environmental concerns. To better address these concerns, the monitoring of the acoustic environment in urban as well as rural or wilderness areas is an important matter. Building on the recent development of low cost hardware acoustic sensors, we propose in this paper to consider a sensor grid approach to tackle this issue. In this kind of approach, the crucial question is the nature of the data that are transmitted from the sensors to the processing and archival servers. To this end, we propose an efficient audio coding scheme based on third octave band spectral representation that allows: (1 the estimation of standard acoustic indicators; and (2 the recognition of acoustic events at state-of-the-art performance rate. The former is useful to provide quantitative information about the acoustic environment, while the latter is useful to gather qualitative information and build perceptually motivated indicators using for example the emergence of a given sound source. The coding scheme is also demonstrated to transmit spectrally encoded data that, reverted to the time domain using state-of-the-art techniques, are not intelligible, thus protecting the privacy of citizens.

  9. Monitoring Utilization of a Large Scale Addiction Treatment System: The Drug and Alcohol Treatment Information System (DATIs

    Directory of Open Access Journals (Sweden)

    Nooshin Khobzi Rotondi

    2012-01-01

    Full Text Available Client-based information systems can yield data to address issues of system accountability and planning, and contribute information related to changing patterns of substance use in treatment and, indirectly, general populations. The Drug and Alcohol Treatment Information System (DATIS monitors the number/types of clients treated in approximately 170 publicly-funded addiction treatment agencies in Ontario. The purpose of this study was to estimate the caseload of addiction treatment agencies, and describe important characteristics of clients, their patterns of service utilization and trends over-time from 2005 to 2010. In 2009–2010, 47,065 individuals were admitted to treatment. Since 2005–2006, there has been an increase in adolescents/youth in treatment, and a decrease in the male-female gender ratio. Alcohol problems predominated, but an increasing proportion of clients used cannabis and prescription opioids. DATIS is an evolving system and an integral component of Ontario's performance measurement system. Linkages with healthcare information systems will allow for longitudinal tracking of client health-related outcomes.

  10. Distributed intelligent urban environment monitoring system

    Science.gov (United States)

    Du, Jinsong; Wang, Wei; Gao, Jie; Cong, Rigang

    2018-02-01

    The current environmental pollution and destruction have developed into a world-wide major social problem that threatens human survival and development. Environmental monitoring is the prerequisite and basis of environmental governance, but overall, the current environmental monitoring system is facing a series of problems. Based on the electrochemical sensor, this paper designs a small, low-cost, easy to layout urban environmental quality monitoring terminal, and multi-terminal constitutes a distributed network. The system has been small-scale demonstration applications and has confirmed that the system is suitable for large-scale promotion

  11. The application of two-step linear temperature program to thermal analysis for monitoring the lipid induction of Nostoc sp. KNUA003 in large scale cultivation.

    Science.gov (United States)

    Kang, Bongmun; Yoon, Ho-Sung

    2015-02-01

    Recently, microalgae was considered as a renewable energy for fuel production because its production is nonseasonal and may take place on nonarable land. Despite all of these advantages, microalgal oil production is significantly affected by environmental factors. Furthermore, the large variability remains an important problem in measurement of algae productivity and compositional analysis, especially, the total lipid content. Thus, there is considerable interest in accurate determination of total lipid content during the biotechnological process. For these reason, various high-throughput technologies were suggested for accurate measurement of total lipids contained in the microorganisms, especially oleaginous microalgae. In addition, more advanced technologies were employed to quantify the total lipids of the microalgae without a pretreatment. However, these methods are difficult to measure total lipid content in wet form microalgae obtained from large-scale production. In present study, the thermal analysis performed with two-step linear temeperature program was applied to measure heat evolved in temperature range from 310 to 351 °C of Nostoc sp. KNUA003 obtained from large-scale cultivation. And then, we examined the relationship between the heat evolved in 310-351 °C (HE) and total lipid content of the wet Nostoc cell cultivated in raceway. As a result, the linear relationship was determined between HE value and total lipid content of Nostoc sp. KNUA003. Particularly, there was a linear relationship of 98% between the HE value and the total lipid content of the tested microorganism. Based on this relationship, the total lipid content converted from the heat evolved of wet Nostoc sp. KNUA003 could be used for monitoring its lipid induction in large-scale cultivation. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Prototyping a large-scale distributed system for the Great Observatories era - NASA Astrophysics Data System (ADS)

    Science.gov (United States)

    Shames, Peter

    1990-01-01

    The NASA Astrophysics Data System (ADS) is a distributed information system intended to support research in the Great Observatories era, to simplify access to data, and to enable simultaneous analyses of multispectral data sets. Here, the user agent and interface, its functions, and system components are examined, and the system architecture and infrastructure is addressed. The present status of the system and related future activities are examined.

  13. Large-scale integration of renewable and distributed generation of electricity in Spain: Current situation and future needs

    International Nuclear Information System (INIS)

    Cossent, Rafael; Gómez, Tomás; Olmos, Luis

    2011-01-01

    Similar to other European countries, mechanisms for the promotion of electricity generation from renewable energy sources (RESs) and combined heat and power (CHP) production have caused a significant growth in distributed generation (DG) in Spain. Low DG/RES penetration levels do not have a major impact on electricity systems. However, several problems arise as DG shares increase. Smarter distribution grids are deemed necessary to facilitate DG/RES integration. This involves modifying the way distribution networks are currently planned and operated. Furthermore, DG and demand should also adopt a more active role. This paper reviews the current situation of DG/RES in Spain including penetration rates, support payments for DG/RES, level of market integration, economic regulation of Distribution System Operators (DSOs), smart metering implementation, grid operation and planning, and incentives for DSO innovation. This paper identifies several improvements that could be made to the treatment of DG/RES. Key aspects of an efficient DG/RES integration are identified and several regulatory changes specific to the Spanish situation are recommended. - Highlights: ► Substantial DG/RES penetration levels are foreseen for the coming years in Spain. ► Integrating such amount of DG/RES in electricity markets and networks is challenging. ► We review key regulatory aspects that may affect DG/RES integration in Spain. ► Several recommendations aimed at easing DG/RES integration in Spain are provided. ► Market integration and the transition towards smarter grids are deemed key issues.

  14. ‘Oorja’ in India: Assessing a large-scale commercial distribution of advanced biomass stoves to households

    Science.gov (United States)

    Thurber, Mark C.; Phadke, Himani; Nagavarapu, Sriniketh; Shrimali, Gireesh; Zerriffi, Hisham

    2015-01-01

    Replacing traditional stoves with advanced alternatives that burn more cleanly has the potential to ameliorate major health problems associated with indoor air pollution in developing countries. With a few exceptions, large government and charitable programs to distribute advanced stoves have not had the desired impact. Commercially-based distributions that seek cost recovery and even profits might plausibly do better, both because they encourage distributors to supply and promote products that people want and because they are based around properly-incentivized supply chains that could more be scalable, sustainable, and replicable. The sale in India of over 400,000 “Oorja” stoves to households from 2006 onwards represents the largest commercially-based distribution of a gasification-type advanced biomass stove. BP's Emerging Consumer Markets (ECM) division and then successor company First Energy sold this stove and the pelletized biomass fuel on which it operates. We assess the success of this effort and the role its commercial aspect played in outcomes using a survey of 998 households in areas of Maharashtra and Karnataka where the stove was sold as well as detailed interviews with BP and First Energy staff. Statistical models based on this data indicate that Oorja purchase rates were significantly influenced by the intensity of Oorja marketing in a region as well as by pre-existing stove mix among households. The highest rate of adoption came from LPG-using households for which Oorja's pelletized biomass fuel reduced costs. Smoke- and health-related messages from Oorja marketing did not significantly influence the purchase decision, although they did appear to affect household perceptions about smoke. By the time of our survey, only 9% of households that purchased Oorja were still using the stove, the result in large part of difficulties First Energy encountered in developing a viable supply chain around low-cost procurement of “agricultural waste” to

  15. Ownership and usage of mosquito nets after four years of large-scale free distribution in Papua New Guinea

    Directory of Open Access Journals (Sweden)

    Hetzel Manuel W

    2012-06-01

    Full Text Available Abstract Background Papua New Guinea (PNG is a highly malaria endemic country in the South-West Pacific with a population of approximately 6.6 million (2009. In 2004, the country intensified its malaria control activities with support from the Global Fund. With the aim of achieving 80% ownership and usage, a country-wide campaign distributed two million free long-lasting insecticide-treated nets (LLINs. Methods In order to evaluate outcomes of the campaign against programme targets, a country-wide household survey based on stratified multi-stage random sampling was carried out in 17 of the 20 provinces after the campaign in 2008/09. In addition, a before-after assessment was carried out in six purposively selected sentinel sites. A structured questionnaire was administered to the heads of sampled households to elicit net ownership and usage information. Results After the campaign, 64.6% of households owned a LLIN, 80.1% any type of mosquito net. Overall usage by household members amounted to 32.5% for LLINs and 44.3% for nets in general. Amongst children under five years, 39.5% used a LLIN and 51.8% any type of net, whereas 41.3% of pregnant women used a LLIN and 56.1% any net. Accessibility of villages was the key determinant of net ownership, while usage was mainly determined by ownership. Most (99.5% of the household members who did not sleep under a net did not have access to a (unused net in their household. In the sentinel sites, LLIN ownership increased from 9.4% to 88.7%, ownership of any net from 52.7% to 94.1%. Usage of LLINs increased from 5.5% to 55.1%, usage of any net from 37.3% to 66.7%. Among children under five years, usage of LLINs and of nets in general increased from 8.2% to 67.0% and from 44.6% to 76.1%, respectively (all p ≤ 0.001. Conclusions While a single round of free distribution of LLINs significantly increased net ownership, an insufficient number of nets coupled with a heterogeneous distribution led to overall

  16. The large scale and long term evolution of the solar wind speed distribution and high speed streams

    International Nuclear Information System (INIS)

    Intriligator, D.S.

    1977-01-01

    The spatial and temporal evolution of the solar wind speed distribution and of high speed streams in the solar wind are examined. Comparisons of the solar wind streaming speeds measured at Earth, Pioneer 11, and Pioneer 10 indicate that between 1 AU and 6.4 AU the solar wind speed distributions are narrower (i.e. the 95% value minus the 5% value of the solar wind streaming speed is less) at extended heliocentric distances. These observations are consistent with one exchange of momentum in the solar wind between high speed streams and low speed streams as they propagate outward from the Sun. Analyses of solar wind observations at 1 AU from mid 1964 through 1973 confirm the earlier results reported by Intriligator (1974) that there are statistically significant variations in the solar wind in 1968 and 1969, years of solar maximum. High speed stream parameters show that the number of high speed streams in the solar wind in 1968 and 1969 is considerably more than the predicted yearly average, and in 1965 and 1972 less. Histograms of solar wind speed from 1964 through 1973 indicate that in 1968 there was the highest percentage of elevated solar wind speeds and in 1965 and 1972 the lowest. Studies by others also confirm these results although the respective authors did not indicate this fact. The duration of the streams and the histograms for 1973 imply a shifting in the primary stream source. (Auth.)

  17. A Large-Scale Distribution of Milk-Based Fortified Spreads: Evidence for a New Approach in Regions with High Burden of Acute Malnutrition

    Science.gov (United States)

    Defourny, Isabelle; Minetti, Andrea; Harczi, Géza; Doyon, Stéphane; Shepherd, Susan; Tectonidis, Milton; Bradol, Jean-Hervé; Golden, Michael

    2009-01-01

    Background There are 146 million underweight children in the developing world, which contribute to up to half of the world's child deaths. In high burden regions for malnutrition, the treatment of individual children is limited by available resources. Here, we evaluate a large-scale distribution of a nutritional supplement on the prevention of wasting. Methods and Findings A new ready-to-use food (RUF) was developed as a diet supplement for children under three. The intervention consisted of six monthly distributions of RUF during the 2007 hunger gap in a district of Maradi region, Niger, for approximately 60,000 children (length: 60–85 cm). At each distribution, all children over 65 cm had their Mid-Upper Arm Circumference (MUAC) recorded. Admission trends for severe wasting (WFHmalnutrition (MUACmalnutrition. PMID:19421316

  18. Multivariate geostatistical modeling of the spatial sediment distribution in a large scale drainage basin, Upper Rhone, Switzerland

    Science.gov (United States)

    Schoch, Anna; Blöthe, Jan Henrik; Hoffmann, Thomas; Schrott, Lothar

    2018-02-01

    There is a notable discrepancy between detailed sediment budget studies in small headwater catchments ( 103 km2) in higher order catchments applying modeling and/or remote sensing based approaches for major sediment storage delineation. To bridge the gap between these scales, we compiled an inventory of sediment and bedrock coverage from field mapping, remote sensing analysis and published data for five key sites in the Upper Rhone Basin (Val d'Illiez, Val de la Liène, Turtmanntal, Lötschental, Goms; 360.3 km2, equivalent to 6.7% of the Upper Rhone Basin). This inventory was used as training and testing data for the classification of sediment and bedrock cover. From a digital elevation model (2 × 2 m ground resolution) and Landsat imagery we derived 22 parameters characterizing local morphometry, topography and position, contributing area, and climatic and biotic factors on different spatial scales, which were used as inputs for different statistical models (logistic regression, principal component logistic regression, generalized additive model). Best prediction results with an excellent performance (mean AUROC: 0.8721 ± 0.0012) and both a high spatial and non-spatial transferability were achieved applying a generalized additive model. Since the model has a high thematic consistency, the independent input variables chosen based on their geomorphic relevance are suitable to model the spatial distribution of sediment. Our high-resolution classification shows that 53.5 ± 21.7% of the Upper Rhone Basin are covered with sediment. These are by no means evenly distributed: small headwaters (analysis.

  19. Large scale experiments simulating hydrogen distribution in a spent fuel pool building during a hypothetical fuel uncovery accident scenario

    Energy Technology Data Exchange (ETDEWEB)

    Mignot, Guillaume; Paranjape, Sidharth; Paladino, Domenico; Jaeckel, Bernd; Rydl, Adolf [Paul Scherrer Institute, Villigen (Switzerland)

    2016-08-15

    Following the Fukushima accident and its extended station blackout, attention was brought to the importance of the spent fuel pools' (SFPs) behavior in case of a prolonged loss of the cooling system. Since then, many analytical works have been performed to estimate the timing of hypothetical fuel uncovery for various SFP types. Experimentally, however, little was done to investigate issues related to the formation of a flammable gas mixture, distribution, and stratification in the SFP building itself and to some extent assess the capability for the code to correctly predict it. This paper presents the main outcomes of the Experiments on Spent Fuel Pool (ESFP) project carried out under the auspices of Swissnuclear (Framework 2012–2013) in the PANDA facility at the Paul Scherrer Institut in Switzerland. It consists of an experimental investigation focused on hydrogen concentration build-up into a SFP building during a predefined scaled scenario for different venting positions. Tests follow a two-phase scenario. Initially steam is released to mimic the boiling of the pool followed by a helium/steam mixture release to simulate the deterioration of the oxidizing spent fuel. Results shows that while the SFP building would mainly be inerted by the presence of a high concentration of steam, the volume located below the level of the pool in adjacent rooms would maintain a high air content. The interface of the two-gas mixture presents the highest risk of flammability. Additionally, it was observed that the gas mixture could become stagnant leading locally to high hydrogen concentration while steam condenses. Overall, the experiments provide relevant information for the potentially hazardous gas distribution formed in the SFP building and hints on accident management and on eventual retrofitting measures to be implemented in the SFP building.

  20. Large-scale upper tropospheric pollution observed by MIPAS HCN and C2H6 global distributions

    Science.gov (United States)

    Glatthor, N.; von Clarmann, T.; Stiller, G. P.; Funke, B.; Koukouli, M. E.; Fischer, H.; Grabowski, U.; Höpfner, M.; Kellmann, S.; Linden, A.

    2009-12-01

    We present global upper tropospheric HCN and C2H6 amounts derived from MIPAS/ENVISAT limb emission spectra. HCN and C2H6 are retrieved in the spectral regions 715.5-782.7 cm-1 and 811.5-835.7 cm-1, respectively. The datasets consist of 54 days between September 2003 and March 2004. This period covers the peak and decline of the southern hemispheric biomass burning period and some months thereafter. HCN is a nearly unambiguous tracer of biomass burning with an assumed tropospheric lifetime of several months. Indeed, the most significant feature in the MIPAS HCN dataset is an upper tropospheric plume of enhanced values caused by southern hemispheric biomass burning, which in September and October 2003 extended from tropical South America over Africa, Australia to the Southern Pacific. The spatial extent of this plume agrees well with the MOPITT CO distribution of September 2003. Further there is good agreement with the shapes and mixing ratios of the southern hemispheric HCN and C2H6 fields measured by the ACE experiment between September and November 2005. The MIPAS HCN plume extended from the lowermost observation height of 8 km up to about 16 km altitude, with maximum values of 500-600 pptv in October 2003. It was still clearly visible in December 2003, but had strongly decreased by March 2004, confirming the assumed tropospheric lifetime. The main sources of C2H6 are production and transmission of fossil fuels, followed by biofuel use and biomass burning. The C2H6 distribution also clearly reflected the southern hemispheric biomass burning plume and its seasonal variation, with maximum amounts of 600-700 pptv. Generally there was good spatial overlap between the southern hemispheric distributions of both pollution tracers, except for the region between Peru and the mid-Pacific. Here C2H6was considerably enhanced, whereas the HCN amounts were low. Backward trajectory calculations suggested that industrial pollution was responsible for the elevated C2H6

  1. Large-scale upper tropospheric pollution observed by MIPAS HCN and C2H6 global distributions

    Directory of Open Access Journals (Sweden)

    A. Linden

    2009-12-01

    Full Text Available We present global upper tropospheric HCN and C2H6 amounts derived from MIPAS/ENVISAT limb emission spectra. HCN and C2H6 are retrieved in the spectral regions 715.5–782.7 cm−1 and 811.5–835.7 cm−1, respectively. The datasets consist of 54 days between September 2003 and March 2004. This period covers the peak and decline of the southern hemispheric biomass burning period and some months thereafter. HCN is a nearly unambiguous tracer of biomass burning with an assumed tropospheric lifetime of several months. Indeed, the most significant feature in the MIPAS HCN dataset is an upper tropospheric plume of enhanced values caused by southern hemispheric biomass burning, which in September and October 2003 extended from tropical South America over Africa, Australia to the Southern Pacific. The spatial extent of this plume agrees well with the MOPITT CO distribution of September 2003. Further there is good agreement with the shapes and mixing ratios of the southern hemispheric HCN and C2H6 fields measured by the ACE experiment between September and November 2005. The MIPAS HCN plume extended from the lowermost observation height of 8 km up to about 16 km altitude, with maximum values of 500–600 pptv in October 2003. It was still clearly visible in December 2003, but had strongly decreased by March 2004, confirming the assumed tropospheric lifetime. The main sources of C2H6 are production and transmission of fossil fuels, followed by biofuel use and biomass burning. The C2H6 distribution also clearly reflected the southern hemispheric biomass burning plume and its seasonal variation, with maximum amounts of 600–700 pptv. Generally there was good spatial overlap between the southern hemispheric distributions of both pollution tracers, except for the region between Peru and the mid-Pacific. Here C2H6was considerably enhanced, whereas the HCN amounts were low. Backward trajectory calculations suggested that industrial pollution was responsible

  2. Heat recovery networks synthesis of large-scale industrial sites: Heat load distribution problem with virtual process subsystems

    International Nuclear Information System (INIS)

    Pouransari, Nasibeh; Maréchal, Francois

    2015-01-01

    Highlights: • Synthesizing industrial size heat recovery network with match reduction approach. • Targeting TSI with minimum exchange between process subsystems. • Generating a feasible close-to-optimum network. • Reducing tremendously the HLD computational time and complexity. • Generating realistic network with respect to the plant layout. - Abstract: This paper presents a targeting strategy to design a heat recovery network for an industrial plant by dividing the system into subsystems while considering the heat transfer opportunities between them. The methodology is based on a sequential approach. The heat recovery opportunity between process units and the optimal flow rates of utilities are first identified using a Mixed Integer Linear Programming (MILP) model. The site is then divided into a number of subsystems where the overall interaction is resumed by a pair of virtual hot and cold stream per subsystem which is reconstructed by solving the heat cascade inside each subsystem. The Heat Load Distribution (HLD) problem is then solved between those packed subsystems in a sequential procedure where each time one of the subsystems is unpacked by switching from the virtual stream pair back into the original ones. The main advantages are to minimize the number of connections between process subsystems, to alleviate the computational complexity of the HLD problem and to generate a feasible network which is compatible with the minimum energy consumption objective. The application of the proposed methodology is illustrated through a number of case studies, discussed and compared with the relevant results from the literature

  3. SOLID-DER. Reaching large-scale integration of Distributed Energy Resources in the enlarged European electricity market

    International Nuclear Information System (INIS)

    Van Oostvoorn, F.; Ten Donkelaar, M.

    2007-05-01

    The integration of DER (distributed energy resources) in the European electricity networks has become a key issue for energy producers, network operators, policy makers and the R and D community. In some countries it created already a number of challenges for the stability of the electricity supply system, thereby creating new barriers for further expansion of the share of DER in supply. On the other hand in many Member States there exists still a lack of awareness and understanding of the possible benefits and role of DER in the electricity system, while environmental goals and security of supply issues ask more and more for solutions that DER could provide in the future. The project SOLID-DER, a Coordination Action, will assess the barriers for further integration of DER, overcome both the lack of awareness of benefits of DER solutions and fragmentation in EU R and D results by consolidating all European DER research activities and report on its common findings. In particular awareness of DER solutions and benefits will be raised in the new Member States, thereby addressing their specific issues and barriers and incorporate them in the existing EU DER R and D community. The SOLID-DER Coordination Action will run from November 2005 to October 2008

  4. Spatiotemporal distribution of nitrogen dioxide within and around a large-scale wind farm – a numerical case study

    Directory of Open Access Journals (Sweden)

    J. Mo

    2017-12-01

    Full Text Available As a renewable and clean energy source, wind power has become the most rapidly growing energy resource worldwide in the past decades. Wind power has been thought not to exert any negative impacts on the environment. However, since a wind farm can alter the local meteorological conditions and increase the surface roughness lengths, it may affect air pollutants passing through and over the wind farm after released from their sources and delivered to the wind farm. In the present study, we simulated the nitrogen dioxide (NO2 air concentration within and around the world's largest wind farm (Jiuquan wind farm in Gansu Province, China using a coupled meteorology and atmospheric chemistry model WRF-Chem. The results revealed an edge effect, which featured higher NO2 levels at the immediate upwind and border region of the wind farm and lower NO2 concentration within the wind farm and the immediate downwind transition area of the wind farm. A surface roughness length scheme and a wind turbine drag force scheme were employed to parameterize the wind farm in this model investigation. Modeling results show that both parameterization schemes yield higher concentration in the immediate upstream of the wind farm and lower concentration within the wind farm compared to the case without the wind farm. We infer this edge effect and the spatial distribution of air pollutants to be the result of the internal boundary layer induced by the changes in wind speed and turbulence intensity driven by the rotation of the wind turbine rotor blades and the enhancement of surface roughness length over the wind farm. The step change in the roughness length from the smooth to rough surfaces (overshooting in the upstream of the wind farm decelerates the atmospheric transport of air pollutants, leading to their accumulation. The rough to the smooth surface (undershooting in the downstream of the wind farm accelerates the atmospheric transport of air pollutants, resulting in

  5. Large-scale CO2 injection demos for the development of monitoring and verification technology and guidelines (CO2ReMoVe)

    Energy Technology Data Exchange (ETDEWEB)

    Wildenborg, T.; David, P. [TNO Built Environment and Geosciences, Princetonlaan 6, 3584 CB Utrecht (Netherlands); Bentham, M.; Chadwick, A.; Kirk, K. [British Geological Survey, Kingsley Dunham Centre, Keyworth, Nottingham NG12 5GG (United Kingdom); Dillen, M. [SINTEF Petroleum Research, Trondheim (Norway); Groenenberg, H. [Unit Policy Studies, Energy Research Centre of the Netherlands ECN, Amsterdam (Netherlands); Deflandre, J.P.; Le Gallo, J. [Institut Francais du Petrole, Rueil-Malmaison (France)

    2009-04-15

    The objectives of the EU project CO2ReMoVe are to undertake the research and development necessary to establish scientifically based standards for monitoring future CCS operations and to develop the performance assessment methodologies necessary to demonstrate the long-term reliability of geological storage of CO2. This could in turn lead to guidelines for the certification of sites suitable for CCS on a wide scale. Crucial to the project portfolio are the continuing large-scale CO2 injection operation at Sleipner, the injection operation at In Salah (Algeria) and the recently started injection project at Snoehvit (Norway). Two pilot sites are also currently in the project portfolio, Ketzin in Germany and K12-B in the offshore continental shelf of the Netherlands.

  6. Experimental study of the large-scale axially heterogeneous liquid-metal fast breeder reactor at the fast critical assembly: Power distribution measurements and their analyses

    International Nuclear Information System (INIS)

    Iijima, S.; Obu, M.; Hayase, T.; Ohno, A.; Nemoto, T.; Okajima, S.

    1988-01-01

    Power distributions of the large-scale axially heterogeneous liquid-metal fast breeder reactor were studied by using the experiment results of fast critical assemblies XI, XII, and XIII and the results of their analyses. The power distributions were examined by the gamma-scanning method and fission rate measurements using /sup 239/Pu and /sup 238/U fission counters and the foil irradiation method. In addition to the measurements in the reference core, the power distributions were measured in the core with a control rod inserted and in a modified core where the shape of the internal blanket was determined by the radial boundary. The calculation was made by using JENDL-2 and the Japan Atomic Energy Research Institute's standard calculation system for fast reactor neutronics. The power flattening trend, caused by the decrease of the fast neutron flux, was observed in the axial and radial power distributions. The effect of the radial boundary shape of the internal blanket on the power distribution was determined in the core. The thickness of the internal blanket was reduced at its radial boundary. The influence of the internal blanket was observed in the power distributions in the core with a control rod inserted. The calculation predicted the neutron spectrum harder in the internal blanket. In the radial distributions of /sup 239/Pu fission rates, the space dependency of the calculated-to-experiment values was found at the active core close to the internal blanket

  7. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  8. Kinota: An Open-Source NoSQL implementation of OGC SensorThings for large-scale high-resolution real-time environmental monitoring

    Science.gov (United States)

    Miles, B.; Chepudira, K.; LaBar, W.

    2017-12-01

    The Open Geospatial Consortium (OGC) SensorThings API (STA) specification, ratified in 2016, is a next-generation open standard for enabling real-time communication of sensor data. Building on over a decade of OGC Sensor Web Enablement (SWE) Standards, STA offers a rich data model that can represent a range of sensor and phenomena types (e.g. fixed sensors sensing fixed phenomena, fixed sensors sensing moving phenomena, mobile sensors sensing fixed phenomena, and mobile sensors sensing moving phenomena) and is data agnostic. Additionally, and in contrast to previous SWE standards, STA is developer-friendly, as is evident from its convenient JSON serialization, and expressive OData-based query language (with support for geospatial queries); with its Message Queue Telemetry Transport (MQTT), STA is also well-suited to efficient real-time data publishing and discovery. All these attributes make STA potentially useful for use in environmental monitoring sensor networks. Here we present Kinota(TM), an Open-Source NoSQL implementation of OGC SensorThings for large-scale high-resolution real-time environmental monitoring. Kinota, which roughly stands for Knowledge from Internet of Things Analyses, relies on Cassandra its underlying data store, which is a horizontally scalable, fault-tolerant open-source database that is often used to store time-series data for Big Data applications (though integration with other NoSQL or rational databases is possible). With this foundation, Kinota can scale to store data from an arbitrary number of sensors collecting data every 500 milliseconds. Additionally, Kinota architecture is very modular allowing for customization by adopters who can choose to replace parts of the existing implementation when desirable. The architecture is also highly portable providing the flexibility to choose between cloud providers like azure, amazon, google etc. The scalable, flexible and cloud friendly architecture of Kinota makes it ideal for use in next

  9. Large-scale monitoring of effects of clothianidin-dressed oilseed rape seeds on pollinating insects in Northern Germany: effects on red mason bees (Osmia bicornis).

    Science.gov (United States)

    Peters, Britta; Gao, Zhenglei; Zumkier, Ulrich

    2016-11-01

    The aim of this study was to investigate the effects of Elado® (10 g clothianidin & 2 g beta-cyfluthrin/kg seed)-dressed oilseed rape on the development and reproduction of mason bees (Osmia bicornis) as part of a large-scale monitoring field study in Northern Germany, where oilseed rape is usually cultivated at 25-33 % of the arable land. Both reference and test sites comprised 65 km 2 in which no other crops attractive to pollinating insects were present. Six study locations were selected per site and three nesting shelters were placed at each location. Of these locations, three locations were directly adjacent to oilseed rape fields, while the other three locations were situated 100 m distant from the nearest oilseed rape field. At each location, 1500 cocoons of O. bicornis were placed into the central nesting shelter. During the exposure phase, nest building activities and foraging behaviour were assessed repeatedly. Cocoons were harvested in autumn to assess parasitization and reproduction including larval development. The following spring, the emergence of the next generation of adults from cocoons was monitored. High reproductive output and low parasitization rates indicated that Elado ® -dressed oilseed rape did not cause any detrimental effects on the development or reproduction of mason bees.

  10. Distributed chemical computing using ChemStar: an open source java remote method invocation architecture applied to large scale molecular data from PubChem.

    Science.gov (United States)

    Karthikeyan, M; Krishnan, S; Pandey, Anil Kumar; Bender, Andreas; Tropsha, Alexander

    2008-04-01

    We present the application of a Java remote method invocation (RMI) based open source architecture to distributed chemical computing. This architecture was previously employed for distributed data harvesting of chemical information from the Internet via the Google application programming interface (API; ChemXtreme). Due to its open source character and its flexibility, the underlying server/client framework can be quickly adopted to virtually every computational task that can be parallelized. Here, we present the server/client communication framework as well as an application to distributed computing of chemical properties on a large scale (currently the size of PubChem; about 18 million compounds), using both the Marvin toolkit as well as the open source JOELib package. As an application, for this set of compounds, the agreement of log P and TPSA between the packages was compared. Outliers were found to be mostly non-druglike compounds and differences could usually be explained by differences in the underlying algorithms. ChemStar is the first open source distributed chemical computing environment built on Java RMI, which is also easily adaptable to user demands due to its "plug-in architecture". The complete source codes as well as calculated properties along with links to PubChem resources are available on the Internet via a graphical user interface at http://moltable.ncl.res.in/chemstar/.

  11. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  12. A large-scale distribution of milk-based fortified spreads: evidence for a new approach in regions with high burden of acute malnutrition.

    Directory of Open Access Journals (Sweden)

    Isabelle Defourny

    Full Text Available BACKGROUND: There are 146 million underweight children in the developing world, which contribute to up to half of the world's child deaths. In high burden regions for malnutrition, the treatment of individual children is limited by available resources. Here, we evaluate a large-scale distribution of a nutritional supplement on the prevention of wasting. METHODS AND FINDINGS: A new ready-to-use food (RUF was developed as a diet supplement for children under three. The intervention consisted of six monthly distributions of RUF during the 2007 hunger gap in a district of Maradi region, Niger, for approximately 60,000 children (length: 60-85 cm. At each distribution, all children over 65 cm had their Mid-Upper Arm Circumference (MUAC recorded. Admission trends for severe wasting (WFH<70% NCHS in Maradi, 2002-2005 show an increase every year during the hunger gap. In contrast, in 2007, throughout the period of the distribution, the incidence of severe acute malnutrition (MUAC<110 mm remained at extremely low levels. Comparison of year-over-year admissions to the therapeutic feeding program shows that the 2007 blanket distribution had essentially the same flattening effect on the seasonal rise in admissions as the 2006 individualized treatment of almost 60,000 children moderately wasted. CONCLUSIONS: These results demonstrate the potential for distribution of fortified spreads to reduce the incidence of severe wasting in large population of children 6-36 months of age. Although further information is needed on the cost-effectiveness of such distributions, these results highlight the importance of re-evaluating current nutritional strategies and international recommendations for high burden areas of childhood malnutrition.

  13. Residual stress measurement of large scaled welded pipe using neutron diffraction method. Effect of SCC crack propagation and repair weld on residual stress distribution

    International Nuclear Information System (INIS)

    Suzuki, Hiroshi; Katsuyama, Jinya; Tobita, Tohru; Morii, Yukio

    2011-01-01

    The RESA-1 neutron engineering diffractometer in the JRR-3 (Japan Research Reactor No.3) at the Japan Atomic Energy Agency, which is used for stress measurements, was upgraded to realize residual stress measurements of large scaled mechanical components. A series of residual stress measurements was made to obtain through-thickness residual stress distributions in a Type 304 stainless steel butt-welded pipe of 500A-sch.80 using the upgraded RESA-1 diffractometer. We evaluated effects of crack propagation such as stress corrosion cracking (SCC) and a part-circumference repair weld on the residual stress distributions induced by girth welding. Measured residual stress distributions near original girth weld revealed good agreement with typical results shown in some previous works using finite element method, deep hole drilling as well as neutron diffraction. After introducing a mock crack with 10 mm depth in the heat affected zone on the inside wall of the pipe by electro discharge machining, the axial residual stresses were found to be released in the part of the mock crack. However, changes in the through-wall bending stress component and the self-equilibrated stress component were negligible and hence the axial residual stress distribution in the ligament was remained in the original residual stresses near girth weld without the mock crack. Furthermore, changes in hoop and radial residual stress were also small. The residual stress distributions after a part repair welding on the outer circumference of the girth weld were significantly different from residual stress distributions near the original girth weld. The through-thickness average axial residual stress was increased due to increase of the tensile membrane stress and mitigation of the bending stress after repair welding. Throughout above studies, we evidenced that the neutron diffraction technique is useful and powerful tool for measuring residual stress distributions in large as well as thick mechanical

  14. Large-scale monitoring of effects of clothianidin-dressed oilseed rape seeds on pollinating insects in Northern Germany: effects on honey bees (Apis mellifera).

    Science.gov (United States)

    Rolke, Daniel; Fuchs, Stefan; Grünewald, Bernd; Gao, Zhenglei; Blenau, Wolfgang

    2016-11-01

    Possible effects of clothianidin seed-treated oilseed rape on honey bee colonies were investigated in a large-scale monitoring project in Northern Germany, where oilseed rape usually comprises 25-33 % of the arable land. For both reference and test sites, six study locations were selected and eight honey bee hives were placed at each location. At each site, three locations were directly adjacent to oilseed rape fields and three locations were situated 400 m away from the nearest oilseed rape field. Thus, 96 hives were exposed to fully flowering oilseed rape crops. Colony sizes and weights, the amount of honey harvested, and infection with parasites and diseases were monitored between April and September 2014. The percentage of oilseed rape pollen was determined in pollen and honey samples. After oilseed rape flowering, the hives were transferred to an extensive isolated area for post-exposure monitoring. Total numbers of adult bees and brood cells showed seasonal fluctuations, and there were no significant differences between the sites. The honey, which was extracted at the end of the exposure phase, contained 62.0-83.5 % oilseed rape pollen. Varroa destructor infestation was low during most of the course of the study but increased at the end of the study due to flumethrin resistance in the mite populations. In summary, honey bee colonies foraging in clothianidin seed-treated oilseed rape did not show any detrimental symptoms as compared to colonies foraging in clothianidin-free oilseed rape. Development of colony strength, brood success as well as honey yield and pathogen infection were not significantly affected by clothianidin seed-treatment during this study.

  15. Large Scale Solar Heating

    DEFF Research Database (Denmark)

    Heller, Alfred

    2001-01-01

    The main objective of the research was to evaluate large-scale solar heating connected to district heating (CSDHP), to build up a simulation tool and to demonstrate the application of the simulation tool for design studies and on a local energy planning case. The evaluation was mainly carried out...... model is designed and validated on the Marstal case. Applying the Danish Reference Year, a design tool is presented. The simulation tool is used for proposals for application of alternative designs, including high-performance solar collector types (trough solar collectors, vaccum pipe collectors......). Simulation programs are proposed as control supporting tool for daily operation and performance prediction of central solar heating plants. Finaly the CSHP technolgy is put into persepctive with respect to alternatives and a short discussion on the barries and breakthrough of the technology are given....

  16. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  17. Based on Real Time Remote Health Monitoring Systems: A New Approach for Prioritization "Large Scales Data" Patients with Chronic Heart Diseases Using Body Sensors and Communication Technology.

    Science.gov (United States)

    Kalid, Naser; Zaidan, A A; Zaidan, B B; Salman, Omar H; Hashim, M; Albahri, O S; Albahri, A S

    2018-03-02

    This paper presents a new approach to prioritize "Large-scale Data" of patients with chronic heart diseases by using body sensors and communication technology during disasters and peak seasons. An evaluation matrix is used for emergency evaluation and large-scale data scoring of patients with chronic heart diseases in telemedicine environment. However, one major problem in the emergency evaluation of these patients is establishing a reasonable threshold for patients with the most and least critical conditions. This threshold can be used to detect the highest and lowest priority levels when all the scores of patients are identical during disasters and peak seasons. A practical study was performed on 500 patients with chronic heart diseases and different symptoms, and their emergency levels were evaluated based on four main measurements: electrocardiogram, oxygen saturation sensor, blood pressure monitoring, and non-sensory measurement tool, namely, text frame. Data alignment was conducted for the raw data and decision-making matrix by converting each extracted feature into an integer. This integer represents their state in the triage level based on medical guidelines to determine the features from different sources in a platform. The patients were then scored based on a decision matrix by using multi-criteria decision-making techniques, namely, integrated multi-layer for analytic hierarchy process (MLAHP) and technique for order performance by similarity to ideal solution (TOPSIS). For subjective validation, cardiologists were consulted to confirm the ranking results. For objective validation, mean ± standard deviation was computed to check the accuracy of the systematic ranking. This study provides scenarios and checklist benchmarking to evaluate the proposed and existing prioritization methods. Experimental results revealed the following. (1) The integration of TOPSIS and MLAHP effectively and systematically solved the patient settings on triage and

  18. Large-scale monitoring of effects of clothianidin-dressed OSR seeds on pollinating insects in Northern Germany: effects on large earth bumble bees (Bombus terrestris).

    Science.gov (United States)

    Sterk, Guido; Peters, Britta; Gao, Zhenglei; Zumkier, Ulrich

    2016-11-01

    The aim of this study was to investigate the effects of Elado ® -dressed winter oilseed rape (OSR, 10 g clothianidin & 2 g beta-cyfluthrin/kg seed) on the development, reproduction and behaviour of large earth bumble bees (Bombus terrestris) as part of a large-scale monitoring field study in Northern Germany, where OSR is usually cultivated at 25-33 % of the arable land. Both reference and test sites comprised 65 km 2 in which no other crops attractive to pollinating insects were present. Six study locations were selected per site and 10 bumble bee hives were placed at each location. At each site, three locations were directly adjacent to OSR fields and three locations were situated 400 m distant from the nearest OSR field. The development of colonies was monitored from the beginning of OSR flowering in April until June 2014. Pollen from returning foragers was analysed for its composition. An average of 44 % of OSR pollen was found in pollen loads of bumble bees indicating that OSR was a major resource for the colonies. At the end of OSR flowering, hives were transferred to a nature reserve until the end of the study. Colony development in terms of hive weight and the number of workers showed a typical course with no statistically significant differences between the sites. Reproductive output was comparatively high and not negatively affected by the exposure to treated OSR. In summary, Elado ® -dressed OSR did not cause any detrimental effects on the development or reproduction of bumble bee colonies.

  19. Large scale tracking algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Ross L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Joshua Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Melgaard, David Kennett [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Karelitz, David B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pitts, Todd Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zollweg, Joshua David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Dylan Z. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nandy, Prabal [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Whitlow, Gary L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bender, Daniel A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond Harry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  20. Large-scale computing with Quantum Espresso

    International Nuclear Information System (INIS)

    Giannozzi, P.; Cavazzoni, C.

    2009-01-01

    This paper gives a short introduction to Quantum Espresso: a distribution of software for atomistic simulations in condensed-matter physics, chemical physics, materials science, and to its usage in large-scale parallel computing.

  1. Study design of a cluster-randomized controlled trial to evaluate a large-scale distribution of cook stoves and water filters in Western Province, Rwanda.

    Science.gov (United States)

    Nagel, Corey L; Kirby, Miles A; Zambrano, Laura D; Rosa, Ghislane; Barstow, Christina K; Thomas, Evan A; Clasen, Thomas F

    2016-12-15

    In Rwanda, pneumonia and diarrhea are the first and second leading causes of death, respectively, among children under five. Household air pollution (HAP) resultant from cooking indoors with biomass fuels on traditional stoves is a significant risk factor for pneumonia, while consumption of contaminated drinking water is a primary cause of diarrheal disease. To date, there have been no large-scale effectiveness trials of programmatic efforts to provide either improved cookstoves or household water filters at scale in a low-income country. In this paper we describe the design of a cluster-randomized trial to evaluate the impact of a national-level program to distribute and promote the use of improved cookstoves and advanced water filters to the poorest quarter of households in Rwanda. We randomly allocated 72 sectors (administratively defined units) in Western Province to the intervention, with the remaining 24 sectors in the province serving as controls. In the intervention sectors, roughly 100,000 households received improved cookstoves and household water filters through a government-sponsored program targeting the poorest quarter of households nationally. The primary outcome measures are the incidence of acute respiratory infection (ARI) and diarrhea among children under five years of age. Over a one-year surveillance period, all cases of acute respiratory infection (ARI) and diarrhea identified by health workers in the study area will be extracted from records maintained at health facilities and by community health workers (CHW). In addition, we are conducting intensive, longitudinal data collection among a random sample of households in the study area for in-depth assessment of coverage, use, environmental exposures, and additional health measures. Although previous research has examined the impact of providing household water treatment and improved cookstoves on child health, there have been no studies of national-level programs to deliver these interventions

  2. The Integrated Use of DMSP-OLS Nighttime Light and MODIS Data for Monitoring Large-Scale Impervious Surface Dynamics: A Case Study in the Yangtze River Delta

    Directory of Open Access Journals (Sweden)

    Zhenfeng Shao

    2014-09-01

    Full Text Available The timely and reliable estimation of imperviousness is essential for the scientific understanding of human-Earth interactions. Due to the unique capacity of capturing artificial light luminosity and long-term data records, the Defense Meteorological Satellite Program (DMSP’s Operational Line-scan System (OLS nighttime light (NTL imagery offers an appealing opportunity for continuously characterizing impervious surface area (ISA at regional and continental scales. Although different levels of success have been achieved, critical challenges still remain in the literature. ISA results generated by DMSP-OLS NTL alone suffer from limitations due to systemic defects of the sensor. Moreover, the majority of developed methodologies seldom consider spatial heterogeneity, which is a key issue in coarse imagery applications. In this study, we proposed a novel method for multi-temporal ISA estimation. This method is based on a linear regression model developed between the sub-pixel ISA fraction and a multi-source index with the integrated use of DMSP-OLS NTL and MODIS NDVI. In contrast with traditional regression analysis, we incorporated spatial information to the regression model for obtaining spatially adaptive coefficients at the per-pixel level. To produce multi-temporal ISA maps using a mono-temporal reference dataset, temporally stable samples were extracted for model training and validation. We tested the proposed method in the Yangtze River Delta and generated annual ISA fraction maps for the decade 2000–2009. According to our assessments, the proposed method exhibited substantial improvements compared with the standard linear regression model and provided a feasible way to monitor large-scale impervious surface dynamics.

  3. Large-scale monitoring of effects of clothianidin-dressed oilseed rape seeds on pollinating insects in northern Germany: residues of clothianidin in pollen, nectar and honey.

    Science.gov (United States)

    Rolke, Daniel; Persigehl, Markus; Peters, Britta; Sterk, Guido; Blenau, Wolfgang

    2016-11-01

    This study was part of a large-scale monitoring project to assess the possible effects of Elado ® (10 g clothianidin & 2 g β-cyfluthrin/kg seed)-dressed oilseed rape seeds on different pollinators in Northern Germany. Firstly, residues of clothianidin and its active metabolites thiazolylnitroguanidine and thiazolylmethylurea were measured in nectar and pollen from Elado ® -dressed (test site, T) and undressed (reference site, R) oilseed rape collected by honey bees confined within tunnel tents. Clothianidin and its metabolites could not be detected or quantified in samples from R fields. Clothianidin concentrations in samples from T fields were 1.3 ± 0.9 μg/kg and 1.7 ± 0.9 μg/kg in nectar and pollen, respectively. Secondly, pollen and nectar for residue analyses were sampled from free flying honey bees, bumble bees and mason bees, placed at six study locations each in the R and T sites at the start of oilseed rape flowering. Honey samples were analysed from all honey bee colonies at the end of oilseed rape flowering. Neither clothianidin nor its metabolites were detectable or quantifiable in R site samples. Clothianidin concentrations in samples from the T site were below the limit of quantification (LOQ, 1.0 µg/kg) in most pollen and nectar samples collected by bees and 1.4 ± 0.5 µg/kg in honey taken from honey bee colonies. In summary, the study provides reliable semi-field and field data of clothianidin residues in nectar and pollen collected by different bee species in oilseed rape fields under common agricultural conditions.

  4. Image-processing of time-averaged interface distributions representing CCFL characteristics in a large scale model of a PWR hot-leg pipe geometry

    International Nuclear Information System (INIS)

    Al Issa, Suleiman; Macián-Juan, Rafael

    2017-01-01

    Highlights: • CCFL characteristics are investigated in PWR large-scale hot-leg pipe geometry. • Image processing of air-water interface produced time-averaged interface distributions. • Time-averages provide a comparative method of CCFL characteristics among different studies. • CCFL correlations depend upon the range of investigated water delivery for Dh ≫ 50 mm. • 1D codes are incapable of investigating CCFL because of lack of interface distribution. - Abstract: Countercurrent Flow Limitation (CCFL) was experimentally investigated in a 1/3.9 downscaled COLLIDER facility with a 190 mm pipe’s diameter using air/water at 1 atmospheric pressure. Previous investigations provided knowledge over the onset of CCFL mechanisms. In current article, CCFL characteristics at the COLLIDER facility are measured and discussed along with time-averaged distributions of the air/water interface for a selected matrix of liquid/gas velocities. The article demonstrates the time-averaged interface as a useful method to identify CCFL characteristics at quasi-stationary flow conditions eliminating variations that appears in single images, and showing essential comparative flow features such as: the degree of restriction at the bend, the extension and the intensity of the two-phase mixing zones, and the average water level within the horizontal part and the steam generator. Consequently, making it possible to compare interface distributions obtained at different investigations. The distributions are also beneficial for CFD validations of CCFL as the instant chaotic gas/liquid interface is impossible to reproduce in CFD simulations. The current study shows that final CCFL characteristics curve (and the corresponding CCFL correlation) depends upon the covered measuring range of water delivery. It also shows that a hydraulic diameter should be sufficiently larger than 50 mm in order to obtain CCFL characteristics comparable to the 1:1 scale data (namely the UPTF data). Finally

  5. Large Scale System Defense

    Science.gov (United States)

    2008-10-01

    NUMBER 00 5f. WORK UNIT NUMBER 01 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Columbia University 1700 Broadway New York NY 10019-5905 8...PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) AFRL/RIGA 525 Brooks Rd. Rome NY 13441-4505...pealing because of the need to modify source code. Since source-level annotations serve as a vestigial policy, we articulated a way to augment self

  6. Reactor power distribution monitor

    International Nuclear Information System (INIS)

    Sekimizu, Koichi

    1980-01-01

    Purpose: To improve the performance and secure the safety of a nuclear reactor by rapidly computing and display the power density in the nuclear reactor by using a plurality of processors. Constitution: Plant data for a nuclear reactor containing the measured values from a local power monitor LPRM are sent and recorded in a magnetic disc. They are also sent to a core performance computer in which burn-up degree distribution and the like are computed, and the results are sent and recorded in the magnetic disc. A central processors loads programs to each of the processors and applies data recorded in the magnetic disc to each of the processors. Each of the processors computes the corresponding power distribution in four fuel assemblies surrounding the LPRM string by the above information. The central processor compiles the computation results and displays them on a display. In this way, power distribution in the fuel assemblies can rapidly be computed to thereby secure the improvement of the performance and safety of the reactor. (Seki, T.)

  7. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  8. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  9. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  10. Large-scale galaxy bias

    Science.gov (United States)

    Desjacques, Vincent; Jeong, Donghui; Schmidt, Fabian

    2018-02-01

    This review presents a comprehensive overview of galaxy bias, that is, the statistical relation between the distribution of galaxies and matter. We focus on large scales where cosmic density fields are quasi-linear. On these scales, the clustering of galaxies can be described by a perturbative bias expansion, and the complicated physics of galaxy formation is absorbed by a finite set of coefficients of the expansion, called bias parameters. The review begins with a detailed derivation of this very important result, which forms the basis of the rigorous perturbative description of galaxy clustering, under the assumptions of General Relativity and Gaussian, adiabatic initial conditions. Key components of the bias expansion are all leading local gravitational observables, which include the matter density but also tidal fields and their time derivatives. We hence expand the definition of local bias to encompass all these contributions. This derivation is followed by a presentation of the peak-background split in its general form, which elucidates the physical meaning of the bias parameters, and a detailed description of the connection between bias parameters and galaxy statistics. We then review the excursion-set formalism and peak theory which provide predictions for the values of the bias parameters. In the remainder of the review, we consider the generalizations of galaxy bias required in the presence of various types of cosmological physics that go beyond pressureless matter with adiabatic, Gaussian initial conditions: primordial non-Gaussianity, massive neutrinos, baryon-CDM isocurvature perturbations, dark energy, and modified gravity. Finally, we discuss how the description of galaxy bias in the galaxies' rest frame is related to clustering statistics measured from the observed angular positions and redshifts in actual galaxy catalogs.

  11. Large-scale galaxy bias

    Science.gov (United States)

    Jeong, Donghui; Desjacques, Vincent; Schmidt, Fabian

    2018-01-01

    Here, we briefly introduce the key results of the recent review (arXiv:1611.09787), whose abstract is as following. This review presents a comprehensive overview of galaxy bias, that is, the statistical relation between the distribution of galaxies and matter. We focus on large scales where cosmic density fields are quasi-linear. On these scales, the clustering of galaxies can be described by a perturbative bias expansion, and the complicated physics of galaxy formation is absorbed by a finite set of coefficients of the expansion, called bias parameters. The review begins with a detailed derivation of this very important result, which forms the basis of the rigorous perturbative description of galaxy clustering, under the assumptions of General Relativity and Gaussian, adiabatic initial conditions. Key components of the bias expansion are all leading local gravitational observables, which include the matter density but also tidal fields and their time derivatives. We hence expand the definition of local bias to encompass all these contributions. This derivation is followed by a presentation of the peak-background split in its general form, which elucidates the physical meaning of the bias parameters, and a detailed description of the connection between bias parameters and galaxy (or halo) statistics. We then review the excursion set formalism and peak theory which provide predictions for the values of the bias parameters. In the remainder of the review, we consider the generalizations of galaxy bias required in the presence of various types of cosmological physics that go beyond pressureless matter with adiabatic, Gaussian initial conditions: primordial non-Gaussianity, massive neutrinos, baryon-CDM isocurvature perturbations, dark energy, and modified gravity. Finally, we discuss how the description of galaxy bias in the galaxies' rest frame is related to clustering statistics measured from the observed angular positions and redshifts in actual galaxy catalogs.

  12. Monitoring of ground water quality and heavy metals in soil during large scale bioremediation of petroleum hydrocarbon contaminated waste in India: case studies

    Directory of Open Access Journals (Sweden)

    Ajoy Kumar Mandal

    2014-10-01

    Full Text Available Bioremediation using microbes has been well accepted as an environmentally friendly and economical treatment method for disposal of hazardous petroleum hydrocarbon contaminated waste (oily waste and this type of bioremediation has been successfully conducted in laboratory and on a pilot scale in various countries, including India. Presently there are no federal regulatory guidelines available in India for carrying out field-scale bioremediation of oily waste using microbes. The results of the present study describe the analysis of ground water quality as well as selected heavy metals in oily waste in some of the large-scale field case studies on bioremediation of oily waste (solid waste carried out at various oil installations in India. The results show that there was no contribution of oil and grease and selected heavy metals to the ground water in the nearby area due to adoption of this bioremediation process. The results further reveal that there were no changes in pH and EC of the groundwater due to bioremediation. In almost all cases the selected heavy metals in residual oily waste were within the permissible limits as per Schedule – II of Hazardous Waste Management, Handling and Transboundary Movement Act, Amendment 2008, (HWM Act 2008, by the Ministry of Environment and Forests (MoEF, Government of India (GoI.

  13. Quality monitored distributed voting system

    Science.gov (United States)

    Skogmo, David

    1997-01-01

    A quality monitoring system can detect certain system faults and fraud attempts in a distributed voting system. The system uses decoy voters to cast predetermined check ballots. Absent check ballots can indicate system faults. Altered check ballots can indicate attempts at counterfeiting votes. The system can also cast check ballots at predetermined times to provide another check on the distributed voting system.

  14. Large-scale structure of the Universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.

    1978-01-01

    The problems, discussed at the ''Large-scale Structure of the Universe'' symposium are considered on a popular level. Described are the cell structure of galaxy distribution in the Universe, principles of mathematical galaxy distribution modelling. The images of cell structures, obtained after reprocessing with the computer are given. Discussed are three hypothesis - vortical, entropic, adiabatic, suggesting various processes of galaxy and galaxy clusters origin. A considerable advantage of the adiabatic hypothesis is recognized. The relict radiation, as a method of direct studying the processes taking place in the Universe is considered. The large-scale peculiarities and small-scale fluctuations of the relict radiation temperature enable one to estimate the turbance properties at the pre-galaxy stage. The discussion of problems, pertaining to studying the hot gas, contained in galaxy clusters, the interactions within galaxy clusters and with the inter-galaxy medium, is recognized to be a notable contribution into the development of theoretical and observational cosmology

  15. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  16. Novel insights in the fecal egg count reduction test for monitoring drug efficacy against soil-transmitted helminths in large-scale treatment programs.

    Directory of Open Access Journals (Sweden)

    Bruno Levecke

    2011-12-01

    Full Text Available The fecal egg count reduction test (FECRT is recommended to monitor drug efficacy against soil-transmitted helminths (STHs in public health. However, the impact of factors inherent to study design (sample size and detection limit of the fecal egg count (FEC method and host-parasite interactions (mean baseline FEC and aggregation of FEC across host population on the reliability of FECRT is poorly understood.A simulation study was performed in which FECRT was assessed under varying conditions of the aforementioned factors. Classification trees were built to explore critical values for these factors required to obtain conclusive FECRT results. The outcome of this analysis was subsequently validated on five efficacy trials across Africa, Asia, and Latin America. Unsatisfactory (<85.0% sensitivity and specificity results to detect reduced efficacy were found if sample sizes were small (<10 or if sample sizes were moderate (10-49 combined with highly aggregated FEC (k<0.25. FECRT remained inconclusive under any evaluated condition for drug efficacies ranging from 87.5% to 92.5% for a reduced-efficacy-threshold of 90% and from 92.5% to 97.5% for a threshold of 95%. The most discriminatory study design required 200 subjects independent of STH status (including subjects who are not excreting eggs. For this sample size, the detection limit of the FEC method and the level of aggregation of the FEC did not affect the interpretation of the FECRT. Only for a threshold of 90%, mean baseline FEC <150 eggs per gram of stool led to a reduced discriminatory power.This study confirms that the interpretation of FECRT is affected by a complex interplay of factors inherent to both study design and host-parasite interactions. The results also highlight that revision of the current World Health Organization guidelines to monitor drug efficacy is indicated. We, therefore, propose novel guidelines to support future monitoring programs.

  17. The Milan Project: A New Method for High-Assurance and High-Performance Computing on Large-Scale Distributed Platforms

    National Research Council Canada - National Science Library

    Kedem, Zvi

    2000-01-01

    ...: Calypso, Chime, and Charlotte; which enable applications developed for ideal, shared memory, parallel machines to execute on distributed platforms that are subject to failures, slowdowns, and changing resource availability...

  18. Skewness of citation impact data and covariates of citation distributions: A large-scale empirical analysis based on Web of Science data

    NARCIS (Netherlands)

    Bornmann, L.; Leydesdorff, L.

    Using percentile shares, one can visualize and analyze the skewness in bibliometric data across disciplines and over time. The resulting figures can be intuitively interpreted and are more suitable for detailed analysis of the effects of independent and control variables on distributions than

  19. Large-Scale Prediction of Seagrass Distribution Integrating Landscape Metrics and Environmental Factors: The Case of Cymodocea nodosa (Mediterranean–Atlantic)

    KAUST Repository

    Chefaoui, Rosa M.

    2015-05-05

    Understanding the factors that affect seagrass meadows encompassing their entire range of distribution is challenging yet important for their conservation. Here, we predict the realized and potential distribution for the species Cymodocea nodosa modelling its environmental niche in the Mediterranean and adjacent Atlantic coastlines. We use a combination of environmental variables and landscape metrics to perform a suite of predictive algorithms which enables examination of the niche and find suitable habitats for the species. The most relevant environmental variables defining the distribution of C. nodosa were sea surface temperature (SST) and salinity. We found suitable habitats at SST from 5.8 °C to 26.4 °C and salinity ranging from 17.5 to 39.3. Optimal values of mean winter wave height ranged between 1.2 and 1.5 m, while waves higher than 2.5 m seemed to limit the presence of the species. The influence of nutrients and pH, despite having weight on the models, was not so clear in terms of ranges that confine the distribution of the species. Landscape metrics able to capture variation in the coastline enhanced significantly the accuracy of the models, despite the limitations caused by the scale of the study. We found potential suitable areas not occupied by the seagrass mainly in coastal regions of North Africa and the Adriatic coast of Italy. The present study describes the realized and potential distribution of a seagrass species, providing the first global model of the factors that can be shaping the environmental niche of C. nodosa throughout its range. We identified the variables constraining its distribution as well as thresholds delineating its environmental niche. Landscape metrics showed promising prospects for the prediction of coastal species dependent on the shape of the coast. By contrasting predictive approaches, we defined the variables affecting the distributional areas that seem unsuitable for C. nodosa as well as those suitable habitats not

  20. Large-Scale Prediction of Seagrass Distribution Integrating Landscape Metrics and Environmental Factors: The Case of Cymodocea nodosa (Mediterranean–Atlantic)

    KAUST Repository

    Chefaoui, Rosa M.; Assis, Jorge; Duarte, Carlos M.; Serrã o, Ester A.

    2015-01-01

    Understanding the factors that affect seagrass meadows encompassing their entire range of distribution is challenging yet important for their conservation. Here, we predict the realized and potential distribution for the species Cymodocea nodosa modelling its environmental niche in the Mediterranean and adjacent Atlantic coastlines. We use a combination of environmental variables and landscape metrics to perform a suite of predictive algorithms which enables examination of the niche and find suitable habitats for the species. The most relevant environmental variables defining the distribution of C. nodosa were sea surface temperature (SST) and salinity. We found suitable habitats at SST from 5.8 °C to 26.4 °C and salinity ranging from 17.5 to 39.3. Optimal values of mean winter wave height ranged between 1.2 and 1.5 m, while waves higher than 2.5 m seemed to limit the presence of the species. The influence of nutrients and pH, despite having weight on the models, was not so clear in terms of ranges that confine the distribution of the species. Landscape metrics able to capture variation in the coastline enhanced significantly the accuracy of the models, despite the limitations caused by the scale of the study. We found potential suitable areas not occupied by the seagrass mainly in coastal regions of North Africa and the Adriatic coast of Italy. The present study describes the realized and potential distribution of a seagrass species, providing the first global model of the factors that can be shaping the environmental niche of C. nodosa throughout its range. We identified the variables constraining its distribution as well as thresholds delineating its environmental niche. Landscape metrics showed promising prospects for the prediction of coastal species dependent on the shape of the coast. By contrasting predictive approaches, we defined the variables affecting the distributional areas that seem unsuitable for C. nodosa as well as those suitable habitats not

  1. Long-term spatial and temporal microbial community dynamics in a large-scale drinking water distribution system with multiple disinfectant regimes.

    Science.gov (United States)

    Potgieter, Sarah; Pinto, Ameet; Sigudu, Makhosazana; du Preez, Hein; Ncube, Esper; Venter, Stephanus

    2018-08-01

    Long-term spatial-temporal investigations of microbial dynamics in full-scale drinking water distribution systems are scarce. These investigations can reveal the process, infrastructure, and environmental factors that influence the microbial community, offering opportunities to re-think microbial management in drinking water systems. Often, these insights are missed or are unreliable in short-term studies, which are impacted by stochastic variabilities inherent to large full-scale systems. In this two-year study, we investigated the spatial and temporal dynamics of the microbial community in a large, full scale South African drinking water distribution system that uses three successive disinfection strategies (i.e. chlorination, chloramination and hypochlorination). Monthly bulk water samples were collected from the outlet of the treatment plant and from 17 points in the distribution system spanning nearly 150 km and the bacterial community composition was characterised by Illumina MiSeq sequencing of the V4 hypervariable region of the 16S rRNA gene. Like previous studies, Alpha- and Betaproteobacteria dominated the drinking water bacterial communities, with an increase in Betaproteobacteria post-chloramination. In contrast with previous reports, the observed richness, diversity, and evenness of the bacterial communities were higher in the winter months as opposed to the summer months in this study. In addition to temperature effects, the seasonal variations were also likely to be influenced by changes in average water age in the distribution system and corresponding changes in disinfectant residual concentrations. Spatial dynamics of the bacterial communities indicated distance decay, with bacterial communities becoming increasingly dissimilar with increasing distance between sampling locations. These spatial effects dampened the temporal changes in the bulk water community and were the dominant factor when considering the entire distribution system. However

  2. Large-scale gene flow in the barnacle Jehlius cirratus and contrasts with other broadly-distributed taxa along the Chilean coast

    Directory of Open Access Journals (Sweden)

    Baoying Guo

    2017-02-01

    Full Text Available We evaluate the population genetic structure of the intertidal barnacle Jehlius cirratus across a broad portion of its geographic distribution using data from the mitochondrial cytochrome oxidase I (COI gene region. Despite sampling diversity from over 3,000 km of the linear range of this species, there is only slight regional structure indicated, with overall Φ CT of 0.036 (p < 0.001 yet no support for isolation by distance. While these results suggest greater structure than previous studies of J. cirratus had indicated, the pattern of diversity is still far more subtle than in other similarly-distributed species with similar larval and life history traits. We compare these data and results with recent findings in four other intertidal species that have planktotrophic larvae. There are no clear patterns among these taxa that can be associated with intertidal depth or other known life history traits.

  3. Integrating Remote Sensing Information Into A Distributed Hydrological Model for Improving Water Budget Predictions in Large-scale Basins through Data Assimilation

    Science.gov (United States)

    Qin, Changbo; Jia, Yangwen; Su, Z.(Bob); Zhou, Zuhao; Qiu, Yaqin; Suhui, Shen

    2008-01-01

    This paper investigates whether remote sensing evapotranspiration estimates can be integrated by means of data assimilation into a distributed hydrological model for improving the predictions of spatial water distribution over a large river basin with an area of 317,800 km2. A series of available MODIS satellite images over the Haihe River basin in China are used for the year 2005. Evapotranspiration is retrieved from these 1×1 km resolution images using the SEBS (Surface Energy Balance System) algorithm. The physically-based distributed model WEP-L (Water and Energy transfer Process in Large river basins) is used to compute the water balance of the Haihe River basin in the same year. Comparison between model-derived and remote sensing retrieval basin-averaged evapotranspiration estimates shows a good piecewise linear relationship, but their spatial distribution within the Haihe basin is different. The remote sensing derived evapotranspiration shows variability at finer scales. An extended Kalman filter (EKF) data assimilation algorithm, suitable for non-linear problems, is used. Assimilation results indicate that remote sensing observations have a potentially important role in providing spatial information to the assimilation system for the spatially optical hydrological parameterization of the model. This is especially important for large basins, such as the Haihe River basin in this study. Combining and integrating the capabilities of and information from model simulation and remote sensing techniques may provide the best spatial and temporal characteristics for hydrological states/fluxes, and would be both appealing and necessary for improving our knowledge of fundamental hydrological processes and for addressing important water resource management problems. PMID:27879946

  4. Reversing the Trend of Large Scale and Centralization in Manufacturing: The Case of Distributed Manufacturing of Customizable 3-D-Printable Self-Adjustable Glasses

    Directory of Open Access Journals (Sweden)

    Jephias Gwamuri

    2014-04-01

    Full Text Available Although the trend in manufacturing has been towards centralization to leverage economies of scale, the recent rapid technical development of open-source 3-D printers enables low-cost distributed bespoke production. This paper explores the potential advantages of a distributed manufacturing model of high-value products by investigating the application of 3-D printing to self-refraction eyeglasses. A series of parametric 3-D printable designs is developed, fabricated and tested to overcome limitations identified with mass-manufactured self-correcting eyeglasses designed for the developing world's poor. By utilizing 3-D printable self-adjustable glasses, communities not only gain access to far more diversity in product design, as the glasses can be customized for the individual, but 3-D printing also offers the potential for significant cost reductions. The results show that distributed manufacturing with open-source 3-D printing can empower developing world communities through the ability to print less expensive and customized self-adjusting eyeglasses. This offers the potential to displace both centrally manufactured conventional and self-adjusting glasses while completely eliminating the costs of the conventional optics correction experience, including those of highly-trained optometrists and ophthalmologists and their associated equipment. Although, this study only analyzed a single product, it is clear that other products would benefit from the same approach in isolated regions of the developing world.

  5. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  6. Designing a low-cost effective network for monitoring large scale regional seismicity in a soft-soil region (Alsace, France)

    Science.gov (United States)

    Bès de Berc, M.; Doubre, C.; Wodling, H.; Jund, H.; Hernandez, A.; Blumentritt, H.

    2015-12-01

    The Seismological Observatory of the North-East of France (ObSNEF) is developing its monitoring network within the framework of several projects. Among these project, RESIF (Réseau sismologique et géodésique français) allows the instrumentation of broad-band seismic stations, separated by 50-100 km. With the recent and future development of geothermal industrial projects in the Alsace region, the ObSNEF is responsible for designing, building and operating a dense regional seismic network in order to detect and localize earthquakes with both a completeness magnitude of 1.5 and no clipping for M6.0. The realization of the project has to be done prior to the summer 2016Several complex technical and financial constraints constitute such a projet. First, most of the Alsace Région (150x150 km2), particularly the whole Upper Rhine Graben, is a soft-soil plain where seismic signals are dominated by a high frequency noise level. Second, all the signals have to be transmitted in near real-time. And finally, the total cost of the project must not exceed $450,000.Regarding the noise level in Alsace, in order to make a reduction of 40 dB for frequencies above 1Hz, we program to instrument into 50m deep well with post-hole sensor for 5 stations out of 8 plane new stations. The 3 remaining would be located on bedrock along the Vosges piedmont. In order to be sensitive to low-magnitude regional events, we plan to install a low-noise short-period post-hole velocimeter. In order to avoid saturation for high potentiel local events (M6.0 at 10km), this velocimeter will be coupled with a surface strong-motion sensor. Regarding the connectivity, these stations will have no wired network, which reduces linking costs and delays. We will therefore use solar panels and a 3G/GPRS network. The infrastructure will be minimal and reduced to an outdoor box on a secured parcel of land. In addition to the data-logger, we will use a 12V ruggedized computer, hosting a seed-link server for near

  7. Optical interconnect for large-scale systems

    Science.gov (United States)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  8. Large-scale biotic interaction effects - tree cover interacts with shade toler-ance to affect distribution patterns of herb and shrub species across the Alps

    DEFF Research Database (Denmark)

    Nieto-Lugilde, Diego; Lenoir, Jonathan; Abdulhak, Sylvain

    2012-01-01

    on the occurrence on light-demanding species via size-asymmetric competition for light, but a facilitative effect on shade-tolerant species. In order to compare the relative importance of tree cover, four models with different combinations of variables (climate, soil and tree cover) were run for each species. Then...... role. Results indicated that high tree cover causes range contraction, especially at the upper limit, for light-demanding species, whereas it causes shade-tolerant species to extend their range upwards and downwards. Tree cover thus drives plant-plant interactions to shape plant species distribution...

  9. Two distinct mtDNA lineages of the blue crab reveal large-scale population structure in its native Atlantic distribution

    Science.gov (United States)

    Alaniz Rodrigues, Marcos; Dumont, Luiz Felipe Cestari; dos Santos, Cléverson Rannieri Meira; D'Incao, Fernando; Weiss, Steven; Froufe, Elsa

    2017-10-01

    For the first time, a molecular approach was used to evaluate the phylogenetic structure of the disjunct native American distribution of the blue crab Callinectes sapidus. Population structure was investigated by sequencing 648bp of the Cytochrome oxidase subunit 1 (COI), in a total of 138 sequences stemming from individual samples from both the northern and southern hemispheres of the Western Atlantic distribution of the species. A Bayesian approach was used to construct a phylogenetic tree for all samples, and a 95% confidence parsimony network was created to depict the relationship among haplotypes. Results revealed two highly distinct lineages, one containing all samples from the United States and some from Brazil (lineage 1) and the second restricted to Brazil (lineage 2). In addition, gene flow (at least for females) was detected among estuaries at local scales and there is evidence for shared haplotypes in the south. Furthermore, the findings of this investigation support the contemporary introduction of haplotypes that have apparently spread from the south to the north Atlantic.

  10. Minimizing Harmonic Distortion Impact at Distribution System with Considering Large-Scale EV Load Behaviour Using Modified Lightning Search Algorithm and Pareto-Fuzzy Approach

    Directory of Open Access Journals (Sweden)

    S. N. Syed Nasir

    2018-01-01

    Full Text Available This research is focusing on optimal placement and sizing of multiple variable passive filter (VPF to mitigate harmonic distortion due to charging station (CS at 449 bus distribution network. There are 132 units of CS which are scheduled based on user behaviour within 24 hours, with the interval of 15 minutes. By considering the varying of CS patterns and harmonic impact, Modified Lightning Search Algorithm (MLSA is used to find 22 units of VPF coordination, so that less harmonics will be injected from 415 V bus to the medium voltage network and power loss is also reduced. Power system harmonic flow, VPF, CS, battery, and the analysis will be modelled in MATLAB/m-file platform. High Performance Computing (HPC is used to make simulation faster. Pareto-Fuzzy technique is used to obtain sizing of VPF from all nondominated solutions. From the result, the optimal placements and sizes of VPF are able to reduce the maximum THD for voltage and current and also the total apparent losses up to 39.14%, 52.5%, and 2.96%, respectively. Therefore, it can be concluded that the MLSA is suitable method to mitigate harmonic and it is beneficial in minimizing the impact of aggressive CS installation at distribution network.

  11. 8th International Symposium on Intelligent Distributed Computing & Workshop on Cyber Security and Resilience of Large-Scale Systems & 6th International Workshop on Multi-Agent Systems Technology and Semantics

    CERN Document Server

    Braubach, Lars; Venticinque, Salvatore; Badica, Costin

    2015-01-01

    This book represents the combined peer-reviewed proceedings of the Eight International Symposium on Intelligent Distributed Computing - IDC'2014, of the Workshop on Cyber Security and Resilience of Large-Scale Systems - WSRL-2014, and of the Sixth International Workshop on Multi-Agent Systems Technology and Semantics- MASTS-2014. All the events were held in Madrid, Spain, during September 3-5, 2014. The 47 contributions published in this book address several topics related to theory and applications of the intelligent distributed computing and multi-agent systems, including: agent-based data processing, ambient intelligence, collaborative systems, cryptography and security, distributed algorithms, grid and cloud computing, information extraction, knowledge management, big data and ontologies, social networks, swarm intelligence or videogames amongst others.

  12. Large-scale generic test stand for testing of multiple configurations of air filters utilizing a range of particle size distributions

    Science.gov (United States)

    Giffin, Paxton K.; Parsons, Michael S.; Unz, Ronald J.; Waggoner, Charles A.

    2012-05-01

    The Institute for Clean Energy Technology (ICET) at Mississippi State University has developed a test stand capable of lifecycle testing of high efficiency particulate air filters and other filters specified in American Society of Mechanical Engineers Code on Nuclear Air and Gas Treatment (AG-1) filters. The test stand is currently equipped to test AG-1 Section FK radial flow filters, and expansion is currently underway to increase testing capabilities for other types of AG-1 filters. The test stand is capable of producing differential pressures of 12.45 kPa (50 in. w.c.) at volumetric air flow rates up to 113.3 m3/min (4000 CFM). Testing is performed at elevated and ambient conditions for temperature and relative humidity. Current testing utilizes three challenge aerosols: carbon black, alumina, and Arizona road dust (A1-Ultrafine). Each aerosol has a different mass median diameter to test loading over a wide range of particles sizes. The test stand is designed to monitor and maintain relative humidity and temperature to required specifications. Instrumentation is implemented on the upstream and downstream sections of the test stand as well as on the filter housing itself. Representative data are presented herein illustrating the test stand's capabilities. Digital images of the filter pack collected during and after testing is displayed after the representative data are discussed. In conclusion, the ICET test stand with AG-1 filter testing capabilities has been developed and hurdles such as test parameter stability and design flexibility overcome.

  13. Column Store for GWAC: A High-cadence, High-density, Large-scale Astronomical Light Curve Pipeline and Distributed Shared-nothing Database

    Science.gov (United States)

    Wan, Meng; Wu, Chao; Wang, Jing; Qiu, Yulei; Xin, Liping; Mullender, Sjoerd; Mühleisen, Hannes; Scheers, Bart; Zhang, Ying; Nes, Niels; Kersten, Martin; Huang, Yongpan; Deng, Jinsong; Wei, Jianyan

    2016-11-01

    The ground-based wide-angle camera array (GWAC), a part of the SVOM space mission, will search for various types of optical transients by continuously imaging a field of view (FOV) of 5000 degrees2 every 15 s. Each exposure consists of 36 × 4k × 4k pixels, typically resulting in 36 × ˜175,600 extracted sources. For a modern time-domain astronomy project like GWAC, which produces massive amounts of data with a high cadence, it is challenging to search for short timescale transients in both real-time and archived data, and to build long-term light curves for variable sources. Here, we develop a high-cadence, high-density light curve pipeline (HCHDLP) to process the GWAC data in real-time, and design a distributed shared-nothing database to manage the massive amount of archived data which will be used to generate a source catalog with more than 100 billion records during 10 years of operation. First, we develop HCHDLP based on the column-store DBMS of MonetDB, taking advantage of MonetDB’s high performance when applied to massive data processing. To realize the real-time functionality of HCHDLP, we optimize the pipeline in its source association function, including both time and space complexity from outside the database (SQL semantic) and inside (RANGE-JOIN implementation), as well as in its strategy of building complex light curves. The optimized source association function is accelerated by three orders of magnitude. Second, we build a distributed database using a two-level time partitioning strategy via the MERGE TABLE and REMOTE TABLE technology of MonetDB. Intensive tests validate that our database architecture is able to achieve both linear scalability in response time and concurrent access by multiple users. In summary, our studies provide guidance for a solution to GWAC in real-time data processing and management of massive data.

  14. A large-scale intercomparison of stratospheric vertical distributions of NO2 and BrO retrieved from the SCIAMACHY limb measurements and ground-based twilight observations

    Science.gov (United States)

    Rozanov, Alexei; Hendrick, Francois; Lotz, Wolfhardt; van Roozendael, Michel; Bovensmann, Heinrich; Burrows, John P.

    This study is devoted to the intercomparison of NO2 and BrO vertical profiles obtained from the satellite and ground-based measurements. Although, the ground-based observations are performed only at selected locations, they have a great potential to be used for the validation of satellite measurements since continuous long-term measurement series performed with the same instruments are available. Thus, long-term trends in the observed species can be analyzed and intercompared. Previous intercomparisons of the vertical distributions of NO2 and BrO retrieved from SCIAMACHY limb measurements at the University of Bremen and obtained at IASB-BIRA by applying a profiling technique to ground-based zenith-sky DOAS observations have shown a good agreement between the results of completely different measurement techniques. However, only a relatively short time period of one year was analyzed so far which do not allow investigating seasonal variations and trends. Furthermore, some minor discrepancies are still to be analyzed. In the current study, several years datasets obtained at Observatoire de Haute-Provence (OHP) in France and in Harestua in Norway will be compared to the retrievals of SCIAMACHY limb measurements. Seasonal and annual variations will be analyzed and possible reasons for the remaining discrepancies will be discussed.

  15. Large scale biomimetic membrane arrays

    DEFF Research Database (Denmark)

    Hansen, Jesper Søndergaard; Perry, Mark; Vogel, Jörg

    2009-01-01

    To establish planar biomimetic membranes across large scale partition aperture arrays, we created a disposable single-use horizontal chamber design that supports combined optical-electrical measurements. Functional lipid bilayers could easily and efficiently be established across CO2 laser micro......-structured 8 x 8 aperture partition arrays with average aperture diameters of 301 +/- 5 mu m. We addressed the electro-physical properties of the lipid bilayers established across the micro-structured scaffold arrays by controllable reconstitution of biotechnological and physiological relevant membrane...... peptides and proteins. Next, we tested the scalability of the biomimetic membrane design by establishing lipid bilayers in rectangular 24 x 24 and hexagonal 24 x 27 aperture arrays, respectively. The results presented show that the design is suitable for further developments of sensitive biosensor assays...

  16. Conference on Large Scale Optimization

    CERN Document Server

    Hearn, D; Pardalos, P

    1994-01-01

    On February 15-17, 1993, a conference on Large Scale Optimization, hosted by the Center for Applied Optimization, was held at the University of Florida. The con­ ference was supported by the National Science Foundation, the U. S. Army Research Office, and the University of Florida, with endorsements from SIAM, MPS, ORSA and IMACS. Forty one invited speakers presented papers on mathematical program­ ming and optimal control topics with an emphasis on algorithm development, real world applications and numerical results. Participants from Canada, Japan, Sweden, The Netherlands, Germany, Belgium, Greece, and Denmark gave the meeting an important international component. At­ tendees also included representatives from IBM, American Airlines, US Air, United Parcel Serice, AT & T Bell Labs, Thinking Machines, Army High Performance Com­ puting Research Center, and Argonne National Laboratory. In addition, the NSF sponsored attendance of thirteen graduate students from universities in the United States and abro...

  17. Large scale nuclear structure studies

    International Nuclear Information System (INIS)

    Faessler, A.

    1985-01-01

    Results of large scale nuclear structure studies are reported. The starting point is the Hartree-Fock-Bogoliubov solution with angular momentum and proton and neutron number projection after variation. This model for number and spin projected two-quasiparticle excitations with realistic forces yields in sd-shell nuclei similar good results as the 'exact' shell-model calculations. Here the authors present results for a pf-shell nucleus 46 Ti and results for the A=130 mass region where they studied 58 different nuclei with the same single-particle energies and the same effective force derived from a meson exchange potential. They carried out a Hartree-Fock-Bogoliubov variation after mean field projection in realistic model spaces. In this way, they determine for each yrast state the optimal mean Hartree-Fock-Bogoliubov field. They apply this method to 130 Ce and 128 Ba using the same effective nucleon-nucleon interaction. (Auth.)

  18. Large-scale river regulation

    International Nuclear Information System (INIS)

    Petts, G.

    1994-01-01

    Recent concern over human impacts on the environment has tended to focus on climatic change, desertification, destruction of tropical rain forests, and pollution. Yet large-scale water projects such as dams, reservoirs, and inter-basin transfers are among the most dramatic and extensive ways in which our environment has been, and continues to be, transformed by human action. Water running to the sea is perceived as a lost resource, floods are viewed as major hazards, and wetlands are seen as wastelands. River regulation, involving the redistribution of water in time and space, is a key concept in socio-economic development. To achieve water and food security, to develop drylands, and to prevent desertification and drought are primary aims for many countries. A second key concept is ecological sustainability. Yet the ecology of rivers and their floodplains is dependent on the natural hydrological regime, and its related biochemical and geomorphological dynamics. (Author)

  19. Reviving large-scale projects

    International Nuclear Information System (INIS)

    Desiront, A.

    2003-01-01

    For the past decade, most large-scale hydro development projects in northern Quebec have been put on hold due to land disputes with First Nations. Hydroelectric projects have recently been revived following an agreement signed with Aboriginal communities in the province who recognized the need to find new sources of revenue for future generations. Many Cree are working on the project to harness the waters of the Eastmain River located in the middle of their territory. The work involves building an 890 foot long dam, 30 dikes enclosing a 603 square-km reservoir, a spillway, and a power house with 3 generating units with a total capacity of 480 MW of power for start-up in 2007. The project will require the use of 2,400 workers in total. The Cree Construction and Development Company is working on relations between Quebec's 14,000 Crees and the James Bay Energy Corporation, the subsidiary of Hydro-Quebec which is developing the project. Approximately 10 per cent of the $735-million project has been designated for the environmental component. Inspectors ensure that the project complies fully with environmental protection guidelines. Total development costs for Eastmain-1 are in the order of $2 billion of which $735 million will cover work on site and the remainder will cover generating units, transportation and financial charges. Under the treaty known as the Peace of the Braves, signed in February 2002, the Quebec government and Hydro-Quebec will pay the Cree $70 million annually for 50 years for the right to exploit hydro, mining and forest resources within their territory. The project comes at a time when electricity export volumes to the New England states are down due to growth in Quebec's domestic demand. Hydropower is a renewable and non-polluting source of energy that is one of the most acceptable forms of energy where the Kyoto Protocol is concerned. It was emphasized that large-scale hydro-electric projects are needed to provide sufficient energy to meet both

  20. Sensor distributions for structural monitoring

    DEFF Research Database (Denmark)

    Ulriksen, Martin Dalgaard; Bernal, Dionisio

    2017-01-01

    Deciding on the spatial distribution of output sensors for vibration-based structural health monitoring (SHM) is a task that has been, and still is, studied extensively. Yet, when referring to the conventional damage characterization hierarchy, composed of detection, localization, and quantificat......Deciding on the spatial distribution of output sensors for vibration-based structural health monitoring (SHM) is a task that has been, and still is, studied extensively. Yet, when referring to the conventional damage characterization hierarchy, composed of detection, localization......, and quantification, it is primarily the first component that has been addressed with regard to optimal sensor placement. In this particular context, a common approach is to distribute sensors, of which the amount is determined a priori, such that some scalar function of the probability of detection for a pre......-defined set of damage patterns is maximized. Obviously, the optimal sensor distribution, in terms of damage detection, is algorithm-dependent, but studies have showed how correlation generally exists between the different strategies. However, it still remains a question how this “optimality” correlates...

  1. Large Scale Glazed Concrete Panels

    DEFF Research Database (Denmark)

    Bache, Anja Margrethe

    2010-01-01

    Today, there is a lot of focus on concrete surface’s aesthitic potential, both globally and locally. World famous architects such as Herzog De Meuron, Zaha Hadid, Richard Meyer and David Chippenfield challenge the exposure of concrete in their architecture. At home, this trend can be seen...... in the crinkly façade of DR-Byen (the domicile of the Danish Broadcasting Company) by architect Jean Nouvel and Zaha Hadid’s Ordrupgård’s black curved smooth concrete surfaces. Furthermore, one can point to initiatives such as “Synlig beton” (visible concrete) that can be seen on the website www.......synligbeton.dk and spæncom’s aesthetic relief effects by the designer Line Kramhøft (www.spaencom.com). It is my hope that the research-development project “Lasting large scale glazed concrete formwork,” I am working on at DTU, department of Architectural Engineering will be able to complement these. It is a project where I...

  2. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  3. Large scale cross hole testing

    International Nuclear Information System (INIS)

    Ball, J.K.; Black, J.H.; Doe, T.

    1991-05-01

    As part of the Site Characterisation and Validation programme the results of the large scale cross hole testing have been used to document hydraulic connections across the SCV block, to test conceptual models of fracture zones and obtain hydrogeological properties of the major hydrogeological features. The SCV block is highly heterogeneous. This heterogeneity is not smoothed out even over scales of hundreds of meters. Results of the interpretation validate the hypothesis of the major fracture zones, A, B and H; not much evidence of minor fracture zones is found. The uncertainty in the flow path, through the fractured rock, causes sever problems in interpretation. Derived values of hydraulic conductivity were found to be in a narrow range of two to three orders of magnitude. Test design did not allow fracture zones to be tested individually. This could be improved by testing the high hydraulic conductivity regions specifically. The Piezomac and single hole equipment worked well. Few, if any, of the tests ran long enough to approach equilibrium. Many observation boreholes showed no response. This could either be because there is no hydraulic connection, or there is a connection but a response is not seen within the time scale of the pumping test. The fractional dimension analysis yielded credible results, and the sinusoidal testing procedure provided an effective means of identifying the dominant hydraulic connections. (10 refs.) (au)

  4. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  5. Personalized Opportunistic Computing for CMS at Large Scale

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    **Douglas Thain** is an Associate Professor of Computer Science and Engineering at the University of Notre Dame, where he designs large scale distributed computing systems to power the needs of advanced science and...

  6. Large-scale Intelligent Transporation Systems simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  7. Comparative Analysis of Different Protocols to Manage Large Scale Networks

    OpenAIRE

    Anil Rao Pimplapure; Dr Jayant Dubey; Prashant Sen

    2013-01-01

    In recent year the numbers, complexity and size is increased in Large Scale Network. The best example of Large Scale Network is Internet, and recently once are Data-centers in Cloud Environment. In this process, involvement of several management tasks such as traffic monitoring, security and performance optimization is big task for Network Administrator. This research reports study the different protocols i.e. conventional protocols like Simple Network Management Protocol and newly Gossip bas...

  8. Capabilities of the Large-Scale Sediment Transport Facility

    Science.gov (United States)

    2016-04-01

    pump flow meters, sediment trap weigh tanks , and beach profiling lidar. A detailed discussion of the original LSTF features and capabilities can be...ERDC/CHL CHETN-I-88 April 2016 Approved for public release; distribution is unlimited. Capabilities of the Large-Scale Sediment Transport...describes the Large-Scale Sediment Transport Facility (LSTF) and recent upgrades to the measurement systems. The purpose of these upgrades was to increase

  9. Continuous atmosperic pollutant distribution monitor

    International Nuclear Information System (INIS)

    Burger, L.W.

    1984-06-01

    The feasibility of an on-line continuous pollutant distribution monitor has been established. The device employs a small microprocessor to simulate atmospheric dispersion using the segmented Gaussian plume model, an anemometer-bivane to measure the wind conditions, and a control 'box' with which certain model parameters can be specified and changed without interrupting the execution of the computer program. The output of the device is in the form of contours of equal ground-level concentrations on a colour monitor, or a printer dump of the associated graphic screen. The device is only suitable for short range predictions due to the memory limitations of the microprocessor. Predictions are updated approximately once every minute. Various case studies were performed to investigate the behaviour of the model when subjected to controlled input data. A series of short-range experimental runs were conducted in which predicted SO 2 concentrations were compared with measured values. The performance of the model in the case studies was acceptable, and comparisons with the measured SO 2 concentration values proved the model to overpredict by about 30%, which is considered satisfactory. This study can be applied to radioactive effluents transport in the atmosphere

  10. Large-scale digitizer system, analog converters

    International Nuclear Information System (INIS)

    Althaus, R.F.; Lee, K.L.; Kirsten, F.A.; Wagner, L.J.

    1976-10-01

    Analog to digital converter circuits that are based on the sharing of common resources, including those which are critical to the linearity and stability of the individual channels, are described. Simplicity of circuit composition is valued over other more costly approaches. These are intended to be applied in a large-scale processing and digitizing system for use with high-energy physics detectors such as drift-chambers or phototube-scintillator arrays. Signal distribution techniques are of paramount importance in maintaining adequate signal-to-noise ratio. Noise in both amplitude and time-jitter senses is held sufficiently low so that conversions with 10-bit charge resolution and 12-bit time resolution are achieved

  11. Large scale study of tooth enamel

    International Nuclear Information System (INIS)

    Bodart, F.; Deconninck, G.; Martin, M.T.

    Human tooth enamel contains traces of foreign elements. The presence of these elements is related to the history and the environment of the human body and can be considered as the signature of perturbations which occur during the growth of a tooth. A map of the distribution of these traces on a large scale sample of the population will constitute a reference for further investigations of environmental effects. On hundred eighty samples of teeth were first analyzed using PIXE, backscattering and nuclear reaction techniques. The results were analyzed using statistical methods. Correlations between O, F, Na, P, Ca, Mn, Fe, Cu, Zn, Pb and Sr were observed and cluster analysis was in progress. The techniques described in the present work have been developed in order to establish a method for the exploration of very large samples of the Belgian population. (author)

  12. Distributed computing environment monitoring and user expectations

    International Nuclear Information System (INIS)

    Cottrell, R.L.A.; Logg, C.A.

    1996-01-01

    This paper discusses the growing needs for distributed system monitoring and compares it to current practices. It then goes to identify the components of distributed system monitoring and shows how they are implemented and successfully used at one site today to address the Local area Network (WAN), and host monitoring. It shows how this monitoring can be used to develop realistic service level expectations and also identifies the costs. Finally, the paper briefly discusses the future challenges in network monitoring. (author)

  13. Prediction of etching-shape anomaly due to distortion of ion sheath around a large-scale three-dimensional structure by means of on-wafer monitoring technique and computer simulation

    International Nuclear Information System (INIS)

    Kubota, Tomohiro; Ohtake, Hiroto; Araki, Ryosuke; Yanagisawa, Yuuki; Samukawa, Seiji; Iwasaki, Takuya; Ono, Kohei; Miwa, Kazuhiro

    2013-01-01

    A system for predicting distortion of a profile during plasma etching was developed. The system consists of a combination of measurement and simulation. An ‘on-wafer sheath-shape sensor’ for measuring the plasma-sheath parameters (sheath potential and thickness) on the stage of the plasma etcher was developed. The sensor has numerous small electrodes for measuring sheath potential and saturation ion-current density, from which sheath thickness can be calculated. The results of the measurement show reasonable dependence on source power, bias power and pressure. Based on self-consistent calculation of potential distribution and ion- and electron-density distributions, simulation of the sheath potential distribution around an arbitrary 3D structure and the trajectory of incident ions from the plasma to the structure was developed. To confirm the validity of the distortion prediction by comparing it with experimentally measured distortion, silicon trench etching under chlorine inductively coupled plasma (ICP) was performed using a sample with a vertical step. It was found that the etched trench was distorted when the distance from the step was several millimetres or less. The distortion angle was about 20° at maximum. Measurement was performed using the on-wafer sheath-shape sensor in the same plasma condition as the etching. The ion incident angle, calculated as a function of distance from the step, successfully reproduced the experimentally measured angle, indicating that the combination of measurement by the on-wafer sheath-shape sensor and simulation can predict distortion of an etched structure. This prediction system will be useful for designing devices with large-scale 3D structures (such as those in MEMS) and determining the optimum etching conditions to obtain the desired profiles. (paper)

  14. Monitoring distributed object and component communication

    NARCIS (Netherlands)

    Diakov, N.K.

    2004-01-01

    This thesis presents our work in the area of monitoring distributed software applications (DSAs). We produce three main results: (1) a design approach for building monitoring systems, (2) a design of a system for MOnitoring Distributed Object and Component Communication (MODOCC) behavior in

  15. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  16. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  17. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  18. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  19. Large Scale EOF Analysis of Climate Data

    Science.gov (United States)

    Prabhat, M.; Gittens, A.; Kashinath, K.; Cavanaugh, N. R.; Mahoney, M.

    2016-12-01

    We present a distributed approach towards extracting EOFs from 3D climate data. We implement the method in Apache Spark, and process multi-TB sized datasets on O(1000-10,000) cores. We apply this method to latitude-weighted ocean temperature data from CSFR, a 2.2 terabyte-sized data set comprising ocean and subsurface reanalysis measurements collected at 41 levels in the ocean, at 6 hour intervals over 31 years. We extract the first 100 EOFs of this full data set and compare to the EOFs computed simply on the surface temperature field. Our analyses provide evidence of Kelvin and Rossy waves and components of large-scale modes of oscillation including the ENSO and PDO that are not visible in the usual SST EOFs. Further, they provide information on the the most influential parts of the ocean, such as the thermocline, that exist below the surface. Work is ongoing to understand the factors determining the depth-varying spatial patterns observed in the EOFs. We will experiment with weighting schemes to appropriately account for the differing depths of the observations. We also plan to apply the same distributed approach to analysis of analysis of 3D atmospheric climatic data sets, including multiple variables. Because the atmosphere changes on a quicker time-scale than the ocean, we expect that the results will demonstrate an even greater advantage to computing 3D EOFs in lieu of 2D EOFs.

  20. Power distribution monitor in a nuclear reactor

    International Nuclear Information System (INIS)

    Uematsu, Hitoshi

    1983-01-01

    Purpose: To enable accurate monitoring for the reactor power distribution within a short time in a case where abnormality occurs in in-core neutron monitors or in a case where the reactor core state changes after the calibration for the neutron monitors. Constitution: The power distribution monitor comprises a power distribution calculator adapted to be inputted counted values from a reactor core present state data instruments and calculate the neutron flux distribution in the reactor core and the power distribution based on previously incorporated physical models, an RCF calculator adapted to be inputted with the counted values from the in-core neutron monitors and the neutron flux distribution and the power distribution calculated in the power distribution calculator and compensate the counted errors included in the counted values form the in-core neutron monitors and the calculation errors included in the power distribution calculated in the power distribution calculator to thereby calculate the power distribution within the reactor core, and an input/output device for the input of the data required for said power distribution calculator and the display for the calculation result calculated in the RCF calculator. (Ikeda, J.)

  1. Large scale solar district heating. Evaluation, modelling and designing - Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Heller, A.

    2000-07-01

    The appendices present the following: A) Cad-drawing of the Marstal CSHP design. B) Key values - large-scale solar heating in Denmark. C) Monitoring - a system description. D) WMO-classification of pyranometers (solarimeters). E) The computer simulation model in TRNSYS. F) Selected papers from the author. (EHS)

  2. Factors Influencing Uptake of a Large Scale Curriculum Innovation.

    Science.gov (United States)

    Adey, Philip S.

    Educational research has all too often failed to be implemented on a large-scale basis. This paper describes the multiplier effect of a professional development program for teachers and for trainers in the United Kingdom, and how that program was developed, monitored, and evaluated. Cognitive Acceleration through Science Education (CASE) is a…

  3. Distributed computing environment monitoring and user expectations

    International Nuclear Information System (INIS)

    Cottrell, R.L.A.; Logg, C.A.

    1995-11-01

    This paper discusses the growing needs for distributed system monitoring and compares it to current practices. It then goes on to identify the components of distributed system monitoring and shows how they are implemented and successfully used at one site today to address the Local Area Network (LAN), network services and applications, the Wide Area Network (WAN), and host monitoring. It shows how this monitoring can be used to develop realistic service level expectations and also identifies the costs. Finally, the paper briefly discusses the future challenges in network monitoring

  4. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  5. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  6. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  7. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  8. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  9. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  10. PKI security in large-scale healthcare networks.

    Science.gov (United States)

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  11. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  12. GPU-based large-scale visualization

    KAUST Repository

    Hadwiger, Markus

    2013-11-19

    Recent advances in image and volume acquisition as well as computational advances in simulation have led to an explosion of the amount of data that must be visualized and analyzed. Modern techniques combine the parallel processing power of GPUs with out-of-core methods and data streaming to enable the interactive visualization of giga- and terabytes of image and volume data. A major enabler for interactivity is making both the computational and the visualization effort proportional to the amount of data that is actually visible on screen, decoupling it from the full data size. This leads to powerful display-aware multi-resolution techniques that enable the visualization of data of almost arbitrary size. The course consists of two major parts: An introductory part that progresses from fundamentals to modern techniques, and a more advanced part that discusses details of ray-guided volume rendering, novel data structures for display-aware visualization and processing, and the remote visualization of large online data collections. You will learn how to develop efficient GPU data structures and large-scale visualizations, implement out-of-core strategies and concepts such as virtual texturing that have only been employed recently, as well as how to use modern multi-resolution representations. These approaches reduce the GPU memory requirements of extremely large data to a working set size that fits into current GPUs. You will learn how to perform ray-casting of volume data of almost arbitrary size and how to render and process gigapixel images using scalable, display-aware techniques. We will describe custom virtual texturing architectures as well as recent hardware developments in this area. We will also describe client/server systems for distributed visualization, on-demand data processing and streaming, and remote visualization. We will describe implementations using OpenGL as well as CUDA, exploiting parallelism on GPUs combined with additional asynchronous

  13. Power distribution monitor for nuclear reactor

    International Nuclear Information System (INIS)

    Nishizawa, Yasuo; Kiguchi, Takashi.

    1974-01-01

    Object: To compare the measured local power region monitor (LPRM) index with the result of a primary calculation to correct the threshold condition for the primary calculation thereby to rapidly grasp and monitor the existing power distribution. Structure: The index of an LPRM disposed in a nuclear reactor is processed in a data processor to remove therefrom a noise, and transmitted to a threshold condition processor to be stored therein. The LPRM index measured by the threshold condition processor is compared with the calculated LPRM value transmitted from the primary processor, whereby the threshold condition is corrected and transmitted to the primary processor. After the completion of calculation, the traversing incore probe (TIP) indexing value is converted to a thermal output distribution or a linear output density distribution and transmitted to an output indicator or an output typewriter. The operator may monitor the existing power distribution by monitoring the output indicator. (Kamimura, M.)

  14. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  15. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  16. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  17. Towards flexible and scalable distributed monitoring with mobile agents

    NARCIS (Netherlands)

    Liotta, A.

    2001-01-01

    The tremendous success of the Internet has made possible and even encouraged the realisation of systems characterised by very large scale and high level of distribution. Managing such systems requires systematic communication between a centralised station and distributed components. In many cases, a

  18. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  19. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  20. Continuous Distributed Top-k Monitoring over High-Speed Rail Data Stream in Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Hanning Wang

    2013-01-01

    Full Text Available In the environment of cloud computing, real-time mass data about high-speed rail which is based on the intense monitoring of large scale perceived equipment provides strong support for the safety and maintenance of high-speed rail. In this paper, we focus on the Top-k algorithm of continuous distribution based on Multisource distributed data stream for high-speed rail monitoring. Specifically, we formalized Top-k monitoring model of high-speed rail and proposed DTMR that is the Top-k monitoring algorithm with random, continuous, or strictly monotone aggregation functions. The DTMR was proved to be valid by lots of experiments.

  1. Performance Monitoring of Residential Hot Water Distribution Systems

    Energy Technology Data Exchange (ETDEWEB)

    Liao, Anna; Lanzisera, Steven; Lutz, Jim; Fitting, Christian; Kloss, Margarita; Stiles, Christopher

    2014-08-11

    Current water distribution systems are designed such that users need to run the water for some time to achieve the desired temperature, wasting energy and water in the process. We developed a wireless sensor network for large-scale, long time-series monitoring of residential water end use. Our system consists of flow meters connected to wireless motes transmitting data to a central manager mote, which in turn posts data to our server via the internet. This project also demonstrates a reliable and flexible data collection system that could be configured for various other forms of end use metering in buildings. The purpose of this study was to determine water and energy use and waste in hot water distribution systems in California residences. We installed meters at every end use point and the water heater in 20 homes and collected 1s flow and temperature data over an 8 month period. For a typical shower and dishwasher events, approximately half the energy is wasted. This relatively low efficiency highlights the importance of further examining the energy and water waste in hot water distribution systems.

  2. Configuration management in large scale infrastructure development

    NARCIS (Netherlands)

    Rijn, T.P.J. van; Belt, H. van de; Los, R.H.

    2000-01-01

    Large Scale Infrastructure (LSI) development projects such as the construction of roads, rail-ways and other civil engineering (water)works is tendered differently today than a decade ago. Traditional workflow requested quotes from construction companies for construction works where the works to be

  3. Large-scale Motion of Solar Filaments

    Indian Academy of Sciences (India)

    tribpo

    Large-scale Motion of Solar Filaments. Pavel Ambrož, Astronomical Institute of the Acad. Sci. of the Czech Republic, CZ-25165. Ondrejov, The Czech Republic. e-mail: pambroz@asu.cas.cz. Alfred Schroll, Kanzelhöehe Solar Observatory of the University of Graz, A-9521 Treffen,. Austria. e-mail: schroll@solobskh.ac.at.

  4. Sensitivity analysis for large-scale problems

    Science.gov (United States)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  5. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  6. The origin of large scale cosmic structure

    International Nuclear Information System (INIS)

    Jones, B.J.T.; Palmer, P.L.

    1985-01-01

    The paper concerns the origin of large scale cosmic structure. The evolution of density perturbations, the nonlinear regime (Zel'dovich's solution and others), the Gott and Rees clustering hierarchy, the spectrum of condensations, and biassed galaxy formation, are all discussed. (UK)

  7. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  8. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  9. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  10. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed to advance scientific understanding of the human brain. Computer simulation studies can be used to produce surrogate observational data for better conceptual models and new how...

  11. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  12. Stability of large scale interconnected dynamical systems

    International Nuclear Information System (INIS)

    Akpan, E.P.

    1993-07-01

    Large scale systems modelled by a system of ordinary differential equations are considered and necessary and sufficient conditions are obtained for the uniform asymptotic connective stability of the systems using the method of cone-valued Lyapunov functions. It is shown that this model significantly improves the existing models. (author). 9 refs

  13. Experimental investigation of leak detection using mobile distributed monitoring system

    Science.gov (United States)

    Chen, Jiang; Zheng, Junli; Xiong, Feng; Ge, Qi; Yan, Qixiang; Cheng, Fei

    2018-01-01

    The leak detection of rockfill dams is currently hindered by spatial and temporal randomness and wide monitoring range. The spatial resolution of fiber Bragg grating (FBG) temperature sensing technology is related to the distance between measuring points. As a result, the number of measuring points should be increased to ensure that the precise location of the leak is detected. However, this leads to a higher monitoring cost. Consequently, it is difficult to promote and apply this technology to effectively monitor rockfill dam leakage. In this paper, a practical mobile distributed monitoring system with dual-tubes is used by combining the FBG sensing system and hydrothermal cycling system. This dual-tube structure is composed of an outer polyethylene of raised temperature resistance heating pipe, an inner polytetrafluoroethylene tube, and a FBG sensor string, among which, the FBG sensor string can be dragged freely in the internal tube to change the position of the measuring points and improve the spatial resolution. In order to test the effectiveness of the system, the large-scale model test of concentrated leakage in 13 working conditions is carried out by identifying the location, quantity, and leakage rate of leakage passage. Based on Newton’s law of cooling, the leakage state is identified using the seepage identification index ζ v that was confirmed according to the cooling curve. Results suggested that the monitoring system shows high sensitivity and can improve the spatial resolution with limited measuring points, and thus better locate the leakage area. In addition, the seepage identification index ζ v correlated well with the leakage rate qualitatively.

  14. Large-scale diversity of slope fishes: pattern inconsistency between multiple diversity indices.

    Science.gov (United States)

    Gaertner, Jean-Claude; Maiorano, Porzia; Mérigot, Bastien; Colloca, Francesco; Politou, Chrissi-Yianna; Gil De Sola, Luis; Bertrand, Jacques A; Murenu, Matteo; Durbec, Jean-Pierre; Kallianiotis, Argyris; Mannini, Alessandro

    2013-01-01

    Large-scale studies focused on the diversity of continental slope ecosystems are still rare, usually restricted to a limited number of diversity indices and mainly based on the empirical comparison of heterogeneous local data sets. In contrast, we investigate large-scale fish diversity on the basis of multiple diversity indices and using 1454 standardized trawl hauls collected throughout the upper and middle slope of the whole northern Mediterranean Sea (36°3'- 45°7' N; 5°3'W - 28°E). We have analyzed (1) the empirical relationships between a set of 11 diversity indices in order to assess their degree of complementarity/redundancy and (2) the consistency of spatial patterns exhibited by each of the complementary groups of indices. Regarding species richness, our results contrasted both the traditional view based on the hump-shaped theory for bathymetric pattern and the commonly-admitted hypothesis of a large-scale decreasing trend correlated with a similar gradient of primary production in the Mediterranean Sea. More generally, we found that the components of slope fish diversity we analyzed did not always show a consistent pattern of distribution according either to depth or to spatial areas, suggesting that they are not driven by the same factors. These results, which stress the need to extend the number of indices traditionally considered in diversity monitoring networks, could provide a basis for rethinking not only the methodological approach used in monitoring systems, but also the definition of priority zones for protection. Finally, our results call into question the feasibility of properly investigating large-scale diversity patterns using a widespread approach in ecology, which is based on the compilation of pre-existing heterogeneous and disparate data sets, in particular when focusing on indices that are very sensitive to sampling design standardization, such as species richness.

  15. On precipitation monitoring with theoretical statistical distributions

    Science.gov (United States)

    Cindrić, Ksenija; Juras, Josip; Pasarić, Zoran

    2018-04-01

    A common practice in meteorological drought monitoring is to transform the observed precipitation amounts to the standardised precipitation index (SPI). Though the gamma distribution is usually employed for this purpose, some other distribution may be used, particularly in regions where zero precipitation amounts are recorded frequently. In this study, two distributions are considered alongside with the gamma distribution: the compound Poisson exponential distribution (CPE) and the square root normal distribution (SRN). They are fitted to monthly precipitation amounts measured at 24 stations in Croatia in the 55-year-long period (1961-2015). At five stations, long-term series (1901-2015) are available and they have been used for a more detailed investigation. The accommodation of the theoretical distributions to empirical ones is tested by comparison of the corresponding empirical and theoretical ratios of the skewness and the coefficient of variation. Furthermore, following the common approach to precipitation monitoring (CLIMAT reports), the comparison of the empirical and theoretical quintiles in the two periods (1961-1990 and 1991-2015) is examined. The results from the present study reveal that it would be more appropriate to implement theoretical distributions in such climate reports, since they provide better evaluation for monitoring purposes than the current empirical distribution. Nevertheless, deciding on an optimal theoretical distribution for different climate regimes and for different time periods is not easy to accomplish. With regard to Croatian stations (covering different climate regimes), the CPE or SRN distribution could also be the right choice in the climatological practice, in addition to the gamma distribution.

  16. Distributed Monitoring of Voltage Collapse Sensitivity Indices

    OpenAIRE

    Simpson-Porco, John W.; Bullo, Francesco

    2016-01-01

    The assessment of voltage stability margins is a promising direction for wide-area monitoring systems. Accurate monitoring architectures for long-term voltage instability are typically centralized and lack scalability, while completely decentralized approaches relying on local measurements tend towards inaccuracy. Here we present distributed linear algorithms for the online computation of voltage collapse sensitivity indices. The computations are collectively performed by processors embedded ...

  17. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  18. Emerging large-scale solar heating applications

    Energy Technology Data Exchange (ETDEWEB)

    Wong, W.P.; McClung, J.L. [Science Applications International Corporation (SAIC Canada), Ottawa, Ontario (Canada)

    2009-07-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  19. Challenges for Large Scale Structure Theory

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    I will describe some of the outstanding questions in Cosmology where answers could be provided by observations of the Large Scale Structure of the Universe at late times.I will discuss some of the theoretical challenges which will have to be overcome to extract this information from the observations. I will describe some of the theoretical tools that might be useful to achieve this goal. 

  20. Methods for Large-Scale Nonlinear Optimization.

    Science.gov (United States)

    1980-05-01

    STANFORD, CALIFORNIA 94305 METHODS FOR LARGE-SCALE NONLINEAR OPTIMIZATION by Philip E. Gill, Waiter Murray, I Michael A. Saunden, and Masgaret H. Wright...typical iteration can be partitioned so that where B is an m X m basise matrix. This partition effectively divides the vari- ables into three classes... attention is given to the standard of the coding or the documentation. A much better way of obtaining mathematical software is from a software library

  1. Large scale inhomogeneities and the cosmological principle

    International Nuclear Information System (INIS)

    Lukacs, B.; Meszaros, A.

    1984-12-01

    The compatibility of cosmologic principles and possible large scale inhomogeneities of the Universe is discussed. It seems that the strongest symmetry principle which is still compatible with reasonable inhomogeneities, is a full conformal symmetry in the 3-space defined by the cosmological velocity field, but even in such a case, the standard model is isolated from the inhomogeneous ones when the whole evolution is considered. (author)

  2. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  3. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  4. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  5. LAVA: Large scale Automated Vulnerability Addition

    Science.gov (United States)

    2016-05-23

    LAVA: Large-scale Automated Vulnerability Addition Brendan Dolan -Gavitt∗, Patrick Hulin†, Tim Leek†, Fredrich Ulrich†, Ryan Whelan† (Authors listed...released, and thus rapidly become stale. We can expect tools to have been trained to detect bugs that have been released. Given the commercial price tag...low TCN) and dead (low liveness) program data is a powerful one for vulnera- bility injection. The DUAs it identifies are internal program quantities

  6. Large-Scale Transit Signal Priority Implementation

    OpenAIRE

    Lee, Kevin S.; Lozner, Bailey

    2018-01-01

    In 2016, the District Department of Transportation (DDOT) deployed Transit Signal Priority (TSP) at 195 intersections in highly urbanized areas of Washington, DC. In collaboration with a broader regional implementation, and in partnership with the Washington Metropolitan Area Transit Authority (WMATA), DDOT set out to apply a systems engineering–driven process to identify, design, test, and accept a large-scale TSP system. This presentation will highlight project successes and lessons learned.

  7. Distributed Monitoring System Based on ICINGA

    CERN Multimedia

    Haen, C; Neufeld, N

    2011-01-01

    The LHCb online system relies on a large and heterogeneous I.T. infrastructure : it comprises more than 2000 servers and embedded systems and more than 200 network devices. While for the control and monitoring of detectors, PLCs, and readout boards an industry standard SCADA system PVSSII has been put in production, we use a low level monitoring system to monitor the control infrastructure itself. While our previous system was based on a single central NAGIOS server, our current system uses a distributed ICINGA infrastructure.

  8. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  9. On the relevance of efficient, integrated computer and network monitoring in HEP distributed online environment

    CERN Document Server

    Carvalho, D F; Delgado, V; Albert, J N; Bellas, N; Javello, J; Miere, Y; Ruffinoni, D; Smith, G

    1996-01-01

    Large Scientific Equipments are controlled by Computer System whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, thhe sophistication of its trearment and, on the over hand by the fast evolution of the computer and network market. Some people call them generically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this frame- work the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is to integrate the various functions of DCCS monitoring into one general purpose Multi-layer ...

  10. On the Relevancy of Efficient, Integrated Computer and Network Monitoring in HEP Distributed Online Environment

    Science.gov (United States)

    Carvalho, D.; Gavillet, Ph.; Delgado, V.; Albert, J. N.; Bellas, N.; Javello, J.; Miere, Y.; Ruffinoni, D.; Smith, G.

    Large Scientific Equipments are controlled by Computer Systems whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, the sophistication of its treatment and, on the other hand by the fast evolution of the computer and network market. Some people call them genetically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this framework the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is proposed to integrate the various functions of DCCS monitoring into one general purpose Multi-layer System.

  11. Application of visible/near infrared spectroscopy to quality control of fresh fruits and vegetables in large-scale mass distribution channels: a preliminary test on carrots and tomatoes.

    Science.gov (United States)

    Beghi, Roberto; Giovenzana, Valentina; Tugnolo, Alessio; Guidetti, Riccardo

    2018-05-01

    The market for fruits and vegetables is mainly controlled by the mass distribution channel (MDC). MDC buyers do not have useful instruments to rapidly evaluate the quality of the products. Decisions by the buyers are driven primarily by pricing strategies rather than product quality. Simple, rapid and easy-to-use methods for objectively evaluating the quality of postharvest products are needed. The present study aimed to use visible and near-infrared (vis/NIR) spectroscopy to estimate some qualitative parameters of two low-price products (carrots and tomatoes) of various brands, as well as evaluate the applicability of this technique for use in stores. A non-destructive optical system (vis/NIR spectrophotometer with a reflection probe, spectral range 450-1650 nm) was tested. The differences in quality among carrots and tomatoes purchased from 13 stores on various dates were examined. The reference quality parameters (firmness, water content, soluble solids content, pH and colour) were correlated with the spectral readings. The models derived from the optical data gave positive results, in particular for the prediction of the soluble solids content and the colour, with better results for tomatoes than for carrots. The application of optical techniques may help MDC buyers to monitor the quality of postharvest products, leading to an effective optimization of the entire supply chain. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  12. Hierarchical Cantor set in the large scale structure with torus geometry

    Energy Technology Data Exchange (ETDEWEB)

    Murdzek, R. [Physics Department, ' Al. I. Cuza' University, Blvd. Carol I, Nr. 11, Iassy 700506 (Romania)], E-mail: rmurdzek@yahoo.com

    2008-12-15

    The formation of large scale structures is considered within a model with string on toroidal space-time. Firstly, the space-time geometry is presented. In this geometry, the Universe is represented by a string describing a torus surface. Thereafter, the large scale structure of the Universe is derived from the string oscillations. The results are in agreement with the cellular structure of the large scale distribution and with the theory of a Cantorian space-time.

  13. Integrating Data Streams from in-situ Measurements, Social Networks and Satellite Earth Observation to Augment Operational Flood Monitoring and Forecasting: the 2017 Hurricane Season in the Americas as a Large-scale Test Case

    Science.gov (United States)

    Matgen, P.; Pelich, R.; Brangbour, E.; Bruneau, P.; Chini, M.; Hostache, R.; Schumann, G.; Tamisier, T.

    2017-12-01

    Hurricanes Harvey, Irma and Maria generated large streams of heterogeneous data, coming notably from three main sources: imagery (satellite and aircraft), in-situ measurement stations and social media. Interpreting these data streams brings critical information to develop, validate and update prediction models. The study addresses existing gaps in the joint extraction of disaster risk information from multiple data sources and their usefulness for reducing the predictive uncertainty of large-scale flood inundation models. Satellite EO data, most notably the free-of-charge data streams generated by the Copernicus program, provided a wealth of high-resolution imagery covering the large areas affected. Our study is focussing on the mapping of flooded areas from a sequence of Sentinel-1 SAR imagery using a classification algorithm recently implemented on the European Space Agency's Grid Processing On Demand environment. The end-to-end-processing chain provided a fast access to all relevant imagery and an effective processing for near-real time analyses. The classification algorithm was applied on pairs of images to rapidly and automatically detect, record and disseminate all observable changes of water bodies. Disaster information was also retrieved from photos as well as texts contributed on social networks and the study shows how this information may complement EO and in-situ data and augment information content. As social media data are noisy and difficult to geo-localize, different techniques are being developed to automatically infer associated semantics and geotags. The presentation provides a cross-comparison between the hazard information obtained from the three data sources. We provide examples of how the generated database of geo-localized disaster information was finally integrated into a large-scale hydrodynamic model of the Colorado River emptying into the Matagorda Bay on the Gulf of Mexico in order to reduce its predictive uncertainty. We describe the

  14. Distributed intelligent monitoring and reporting facilities

    Science.gov (United States)

    Pavlou, George; Mykoniatis, George; Sanchez-P, Jorge-A.

    1996-06-01

    Distributed intelligent monitoring and reporting facilities are of paramount importance in both service and network management as they provide the capability to monitor quality of service and utilization parameters and notify degradation so that corrective action can be taken. By intelligent, we refer to the capability of performing the monitoring tasks in a way that has the smallest possible impact on the managed network, facilitates the observation and summarization of information according to a number of criteria and in its most advanced form and permits the specification of these criteria dynamically to suit the particular policy in hand. In addition, intelligent monitoring facilities should minimize the design and implementation effort involved in such activities. The ISO/ITU Metric, Summarization and Performance management functions provide models that only partially satisfy the above requirements. This paper describes our extensions to the proposed models to support further capabilities, with the intention to eventually lead to fully dynamically defined monitoring policies. The concept of distributing intelligence is also discussed, including the consideration of security issues and the applicability of the model in ODP-based distributed processing environments.

  15. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  16. Monitoring device for the reactor power distribution

    International Nuclear Information System (INIS)

    Uematsu, Hitoshi; Tsuiki, Makoto

    1982-01-01

    Purpose: To enable accurate monitoring for the power distribution in a short time, as well as independent detection for in-core neutron flux detectors in abnormal operation due to failures or like other causes to thereby surely provide reliable substitute values. Constitution: Counted values are inputted from a reactor core present status data detector by a power distribution calculation device to calculate the in-core neutron flux density and the power distribution based on previously stored physical models. While on the other hand, counted value from the in-core neutron detectors and the neutron flux distribution and the power distribution calculated from the power distribution calculation device are inputted from a BCF calculation device to compensate the counting errors incorporated in the counted value from the in-core neutron flux detectors and the calculation errors incorporated in the power distribution calculated in the power distribution calculation device respectively and thereby calculate the power distribution in the reactor core. Further, necessary data are inputted to the power distribution calculation device by an input/output device and the results calculated in the BCF calculation device are displayed. (Aizawa, K.)

  17. Monitoring fish distributions along electrofishing segments

    Science.gov (United States)

    Miranda, Leandro E.

    2014-01-01

    Electrofishing is widely used to monitor fish species composition and relative abundance in streams and lakes. According to standard protocols, multiple segments are selected in a body of water to monitor population relative abundance as the ratio of total catch to total sampling effort. The standard protocol provides an assessment of fish distribution at a macrohabitat scale among segments, but not within segments. An ancillary protocol was developed for assessing fish distribution at a finer scale within electrofishing segments. The ancillary protocol was used to estimate spacing, dispersion, and association of two species along shore segments in two local reservoirs. The added information provided by the ancillary protocol may be useful for assessing fish distribution relative to fish of the same species, to fish of different species, and to environmental or habitat characteristics.

  18. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  19. Adaptive visualization for large-scale graph

    International Nuclear Information System (INIS)

    Nakamura, Hiroko; Shinano, Yuji; Ohzahata, Satoshi

    2010-01-01

    We propose an adoptive visualization technique for representing a large-scale hierarchical dataset within limited display space. A hierarchical dataset has nodes and links showing the parent-child relationship between the nodes. These nodes and links are described using graphics primitives. When the number of these primitives is large, it is difficult to recognize the structure of the hierarchical data because many primitives are overlapped within a limited region. To overcome this difficulty, we propose an adaptive visualization technique for hierarchical datasets. The proposed technique selects an appropriate graph style according to the nodal density in each area. (author)

  20. Neutrinos and large-scale structure

    International Nuclear Information System (INIS)

    Eisenstein, Daniel J.

    2015-01-01

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos

  1. Puzzles of large scale structure and gravitation

    International Nuclear Information System (INIS)

    Sidharth, B.G.

    2006-01-01

    We consider the puzzle of cosmic voids bounded by two-dimensional structures of galactic clusters as also a puzzle pointed out by Weinberg: How can the mass of a typical elementary particle depend on a cosmic parameter like the Hubble constant? An answer to the first puzzle is proposed in terms of 'Scaled' Quantum Mechanical like behaviour which appears at large scales. The second puzzle can be answered by showing that the gravitational mass of an elementary particle has a Machian character (see Ahmed N. Cantorian small worked, Mach's principle and the universal mass network. Chaos, Solitons and Fractals 2004;21(4))

  2. Neutrinos and large-scale structure

    Energy Technology Data Exchange (ETDEWEB)

    Eisenstein, Daniel J. [Daniel J. Eisenstein, Harvard-Smithsonian Center for Astrophysics, 60 Garden St., MS #20, Cambridge, MA 02138 (United States)

    2015-07-15

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos.

  3. Concepts for Large Scale Hydrogen Production

    OpenAIRE

    Jakobsen, Daniel; Åtland, Vegar

    2016-01-01

    The objective of this thesis is to perform a techno-economic analysis of large-scale, carbon-lean hydrogen production in Norway, in order to evaluate various production methods and estimate a breakeven price level. Norway possesses vast energy resources and the export of oil and gas is vital to the country s economy. The results of this thesis indicate that hydrogen represents a viable, carbon-lean opportunity to utilize these resources, which can prove key in the future of Norwegian energy e...

  4. Stabilization Algorithms for Large-Scale Problems

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg

    2006-01-01

    The focus of the project is on stabilization of large-scale inverse problems where structured models and iterative algorithms are necessary for computing approximate solutions. For this purpose, we study various iterative Krylov methods and their abilities to produce regularized solutions. Some......-curve. This heuristic is implemented as a part of a larger algorithm which is developed in collaboration with G. Rodriguez and P. C. Hansen. Last, but not least, a large part of the project has, in different ways, revolved around the object-oriented Matlab toolbox MOORe Tools developed by PhD Michael Jacobsen. New...

  5. Large scale phononic metamaterials for seismic isolation

    International Nuclear Information System (INIS)

    Aravantinos-Zafiris, N.; Sigalas, M. M.

    2015-01-01

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials

  6. Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Willcox, Karen [MIT; Marzouk, Youssef [MIT

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to

  7. Design and Realization of Online Monitoring System of Distributed New Energy and Renewable Energy

    Science.gov (United States)

    Tang, Yanfen; Zhou, Tao; Li, Mengwen; Zheng, Guotai; Li, Hao

    2018-01-01

    Aimed at difficult centralized monitoring and management of current distributed new energy and renewable energy generation projects due to great varieties, different communication protocols and large-scale difference, this paper designs a online monitoring system of new energy and renewable energy characterized by distributed deployment, tailorable functions, extendible applications and fault self-healing performance. This system is designed based on international general standard for grid information data model, formulates unified data acquisition and transmission standard for different types of new energy and renewable energy generation projects, and can realize unified data acquisition and real-time monitoring of new energy and renewable energy generation projects, such as solar energy, wind power, biomass energy, etc. within its jurisdiction. This system has applied in Beijing. At present, 576 projects are connected to the system. Good effect is achieved and stability and reliability of the system have been validated.

  8. Photorealistic large-scale urban city model reconstruction.

    Science.gov (United States)

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite).

  9. Dipolar modulation of Large-Scale Structure

    Science.gov (United States)

    Yoon, Mijin

    For the last two decades, we have seen a drastic development of modern cosmology based on various observations such as the cosmic microwave background (CMB), type Ia supernovae, and baryonic acoustic oscillations (BAO). These observational evidences have led us to a great deal of consensus on the cosmological model so-called LambdaCDM and tight constraints on cosmological parameters consisting the model. On the other hand, the advancement in cosmology relies on the cosmological principle: the universe is isotropic and homogeneous on large scales. Testing these fundamental assumptions is crucial and will soon become possible given the planned observations ahead. Dipolar modulation is the largest angular anisotropy of the sky, which is quantified by its direction and amplitude. We measured a huge dipolar modulation in CMB, which mainly originated from our solar system's motion relative to CMB rest frame. However, we have not yet acquired consistent measurements of dipolar modulations in large-scale structure (LSS), as they require large sky coverage and a number of well-identified objects. In this thesis, we explore measurement of dipolar modulation in number counts of LSS objects as a test of statistical isotropy. This thesis is based on two papers that were published in peer-reviewed journals. In Chapter 2 [Yoon et al., 2014], we measured a dipolar modulation in number counts of WISE matched with 2MASS sources. In Chapter 3 [Yoon & Huterer, 2015], we investigated requirements for detection of kinematic dipole in future surveys.

  10. Internationalization Measures in Large Scale Research Projects

    Science.gov (United States)

    Soeding, Emanuel; Smith, Nancy

    2017-04-01

    Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.

  11. Large scale integration of photovoltaics in cities

    International Nuclear Information System (INIS)

    Strzalka, Aneta; Alam, Nazmul; Duminil, Eric; Coors, Volker; Eicker, Ursula

    2012-01-01

    Highlights: ► We implement the photovoltaics on a large scale. ► We use three-dimensional modelling for accurate photovoltaic simulations. ► We consider the shadowing effect in the photovoltaic simulation. ► We validate the simulated results using detailed hourly measured data. - Abstract: For a large scale implementation of photovoltaics (PV) in the urban environment, building integration is a major issue. This includes installations on roof or facade surfaces with orientations that are not ideal for maximum energy production. To evaluate the performance of PV systems in urban settings and compare it with the building user’s electricity consumption, three-dimensional geometry modelling was combined with photovoltaic system simulations. As an example, the modern residential district of Scharnhauser Park (SHP) near Stuttgart/Germany was used to calculate the potential of photovoltaic energy and to evaluate the local own consumption of the energy produced. For most buildings of the district only annual electrical consumption data was available and only selected buildings have electronic metering equipment. The available roof area for one of these multi-family case study buildings was used for a detailed hourly simulation of the PV power production, which was then compared to the hourly measured electricity consumption. The results were extrapolated to all buildings of the analyzed area by normalizing them to the annual consumption data. The PV systems can produce 35% of the quarter’s total electricity consumption and half of this generated electricity is directly used within the buildings.

  12. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  13. Macroecological factors explain large-scale spatial population patterns of ancient agriculturalists

    NARCIS (Netherlands)

    Xu, C.; Chen, B.; Abades, S.; Reino, L.; Teng, S.; Ljungqvist, F.C.; Huang, Z.Y.X.; Liu, X.

    2015-01-01

    Aim: It has been well demonstrated that the large-scale distribution patterns of numerous species are driven by similar macroecological factors. However, understanding of this topic remains limited when applied to our own species. Here we take a large-scale look at ancient agriculturalist

  14. Mapping spatial patterns of denitrifiers at large scales (Invited)

    Science.gov (United States)

    Philippot, L.; Ramette, A.; Saby, N.; Bru, D.; Dequiedt, S.; Ranjard, L.; Jolivet, C.; Arrouays, D.

    2010-12-01

    Little information is available regarding the landscape-scale distribution of microbial communities and its environmental determinants. Here we combined molecular approaches and geostatistical modeling to explore spatial patterns of the denitrifying community at large scales. The distribution of denitrifrying community was investigated over 107 sites in Burgundy, a 31 500 km2 region of France, using a 16 X 16 km sampling grid. At each sampling site, the abundances of denitrifiers and 42 soil physico-chemical properties were measured. The relative contributions of land use, spatial distance, climatic conditions, time and soil physico-chemical properties to the denitrifier spatial distribution were analyzed by canonical variation partitioning. Our results indicate that 43% to 85% of the spatial variation in community abundances could be explained by the measured environmental parameters, with soil chemical properties (mostly pH) being the main driver. We found spatial autocorrelation up to 739 km and used geostatistical modelling to generate predictive maps of the distribution of denitrifiers at the landscape scale. Studying the distribution of the denitrifiers at large scale can help closing the artificial gap between the investigation of microbial processes and microbial community ecology, therefore facilitating our understanding of the relationships between the ecology of denitrifiers and N-fluxes by denitrification.

  15. Parallel clustering algorithm for large-scale biological data sets.

    Science.gov (United States)

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies.

  16. [A large-scale accident in Alpine terrain].

    Science.gov (United States)

    Wildner, M; Paal, P

    2015-02-01

    Due to the geographical conditions, large-scale accidents amounting to mass casualty incidents (MCI) in Alpine terrain regularly present rescue teams with huge challenges. Using an example incident, specific conditions and typical problems associated with such a situation are presented. The first rescue team members to arrive have the elementary tasks of qualified triage and communication to the control room, which is required to dispatch the necessary additional support. Only with a clear "concept", to which all have to adhere, can the subsequent chaos phase be limited. In this respect, a time factor confounded by adverse weather conditions or darkness represents enormous pressure. Additional hazards are frostbite and hypothermia. If priorities can be established in terms of urgency, then treatment and procedure algorithms have proven successful. For evacuation of causalities, a helicopter should be strived for. Due to the low density of hospitals in Alpine regions, it is often necessary to distribute the patients over a wide area. Rescue operations in Alpine terrain have to be performed according to the particular conditions and require rescue teams to have specific knowledge and expertise. The possibility of a large-scale accident should be considered when planning events. With respect to optimization of rescue measures, regular training and exercises are rational, as is the analysis of previous large-scale Alpine accidents.

  17. LHCb: Monitoring the DIRAC Distribution System

    CERN Multimedia

    Nandakumar, R; Santinelli, R

    2009-01-01

    DIRAC is the LHCb gateway to any computing grid infrastructure (currently supporting WLCG) and is intended to reliably run large data mining activities. The DIRAC system consists of various services (which wait to be contacted to perform actions) and agents (which carry out periodic activities) to direct jobs as required. An important part of ensuring the reliability of the infrastructure is the monitoring and logging of these DIRAC distributed systems. The monitoring is done collecting information from two sources - one is from pinging the services or by keeping track of the regular heartbeats of the agents, and the other from the analysis of the error messages generated by both agents and services and collected by the logging system. This allows us to ensure that he components are running properly and to collect useful information regarding their operations. The process status monitoring is displayed using the SLS sensor mechanism which also automatically allows one to plot various quantities and also keep ...

  18. Nonlinear evolution of large-scale structure in the universe

    International Nuclear Information System (INIS)

    Frenk, C.S.; White, S.D.M.; Davis, M.

    1983-01-01

    Using N-body simulations we study the nonlinear development of primordial density perturbation in an Einstein--de Sitter universe. We compare the evolution of an initial distribution without small-scale density fluctuations to evolution from a random Poisson distribution. These initial conditions mimic the assumptions of the adiabatic and isothermal theories of galaxy formation. The large-scale structures which form in the two cases are markedly dissimilar. In particular, the correlation function xi(r) and the visual appearance of our adiabatic (or ''pancake'') models match better the observed distribution of galaxies. This distribution is characterized by large-scale filamentary structure. Because the pancake models do not evolve in a self-similar fashion, the slope of xi(r) steepens with time; as a result there is a unique epoch at which these models fit the galaxy observations. We find the ratio of cutoff length to correlation length at this time to be lambda/sub min//r 0 = 5.1; its expected value in a neutrino dominated universe is 4(Ωh) -1 (H 0 = 100h km s -1 Mpc -1 ). At early epochs these models predict a negligible amplitude for xi(r) and could explain the lack of measurable clustering in the Lyα absorption lines of high-redshift quasars. However, large-scale structure in our models collapses after z = 2. If this collapse precedes galaxy formation as in the usual pancake theory, galaxies formed uncomfortably recently. The extent of this problem may depend on the cosmological model used; the present series of experiments should be extended in the future to include models with Ω<1

  19. Engineering management of large scale systems

    Science.gov (United States)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  20. Grid sensitivity capability for large scale structures

    Science.gov (United States)

    Nagendra, Gopal K.; Wallerstein, David V.

    1989-01-01

    The considerations and the resultant approach used to implement design sensitivity capability for grids into a large scale, general purpose finite element system (MSC/NASTRAN) are presented. The design variables are grid perturbations with a rather general linking capability. Moreover, shape and sizing variables may be linked together. The design is general enough to facilitate geometric modeling techniques for generating design variable linking schemes in an easy and straightforward manner. Test cases have been run and validated by comparison with the overall finite difference method. The linking of a design sensitivity capability for shape variables in MSC/NASTRAN with an optimizer would give designers a powerful, automated tool to carry out practical optimization design of real life, complicated structures.

  1. Large - scale Rectangular Ruler Automated Verification Device

    Science.gov (United States)

    Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie

    2018-03-01

    This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.

  2. Large Scale Landform Mapping Using Lidar DEM

    Directory of Open Access Journals (Sweden)

    Türkay Gökgöz

    2015-08-01

    Full Text Available In this study, LIDAR DEM data was used to obtain a primary landform map in accordance with a well-known methodology. This primary landform map was generalized using the Focal Statistics tool (Majority, considering the minimum area condition in cartographic generalization in order to obtain landform maps at 1:1000 and 1:5000 scales. Both the primary and the generalized landform maps were verified visually with hillshaded DEM and an orthophoto. As a result, these maps provide satisfactory visuals of the landforms. In order to show the effect of generalization, the area of each landform in both the primary and the generalized maps was computed. Consequently, landform maps at large scales could be obtained with the proposed methodology, including generalization using LIDAR DEM.

  3. Constructing sites on a large scale

    DEFF Research Database (Denmark)

    Braae, Ellen Marie; Tietjen, Anne

    2011-01-01

    Since the 1990s, the regional scale has regained importance in urban and landscape design. In parallel, the focus in design tasks has shifted from master plans for urban extension to strategic urban transformation projects. A prominent example of a contemporary spatial development approach...... for setting the design brief in a large scale urban landscape in Norway, the Jaeren region around the city of Stavanger. In this paper, we first outline the methodological challenges and then present and discuss the proposed method based on our teaching experiences. On this basis, we discuss aspects...... is the IBA Emscher Park in the Ruhr area in Germany. Over a 10 years period (1988-1998), more than a 100 local transformation projects contributed to the transformation from an industrial to a post-industrial region. The current paradigm of planning by projects reinforces the role of the design disciplines...

  4. Testing Einstein's Gravity on Large Scales

    Science.gov (United States)

    Prescod-Weinstein, Chandra

    2011-01-01

    A little over a decade has passed since two teams studying high redshift Type Ia supernovae announced the discovery that the expansion of the universe was accelerating. After all this time, we?re still not sure how cosmic acceleration fits into the theory that tells us about the large-scale universe: General Relativity (GR). As part of our search for answers, we have been forced to question GR itself. But how will we test our ideas? We are fortunate enough to be entering the era of precision cosmology, where the standard model of gravity can be subjected to more rigorous testing. Various techniques will be employed over the next decade or two in the effort to better understand cosmic acceleration and the theory behind it. In this talk, I will describe cosmic acceleration, current proposals to explain it, and weak gravitational lensing, an observational effect that allows us to do the necessary precision cosmology.

  5. Large-Scale Astrophysical Visualization on Smartphones

    Science.gov (United States)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  6. Large-scale sequential quadratic programming algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  7. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    survival and recruitment estimates from the French CES scheme to assess the relative contributions of survival and recruitment to overall population changes. He develops a novel approach to modelling survival rates from such multi–site data by using within–year recaptures to provide a covariate of between–year recapture rates. This provided parsimonious models of variation in recapture probabilities between sites and years. The approach provides promising results for the four species investigated and can potentially be extended to similar data from other CES/MAPS schemes. The final paper by Blandine Doligez, David Thomson and Arie van Noordwijk (Doligez et al., 2004 illustrates how large-scale studies of population dynamics can be important for evaluating the effects of conservation measures. Their study is concerned with the reintroduction of White Stork populations to the Netherlands where a re–introduction programme started in 1969 had resulted in a breeding population of 396 pairs by 2000. They demonstrate the need to consider a wide range of models in order to account for potential age, time, cohort and “trap–happiness” effects. As the data are based on resightings such trap–happiness must reflect some form of heterogeneity in resighting probabilities. Perhaps surprisingly, the provision of supplementary food did not influence survival, but it may havehad an indirect effect via the alteration of migratory behaviour. Spatially explicit modelling of data gathered at many sites inevitably results in starting models with very large numbers of parameters. The problem is often complicated further by having relatively sparse data at each site, even where the total amount of data gathered is very large. Both Julliard (2004 and Doligez et al. (2004 give explicit examples of problems caused by needing to handle very large numbers of parameters and show how they overcame them for their particular data sets. Such problems involve both the choice of appropriate

  8. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  9. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  10. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif; Orakzai, Faisal Moeen; Abdelaziz, Ibrahim; Khayyat, Zuhair

    2017-01-01

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  11. Large-scale implementation of disease control programmes: a cost-effectiveness analysis of long-lasting insecticide-treated bed net distribution channels in a malaria-endemic area of western Kenya-a study protocol.

    Science.gov (United States)

    Gama, Elvis; Were, Vincent; Ouma, Peter; Desai, Meghna; Niessen, Louis; Buff, Ann M; Kariuki, Simon

    2016-11-21

    Historically, Kenya has used various distribution models for long-lasting insecticide-treated bed nets (LLINs) with variable results in population coverage. The models presently vary widely in scale, target population and strategy. There is limited information to determine the best combination of distribution models, which will lead to sustained high coverage and are operationally efficient and cost-effective. Standardised cost information is needed in combination with programme effectiveness estimates to judge the efficiency of LLIN distribution models and options for improvement in implementing malaria control programmes. The study aims to address the information gap, estimating distribution cost and the effectiveness of different LLIN distribution models, and comparing them in an economic evaluation. Evaluation of cost and coverage will be determined for 5 different distribution models in Busia County, an area of perennial malaria transmission in western Kenya. Cost data will be collected retrospectively from health facilities, the Ministry of Health, donors and distributors. Programme-effectiveness data, defined as the number of people with access to an LLIN per 1000 population, will be collected through triangulation of data from a nationally representative, cross-sectional malaria survey, a cross-sectional survey administered to a subsample of beneficiaries in Busia County and LLIN distributors' records. Descriptive statistics and regression analysis will be used for the evaluation. A cost-effectiveness analysis will be performed from a health-systems perspective, and cost-effectiveness ratios will be calculated using bootstrapping techniques. The study has been evaluated and approved by Kenya Medical Research Institute, Scientific and Ethical Review Unit (SERU number 2997). All participants will provide written informed consent. The findings of this economic evaluation will be disseminated through peer-reviewed publications. Published by the BMJ Publishing

  12. Large-scale stochasticity in Hamiltonian systems

    International Nuclear Information System (INIS)

    Escande, D.F.

    1982-01-01

    Large scale stochasticity (L.S.S.) in Hamiltonian systems is defined on the paradigm Hamiltonian H(v,x,t) =v 2 /2-M cos x-P cos k(x-t) which describes the motion of one particle in two electrostatic waves. A renormalization transformation Tsub(r) is described which acts as a microscope that focusses on a given KAM (Kolmogorov-Arnold-Moser) torus in phase space. Though approximate, Tsub(r) yields the threshold of L.S.S. in H with an error of 5-10%. The universal behaviour of KAM tori is predicted: for instance the scale invariance of KAM tori and the critical exponent of the Lyapunov exponent of Cantori. The Fourier expansion of KAM tori is computed and several conjectures by L. Kadanoff and S. Shenker are proved. Chirikov's standard mapping for stochastic layers is derived in a simpler way and the width of the layers is computed. A simpler renormalization scheme for these layers is defined. A Mathieu equation for describing the stability of a discrete family of cycles is derived. When combined with Tsub(r), it allows to prove the link between KAM tori and nearby cycles, conjectured by J. Greene and, in particular, to compute the mean residue of a torus. The fractal diagrams defined by G. Schmidt are computed. A sketch of a methodology for computing the L.S.S. threshold in any two-degree-of-freedom Hamiltonian system is given. (Auth.)

  13. Large scale molecular simulations of nanotoxicity.

    Science.gov (United States)

    Jimenez-Cruz, Camilo A; Kang, Seung-gu; Zhou, Ruhong

    2014-01-01

    The widespread use of nanomaterials in biomedical applications has been accompanied by an increasing interest in understanding their interactions with tissues, cells, and biomolecules, and in particular, on how they might affect the integrity of cell membranes and proteins. In this mini-review, we present a summary of some of the recent studies on this important subject, especially from the point of view of large scale molecular simulations. The carbon-based nanomaterials and noble metal nanoparticles are the main focus, with additional discussions on quantum dots and other nanoparticles as well. The driving forces for adsorption of fullerenes, carbon nanotubes, and graphene nanosheets onto proteins or cell membranes are found to be mainly hydrophobic interactions and the so-called π-π stacking (between aromatic rings), while for the noble metal nanoparticles the long-range electrostatic interactions play a bigger role. More interestingly, there are also growing evidences showing that nanotoxicity can have implications in de novo design of nanomedicine. For example, the endohedral metallofullerenol Gd@C₈₂(OH)₂₂ is shown to inhibit tumor growth and metastasis by inhibiting enzyme MMP-9, and graphene is illustrated to disrupt bacteria cell membranes by insertion/cutting as well as destructive extraction of lipid molecules. These recent findings have provided a better understanding of nanotoxicity at the molecular level and also suggested therapeutic potential by using the cytotoxicity of nanoparticles against cancer or bacteria cells. © 2014 Wiley Periodicals, Inc.

  14. Large-scale tides in general relativity

    Energy Technology Data Exchange (ETDEWEB)

    Ip, Hiu Yan; Schmidt, Fabian, E-mail: iphys@mpa-garching.mpg.de, E-mail: fabians@mpa-garching.mpg.de [Max-Planck-Institut für Astrophysik, Karl-Schwarzschild-Str. 1, 85741 Garching (Germany)

    2017-02-01

    Density perturbations in cosmology, i.e. spherically symmetric adiabatic perturbations of a Friedmann-Lemaȋtre-Robertson-Walker (FLRW) spacetime, are locally exactly equivalent to a different FLRW solution, as long as their wavelength is much larger than the sound horizon of all fluid components. This fact is known as the 'separate universe' paradigm. However, no such relation is known for anisotropic adiabatic perturbations, which correspond to an FLRW spacetime with large-scale tidal fields. Here, we provide a closed, fully relativistic set of evolutionary equations for the nonlinear evolution of such modes, based on the conformal Fermi (CFC) frame. We show explicitly that the tidal effects are encoded by the Weyl tensor, and are hence entirely different from an anisotropic Bianchi I spacetime, where the anisotropy is sourced by the Ricci tensor. In order to close the system, certain higher derivative terms have to be dropped. We show that this approximation is equivalent to the local tidal approximation of Hui and Bertschinger [1]. We also show that this very simple set of equations matches the exact evolution of the density field at second order, but fails at third and higher order. This provides a useful, easy-to-use framework for computing the fully relativistic growth of structure at second order.

  15. Food appropriation through large scale land acquisitions

    International Nuclear Information System (INIS)

    Cristina Rulli, Maria; D’Odorico, Paolo

    2014-01-01

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300–550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190–370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations. (letter)

  16. Mirror dark matter and large scale structure

    International Nuclear Information System (INIS)

    Ignatiev, A.Yu.; Volkas, R.R.

    2003-01-01

    Mirror matter is a dark matter candidate. In this paper, we reexamine the linear regime of density perturbation growth in a universe containing mirror dark matter. Taking adiabatic scale-invariant perturbations as the input, we confirm that the resulting processed power spectrum is richer than for the more familiar cases of cold, warm and hot dark matter. The new features include a maximum at a certain scale λ max , collisional damping below a smaller characteristic scale λ S ' , with oscillatory perturbations between the two. These scales are functions of the fundamental parameters of the theory. In particular, they decrease for decreasing x, the ratio of the mirror plasma temperature to that of the ordinary. For x∼0.2, the scale λ max becomes galactic. Mirror dark matter therefore leads to bottom-up large scale structure formation, similar to conventional cold dark matter, for x(less-or-similar sign)0.2. Indeed, the smaller the value of x, the closer mirror dark matter resembles standard cold dark matter during the linear regime. The differences pertain to scales smaller than λ S ' in the linear regime, and generally in the nonlinear regime because mirror dark matter is chemically complex and to some extent dissipative. Lyman-α forest data and the early reionization epoch established by WMAP may hold the key to distinguishing mirror dark matter from WIMP-style cold dark matter

  17. Large-scale visualization system for grid environment

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of Japan Atomic Energy Agency (CCSE/JAEA) has been conducting R and Ds of distributed computing (grid computing) environments: Seamless Thinking Aid (STA), Information Technology Based Laboratory (ITBL) and Atomic Energy Grid InfraStructure (AEGIS). In these R and Ds, we have developed the visualization technology suitable for the distributed computing environment. As one of the visualization tools, we have developed the Parallel Support Toolkit (PST) which can execute the visualization process parallely on a computer. Now, we improve PST to be executable simultaneously on multiple heterogeneous computers using Seamless Thinking Aid Message Passing Interface (STAMPI). STAMPI, we have developed in these R and Ds, is the MPI library executable on a heterogeneous computing environment. The improvement realizes the visualization of extremely large-scale data and enables more efficient visualization processes in a distributed computing environment. (author)

  18. Large scale laboratory diffusion experiments in clay rocks

    International Nuclear Information System (INIS)

    Garcia-Gutierrez, M.; Missana, T.; Mingarro, M.; Martin, P.L.; Cormenzana, J.L.

    2005-01-01

    Full text of publication follows: Clay formations are potential host rocks for high-level radioactive waste repositories. In clay materials the radionuclide diffusion is the main transport mechanism. Thus, the understanding of the diffusion processes and the determination of diffusion parameters in conditions as similar as possible to the real ones, are critical for the performance assessment of deep geological repository. Diffusion coefficients are mainly measured in the laboratory using small samples, after a preparation to fit into the diffusion cell. In addition, a few field tests are usually performed for confirming laboratory results, and analyse scale effects. In field or 'in situ' tests the experimental set-up usually includes the injection of a tracer diluted in reconstituted formation water into a packed off section of a borehole. Both experimental systems may produce artefacts in the determination of diffusion coefficients. In laboratory the preparation of the sample can generate structural change mainly if the consolidated clay have a layered fabric, and in field test the introduction of water could modify the properties of the saturated clay in the first few centimeters, just where radionuclide diffusion is expected to take place. In this work, a large scale laboratory diffusion experiment is proposed, using a large cylindrical sample of consolidated clay that can overcome the above mentioned problems. The tracers used were mixed with clay obtained by drilling a central hole, re-compacted into the hole at approximately the same density as the consolidated block and finally sealed. Neither additional treatment of the sample nor external monitoring are needed. After the experimental time needed for diffusion to take place (estimated by scoping calculations) the block was sampled to obtain a 3D distribution of the tracer concentration and the results were modelled. An additional advantage of the proposed configuration is that it could be used in 'in situ

  19. Monitoring community mobilisation and organisational capacity among high-risk groups in a large-scale HIV prevention programme in India: selected findings using a Community Ownership and Preparedness Index.

    Science.gov (United States)

    Narayanan, Pradeep; Moulasha, K; Wheeler, Tisha; Baer, James; Bharadwaj, Sowmyaa; Ramanathan, T V; Thomas, Tom

    2012-10-01

    In a participatory approach to health and development interventions, defining and measuring community mobilisation is important, but it is challenging to do this effectively, especially at scale. A cross-sectional, participatory monitoring tool was administered in 2008-2009 and 2009-2010 across a representative sample of 25 community-based groups (CBGs) formed under the Avahan India AIDS Initiative, to assess their progress in mobilisation, and to inform efforts to strengthen the groups and make them sustainable. The survey used a weighted index to capture both qualitative and quantitative data in numeric form. The index permitted broad, as well as highly detailed, analysis of community mobilisation, relevant at the level of individual groups, as well as state-wide and across the whole programme. The survey demonstrated that leadership and programme management were the strongest areas among the CBGs, confirming the programme's investment in these areas. Discussion of the Round 1 results led to efforts to strengthen governance and democratic decision making in the groups, and progress was reflected in the Round 2 survey results. CBG engagement with state authorities to gain rights and entitlements and securing the long-term financial stability of groups remain a challenge. The survey has proven useful for informing the managers of programmes about what is happening on the ground, and it has opened spaces for discussion within community groups about the nature of leadership, decision making and their goals, which is leading to accelerated progress. The tool provided useful data to manage community mobilisation in Avahan.

  20. Designing and developing portable large-scale JavaScript web applications within the Experiment Dashboard framework

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Improvements in web browser performance and web standards compliance, as well as the availability of comprehensive JavaScript libraries, provides an opportunity to develop functionally rich yet intuitive web applications that allow users to access, render and analyse data in novel ways. However, the development of such large-scale JavaScript web applications presents new challenges, in particular with regard to code sustainability and team-based work. We present an approach that meets the challenges of large-scale JavaScript web application design and development, including client-side model-view-controller architecture, design patterns, and JavaScript libraries. Furthermore, we show how the approach leads naturally to the encapsulation of the data source as a web API, allowing applications to be easily ported to new data sources. The Experiment Dashboard framework is used for the development of applications for monitoring the distributed computing activities of virtual organisations on the Worldwide LHC Co...

  1. Designing and developing portable large-scale JavaScript web applications within the Experiment Dashboard framework

    CERN Document Server

    Andreeva, J; Karavakis, E; Kokoszkiewicz, L; Nowotka, M; Saiz, P; Tuckett, D

    2012-01-01

    Improvements in web browser performance and web standards compliance, as well as the availability of comprehensive JavaScript libraries, provides an opportunity to develop functionally rich yet intuitive web applications that allow users to access, render and analyse data in novel ways. However, the development of such large-scale JavaScript web applications presents new challenges, in particular with regard to code sustainability and team-based work. We present an approach that meets the challenges of large-scale JavaScript web application design and development, including client-side model-view-controller architecture, design patterns, and JavaScript libraries. Furthermore, we show how the approach leads naturally to the encapsulation of the data source as a web API, allowing applications to be easily ported to new data sources. The Experiment Dashboard framework is used for the development of applications for monitoring the distributed computing activities of virtual organisations on the Worldwide LHC Comp...

  2. Sensitivity technologies for large scale simulation

    International Nuclear Information System (INIS)

    Collis, Samuel Scott; Bartlett, Roscoe Ainsworth; Smith, Thomas Michael; Heinkenschloss, Matthias; Wilcox, Lucas C.; Hill, Judith C.; Ghattas, Omar; Berggren, Martin Olof; Akcelik, Volkan; Ober, Curtis Curry; van Bloemen Waanders, Bart Gustaaf; Keiter, Eric Richard

    2005-01-01

    Sensitivity analysis is critically important to numerous analysis algorithms, including large scale optimization, uncertainty quantification,reduced order modeling, and error estimation. Our research focused on developing tools, algorithms and standard interfaces to facilitate the implementation of sensitivity type analysis into existing code and equally important, the work was focused on ways to increase the visibility of sensitivity analysis. We attempt to accomplish the first objective through the development of hybrid automatic differentiation tools, standard linear algebra interfaces for numerical algorithms, time domain decomposition algorithms and two level Newton methods. We attempt to accomplish the second goal by presenting the results of several case studies in which direct sensitivities and adjoint methods have been effectively applied, in addition to an investigation of h-p adaptivity using adjoint based a posteriori error estimation. A mathematical overview is provided of direct sensitivities and adjoint methods for both steady state and transient simulations. Two case studies are presented to demonstrate the utility of these methods. A direct sensitivity method is implemented to solve a source inversion problem for steady state internal flows subject to convection diffusion. Real time performance is achieved using novel decomposition into offline and online calculations. Adjoint methods are used to reconstruct initial conditions of a contamination event in an external flow. We demonstrate an adjoint based transient solution. In addition, we investigated time domain decomposition algorithms in an attempt to improve the efficiency of transient simulations. Because derivative calculations are at the root of sensitivity calculations, we have developed hybrid automatic differentiation methods and implemented this approach for shape optimization for gas dynamics using the Euler equations. The hybrid automatic differentiation method was applied to a first

  3. Pseudo-interactive monitoring in distributed computing

    International Nuclear Information System (INIS)

    Sfiligoi, I.; Bradley, D.; Livny, M.

    2009-01-01

    Distributed computing, and in particular Grid computing, enables physicists to use thousands of CPU days worth of computing every day, by submitting thousands of compute jobs. Unfortunately, a small fraction of such jobs regularly fail; the reasons vary from disk and network problems to bugs in the user code. A subset of these failures result in jobs being stuck for long periods of time. In order to debug such failures, interactive monitoring is highly desirable; users need to browse through the job log files and check the status of the running processes. Batch systems typically don't provide such services; at best, users get job logs at job termination, and even this may not be possible if the job is stuck in an infinite loop. In this paper we present a novel approach of using regular batch system capabilities of Condor to enable users to access the logs and processes of any running job. This does not provide true interactive access, so commands like vi are not viable, but it does allow operations like ls, cat, top, ps, lsof, netstat and dumping the stack of any process owned by the user; we call this pseudo-interactive monitoring. It is worth noting that the same method can be used to monitor Grid jobs in a glidein-based environment. We further believe that the same mechanism could be applied to many other batch systems.

  4. Pseudo-interactive monitoring in distributed computing

    International Nuclear Information System (INIS)

    Sfiligoi, I; Bradley, D; Livny, M

    2010-01-01

    Distributed computing, and in particular Grid computing, enables physicists to use thousands of CPU days worth of computing every day, by submitting thousands of compute jobs. Unfortunately, a small fraction of such jobs regularly fail; the reasons vary from disk and network problems to bugs in the user code. A subset of these failures result in jobs being stuck for long periods of time. In order to debug such failures, interactive monitoring is highly desirable; users need to browse through the job log files and check the status of the running processes. Batch systems typically don't provide such services; at best, users get job logs at job termination, and even this may not be possible if the job is stuck in an infinite loop. In this paper we present a novel approach of using regular batch system capabilities of Condor to enable users to access the logs and processes of any running job. This does not provide true interactive access, so commands like vi are not viable, but it does allow operations like ls, cat, top, ps, lsof, netstat and dumping the stack of any process owned by the user; we call this pseudo-interactive monitoring. It is worth noting that the same method can be used to monitor Grid jobs in a glidein-based environment. We further believe that the same mechanism could be applied to many other batch systems.

  5. Pseudo-interactive monitoring in distributed computing

    Energy Technology Data Exchange (ETDEWEB)

    Sfiligoi, I.; /Fermilab; Bradley, D.; Livny, M.; /Wisconsin U., Madison

    2009-05-01

    Distributed computing, and in particular Grid computing, enables physicists to use thousands of CPU days worth of computing every day, by submitting thousands of compute jobs. Unfortunately, a small fraction of such jobs regularly fail; the reasons vary from disk and network problems to bugs in the user code. A subset of these failures result in jobs being stuck for long periods of time. In order to debug such failures, interactive monitoring is highly desirable; users need to browse through the job log files and check the status of the running processes. Batch systems typically don't provide such services; at best, users get job logs at job termination, and even this may not be possible if the job is stuck in an infinite loop. In this paper we present a novel approach of using regular batch system capabilities of Condor to enable users to access the logs and processes of any running job. This does not provide true interactive access, so commands like vi are not viable, but it does allow operations like ls, cat, top, ps, lsof, netstat and dumping the stack of any process owned by the user; we call this pseudo-interactive monitoring. It is worth noting that the same method can be used to monitor Grid jobs in a glidein-based environment. We further believe that the same mechanism could be applied to many other batch systems.

  6. Topographically Engineered Large Scale Nanostructures for Plasmonic Biosensing

    Science.gov (United States)

    Xiao, Bo; Pradhan, Sangram K.; Santiago, Kevin C.; Rutherford, Gugu N.; Pradhan, Aswini K.

    2016-04-01

    We demonstrate that a nanostructured metal thin film can achieve enhanced transmission efficiency and sharp resonances and use a large-scale and high-throughput nanofabrication technique for the plasmonic structures. The fabrication technique combines the features of nanoimprint and soft lithography to topographically construct metal thin films with nanoscale patterns. Metal nanogratings developed using this method show significantly enhanced optical transmission (up to a one-order-of-magnitude enhancement) and sharp resonances with full width at half maximum (FWHM) of ~15nm in the zero-order transmission using an incoherent white light source. These nanostructures are sensitive to the surrounding environment, and the resonance can shift as the refractive index changes. We derive an analytical method using a spatial Fourier transformation to understand the enhancement phenomenon and the sensing mechanism. The use of real-time monitoring of protein-protein interactions in microfluidic cells integrated with these nanostructures is demonstrated to be effective for biosensing. The perpendicular transmission configuration and large-scale structures provide a feasible platform without sophisticated optical instrumentation to realize label-free surface plasmon resonance (SPR) sensing.

  7. Large scale and performance tests of the ATLAS online software

    International Nuclear Information System (INIS)

    Alexandrov; Kotov, V.; Mineev, M.; Roumiantsev, V.; Wolters, H.; Amorim, A.; Pedro, L.; Ribeiro, A.; Badescu, E.; Caprini, M.; Burckhart-Chromek, D.; Dobson, M.; Jones, R.; Kazarov, A.; Kolos, S.; Liko, D.; Lucio, L.; Mapelli, L.; Nassiakou, M.; Schweiger, D.; Soloviev, I.; Hart, R.; Ryabov, Y.; Moneta, L.

    2001-01-01

    One of the sub-systems of the Trigger/DAQ system of the future ATLAS experiment is the Online Software system. It encompasses the functionality needed to configure, control and monitor the DAQ. Its architecture is based on a component structure described in the ATLAS Trigger/DAQ technical proposal. Regular integration tests ensure its smooth operation in test beam setups during its evolutionary development towards the final ATLAS online system. Feedback is received and returned into the development process. Studies of the system behavior have been performed on a set of up to 111 PCs on a configuration which is getting closer to the final size. Large scale and performance test of the integrated system were performed on this setup with emphasis on investigating the aspects of the inter-dependence of the components and the performance of the communication software. Of particular interest were the run control state transitions in various configurations of the run control hierarchy. For the purpose of the tests, the software from other Trigger/DAQ sub-systems has been emulated. The author presents a brief overview of the online system structure, its components and the large scale integration tests and their results

  8. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  9. Experimental Investigation of Large-Scale Bubbly Plumes

    Energy Technology Data Exchange (ETDEWEB)

    Zboray, R.; Simiano, M.; De Cachard, F

    2004-03-01

    Carefully planned and instrumented experiments under well-defined boundary conditions have been carried out on large-scale, isothermal, bubbly plumes. The data obtained is meant to validate newly developed, high-resolution numerical tools for 3D transient, two-phase flow modelling. Several measurement techniques have been utilised to collect data from the experiments: particle image velocimetry, optical probes, electromagnetic probes, and visualisation. Bubble and liquid velocity fields, void-fraction distributions, bubble size and interfacial-area-concentration distributions have all been measured in the plume region, as well as recirculation velocities in the surrounding pool. The results obtained from the different measurement techniques have been compared. In general, the two-phase flow data obtained from the different techniques are found to be consistent, and of high enough quality for validating numerical simulation tools for 3D bubbly flows. (author)

  10. Experimental Investigation of Large-Scale Bubbly Plumes

    International Nuclear Information System (INIS)

    Zboray, R.; Simiano, M.; De Cachard, F.

    2004-01-01

    Carefully planned and instrumented experiments under well-defined boundary conditions have been carried out on large-scale, isothermal, bubbly plumes. The data obtained is meant to validate newly developed, high-resolution numerical tools for 3D transient, two-phase flow modelling. Several measurement techniques have been utilised to collect data from the experiments: particle image velocimetry, optical probes, electromagnetic probes, and visualisation. Bubble and liquid velocity fields, void-fraction distributions, bubble size and interfacial-area-concentration distributions have all been measured in the plume region, as well as recirculation velocities in the surrounding pool. The results obtained from the different measurement techniques have been compared. In general, the two-phase flow data obtained from the different techniques are found to be consistent, and of high enough quality for validating numerical simulation tools for 3D bubbly flows. (author)

  11. Integral criteria for large-scale multiple fingerprint solutions

    Science.gov (United States)

    Ushmaev, Oleg S.; Novikov, Sergey O.

    2004-08-01

    We propose the definition and analysis of the optimal integral similarity score criterion for large scale multmodal civil ID systems. Firstly, the general properties of score distributions for genuine and impostor matches for different systems and input devices are investigated. The empirical statistics was taken from the real biometric tests. Then we carry out the analysis of simultaneous score distributions for a number of combined biometric tests and primary for ultiple fingerprint solutions. The explicit and approximate relations for optimal integral score, which provides the least value of the FRR while the FAR is predefined, have been obtained. The results of real multiple fingerprint test show good correspondence with the theoretical results in the wide range of the False Acceptance and the False Rejection Rates.

  12. Large Scale Simulation Platform for NODES Validation Study

    Energy Technology Data Exchange (ETDEWEB)

    Sotorrio, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Qin, Y. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Min, L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-04-27

    This report summarizes the Large Scale (LS) simulation platform created for the Eaton NODES project. The simulation environment consists of both wholesale market simulator and distribution simulator and includes the CAISO wholesale market model and a PG&E footprint of 25-75 feeders to validate the scalability under a scenario of 33% RPS in California with additional 17% of DERS coming from distribution and customers. The simulator can generate hourly unit commitment, 5-minute economic dispatch, and 4-second AGC regulation signals. The simulator is also capable of simulating greater than 10k individual controllable devices. Simulated DERs include water heaters, EVs, residential and light commercial HVAC/buildings, and residential-level battery storage. Feeder-level voltage regulators and capacitor banks are also simulated for feeder-level real and reactive power management and Vol/Var control.

  13. Future hydrogen markets for large-scale hydrogen production systems

    International Nuclear Information System (INIS)

    Forsberg, Charles W.

    2007-01-01

    The cost of delivered hydrogen includes production, storage, and distribution. For equal production costs, large users (>10 6 m 3 /day) will favor high-volume centralized hydrogen production technologies to avoid collection costs for hydrogen from widely distributed sources. Potential hydrogen markets were examined to identify and characterize those markets that will favor large-scale hydrogen production technologies. The two high-volume centralized hydrogen production technologies are nuclear energy and fossil energy with carbon dioxide sequestration. The potential markets for these technologies are: (1) production of liquid fuels (gasoline, diesel and jet) including liquid fuels with no net greenhouse gas emissions and (2) peak electricity production. The development of high-volume centralized hydrogen production technologies requires an understanding of the markets to (1) define hydrogen production requirements (purity, pressure, volumes, need for co-product oxygen, etc.); (2) define and develop technologies to use the hydrogen, and (3) create the industrial partnerships to commercialize such technologies. (author)

  14. Designing and developing portable large-scale JavaScript web applications within the Experiment Dashboard framework

    International Nuclear Information System (INIS)

    Andreeva, J; Dzhunov, I; Karavakis, E; Kokoszkiewicz, L; Nowotka, M; Saiz, P; Tuckett, D

    2012-01-01

    Improvements in web browser performance and web standards compliance, as well as the availability of comprehensive JavaScript libraries, provides an opportunity to develop functionally rich yet intuitive web applications that allow users to access, render and analyse data in novel ways. However, the development of such large-scale JavaScript web applications presents new challenges, in particular with regard to code sustainability and team-based work. We present an approach that meets the challenges of large-scale JavaScript web application design and development, including client-side model-view-controller architecture, design patterns, and JavaScript libraries. Furthermore, we show how the approach leads naturally to the encapsulation of the data source as a web API, allowing applications to be easily ported to new data sources. The Experiment Dashboard framework is used for the development of applications for monitoring the distributed computing activities of virtual organisations on the Worldwide LHC Computing Grid. We demonstrate the benefits of the approach for large-scale JavaScript web applications in this context by examining the design of several Experiment Dashboard applications for data processing, data transfer and site status monitoring, and by showing how they have been ported for different virtual organisations and technologies.

  15. Designing and developing portable large-scale JavaScript web applications within the Experiment Dashboard framework

    Science.gov (United States)

    Andreeva, J.; Dzhunov, I.; Karavakis, E.; Kokoszkiewicz, L.; Nowotka, M.; Saiz, P.; Tuckett, D.

    2012-12-01

    Improvements in web browser performance and web standards compliance, as well as the availability of comprehensive JavaScript libraries, provides an opportunity to develop functionally rich yet intuitive web applications that allow users to access, render and analyse data in novel ways. However, the development of such large-scale JavaScript web applications presents new challenges, in particular with regard to code sustainability and team-based work. We present an approach that meets the challenges of large-scale JavaScript web application design and development, including client-side model-view-controller architecture, design patterns, and JavaScript libraries. Furthermore, we show how the approach leads naturally to the encapsulation of the data source as a web API, allowing applications to be easily ported to new data sources. The Experiment Dashboard framework is used for the development of applications for monitoring the distributed computing activities of virtual organisations on the Worldwide LHC Computing Grid. We demonstrate the benefits of the approach for large-scale JavaScript web applications in this context by examining the design of several Experiment Dashboard applications for data processing, data transfer and site status monitoring, and by showing how they have been ported for different virtual organisations and technologies.

  16. Large scale dynamics of protoplanetary discs

    Science.gov (United States)

    Béthune, William

    2017-08-01

    Planets form in the gaseous and dusty disks orbiting young stars. These protoplanetary disks are dispersed in a few million years, being accreted onto the central star or evaporated into the interstellar medium. To explain the observed accretion rates, it is commonly assumed that matter is transported through the disk by turbulence, although the mechanism sustaining turbulence is uncertain. On the other side, irradiation by the central star could heat up the disk surface and trigger a photoevaporative wind, but thermal effects cannot account for the observed acceleration and collimation of the wind into a narrow jet perpendicular to the disk plane. Both issues can be solved if the disk is sensitive to magnetic fields. Weak fields lead to the magnetorotational instability, whose outcome is a state of sustained turbulence. Strong fields can slow down the disk, causing it to accrete while launching a collimated wind. However, the coupling between the disk and the neutral gas is done via electric charges, each of which is outnumbered by several billion neutral molecules. The imperfect coupling between the magnetic field and the neutral gas is described in terms of "non-ideal" effects, introducing new dynamical behaviors. This thesis is devoted to the transport processes happening inside weakly ionized and weakly magnetized accretion disks; the role of microphysical effects on the large-scale dynamics of the disk is of primary importance. As a first step, I exclude the wind and examine the impact of non-ideal effects on the turbulent properties near the disk midplane. I show that the flow can spontaneously organize itself if the ionization fraction is low enough; in this case, accretion is halted and the disk exhibits axisymmetric structures, with possible consequences on planetary formation. As a second step, I study the launching of disk winds via a global model of stratified disk embedded in a warm atmosphere. This model is the first to compute non-ideal effects from

  17. Large-Scale Spacecraft Fire Safety Tests

    Science.gov (United States)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; hide

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  18. Large-scale fuel cycle centres

    International Nuclear Information System (INIS)

    Smiley, S.H.; Black, K.M.

    1977-01-01

    The US Nuclear Regulatory Commission (NRC) has considered the nuclear energy centre concept for fuel cycle plants in the Nuclear Energy Centre Site Survey 1975 (NECSS-75) Rep. No. NUREG-0001, an important study mandated by the US Congress in the Energy Reorganization Act of 1974 which created the NRC. For this study, the NRC defined fuel cycle centres as consisting of fuel reprocessing and mixed-oxide fuel fabrication plants, and optional high-level waste and transuranic waste management facilities. A range of fuel cycle centre sizes corresponded to the fuel throughput of power plants with a total capacity of 50,000-300,000MW(e). The types of fuel cycle facilities located at the fuel cycle centre permit the assessment of the role of fuel cycle centres in enhancing the safeguard of strategic special nuclear materials - plutonium and mixed oxides. Siting fuel cycle centres presents a smaller problem than siting reactors. A single reprocessing plant of the scale projected for use in the USA (1500-2000t/a) can reprocess fuel from reactors producing 50,000-65,000MW(e). Only two or three fuel cycle centres of the upper limit size considered in the NECSS-75 would be required in the USA by the year 2000. The NECSS-75 fuel cycle centre evaluation showed that large-scale fuel cycle centres present no real technical siting difficulties from a radiological effluent and safety standpoint. Some construction economies may be achievable with fuel cycle centres, which offer opportunities to improve waste-management systems. Combined centres consisting of reactors and fuel reprocessing and mixed-oxide fuel fabrication plants were also studied in the NECSS. Such centres can eliminate shipment not only of Pu but also mixed-oxide fuel. Increased fuel cycle costs result from implementation of combined centres unless the fuel reprocessing plants are commercial-sized. Development of Pu-burning reactors could reduce any economic penalties of combined centres. The need for effective fissile

  19. Recent Advances in Understanding Large Scale Vapour Explosions

    International Nuclear Information System (INIS)

    Board, S.J.; Hall, R.W.

    1976-01-01

    In foundries, violent explosions occur occasionally when molten metal comes into contact with water. If similar explosions can occur with other materials, hazardous situations may arise for example in LNG marine transportation accidents, or in liquid cooled reactor incidents when molten UO 2 contacts water or sodium coolant. Over the last 10 years a large body of experimental data has been obtained on the behaviour of small quantities of hot material in contact with a vaporisable coolant. Such experiments generally give low energy yields, despite producing fine fragmentation of the molten material. These events have been interpreted in terms of a wide range of phenomena such as violent boiling, liquid entrainment, bubble collapse, superheat, surface cracking and many others. Many of these studies have been aimed at understanding the small scale behaviour of the particular materials of interest. However, understanding the nature of the energetic events which were the original cause for concern may also be necessary to give confidence that violent events cannot occur for these materials in large scale situations. More recently, there has been a trend towards larger experiments and some of these have produced explosions of moderately high efficiency. Although occurrence of such large scale explosions can depend rather critically on initial conditions in a way which is not fully understood, there are signs that the interpretation of these events may be more straightforward than that of the single drop experiments. In the last two years several theoretical models for large scale explosions have appeared which attempt a self contained explanation of at least some stages of such high yield events: these have as their common feature a description of how a propagating breakdown of an initially quasi-stable distribution of materials is induced by the pressure and flow field caused by the energy release in adjacent regions. These models have led to the idea that for a full

  20. Detecting the Spatio-temporal Distribution of Soil Salinity and Its Relationship to Crop Growth in a Large-scale Arid Irrigation District Based on Sampling Experiment and Remote Sensing

    Science.gov (United States)

    Ren, D.; Huang, G., Sr.; Xu, X.; Huang, Q., Sr.; Xiong, Y.

    2016-12-01

    Soil salinity analysis on a regional scale is of great significance for protecting agriculture production and maintaining eco-environmental health in arid and semi-arid irrigated areas. In this study, the Hetao Irrigation District (Hetao) in Inner Mongolia Autonomous Region, with suffering long-term soil salinization problems, was selected as the case study area. Field sampling experiments and investigations related to soil salt contents, crop growth and yields were carried out across the whole area, during April to August in 2015. Soil salinity characteristics in space and time were systematically analyzed for Hetao as well as the corresponding impacts on crops. Remotely sensed map of soil salinity distribution for surface soil was also derived based on the Landsat OLI data with a 30 m resolution. The results elaborated the temporal and spatial dynamics of soil salinity and the relationships with irrigation, groundwater depth and crop water consumption in Hetao. In addition, the strong spatial variability of salinization was clearly presented by the remotely sensed map of soil salinity. Further, the relationship between soil salinity and crop growth was analyzed, and then the impact degrees of soil salinization on cropping pattern, leaf area index, plant height and crop yield were preliminarily revealed. Overall, this study can provide very useful information for salinization control and guide the future agricultural production and soil-water management for the arid irrigation districts analogous to Hetao.

  1. Distributed acoustic sensing for pipeline monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Hill, David; McEwen-King, Magnus [OptaSense, QinetiQ Ltd., London (United Kingdom)

    2009-07-01

    Optical fibre is deployed widely across the oil and gas industry. As well as being deployed regularly to provide high bandwidth telecommunications and infrastructure for SCADA it is increasingly being used to sense pressure, temperature and strain along buried pipelines, on subsea pipelines and downhole. In this paper we present results from the latest sensing capability using standard optical fibre to detect acoustic signals along the entire length of a pipeline. In Distributed Acoustic Sensing (DAS) an optical fibre is used for both sensing and telemetry. In this paper we present results from the OptaSense{sup TM} system which has been used to detect third party intervention (TPI) along buried pipelines. In a typical deployment the system is connected to an existing standard single-mode fibre, up to 50km in length, and was used to independently listen to the acoustic / seismic activity at every 10 meter interval. We will show that through the use of advanced array processing of the independent, simultaneously sampled channels it is possible to detect and locate activity within the vicinity of the pipeline and through sophisticated acoustic signal processing to obtain the acoustic signature to classify the type of activity. By combining spare fibre capacity in existing buried fibre optic cables; processing and display techniques commonly found in sonar; and state-of-the-art in fibre-optic distributed acoustic sensing, we will describe the new monitoring capabilities that are available to the pipeline operator. Without the expense of retrofitting sensors to the pipeline, this technology can provide a high performance, rapidly deployable and cost effective method of providing gapless and persistent monitoring of a pipeline. We will show how this approach can be used to detect, classify and locate activity such as; third party interference (including activity indicative of illegal hot tapping); real time tracking of pigs; and leak detection. We will also show how an

  2. Systematic renormalization of the effective theory of Large Scale Structure

    International Nuclear Information System (INIS)

    Abolhasani, Ali Akbar; Mirbabayi, Mehrdad; Pajer, Enrico

    2016-01-01

    A perturbative description of Large Scale Structure is a cornerstone of our understanding of the observed distribution of matter in the universe. Renormalization is an essential and defining step to make this description physical and predictive. Here we introduce a systematic renormalization procedure, which neatly associates counterterms to the UV-sensitive diagrams order by order, as it is commonly done in quantum field theory. As a concrete example, we renormalize the one-loop power spectrum and bispectrum of both density and velocity. In addition, we present a series of results that are valid to all orders in perturbation theory. First, we show that while systematic renormalization requires temporally non-local counterterms, in practice one can use an equivalent basis made of local operators. We give an explicit prescription to generate all counterterms allowed by the symmetries. Second, we present a formal proof of the well-known general argument that the contribution of short distance perturbations to large scale density contrast δ and momentum density π(k) scale as k 2 and k, respectively. Third, we demonstrate that the common practice of introducing counterterms only in the Euler equation when one is interested in correlators of δ is indeed valid to all orders.

  3. Exploiting Data Sparsity for Large-Scale Matrix Computations

    KAUST Repository

    Akbudak, Kadir; Ltaief, Hatem; Mikhalev, Aleksandr; Charara, Ali; Keyes, David E.

    2018-01-01

    Exploiting data sparsity in dense matrices is an algorithmic bridge between architectures that are increasingly memory-austere on a per-core basis and extreme-scale applications. The Hierarchical matrix Computations on Manycore Architectures (HiCMA) library tackles this challenging problem by achieving significant reductions in time to solution and memory footprint, while preserving a specified accuracy requirement of the application. HiCMA provides a high-performance implementation on distributed-memory systems of one of the most widely used matrix factorization in large-scale scientific applications, i.e., the Cholesky factorization. It employs the tile low-rank data format to compress the dense data-sparse off-diagonal tiles of the matrix. It then decomposes the matrix computations into interdependent tasks and relies on the dynamic runtime system StarPU for asynchronous out-of-order scheduling, while allowing high user-productivity. Performance comparisons and memory footprint on matrix dimensions up to eleven million show a performance gain and memory saving of more than an order of magnitude for both metrics on thousands of cores, against state-of-the-art open-source and vendor optimized numerical libraries. This represents an important milestone in enabling large-scale matrix computations toward solving big data problems in geospatial statistics for climate/weather forecasting applications.

  4. Exploiting Data Sparsity for Large-Scale Matrix Computations

    KAUST Repository

    Akbudak, Kadir

    2018-02-24

    Exploiting data sparsity in dense matrices is an algorithmic bridge between architectures that are increasingly memory-austere on a per-core basis and extreme-scale applications. The Hierarchical matrix Computations on Manycore Architectures (HiCMA) library tackles this challenging problem by achieving significant reductions in time to solution and memory footprint, while preserving a specified accuracy requirement of the application. HiCMA provides a high-performance implementation on distributed-memory systems of one of the most widely used matrix factorization in large-scale scientific applications, i.e., the Cholesky factorization. It employs the tile low-rank data format to compress the dense data-sparse off-diagonal tiles of the matrix. It then decomposes the matrix computations into interdependent tasks and relies on the dynamic runtime system StarPU for asynchronous out-of-order scheduling, while allowing high user-productivity. Performance comparisons and memory footprint on matrix dimensions up to eleven million show a performance gain and memory saving of more than an order of magnitude for both metrics on thousands of cores, against state-of-the-art open-source and vendor optimized numerical libraries. This represents an important milestone in enabling large-scale matrix computations toward solving big data problems in geospatial statistics for climate/weather forecasting applications.

  5. The combustion behavior of large scale lithium titanate battery

    Science.gov (United States)

    Huang, Peifeng; Wang, Qingsong; Li, Ke; Ping, Ping; Sun, Jinhua

    2015-01-01

    Safety problem is always a big obstacle for lithium battery marching to large scale application. However, the knowledge on the battery combustion behavior is limited. To investigate the combustion behavior of large scale lithium battery, three 50 Ah Li(NixCoyMnz)O2/Li4Ti5O12 batteries under different state of charge (SOC) were heated to fire. The flame size variation is depicted to analyze the combustion behavior directly. The mass loss rate, temperature and heat release rate are used to analyze the combustion behavior in reaction way deeply. Based on the phenomenon, the combustion process is divided into three basic stages, even more complicated at higher SOC with sudden smoke flow ejected. The reason is that a phase change occurs in Li(NixCoyMnz)O2 material from layer structure to spinel structure. The critical temperatures of ignition are at 112–121°C on anode tab and 139 to 147°C on upper surface for all cells. But the heating time and combustion time become shorter with the ascending of SOC. The results indicate that the battery fire hazard increases with the SOC. It is analyzed that the internal short and the Li+ distribution are the main causes that lead to the difference. PMID:25586064

  6. DEMNUni: massive neutrinos and the bispectrum of large scale structures

    Science.gov (United States)

    Ruggeri, Rossana; Castorina, Emanuele; Carbone, Carmelita; Sefusatti, Emiliano

    2018-03-01

    The main effect of massive neutrinos on the large-scale structure consists in a few percent suppression of matter perturbations on all scales below their free-streaming scale. Such effect is of particular importance as it allows to constraint the value of the sum of neutrino masses from measurements of the galaxy power spectrum. In this work, we present the first measurements of the next higher-order correlation function, the bispectrum, from N-body simulations that include massive neutrinos as particles. This is the simplest statistics characterising the non-Gaussian properties of the matter and dark matter halos distributions. We investigate, in the first place, the suppression due to massive neutrinos on the matter bispectrum, comparing our measurements with the simplest perturbation theory predictions, finding the approximation of neutrinos contributing at quadratic order in perturbation theory to provide a good fit to the measurements in the simulations. On the other hand, as expected, a linear approximation for neutrino perturbations would lead to Script O(fν) errors on the total matter bispectrum at large scales. We then attempt an extension of previous results on the universality of linear halo bias in neutrino cosmologies, to non-linear and non-local corrections finding consistent results with the power spectrum analysis.

  7. Safeguarding of large scale reprocessing and MOX plants

    International Nuclear Information System (INIS)

    Howsley, R.; Burrows, B.; Longevialle, H. de; Kuroi, H.; Izumi, A.

    1997-01-01

    In May 97, the IAEA Board of Governors approved the final measures of the ''93+2'' safeguards strengthening programme, thus improving the international non-proliferation regime by enhancing the effectiveness and efficiency of safeguards verification. These enhancements are not however, a revolution in current practices, but rather an important step in the continuous evolution of the safeguards system. The principles embodied in 93+2, for broader access to information and increased physical access already apply, in a pragmatic way, to large scale reprocessing and MOX fabrication plants. In these plants, qualitative measures and process monitoring play an important role in addition to accountancy and material balance evaluations in attaining the safeguard's goals. This paper will reflect on the safeguards approaches adopted for these large bulk handling facilities and draw analogies, conclusions and lessons for the forthcoming implementation of the 93+2 Programme. (author)

  8. Large scale intender test program to measure sub gouge displacements

    Energy Technology Data Exchange (ETDEWEB)

    Been, Ken; Lopez, Juan [Golder Associates Inc, Houston, TX (United States); Sancio, Rodolfo [MMI Engineering Inc., Houston, TX (United States)

    2011-07-01

    The production of submarine pipelines in an offshore environment covered with ice is very challenging. Several precautions must be taken such as burying the pipelines to protect them from ice movement caused by gouging. The estimation of the subgouge displacements is a key factor in pipeline design for ice gouged environments. This paper investigated a method to measure subgouge displacements. An experimental program was implemented in an open field to produce large scale idealized gouges on engineered soil beds (sand and clay). The horizontal force required to produce the gouge, the subgouge displacements in the soil and the strain imposed by these displacements were monitored on a buried model pipeline. The results showed that for a given keel, the gouge depth was inversely proportional to undrained shear strength in clay. The subgouge displacements measured did not show a relationship with the gouge depth, width or soil density in sand and clay tests.

  9. SIMON: Remote collaboration system based on large scale simulation

    International Nuclear Information System (INIS)

    Sugawara, Akihiro; Kishimoto, Yasuaki

    2003-01-01

    Development of SIMON (SImulation MONitoring) system is described. SIMON aims to investigate many physical phenomena of tokamak type nuclear fusion plasma by simulation and to exchange information and to carry out joint researches with scientists in the world using internet. The characteristics of SIMON are followings; 1) decrease load of simulation by trigger sending method, 2) visualization of simulation results and hierarchical structure of analysis, 3) decrease of number of license by using command line when software is used, 4) improvement of support for using network of simulation data output by use of HTML (Hyper Text Markup Language), 5) avoidance of complex built-in work in client part and 6) small-sized and portable software. The visualization method of large scale simulation, remote collaboration system by HTML, trigger sending method, hierarchical analytical method, introduction into three-dimensional electromagnetic transportation code and technologies of SIMON system are explained. (S.Y.)

  10. Large-scale demonstration of waste solidification in saltstone

    International Nuclear Information System (INIS)

    McIntyre, P.F.; Oblath, S.B.; Wilhite, E.L.

    1988-05-01

    The saltstone lysimeters are a large scale demonstration of a disposal concept for decontaminated salt solution resulting from in-tank processing of defense waste. The lysimeter experiment has provided data on the leaching behavior of large saltstone monoliths under realistic field conditions. The results also will be used to compare the effect of capping the wasteform on contaminant release. Biweekly monitoring of sump leachate from three lysimeters has continued on a routine basis for approximately 3 years. An uncapped lysimeter has shown the highest levels of nitrate and 99 Tc release. Gravel and clay capped lysimeters have shown levels equivalent to or slightly higher than background rainwater levels. Mathematical model predictions have been compared to lysimeter results. The models will be applied to predict the impact of saltstone disposal on groundwater quality. 9 refs., 5 figs., 3 tabs

  11. Large scale oil lease automation and electronic custody transfer

    International Nuclear Information System (INIS)

    Price, C.R.; Elmer, D.C.

    1995-01-01

    Typically, oil field production operations have only been automated at fields with long term production profiles and enhanced recovery. The automation generally consists of monitoring and control at the wellhead and centralized facilities. However, Union Pacific Resources Co. (UPRC) has successfully implemented a large scale automation program for rapid-decline primary recovery Austin Chalk wells where purchasers buy and transport oil from each individual wellsite. This project has resulted in two significant benefits. First, operators are using the system to re-engineer their work processes. Second, an inter-company team created a new electronic custody transfer method. This paper will describe: the progression of the company's automation objectives in the area; the field operator's interaction with the system, and the related benefits; the research and development of the new electronic custody transfer method

  12. A Java based environment to control and monitor distributed processing systems

    International Nuclear Information System (INIS)

    Legrand, I.C.

    1997-01-01

    Distributed processing systems are considered to solve the challenging requirements of triggering and data acquisition systems for future HEP experiments. The aim of this work is to present a software environment to control and monitor large scale parallel processing systems based on a distributed client-server approach developed in Java. One server task may control several processing nodes, switching elements or controllers for different sub-systems. Servers are designed as multi-thread applications for efficient communications with other objects. Servers communicate between themselves by using Remote Method Invocation (RMI) in a peer-to-peer mechanism. This distributed server layer has to provide a dynamic and transparent access from any client to all the resources in the system. The graphical user interface programs, which are platform independent, may be transferred to any client via the http protocol. In this scheme the control and monitor tasks are distributed among servers and network controls the flow of information among servers and clients providing a flexible mechanism for monitoring and controlling large heterogenous distributed systems. (author)

  13. Large scale injection test (LASGIT) modelling

    International Nuclear Information System (INIS)

    Arnedo, D.; Olivella, S.; Alonso, E.E.

    2010-01-01

    Document available in extended abstract form only. With the objective of understanding the gas flow processes through clay barriers in schemes of radioactive waste disposal, the Lasgit in situ experiment was planned and is currently in progress. The modelling of the experiment will permit to better understand of the responses, to confirm hypothesis of mechanisms and processes and to learn in order to design future experiments. The experiment and modelling activities are included in the project FORGE (FP7). The in situ large scale injection test Lasgit is currently being performed at the Aespoe Hard Rock Laboratory by SKB and BGS. An schematic layout of the test is shown. The deposition hole follows the KBS3 scheme. A copper canister is installed in the axe of the deposition hole, surrounded by blocks of highly compacted MX-80 bentonite. A concrete plug is placed at the top of the buffer. A metallic lid anchored to the surrounding host rock is included in order to prevent vertical movements of the whole system during gas injection stages (high gas injection pressures are expected to be reached). Hydration of the buffer material is achieved by injecting water through filter mats, two placed at the rock walls and two at the interfaces between bentonite blocks. Water is also injected through the 12 canister filters. Gas injection stages are performed injecting gas to some of the canister injection filters. Since the water pressure and the stresses (swelling pressure development) will be high during gas injection, it is necessary to inject at high gas pressures. This implies mechanical couplings as gas penetrates after the gas entry pressure is achieved and may produce deformations which in turn lead to permeability increments. A 3D hydro-mechanical numerical model of the test using CODE-BRIGHT is presented. The domain considered for the modelling is shown. The materials considered in the simulation are the MX-80 bentonite blocks (cylinders and rings), the concrete plug

  14. Large-scale fuel cycle centers

    International Nuclear Information System (INIS)

    Smiley, S.H.; Black, K.M.

    1977-01-01

    The United States Nuclear Regulatory Commission (NRC) has considered the nuclear energy center concept for fuel cycle plants in the Nuclear Energy Center Site Survey - 1975 (NECSS-75) -- an important study mandated by the U.S. Congress in the Energy Reorganization Act of 1974 which created the NRC. For the study, NRC defined fuel cycle centers to consist of fuel reprocessing and mixed oxide fuel fabrication plants, and optional high-level waste and transuranic waste management facilities. A range of fuel cycle center sizes corresponded to the fuel throughput of power plants with a total capacity of 50,000 - 300,000 MWe. The types of fuel cycle facilities located at the fuel cycle center permit the assessment of the role of fuel cycle centers in enhancing safeguarding of strategic special nuclear materials -- plutonium and mixed oxides. Siting of fuel cycle centers presents a considerably smaller problem than the siting of reactors. A single reprocessing plant of the scale projected for use in the United States (1500-2000 MT/yr) can reprocess the fuel from reactors producing 50,000-65,000 MWe. Only two or three fuel cycle centers of the upper limit size considered in the NECSS-75 would be required in the United States by the year 2000 . The NECSS-75 fuel cycle center evaluations showed that large scale fuel cycle centers present no real technical difficulties in siting from a radiological effluent and safety standpoint. Some construction economies may be attainable with fuel cycle centers; such centers offer opportunities for improved waste management systems. Combined centers consisting of reactors and fuel reprocessing and mixed oxide fuel fabrication plants were also studied in the NECSS. Such centers can eliminate not only shipment of plutonium, but also mixed oxide fuel. Increased fuel cycle costs result from implementation of combined centers unless the fuel reprocessing plants are commercial-sized. Development of plutonium-burning reactors could reduce any

  15. Large-scale assembly of colloidal particles

    Science.gov (United States)

    Yang, Hongta

    This study reports a simple, roll-to-roll compatible coating technology for producing three-dimensional highly ordered colloidal crystal-polymer composites, colloidal crystals, and macroporous polymer membranes. A vertically beveled doctor blade is utilized to shear align silica microsphere-monomer suspensions to form large-area composites in a single step. The polymer matrix and the silica microspheres can be selectively removed to create colloidal crystals and self-standing macroporous polymer membranes. The thickness of the shear-aligned crystal is correlated with the viscosity of the colloidal suspension and the coating speed, and the correlations can be qualitatively explained by adapting the mechanisms developed for conventional doctor blade coating. Five important research topics related to the application of large-scale three-dimensional highly ordered macroporous films by doctor blade coating are covered in this study. The first topic describes the invention in large area and low cost color reflective displays. This invention is inspired by the heat pipe technology. The self-standing macroporous polymer films exhibit brilliant colors which originate from the Bragg diffractive of visible light form the three-dimensional highly ordered air cavities. The colors can be easily changed by tuning the size of the air cavities to cover the whole visible spectrum. When the air cavities are filled with a solvent which has the same refractive index as that of the polymer, the macroporous polymer films become completely transparent due to the index matching. When the solvent trapped in the cavities is evaporated by in-situ heating, the sample color changes back to brilliant color. This process is highly reversible and reproducible for thousands of cycles. The second topic reports the achievement of rapid and reversible vapor detection by using 3-D macroporous photonic crystals. Capillary condensation of a condensable vapor in the interconnected macropores leads to the

  16. Large scale implementation of guided wave based broken rail monitoring

    Science.gov (United States)

    Burger, Francois A.; Loveday, Philip W.; Long, Craig S.

    2015-03-01

    A guided wave ultrasound system has been developed over the past 17 years to detect breaks in continuously welded rail track. Installation of the version 4 system on an 840 km long heavy duty freight line was conducted between January 2013 and June 2014. The system operates in pitch - catch mode with alternate transmit and receive transducers spaced approximately 1km apart. If the acoustic signal is not received at the receive station an alarm is triggered to indicate a break in the rail between the transmit station and the receive station. The system is permanently installed, powered by solar panels and issues broken rail alarms using the GSM network where available, and digital radio technology in other areas. A total of 931 stations were installed and the entire length of rail is interrogated every fifteen minutes. The system operates reliably although some problems involving unreliable GSM communication and theft of solar panels have been experienced. In the first two months of operation four broken rails were detected and train operation was halted temporarily for repairs.

  17. Security and VO management capabilities in a large-scale Grid operating system

    OpenAIRE

    Aziz, Benjamin; Sporea, Ioana

    2014-01-01

    This paper presents a number of security and VO management capabilities in a large-scale distributed Grid operating system. The capabilities formed the basis of the design and implementation of a number of security and VO management services in the system. The main aim of the paper is to provide some idea of the various functionality cases that need to be considered when designing similar large-scale systems in the future.

  18. Complex modular structure of large-scale brain networks

    Science.gov (United States)

    Valencia, M.; Pastor, M. A.; Fernández-Seara, M. A.; Artieda, J.; Martinerie, J.; Chavez, M.

    2009-06-01

    Modular structure is ubiquitous among real-world networks from related proteins to social groups. Here we analyze the modular organization of brain networks at a large scale (voxel level) extracted from functional magnetic resonance imaging signals. By using a random-walk-based method, we unveil the modularity of brain webs and show modules with a spatial distribution that matches anatomical structures with functional significance. The functional role of each node in the network is studied by analyzing its patterns of inter- and intramodular connections. Results suggest that the modular architecture constitutes the structural basis for the coexistence of functional integration of distant and specialized brain areas during normal brain activities at rest.

  19. Large-scale additive manufacturing with bioinspired cellulosic materials.

    Science.gov (United States)

    Sanandiya, Naresh D; Vijay, Yadunund; Dimopoulou, Marina; Dritsas, Stylianos; Fernandez, Javier G

    2018-06-05

    Cellulose is the most abundant and broadly distributed organic compound and industrial by-product on Earth. However, despite decades of extensive research, the bottom-up use of cellulose to fabricate 3D objects is still plagued with problems that restrict its practical applications: derivatives with vast polluting effects, use in combination with plastics, lack of scalability and high production cost. Here we demonstrate the general use of cellulose to manufacture large 3D objects. Our approach diverges from the common association of cellulose with green plants and it is inspired by the wall of the fungus-like oomycetes, which is reproduced introducing small amounts of chitin between cellulose fibers. The resulting fungal-like adhesive material(s) (FLAM) are strong, lightweight and inexpensive, and can be molded or processed using woodworking techniques. We believe this first large-scale additive manufacture with ubiquitous biological polymers will be the catalyst for the transition to environmentally benign and circular manufacturing models.

  20. Testing Inflation with Large Scale Structure: Connecting Hopes with Reality

    Energy Technology Data Exchange (ETDEWEB)

    Alvarez, Marcello [Univ. of Toronto, ON (Canada); Baldauf, T. [Inst. of Advanced Studies, Princeton, NJ (United States); Bond, J. Richard [Univ. of Toronto, ON (Canada); Canadian Inst. for Advanced Research, Toronto, ON (Canada); Dalal, N. [Univ. of Illinois, Urbana-Champaign, IL (United States); Putter, R. D. [Jet Propulsion Lab., Pasadena, CA (United States); California Inst. of Technology (CalTech), Pasadena, CA (United States); Dore, O. [Jet Propulsion Lab., Pasadena, CA (United States); California Inst. of Technology (CalTech), Pasadena, CA (United States); Green, Daniel [Univ. of Toronto, ON (Canada); Canadian Inst. for Advanced Research, Toronto, ON (Canada); Hirata, Chris [The Ohio State Univ., Columbus, OH (United States); Huang, Zhiqi [Univ. of Toronto, ON (Canada); Huterer, Dragan [Univ. of Michigan, Ann Arbor, MI (United States); Jeong, Donghui [Pennsylvania State Univ., University Park, PA (United States); Johnson, Matthew C. [York Univ., Toronto, ON (Canada); Perimeter Inst., Waterloo, ON (Canada); Krause, Elisabeth [Stanford Univ., CA (United States); Loverde, Marilena [Univ. of Chicago, IL (United States); Meyers, Joel [Univ. of Toronto, ON (Canada); Meeburg, Daniel [Univ. of Toronto, ON (Canada); Senatore, Leonardo [Stanford Univ., CA (United States); Shandera, Sarah [Pennsylvania State Univ., University Park, PA (United States); Silverstein, Eva [Stanford Univ., CA (United States); Slosar, Anze [Brookhaven National Lab. (BNL), Upton, NY (United States); Smith, Kendrick [Perimeter Inst., Waterloo, Toronto, ON (Canada); Zaldarriaga, Matias [Univ. of Toronto, ON (Canada); Assassi, Valentin [Cambridge Univ. (United Kingdom); Braden, Jonathan [Univ. of Toronto, ON (Canada); Hajian, Amir [Univ. of Toronto, ON (Canada); Kobayashi, Takeshi [Perimeter Inst., Waterloo, Toronto, ON (Canada); Univ. of Toronto, ON (Canada); Stein, George [Univ. of Toronto, ON (Canada); Engelen, Alexander van [Univ. of Toronto, ON (Canada)

    2014-12-15

    The statistics of primordial curvature fluctuations are our window into the period of inflation, where these fluctuations were generated. To date, the cosmic microwave background has been the dominant source of information about these perturbations. Large-scale structure is, however, from where drastic improvements should originate. In this paper, we explain the theoretical motivations for pursuing such measurements and the challenges that lie ahead. In particular, we discuss and identify theoretical targets regarding the measurement of primordial non-Gaussianity. We argue that when quantified in terms of the local (equilateral) template amplitude f$loc\\atop{NL}$ (f$eq\\atop{NL}$), natural target levels of sensitivity are Δf$loc, eq\\atop{NL}$ ≃ 1. We highlight that such levels are within reach of future surveys by measuring 2-, 3- and 4-point statistics of the galaxy spatial distribution. This paper summarizes a workshop held at CITA (University of Toronto) on October 23-24, 2014.

  1. Bonus algorithm for large scale stochastic nonlinear programming problems

    CERN Document Server

    Diwekar, Urmila

    2015-01-01

    This book presents the details of the BONUS algorithm and its real world applications in areas like sensor placement in large scale drinking water networks, sensor placement in advanced power systems, water management in power systems, and capacity expansion of energy systems. A generalized method for stochastic nonlinear programming based on a sampling based approach for uncertainty analysis and statistical reweighting to obtain probability information is demonstrated in this book. Stochastic optimization problems are difficult to solve since they involve dealing with optimization and uncertainty loops. There are two fundamental approaches used to solve such problems. The first being the decomposition techniques and the second method identifies problem specific structures and transforms the problem into a deterministic nonlinear programming problem. These techniques have significant limitations on either the objective function type or the underlying distributions for the uncertain variables. Moreover, these ...

  2. Segmentation by Large Scale Hypothesis Testing - Segmentation as Outlier Detection

    DEFF Research Database (Denmark)

    Darkner, Sune; Dahl, Anders Lindbjerg; Larsen, Rasmus

    2010-01-01

    a microscope and we show how the method can handle transparent particles with significant glare point. The method generalizes to other problems. THis is illustrated by applying the method to camera calibration images and MRI of the midsagittal plane for gray and white matter separation and segmentation......We propose a novel and efficient way of performing local image segmentation. For many applications a threshold of pixel intensities is sufficient but determine the appropriate threshold value can be difficult. In cases with large global intensity variation the threshold value has to be adapted...... locally. We propose a method based on large scale hypothesis testing with a consistent method for selecting an appropriate threshold for the given data. By estimating the background distribution we characterize the segment of interest as a set of outliers with a certain probability based on the estimated...

  3. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  4. Modeling Human Behavior at a Large Scale

    Science.gov (United States)

    2012-01-01

    which is consistent with previous findings (Kwak et al., 2010; Java et al., 2007). Unique locations are the result of iterative clustering that merges...subjects living in the Seattle metropolitan area. The colored tri- angular cells represent a probability distribution of the person’s location given an...2010. Using a Python script , we periodically queried Twitter with requests of all recent tweets within 150 kilometers of LA’s city center, and 100

  5. Algorithms for Large-Scale Astronomical Problems

    Science.gov (United States)

    2013-08-01

    Magana, Ross O’Connell, Xiaoying Xu, and Feng Yu . They are very nice and provided a lot of useful information. During these years I have worked with...Sicun Gao, Fan Guo, Yunchuan Kong, Ni Lao, Lei Li, Nan Li, Jialiu Lin, Yuefeng Lin, Xi Liu, Ning Lu, Min Luo, Kai Ren, Chao Shen, Long Qin, Huimin... Hsieh , Deborah A. Wallach, Mike Burrows, Tushar Chandra, Andrew Fikes, and Robert E. Gruber. Bigtable: A distributed storage system for structured data

  6. Large-Scale Structure Behind The Milky Way with ALFAZOA

    Science.gov (United States)

    Sanchez Barrantes, Monica; Henning, Patricia A.; Momjian, Emmanuel; McIntyre, Travis; Minchin, Robert F.

    2018-06-01

    The region of the sky behind the Milky Way (the Zone of Avoidance; ZOA) is not well studied due to high obscuration from gas and dust in our galaxy as well as stellar confusion, which results in low detection rate of galaxies in this region. Because of this, little is known about the distribution of galaxies in the ZOA, and other all sky redshift surveys have incomplete maps (e.g. the 2MASS Redshift survey in NIR has a gap of 5-8 deg around the Galactic plane). There is still controversy about the dipole anisotropy calculated from the comparison between the CMB and galaxy and redshift surveys, in part due to the incomplete sky mapping and redshift depth of these surveys. Fortunately, there is no ZOA at radio wavelengths because such wavelengths can pass unimpeded through dust and are not affected by stellar confusion. Therefore, we can detect and make a map of the distribution of obscured galaxies that contain the 21cm neutral hydrogen emission line, and trace the large-scale structure across the Galactic plane. The Arecibo L-Band Feed Array Zone of Avoidance (ALFAZOA) survey is a blind HI survey for galaxies behind the Milky Way that covers more than 1000 square degrees of the sky, conducted in two phases: shallow (completed) and deep (ongoing). We show the results of the finished shallow phase of the survey, which mapped a region between the galactic longitude l=30-75 deg, and latitude b <|10 deg|, and detected 418 galaxies to about 12,000 km/s, including galaxy properties and mapped large-scale structure. We do the same for new results from the deep phase, which is ongoing and covers 30 < l < 75 deg and b < |2| deg for the inner galaxy and 175 < l < 207 deg, with -2 < b < 1 for the outer galaxy.

  7. Locating inefficient links in a large-scale transportation network

    Science.gov (United States)

    Sun, Li; Liu, Like; Xu, Zhongzhi; Jie, Yang; Wei, Dong; Wang, Pu

    2015-02-01

    Based on data from geographical information system (GIS) and daily commuting origin destination (OD) matrices, we estimated the distribution of traffic flow in the San Francisco road network and studied Braess's paradox in a large-scale transportation network with realistic travel demand. We measured the variation of total travel time Δ T when a road segment is closed, and found that | Δ T | follows a power-law distribution if Δ T 0. This implies that most roads have a negligible effect on the efficiency of the road network, while the failure of a few crucial links would result in severe travel delays, and closure of a few inefficient links would counter-intuitively reduce travel costs considerably. Generating three theoretical networks, we discovered that the heterogeneously distributed travel demand may be the origin of the observed power-law distributions of | Δ T | . Finally, a genetic algorithm was used to pinpoint inefficient link clusters in the road network. We found that closing specific road clusters would further improve the transportation efficiency.

  8. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  9. Distributed monitoring system based on Icinga

    International Nuclear Information System (INIS)

    Haen, C.; Bonaccorsi, E.; Neufeld, N.

    2012-01-01

    The LHCb online system relies on a large and heterogeneous IT infrastructure: it comprises more than 2000 servers and embedded systems and more than 200 network devices. Many of these equipment are critical in order to run the experiment, and it is important to have a monitoring solution efficient enough so that the experts can diagnose and act quickly. While our previous system was based on a central Nagios server, our current system uses a distributed Icinga infrastructure. We have a single instance of Icinga that schedules the checks and deals with the results but we have other servers (called 'workers') to perform the checks. The load induced on the worker is negligible, whereas the central server is fully busy. A client/server model based on Gearman manages queues the clients use to get their tasks and give their results. The other interesting feature of Icinga is the database back-end. Icinga will log the result of every action and check-result in a database. The new installation is now running 36000 service checks on 2100 hosts with 50 Gearman workers. Performances have dramatically improved

  10. Large scale flow in the dayside magnetosheath

    International Nuclear Information System (INIS)

    Crooker, N.U.; Siscoe, G.L.; Eastman, T.E.; Frank, L.A.; Zwickl, R.D.

    1984-01-01

    The degree of control over plasma flow direction exerted by the compressed magnetic field in the dayside magnetosheath is examined by comparing ISEE 1 LEPEDEA data with hydrodynamic and magnetohydrodynamic predictions. Measured flow directions projected toward the subsolar region pass within approx.1 R/sub E/ of the aberrated theoretical hydrodynamic stagnation point in 11 of 20 cases analyzed. The remaining nine cases pass within approx.2-3 R/sub E/ of the stagnation point. One case with large deflection has been studied in detail with large-time-resolution plasma and magnetic field data both from ISEE 1 and from ISEE 3, in the role of a solar wind monitor. The deflected flow is persitent over a period of 1 1/2 hours, and its direction is consistent with a stagnation point displacement resulting from increased, asymmetric magnetic field pressure contributions during periods of low Alfven Mach number, as predicted by Russell et al. Of the other eight cases with large deflections, four are associated with flux transfer events identified independently by Berchem and Russell. The observed deflections in these cases are consistent with either the subsolar merging line or the antiparallel merging hypothesis, but not exclusively with one or the other. The results relating to the formation of a stagnation line rather than a stagnation point are inconclusive

  11. Large-scale field testing on flexible shallow landslide barriers

    Science.gov (United States)

    Bugnion, Louis; Volkwein, Axel; Wendeler, Corinna; Roth, Andrea

    2010-05-01

    Open shallow landslides occur regularly in a wide range of natural terrains. Generally, they are difficult to predict and result in damages to properties and disruption of transportation systems. In order to improve the knowledge about the physical process itself and to develop new protection measures, large-scale field experiments were conducted in Veltheim, Switzerland. Material was released down a 30° inclined test slope into a flexible barrier. The flow as well as the impact into the barrier was monitored using various measurement techniques. Laser devices recording flow heights, a special force plate measuring normal and shear basal forces as well as load cells for impact pressures were installed along the test slope. In addition, load cells were built in the support and retaining cables of the barrier to provide data for detailed back-calculation of load distribution during impact. For the last test series an additional guiding wall in flow direction on both sides of the barrier was installed to achieve higher impact pressures in the middle of the barrier. With these guiding walls the flow is not able to spread out before hitting the barrier. A special constructed release mechanism simulating the sudden failure of the slope was designed such that about 50 m3 of mixed earth and gravel saturated with water can be released in an instant. Analysis of cable forces combined with impact pressures and velocity measurements during a test series allow us now to develop a load model for the barrier design. First numerical simulations with the software tool FARO, originally developed for rockfall barriers and afterwards calibrated for debris flow impacts, lead already to structural improvements on barrier design. Decisive for the barrier design is the first dynamic impact pressure depending on the flow velocity and afterwards the hydrostatic pressure of the complete retained material behind the barrier. Therefore volume estimation of open shallow landslides by assessing

  12. The Effect of Large Scale Salinity Gradient on Langmuir Turbulence

    Science.gov (United States)

    Fan, Y.; Jarosz, E.; Yu, Z.; Jensen, T.; Sullivan, P. P.; Liang, J.

    2017-12-01

    Langmuir circulation (LC) is believed to be one of the leading order causes of turbulent mixing in the upper ocean. It is important for momentum and heat exchange across the mixed layer (ML) and directly impact the dynamics and thermodynamics in the upper ocean and lower atmosphere including the vertical distributions of chemical, biological, optical, and acoustic properties. Based on Craik and Leibovich (1976) theory, large eddy simulation (LES) models have been developed to simulate LC in the upper ocean, yielding new insights that could not be obtained from field observations and turbulent closure models. Due its high computational cost, LES models are usually limited to small domain sizes and cannot resolve large-scale flows. Furthermore, most LES models used in the LC simulations use periodic boundary conditions in the horizontal direction, which assumes the physical properties (i.e. temperature and salinity) and expected flow patterns in the area of interest are of a periodically repeating nature so that the limited small LES domain is representative for the larger area. Using periodic boundary condition can significantly reduce computational effort in problems, and it is a good assumption for isotropic shear turbulence. However, LC is anisotropic (McWilliams et al 1997) and was observed to be modulated by crosswind tidal currents (Kukulka et al 2011). Using symmetrical domains, idealized LES studies also indicate LC could interact with oceanic fronts (Hamlington et al 2014) and standing internal waves (Chini and Leibovich, 2005). The present study expands our previous LES modeling investigations of Langmuir turbulence to the real ocean conditions with large scale environmental motion that features fresh water inflow into the study region. Large scale gradient forcing is introduced to the NCAR LES model through scale separation analysis. The model is applied to a field observation in the Gulf of Mexico in July, 2016 when the measurement site was impacted by

  13. Large-scale, long-term silvicultural experiments in the United States: historical overview and contemporary examples.

    Science.gov (United States)

    R. S. Seymour; J. Guldin; D. Marshall; B. Palik

    2006-01-01

    This paper provides a synopsis of large-scale, long-term silviculture experiments in the United States. Large-scale in a silvicultural context means that experimental treatment units encompass entire stands (5 to 30 ha); long-term means that results are intended to be monitored over many cutting cycles or an entire rotation, typically for many decades. Such studies...

  14. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  15. Double inflation: A possible resolution of the large-scale structure problem

    International Nuclear Information System (INIS)

    Turner, M.S.; Villumsen, J.V.; Vittorio, N.; Silk, J.; Juszkiewicz, R.

    1986-11-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Ω = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of ∼100 Mpc, while the small-scale structure over ≤ 10 Mpc resembles that in a low density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations. 38 refs., 6 figs

  16. Autonomous Sensors for Large Scale Data Collection

    Science.gov (United States)

    Noto, J.; Kerr, R.; Riccobono, J.; Kapali, S.; Migliozzi, M. A.; Goenka, C.

    2017-12-01

    Presented here is a novel implementation of a "Doppler imager" which remotely measures winds and temperatures of the neutral background atmosphere at ionospheric altitudes of 87-300Km and possibly above. Incorporating both recent optical manufacturing developments, modern network awareness and the application of machine learning techniques for intelligent self-monitoring and data classification. This system achieves cost savings in manufacturing, deployment and lifetime operating costs. Deployed in both ground and space-based modalities, this cost-disruptive technology will allow computer models of, ionospheric variability and other space weather models to operate with higher precision. Other sensors can be folded into the data collection and analysis architecture easily creating autonomous virtual observatories. A prototype version of this sensor has recently been deployed in Trivandrum India for the Indian Government. This Doppler imager is capable of operation, even within the restricted CubeSat environment. The CubeSat bus offers a very challenging environment, even for small instruments. The lack of SWaP and the challenging thermal environment demand development of a new generation of instruments; the Doppler imager presented is well suited to this environment. Concurrent with this CubeSat development is the development and construction of ground based arrays of inexpensive sensors using the proposed technology. This instrument could be flown inexpensively on one or more CubeSats to provide valuable data to space weather forecasters and ionospheric scientists. Arrays of magnetometers have been deployed for the last 20 years [Alabi, 2005]. Other examples of ground based arrays include an array of white-light all sky imagers (THEMIS) deployed across Canada [Donovan et al., 2006], oceans sensors on buoys [McPhaden et al., 2010], and arrays of seismic sensors [Schweitzer et al., 2002]. A comparable array of Doppler imagers can be constructed and deployed on the

  17. Disinformative data in large-scale hydrological modelling

    Directory of Open Access Journals (Sweden)

    A. Kauffeldt

    2013-07-01

    Full Text Available Large-scale hydrological modelling has become an important tool for the study of global and regional water resources, climate impacts, and water-resources management. However, modelling efforts over large spatial domains are fraught with problems of data scarcity, uncertainties and inconsistencies between model forcing and evaluation data. Model-independent methods to screen and analyse data for such problems are needed. This study aimed at identifying data inconsistencies in global datasets using a pre-modelling analysis, inconsistencies that can be disinformative for subsequent modelling. The consistency between (i basin areas for different hydrographic datasets, and (ii between climate data (precipitation and potential evaporation and discharge data, was examined in terms of how well basin areas were represented in the flow networks and the possibility of water-balance closure. It was found that (i most basins could be well represented in both gridded basin delineations and polygon-based ones, but some basins exhibited large area discrepancies between flow-network datasets and archived basin areas, (ii basins exhibiting too-high runoff coefficients were abundant in areas where precipitation data were likely affected by snow undercatch, and (iii the occurrence of basins exhibiting losses exceeding the potential-evaporation limit was strongly dependent on the potential-evaporation data, both in terms of numbers and geographical distribution. Some inconsistencies may be resolved by considering sub-grid variability in climate data, surface-dependent potential-evaporation estimates, etc., but further studies are needed to determine the reasons for the inconsistencies found. Our results emphasise the need for pre-modelling data analysis to identify dataset inconsistencies as an important first step in any large-scale study. Applying data-screening methods before modelling should also increase our chances to draw robust conclusions from subsequent

  18. Thermal activation of dislocations in large scale obstacle bypass

    Science.gov (United States)

    Sobie, Cameron; Capolungo, Laurent; McDowell, David L.; Martinez, Enrique

    2017-08-01

    Dislocation dynamics simulations have been used extensively to predict hardening caused by dislocation-obstacle interactions, including irradiation defect hardening in the athermal case. Incorporating the role of thermal energy on these interactions is possible with a framework provided by harmonic transition state theory (HTST) enabling direct access to thermally activated reaction rates using the Arrhenius equation, including rates of dislocation-obstacle bypass processes. Moving beyond unit dislocation-defect reactions to a representative environment containing a large number of defects requires coarse-graining the activation energy barriers of a population of obstacles into an effective energy barrier that accurately represents the large scale collective process. The work presented here investigates the relationship between unit dislocation-defect bypass processes and the distribution of activation energy barriers calculated for ensemble bypass processes. A significant difference between these cases is observed, which is attributed to the inherent cooperative nature of dislocation bypass processes. In addition to the dislocation-defect interaction, the morphology of the dislocation segments pinned to the defects play an important role on the activation energies for bypass. A phenomenological model for activation energy stress dependence is shown to describe well the effect of a distribution of activation energies, and a probabilistic activation energy model incorporating the stress distribution in a material is presented.

  19. Episodic memory in aspects of large-scale brain networks

    Science.gov (United States)

    Jeong, Woorim; Chung, Chun Kee; Kim, June Sic

    2015-01-01

    Understanding human episodic memory in aspects of large-scale brain networks has become one of the central themes in neuroscience over the last decade. Traditionally, episodic memory was regarded as mostly relying on medial temporal lobe (MTL) structures. However, recent studies have suggested involvement of more widely distributed cortical network and the importance of its interactive roles in the memory process. Both direct and indirect neuro-modulations of the memory network have been tried in experimental treatments of memory disorders. In this review, we focus on the functional organization of the MTL and other neocortical areas in episodic memory. Task-related neuroimaging studies together with lesion studies suggested that specific sub-regions of the MTL are responsible for specific components of memory. However, recent studies have emphasized that connectivity within MTL structures and even their network dynamics with other cortical areas are essential in the memory process. Resting-state functional network studies also have revealed that memory function is subserved by not only the MTL system but also a distributed network, particularly the default-mode network (DMN). Furthermore, researchers have begun to investigate memory networks throughout the entire brain not restricted to the specific resting-state network (RSN). Altered patterns of functional connectivity (FC) among distributed brain regions were observed in patients with memory impairments. Recently, studies have shown that brain stimulation may impact memory through modulating functional networks, carrying future implications of a novel interventional therapy for memory impairment. PMID:26321939

  20. Episodic memory in aspects of large-scale brain networks

    Directory of Open Access Journals (Sweden)

    Woorim eJeong

    2015-08-01

    Full Text Available Understanding human episodic memory in aspects of large-scale brain networks has become one of the central themes in neuroscience over the last decade. Traditionally, episodic memory was regarded as mostly relying on medial temporal lobe (MTL structures. However, recent studies have suggested involvement of more widely distributed cortical network and the importance of its interactive roles in the memory process. Both direct and indirect neuro-modulations of the memory network have been tried in experimental treatments of memory disorders. In this review, we focus on the functional organization of the MTL and other neocortical areas in episodic memory. Task-related neuroimaging studies together with lesion studies suggested that specific sub-regions of the MTL are responsible for specific components of memory. However, recent studies have emphasized that connectivity within MTL structures and even their network dynamics with other cortical areas are essential in the memory process. Resting-state functional network studies also have revealed that memory function is subserved by not only the MTL system but also a distributed network, particularly the default-mode network. Furthermore, researchers have begun to investigate memory networks throughout the entire brain not restricted to the specific resting-state network. Altered patterns of functional connectivity among distributed brain regions were observed in patients with memory impairments. Recently, studies have shown that brain stimulation may impact memory through modulating functional networks, carrying future implications of a novel interventional therapy for memory impairment.

  1. Response of deep and shallow tropical maritime cumuli to large-scale processes

    Science.gov (United States)

    Yanai, M.; Chu, J.-H.; Stark, T. E.; Nitta, T.

    1976-01-01

    The bulk diagnostic method of Yanai et al. (1973) and a simplified version of the spectral diagnostic method of Nitta (1975) are used for a more quantitative evaluation of the response of various types of cumuliform clouds to large-scale processes, using the same data set in the Marshall Islands area for a 100-day period in 1956. The dependence of the cloud mass flux distribution on radiative cooling, large-scale vertical motion, and evaporation from the sea is examined. It is shown that typical radiative cooling rates in the tropics tend to produce a bimodal distribution of mass spectrum exhibiting deep and shallow clouds. The bimodal distribution is further enhanced when the large-scale vertical motion is upward, and a nearly unimodal distribution of shallow clouds prevails when the relative cooling is compensated by the heating due to the large-scale subsidence. Both deep and shallow clouds are modulated by large-scale disturbances. The primary role of surface evaporation is to maintain the moisture flux at the cloud base.

  2. Hierarchical formation of large scale structures of the Universe: observations and models

    International Nuclear Information System (INIS)

    Maurogordato, Sophie

    2003-01-01

    In this report for an Accreditation to Supervise Research (HDR), the author proposes an overview of her research works in cosmology. These works notably addressed the large scale distribution of the Universe (with constraints on the scenario of formation, and on the bias relationship, and the structuring of clusters), the analysis of galaxy clusters during coalescence, mass distribution within relaxed clusters [fr

  3. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...

  4. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  5. Large-scale Agricultural Land Acquisitions in West Africa | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project will examine large-scale agricultural land acquisitions in nine West African countries -Burkina Faso, Guinea-Bissau, Guinea, Benin, Mali, Togo, Senegal, Niger, and Côte d'Ivoire. ... They will use the results to increase public awareness and knowledge about the consequences of large-scale land acquisitions.

  6. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Administrator

    structure and chemical purity of 99⋅1% by inductively coupled plasma optical emission spectroscopy on a large scale. Keywords. Sol–gel; yttria-stabilized zirconia; large scale; nanopowder; Pechini method. 1. Introduction. Zirconia has attracted the attention of many scientists because of its tremendous thermal, mechanical ...

  7. On the relevancy of efficient, integrated computer and network monitoring in HEP distributed online environment

    International Nuclear Information System (INIS)

    Carvalho, D.; Gavillet, Ph.; Delgado, V.; Javello, J.; Miere, Y.; Ruffinoni, D.; Albert, J.N.; Bellas, N.; Smith, G.

    1996-01-01

    Large Scientific Equipment are controlled by Computer Systems whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, the sophistication of its treatment and, on the other hand by the fast evolution of the computer and network market. Some people call them generically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this framework the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is proposed to integrate the various functions of DCCS monitoring into one general purpose Multi-layer System. (author)

  8. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  9. Monitoring extensions for component-based distributed software

    NARCIS (Netherlands)

    Diakov, N.K.; Papir, Z.; van Sinderen, Marten J.; Quartel, Dick

    2000-01-01

    This paper defines a generic class of monitoring extensions to component-based distributed enterprise software. Introducing a monitoring extension to a legacy application system can be very costly. In this paper, we identify the minimum support for application monitoring within the generic

  10. 2MASS Constraints on the Local Large-Scale Structure: A Challenge to LCDM?

    OpenAIRE

    Frith, W. J.; Shanks, T.; Outram, P. J.

    2004-01-01

    We investigate the large-scale structure of the local galaxy distribution using the recently completed 2 Micron All Sky Survey (2MASS). First, we determine the K-band number counts over the 4000 sq.deg. APM survey area where evidence for a large-scale `local hole' has previously been detected and compare them to a homogeneous prediction. Considering a LCDM form for the 2-point angular correlation function, the observed deficiency represents a 5 sigma fluctuation in the galaxy distribution. We...

  11. Large Scale Relationship between Aquatic Insect Traits and Climate.

    Science.gov (United States)

    Bhowmik, Avit Kumar; Schäfer, Ralf B

    2015-01-01

    Climate is the predominant environmental driver of freshwater assemblage pattern on large spatial scales, and traits of freshwater organisms have shown considerable potential to identify impacts of climate change. Although several studies suggest traits that may indicate vulnerability to climate change, the empirical relationship between freshwater assemblage trait composition and climate has been rarely examined on large scales. We compared the responses of the assumed climate-associated traits from six grouping features to 35 bioclimatic indices (~18 km resolution) for five insect orders (Diptera, Ephemeroptera, Odonata, Plecoptera and Trichoptera), evaluated their potential for changing distribution pattern under future climate change and identified the most influential bioclimatic indices. The data comprised 782 species and 395 genera sampled in 4,752 stream sites during 2006 and 2007 in Germany (~357,000 km² spatial extent). We quantified the variability and spatial autocorrelation in the traits and orders that are associated with the combined and individual bioclimatic indices. Traits of temperature preference grouping feature that are the products of several other underlying climate-associated traits, and the insect order Ephemeroptera exhibited the strongest response to the bioclimatic indices as well as the highest potential for changing distribution pattern. Regarding individual traits, insects in general and ephemeropterans preferring very cold temperature showed the highest response, and the insects preferring cold and trichopterans preferring moderate temperature showed the highest potential for changing distribution. We showed that the seasonal radiation and moisture are the most influential bioclimatic aspects, and thus changes in these aspects may affect the most responsive traits and orders and drive a change in their spatial distribution pattern. Our findings support the development of trait-based metrics to predict and detect climate

  12. Large scale anisotropy studies with the Auger Observatory

    International Nuclear Information System (INIS)

    Santos, E.M.; Letessier-Selvon, A.

    2006-01-01

    With the increasing Auger surface array data sample of the highest energy cosmic rays, large scale anisotropy studies at this part of the spectrum become a promising path towards the understanding of the origin of ultra-high energy cosmic particles. We describe the methods underlying the search for distortions in the cosmic rays arrival directions over large angular scales, that is, bigger than those commonly employed in the search for correlations with point-like sources. The widely used tools, known as coverage maps, are described and some of the issues involved in their calculations are presented through Monte Carlo based studies. Coverage computation requires a deep knowledge on the local detection efficiency, including the influence of weather parameters like temperature and pressure. Particular attention is devoted to a new proposed method to extract the coverage, based upon the assumption of time factorization of an extensive air shower detector acceptance. We use Auger monitoring data to test the goodness of such a hypothesis. We finally show the necessity of using more than one coverage to extract any possible anisotropic pattern on the sky, by pointing to some of the biases present in commonly used methods based, for example, on the scrambling of the UTC arrival times for each event. (author)

  13. Distributed Monitoring of the R(sup 2) Statistic for Linear Regression

    Science.gov (United States)

    Bhaduri, Kanishka; Das, Kamalika; Giannella, Chris R.

    2011-01-01

    The problem of monitoring a multivariate linear regression model is relevant in studying the evolving relationship between a set of input variables (features) and one or more dependent target variables. This problem becomes challenging for large scale data in a distributed computing environment when only a subset of instances is available at individual nodes and the local data changes frequently. Data centralization and periodic model recomputation can add high overhead to tasks like anomaly detection in such dynamic settings. Therefore, the goal is to develop techniques for monitoring and updating the model over the union of all nodes data in a communication-efficient fashion. Correctness guarantees on such techniques are also often highly desirable, especially in safety-critical application scenarios. In this paper we develop DReMo a distributed algorithm with very low resource overhead, for monitoring the quality of a regression model in terms of its coefficient of determination (R2 statistic). When the nodes collectively determine that R2 has dropped below a fixed threshold, the linear regression model is recomputed via a network-wide convergecast and the updated model is broadcast back to all nodes. We show empirically, using both synthetic and real data, that our proposed method is highly communication-efficient and scalable, and also provide theoretical guarantees on correctness.

  14. Host Immunity via Mutable Virtualized Large-Scale Network Containers

    Science.gov (United States)

    2016-07-25

    Programs, P.O. Box 8795, REPORT NUMBER Williamsburg, VA 23187-8795 9. SPONSORING/ MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM...Second, we need to mitigate the threat from insiders. Though the system only allows authenticated clients to locate and access it service, insiders...and constrain the distributed persistent inside crawlers that have va.lid credentials to access the web services. The main idea is to add a marker

  15. Distributed optical fiber sensors for integrated monitoring of railway infrastructures

    Science.gov (United States)

    Minardo, Aldo; Coscetta, Agnese; Porcaro, Giuseppe; Giannetta, Daniele; Bernini, Romeo; Zeni, Luigi

    2014-05-01

    We propose the application of a distributed optical fiber sensor based on stimulated Brillouin scattering, as an integrated system for safety monitoring of railway infrastructures. The strain distribution was measured dynamically along a 60 meters length of rail track, as well as along a 3-m stone arch bridge. The results indicate that distributed sensing technology is able to provide useful information in railway traffic and safety monitoring.

  16. Remote collaboration system based on large scale simulation

    International Nuclear Information System (INIS)

    Kishimoto, Yasuaki; Sugahara, Akihiro; Li, J.Q.

    2008-01-01

    Large scale simulation using super-computer, which generally requires long CPU time and produces large amount of data, has been extensively studied as a third pillar in various advanced science fields in parallel to theory and experiment. Such a simulation is expected to lead new scientific discoveries through elucidation of various complex phenomena, which are hardly identified only by conventional theoretical and experimental approaches. In order to assist such large simulation studies for which many collaborators working at geographically different places participate and contribute, we have developed a unique remote collaboration system, referred to as SIMON (simulation monitoring system), which is based on client-server system control introducing an idea of up-date processing, contrary to that of widely used post-processing. As a key ingredient, we have developed a trigger method, which transmits various requests for the up-date processing from the simulation (client) running on a super-computer to a workstation (server). Namely, the simulation running on a super-computer actively controls the timing of up-date processing. The server that has received the requests from the ongoing simulation such as data transfer, data analyses, and visualizations, etc. starts operations according to the requests during the simulation. The server makes the latest results available to web browsers, so that the collaborators can monitor the results at any place and time in the world. By applying the system to a specific simulation project of laser-matter interaction, we have confirmed that the system works well and plays an important role as a collaboration platform on which many collaborators work with one another

  17. An Novel Architecture of Large-scale Communication in IOT

    Science.gov (United States)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  18. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  19. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    The present paper presents overtopping measurements from small scale model test performed at the Haudraulic & Coastal Engineering Laboratory, Aalborg University, Denmark and large scale model tests performed at the Largde Wave Channel,Hannover, Germany. Comparison between results obtained from...... small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects...

  20. Analysis of environmental impact assessment for large-scale X-ray medical equipments

    International Nuclear Information System (INIS)

    Fu Jin; Pei Chengkai

    2011-01-01

    Based on an Environmental Impact Assessment (EIA) project, this paper elaborates the basic analysis essentials of EIA for the sales project of large-scale X-ray medical equipment, and provides the analysis procedure of environmental impact and dose estimation method under normal and accident conditions. The key points of EIA for the sales project of large-scale X-ray medical equipment include the determination of pollution factor and management limit value according to the project's actual situation, the utilization of various methods of assessment and prediction such as analogy, actual measurement and calculation to analyze, monitor, calculate and predict the pollution during normal and accident condition. (authors)

  1. Parallel Index and Query for Large Scale Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chou, Jerry; Wu, Kesheng; Ruebel, Oliver; Howison, Mark; Qiang, Ji; Prabhat,; Austin, Brian; Bethel, E. Wes; Ryne, Rob D.; Shoshani, Arie

    2011-07-18

    Modern scientific datasets present numerous data management and analysis challenges. State-of-the-art index and query technologies are critical for facilitating interactive exploration of large datasets, but numerous challenges remain in terms of designing a system for process- ing general scientific datasets. The system needs to be able to run on distributed multi-core platforms, efficiently utilize underlying I/O infrastructure, and scale to massive datasets. We present FastQuery, a novel software framework that address these challenges. FastQuery utilizes a state-of-the-art index and query technology (FastBit) and is designed to process mas- sive datasets on modern supercomputing platforms. We apply FastQuery to processing of a massive 50TB dataset generated by a large scale accelerator modeling code. We demonstrate the scalability of the tool to 11,520 cores. Motivated by the scientific need to search for inter- esting particles in this dataset, we use our framework to reduce search time from hours to tens of seconds.

  2. Cooling pipeline disposing structure for large-scaled cryogenic structure

    International Nuclear Information System (INIS)

    Takahashi, Hiroyuki.

    1996-01-01

    The present invention concerns an electromagnetic force supporting structure for superconductive coils. As the size of a cryogenic structure is increased, since it takes much cooling time, temperature difference between cooling pipelines and the cryogenic structure is increased over a wide range, and difference of heat shrinkage is increased to increase thermal stresses. Then, in the cooling pipelines for a large scaled cryogenic structure, the cooling pipelines and the structure are connected by way of a thin metal plate made of a material having a heat conductivity higher than that of the material of the structure by one digit or more, and the thin metal plate is bent. The displacement between the cryogenic structure and the cooling pipelines caused by heat shrinkage is absorbed by the elongation/shrinkage of the bent structure of the thin metal plate, and the thermal stresses due to the displacement is reduced. In addition, the heat of the cryogenic structures is transferred by way of the thin metal plate. Then, the cooling pipelines can be secured to the cryogenic structure such that cooling by heat transfer is enabled by absorbing a great deviation or three dimensional displacement due to the difference of the temperature distribution between the cryogenic structure enlarged in the scale and put into the three dimensional shape, and the cooling pipelines. (N.H.)

  3. Large Scale Software Building with CMake in ATLAS

    Science.gov (United States)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.

  4. LARGE-SCALE INDICATIVE MAPPING OF SOIL RUNOFF

    Directory of Open Access Journals (Sweden)

    E. Panidi

    2017-11-01

    Full Text Available In our study we estimate relationships between quantitative parameters of relief, soil runoff regime, and spatial distribution of radioactive pollutants in the soil. The study is conducted on the test arable area located in basin of the upper Oka River (Orel region, Russia. Previously we collected rich amount of soil samples, which make it possible to investigate redistribution of the Chernobyl-origin cesium-137 in soil material and as a consequence the soil runoff magnitude at sampling points. Currently we are describing and discussing the technique applied to large-scale mapping of the soil runoff. The technique is based upon the cesium-137 radioactivity measurement in the different relief structures. Key stages are the allocation of the places for soil sampling points (we used very high resolution space imagery as a supporting data; soil samples collection and analysis; calibration of the mathematical model (using the estimated background value of the cesium-137 radioactivity; and automated compilation of the map (predictive map of the studied territory (digital elevation model is used for this purpose, and cesium-137 radioactivity can be predicted using quantitative parameters of the relief. The maps can be used as a support data for precision agriculture and for recultivation or melioration purposes.

  5. Monte Carlo modelling of large scale NORM sources using MCNP.

    Science.gov (United States)

    Wallace, J D

    2013-12-01

    The representative Monte Carlo modelling of large scale planar sources (for comparison to external environmental radiation fields) is undertaken using substantial diameter and thin profile planar cylindrical sources. The relative impact of source extent, soil thickness and sky-shine are investigated to guide decisions relating to representative geometries. In addition, the impact of source to detector distance on the nature of the detector response, for a range of source sizes, has been investigated. These investigations, using an MCNP based model, indicate a soil cylinder of greater than 20 m diameter and of no less than 50 cm depth/height, combined with a 20 m deep sky section above the soil cylinder, are needed to representatively model the semi-infinite plane of uniformly distributed NORM sources. Initial investigation of the effect of detector placement indicate that smaller source sizes may be used to achieve a representative response at shorter source to detector distances. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  6. Alignment between galaxies and large-scale structure

    International Nuclear Information System (INIS)

    Faltenbacher, A.; Li Cheng; White, Simon D. M.; Jing, Yi-Peng; Mao Shude; Wang Jie

    2009-01-01

    Based on the Sloan Digital Sky Survey DR6 (SDSS) and the Millennium Simulation (MS), we investigate the alignment between galaxies and large-scale structure. For this purpose, we develop two new statistical tools, namely the alignment correlation function and the cos(2θ)-statistic. The former is a two-dimensional extension of the traditional two-point correlation function and the latter is related to the ellipticity correlation function used for cosmic shear measurements. Both are based on the cross correlation between a sample of galaxies with orientations and a reference sample which represents the large-scale structure. We apply the new statistics to the SDSS galaxy catalog. The alignment correlation function reveals an overabundance of reference galaxies along the major axes of red, luminous (L ∼ * ) galaxies out to projected separations of 60 h- 1 Mpc. The signal increases with central galaxy luminosity. No alignment signal is detected for blue galaxies. The cos(2θ)-statistic yields very similar results. Starting from a MS semi-analytic galaxy catalog, we assign an orientation to each red, luminous and central galaxy, based on that of the central region of the host halo (with size similar to that of the stellar galaxy). As an alternative, we use the orientation of the host halo itself. We find a mean projected misalignment between a halo and its central region of ∼ 25 deg. The misalignment decreases slightly with increasing luminosity of the central galaxy. Using the orientations and luminosities of the semi-analytic galaxies, we repeat our alignment analysis on mock surveys of the MS. Agreement with the SDSS results is good if the central orientations are used. Predictions using the halo orientations as proxies for central galaxy orientations overestimate the observed alignment by more than a factor of 2. Finally, the large volume of the MS allows us to generate a two-dimensional map of the alignment correlation function, which shows the reference

  7. Large-Scale Spray Releases: Additional Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Daniel, Richard C.; Gauglitz, Phillip A.; Burns, Carolyn A.; Fountain, Matthew S.; Shimskey, Rick W.; Billing, Justin M.; Bontha, Jagannadha R.; Kurath, Dean E.; Jenks, Jeromy WJ; MacFarlan, Paul J.; Mahoney, Lenna A.

    2013-08-01

    One of the events postulated in the hazard analysis for the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak event involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids that behave as a Newtonian fluid. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and in processing facilities across the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are mostly absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale testing. The small-scale testing and resultant data are described in Mahoney et al. (2012b), and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used

  8. Large-scale CO2 storage — Is it feasible?

    Directory of Open Access Journals (Sweden)

    Johansen H.

    2013-06-01

    Full Text Available CCS is generally estimated to have to account for about 20% of the reduction of CO2 emissions to the atmosphere. This paper focuses on the technical aspects of CO2 storage, even if the CCS challenge is equally dependent upon finding viable international solutions to a wide range of economic, political and cultural issues. It has already been demonstrated that it is technically possible to store adequate amounts of CO2 in the subsurface (Sleipner, InSalah, Snøhvit. The large-scale storage challenge (several Gigatons of CO2 per year is more an issue of minimizing cost without compromising safety, and of making international regulations.The storage challenge may be split into 4 main parts: 1 finding reservoirs with adequate storage capacity, 2 make sure that the sealing capacity above the reservoir is sufficient, 3 build the infrastructure for transport, drilling and injection, and 4 set up and perform the necessary monitoring activities. More than 150 years of worldwide experience from the production of oil and gas is an important source of competence for CO2 storage. The storage challenge is however different in three important aspects: 1 the storage activity results in pressure increase in the subsurface, 2 there is no production of fluids that give important feedback on reservoir performance, and 3 the monitoring requirement will have to extend for a much longer time into the future than what is needed during oil and gas production. An important property of CO2 is that its behaviour in the subsurface is significantly different from that of oil and gas. CO2 in contact with water is reactive and corrosive, and may impose great damage on both man-made and natural materials, if proper precautions are not executed. On the other hand, the long-term effect of most of these reactions is that a large amount of CO2 will become immobilized and permanently stored as solid carbonate minerals. The reduced opportunity for direct monitoring of fluid samples

  9. Large-scale CO2 storage — Is it feasible?

    Science.gov (United States)

    Johansen, H.

    2013-06-01

    CCS is generally estimated to have to account for about 20% of the reduction of CO2 emissions to the atmosphere. This paper focuses on the technical aspects of CO2 storage, even if the CCS challenge is equally dependent upon finding viable international solutions to a wide range of economic, political and cultural issues. It has already been demonstrated that it is technically possible to store adequate amounts of CO2 in the subsurface (Sleipner, InSalah, Snøhvit). The large-scale storage challenge (several Gigatons of CO2 per year) is more an issue of minimizing cost without compromising safety, and of making international regulations.The storage challenge may be split into 4 main parts: 1) finding reservoirs with adequate storage capacity, 2) make sure that the sealing capacity above the reservoir is sufficient, 3) build the infrastructure for transport, drilling and injection, and 4) set up and perform the necessary monitoring activities. More than 150 years of worldwide experience from the production of oil and gas is an important source of competence for CO2 storage. The storage challenge is however different in three important aspects: 1) the storage activity results in pressure increase in the subsurface, 2) there is no production of fluids that give important feedback on reservoir performance, and 3) the monitoring requirement will have to extend for a much longer time into the future than what is needed during oil and gas production. An important property of CO2 is that its behaviour in the subsurface is significantly different from that of oil and gas. CO2 in contact with water is reactive and corrosive, and may impose great damage on both man-made and natural materials, if proper precautions are not executed. On the other hand, the long-term effect of most of these reactions is that a large amount of CO2 will become immobilized and permanently stored as solid carbonate minerals. The reduced opportunity for direct monitoring of fluid samples close to the

  10. Large-scale land transformations in Indonesia: The role of ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... enable timely responses to the impacts of large-scale land transformations in Central Kalimantan ... In partnership with UNESCO's Organization for Women in Science for the ... New funding opportunity for gender equality and climate change.

  11. Resolute large scale mining company contribution to health services of

    African Journals Online (AJOL)

    Resolute large scale mining company contribution to health services of Lusu ... in terms of socio economic, health, education, employment, safe drinking water, ... The data were analyzed using Scientific Package for Social Science (SPSS).

  12. Bottom-Up Accountability Initiatives and Large-Scale Land ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Corey Piccioni

    fuel/energy, climate, and finance has occurred and one of the most ... this wave of large-scale land acquisitions. In fact, esti- ... Environmental Rights Action/Friends of the Earth,. Nigeria ... map the differentiated impacts (gender, ethnicity,.

  13. Large-scale linear programs in planning and prediction.

    Science.gov (United States)

    2017-06-01

    Large-scale linear programs are at the core of many traffic-related optimization problems in both planning and prediction. Moreover, many of these involve significant uncertainty, and hence are modeled using either chance constraints, or robust optim...

  14. Bottom-Up Accountability Initiatives and Large-Scale Land ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... Security can help increase accountability for large-scale land acquisitions in ... to build decent economic livelihoods and participate meaningfully in decisions ... its 2017 call for proposals to establish Cyber Policy Centres in the Global South.

  15. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  16. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  17. No Large Scale Curvature Perturbations during Waterfall of Hybrid Inflation

    OpenAIRE

    Abolhasani, Ali Akbar; Firouzjahi, Hassan

    2010-01-01

    In this paper the possibility of generating large scale curvature perturbations induced from the entropic perturbations during the waterfall phase transition of standard hybrid inflation model is studied. We show that whether or not appreciable amounts of large scale curvature perturbations are produced during the waterfall phase transition depend crucially on the competition between the classical and the quantum mechanical back-reactions to terminate inflation. If one considers only the clas...

  18. Bayesian hierarchical model for large-scale covariance matrix estimation.

    Science.gov (United States)

    Zhu, Dongxiao; Hero, Alfred O

    2007-12-01

    Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.

  19. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  20. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  1. Comprehensive Monitoring for Heterogeneous Geographically Distributed Storage

    Energy Technology Data Exchange (ETDEWEB)

    Ratnikova, N. [Fermilab; Karavakis, E. [CERN; Lammel, S. [Fermilab; Wildish, T. [Princeton U.

    2015-12-23

    Storage capacity at CMS Tier-1 and Tier-2 sites reached over 100 Petabytes in 2014, and will be substantially increased during Run 2 data taking. The allocation of storage for the individual users analysis data, which is not accounted as a centrally managed storage space, will be increased to up to 40%. For comprehensive tracking and monitoring of the storage utilization across all participating sites, CMS developed a space monitoring system, which provides a central view of the geographically dispersed heterogeneous storage systems. The first prototype was deployed at pilot sites in summer 2014, and has been substantially reworked since then. In this paper we discuss the functionality and our experience of system deployment and operation on the full CMS scale.

  2. Monitoring Distributed Systems: A Relational Approach.

    Science.gov (United States)

    1982-12-01

    4. Summary 200 • 0" List of Figures Iv List of Figures Figure 2-1: Relationships between Primitive and Derived Events and Periods 15 Figure 3-1...structure in order to ensure specified invariants. usually relating to synchronization [Hoare 74). Both definitions emphasize the control, rather than the...monitor must understand that there are such things as proces- sors. processes, memory, message ports, semaphores , etc. and that certain relationships

  3. Inflationary tensor fossils in large-scale structure

    Energy Technology Data Exchange (ETDEWEB)

    Dimastrogiovanni, Emanuela [School of Physics and Astronomy, University of Minnesota, Minneapolis, MN 55455 (United States); Fasiello, Matteo [Department of Physics, Case Western Reserve University, Cleveland, OH 44106 (United States); Jeong, Donghui [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States); Kamionkowski, Marc, E-mail: ema@physics.umn.edu, E-mail: mrf65@case.edu, E-mail: duj13@psu.edu, E-mail: kamion@jhu.edu [Department of Physics and Astronomy, 3400 N. Charles St., Johns Hopkins University, Baltimore, MD 21218 (United States)

    2014-12-01

    Inflation models make specific predictions for a tensor-scalar-scalar three-point correlation, or bispectrum, between one gravitational-wave (tensor) mode and two density-perturbation (scalar) modes. This tensor-scalar-scalar correlation leads to a local power quadrupole, an apparent departure from statistical isotropy in our Universe, as well as characteristic four-point correlations in the current mass distribution in the Universe. So far, the predictions for these observables have been worked out only for single-clock models in which certain consistency conditions between the tensor-scalar-scalar correlation and tensor and scalar power spectra are satisfied. Here we review the requirements on inflation models for these consistency conditions to be satisfied. We then consider several examples of inflation models, such as non-attractor and solid-inflation models, in which these conditions are put to the test. In solid inflation the simplest consistency conditions are already violated whilst in the non-attractor model we find that, contrary to the standard scenario, the tensor-scalar-scalar correlator probes directly relevant model-dependent information. We work out the predictions for observables in these models. For non-attractor inflation we find an apparent local quadrupolar departure from statistical isotropy in large-scale structure but that this power quadrupole decreases very rapidly at smaller scales. The consistency of the CMB quadrupole with statistical isotropy then constrains the distance scale that corresponds to the transition from the non-attractor to attractor phase of inflation to be larger than the currently observable horizon. Solid inflation predicts clustering fossils signatures in the current galaxy distribution that may be large enough to be detectable with forthcoming, and possibly even current, galaxy surveys.

  4. Large-scale agent-based social simulation : A study on epidemic prediction and control

    NARCIS (Netherlands)

    Zhang, M.

    2016-01-01

    Large-scale agent-based social simulation is gradually proving to be a versatile methodological approach for studying human societies, which could make contributions from policy making in social science, to distributed artificial intelligence and agent technology in computer science, and to theory

  5. Finite Mixture Multilevel Multidimensional Ordinal IRT Models for Large Scale Cross-Cultural Research

    Science.gov (United States)

    de Jong, Martijn G.; Steenkamp, Jan-Benedict E. M.

    2010-01-01

    We present a class of finite mixture multilevel multidimensional ordinal IRT models for large scale cross-cultural research. Our model is proposed for confirmatory research settings. Our prior for item parameters is a mixture distribution to accommodate situations where different groups of countries have different measurement operations, while…

  6. Sustainability of small reservoirs and large scale water availability under current conditions and climate change

    NARCIS (Netherlands)

    Krol, Martinus S.; de Vries, Marjella J.; van Oel, Pieter R.; Carlos de Araújo, José

    2011-01-01

    Semi-arid river basins often rely on reservoirs for water supply. Small reservoirs may impact on large-scale water availability both by enhancing availability in a distributed sense and by subtracting water for large downstream user communities, e.g. served by large reservoirs. Both of these impacts

  7. Primordial Non-Gaussianity in the Large-Scale Structure of the Universe

    Directory of Open Access Journals (Sweden)

    Vincent Desjacques

    2010-01-01

    generated the cosmological fluctuations observed today. Any detection of significant non-Gaussianity would thus have profound implications for our understanding of cosmic structure formation. The large-scale mass distribution in the Universe is a sensitive probe of the nature of initial conditions. Recent theoretical progress together with rapid developments in observational techniques will enable us to critically confront predictions of inflationary scenarios and set constraints as competitive as those from the Cosmic Microwave Background. In this paper, we review past and current efforts in the search for primordial non-Gaussianity in the large-scale structure of the Universe.

  8. Scalable, Asynchronous, Distributed Eigen-Monitoring of Astronomy Data Streams

    Data.gov (United States)

    National Aeronautics and Space Administration — In this paper, we develop a distributed algorithm for monitoring the principal components (PCs) for next generation of astronomy petascale data pipelines such as the...

  9. Distributed Rocket Engine Testing Health Monitoring System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The on-ground and Distributed Rocket Engine Testing Health Monitoring System (DiRETHMS) provides a system architecture and software tools for performing diagnostics...

  10. Distributed Rocket Engine Testing Health Monitoring System, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Leveraging the Phase I achievements of the Distributed Rocket Engine Testing Health Monitoring System (DiRETHMS) including its software toolsets and system building...

  11. ATLAS Distributed Computing Monitoring tools during the LHC Run I

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Di Girolamo, A; Jezequel, S; Ueda, I; Wenaus, T

    2013-01-01

    This contribution summarizes evolution of the ATLAS Distributed Computing (ADC) Monitoring project during the LHC Run I. The ADC Monitoring targets at the three groups of customers: ADC Operations team to early identify malfunctions and escalate issues to an activity or a service expert, ATLAS national contacts and sites for the real-time monitoring and long-term measurement of the performance of the provided computing resources, and the ATLAS Management for long-term trends and accounting information about the ATLAS Distributed Computing resources.\\\\ During the LHC Run I a significant development effort has been invested in standardization of the monitoring and accounting applications in order to provide extensive monitoring and accounting suite. ADC Monitoring applications separate the data layer and the visualization layer. The data layer exposes data in a predefined format. The visualization layer is designed bearing in mind visual identity of the provided graphical elements, and re-usability of the visua...

  12. ATLAS Distributed Computing Monitoring tools during the LHC Run I

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Di Girolamo, A; Jezequel, S; Ueda, I; Wenaus, T

    2014-01-01

    This contribution summarizes evolution of the ATLAS Distributed Computing (ADC) Monitoring project during the LHC Run I. The ADC Monitoring targets at the three groups of customers: ADC Operations team to early identify malfunctions and escalate issues to an activity or a service expert, ATLAS national contacts and sites for the real-time monitoring and long-term measurement of the performance of the provided computing resources, and the ATLAS Management for long-term trends and accounting information about the ATLAS Distributed Computing resources.\\\\ During the LHC Run I a significant development effort has been invested in standardization of the monitoring and accounting applications in order to provide extensive monitoring and accounting suite. ADC Monitoring applications separate the data layer and the visualization layer. The data layer exposes data in a predefined format. The visualization layer is designed bearing in mind visual identity of the provided graphical elements, and re-usability of the visua...

  13. Investigation on the integral output power model of a large-scale wind farm

    Institute of Scientific and Technical Information of China (English)

    BAO Nengsheng; MA Xiuqian; NI Weidou

    2007-01-01

    The integral output power model of a large-scale wind farm is needed when estimating the wind farm's output over a period of time in the future.The actual wind speed power model and calculation method of a wind farm made up of many wind turbine units are discussed.After analyzing the incoming wind flow characteristics and their energy distributions,and after considering the multi-effects among the wind turbine units and certain assumptions,the incoming wind flow model of multi-units is built.The calculation algorithms and steps of the integral output power model of a large-scale wind farm are provided.Finally,an actual power output of the wind farm is calculated and analyzed by using the practical measurement wind speed data.The characteristics of a large-scale wind farm are also discussed.

  14. On the universal character of the large scale structure of the universe

    International Nuclear Information System (INIS)

    Demianski, M.; International Center for Relativistic Astrophysics; Rome Univ.; Doroshkevich, A.G.

    1991-01-01

    We review different theories of formation of the large scale structure of the Universe. Special emphasis is put on the theory of inertial instability. We show that for a large class of initial spectra the resulting two point correlation functions are similar. We discuss also the adhesion theory which uses the Burgers equation, Navier-Stokes equation or coagulation process. We review the Zeldovich theory of gravitational instability and discuss the internal structure of pancakes. Finally we discuss the role of the velocity potential in determining the global characteristics of large scale structures (distribution of caustics, scale of voids, etc.). In the last chapter we list the main unsolved problems and main successes of the theory of formation of large scale structure. (orig.)

  15. Research and development of safeguards measures for the large scale reprocessing plant

    Energy Technology Data Exchange (ETDEWEB)

    Kikuchi, Masahiro; Sato, Yuji; Yokota, Yasuhiro; Masuda, Shoichiro; Kobayashi, Isao; Uchikoshi, Seiji; Tsutaki, Yasuhiro; Nidaira, Kazuo [Nuclear Material Control Center, Tokyo (Japan)

    1994-12-31

    The Government of Japan agreed on the safeguards concepts of commercial size reprocessing plant under the bilateral agreement for cooperation between the Japan and the United States. In addition, the LASCAR, that is the forum of large scale reprocessing plant safeguards, could obtain the fruitful results in the spring of 1992. The research and development of safeguards measures for the Rokkasho Reprocessing Plant should be progressed with every regard to the concepts described in both documents. Basically, the material accountancy and monitoring system should be established, based on the NRTA and other measures in order to obtain the timeliness goal for plutonium, and the un-attended mode inspection approach based on the integrated containment/surveillance system coupled with radiation monitoring in order to reduce the inspection efforts. NMCC has been studying on the following measures for a large scale reprocessing plant safeguards (1) A radiation gate monitor and integrated surveillance system (2) A near real time Shipper and Receiver Difference monitoring (3) A near real time material accountancy system operated for the bulk handling area (4) A volume measurement technique in a large scale input accountancy vessel (5) An in-process inventory estimation technique applied to the process equipment such as the pulse column and evaporator (6) Solution transfer monitoring approach applied to buffer tanks in the chemical process (7) A timely analysis technique such as a hybrid K edge densitometer operated in the on-site laboratory (J.P.N.).

  16. Distributed Space Missions for Earth System Monitoring

    CERN Document Server

    2013-01-01

    A key addition to Springer's Space Technology Library series, this edited volume features the work of dozens of authors and offers a wealth of perspectives on distributed Earth observation missions. In sum, it is an eloquent synthesis of the fullest possible range of current approaches to a fast-developing field characterized by growing membership of the 'space club' to include nations formerly regarded as part of the Third World. The volume's four discrete sections focus on the topic's various aspects, including the key theoretical and technical issues arising from the division of payloads onto different satellites. The first is devoted to analyzing distributed synthetic aperture radars, with bi- and multi-static radars receiving separate treatment. This is followed by a full discussion of relative dynamics, guidance, navigation and control. Here, the separate topics of design; establishment, maintenance and control; and measurements are developed with relative trajectory as a reference point, while the dis...

  17. Large-scale modeling of rain fields from a rain cell deterministic model

    Science.gov (United States)

    FéRal, Laurent; Sauvageot, Henri; Castanet, Laurent; Lemorton, JoëL.; Cornet, FréDéRic; Leconte, Katia

    2006-04-01

    A methodology to simulate two-dimensional rain rate fields at large scale (1000 × 1000 km2, the scale of a satellite telecommunication beam or a terrestrial fixed broadband wireless access network) is proposed. It relies on a rain rate field cellular decomposition. At small scale (˜20 × 20 km2), the rain field is split up into its macroscopic components, the rain cells, described by the Hybrid Cell (HYCELL) cellular model. At midscale (˜150 × 150 km2), the rain field results from the conglomeration of rain cells modeled by HYCELL. To account for the rain cell spatial distribution at midscale, the latter is modeled by a doubly aggregative isotropic random walk, the optimal parameterization of which is derived from radar observations at midscale. The extension of the simulation area from the midscale to the large scale (1000 × 1000 km2) requires the modeling of the weather frontal area. The latter is first modeled by a Gaussian field with anisotropic covariance function. The Gaussian field is then turned into a binary field, giving the large-scale locations over which it is raining. This transformation requires the definition of the rain occupation rate over large-scale areas. Its probability distribution is determined from observations by the French operational radar network ARAMIS. The coupling with the rain field modeling at midscale is immediate whenever the large-scale field is split up into midscale subareas. The rain field thus generated accounts for the local CDF at each point, defining a structure spatially correlated at small scale, midscale, and large scale. It is then suggested that this approach be used by system designers to evaluate diversity gain, terrestrial path attenuation, or slant path attenuation for different azimuth and elevation angle directions.

  18. Challenges in Managing Trustworthy Large-scale Digital Science

    Science.gov (United States)

    Evans, B. J. K.

    2017-12-01

    The increased use of large-scale international digital science has opened a number of challenges for managing, handling, using and preserving scientific information. The large volumes of information are driven by three main categories - model outputs including coupled models and ensembles, data products that have been processing to a level of usability, and increasingly heuristically driven data analysis. These data products are increasingly the ones that are usable by the broad communities, and far in excess of the raw instruments data outputs. The data, software and workflows are then shared and replicated to allow broad use at an international scale, which places further demands of infrastructure to support how the information is managed reliably across distributed resources. Users necessarily rely on these underlying "black boxes" so that they are productive to produce new scientific outcomes. The software for these systems depend on computational infrastructure, software interconnected systems, and information capture systems. This ranges from the fundamentals of the reliability of the compute hardware, system software stacks and libraries, and the model software. Due to these complexities and capacity of the infrastructure, there is an increased emphasis of transparency of the approach and robustness of the methods over the full reproducibility. Furthermore, with large volume data management, it is increasingly difficult to store the historical versions of all model and derived data. Instead, the emphasis is on the ability to access the updated products and the reliability by which both previous outcomes are still relevant and can be updated for the new information. We will discuss these challenges and some of the approaches underway that are being used to address these issues.

  19. Large-scale projects between regional planning and environmental protection

    International Nuclear Information System (INIS)

    Schmidt, G.

    1984-01-01

    The first part of the work discusses the current law of land-use planning, municipal and technical construction planning, and licensing under the atomic energy law and the federal law on immission protection. In the second part some theses suggesting modifications are submitted. In the sector of land-use planning substantial contributions to the protection of the environment can only be expected from programs and plans (aims). For the environmental conflicts likely to arise from large-scale projects (nuclear power plant, fossil-fuel power plant) this holds good for the most part of site selection plans. They have bearings on environmental protection in that they presuppose thorough examination of facts, help to recognize possible conflicts at an early date and provide a frame for solving those problems. Municipal construction planning is guided by the following principles: Environmental protection is an equivalent planning target. Environmental data and facts and their methodical processing play a fundamental part as they constitute the basis of evaluation. Under the rules and regulations of the federal law on immission protection, section 5, number 2 - prevention of nuisances - operators are obliged to take preventive care of risks. That section is not concerned with planning or distribution. Neither does the licensing of nuclear plants have planning character. So far as the legal preconditions of licensing are fulfilled, the scope for rejection of an application under section 7, subsection 2 of the atomic energy law in view of site selection and requirement of a plant hardly carries any practical weight. (orig./HP) [de

  20. Holographic monitoring of spatial distributions of singlet oxygen in water

    Science.gov (United States)

    Belashov, A. V.; Bel'tyukova, D. M.; Vasyutinskii, O. S.; Petrov, N. V.; Semenova, I. V.; Chupov, A. S.

    2014-12-01

    A method for monitoring spatial distributions of singlet oxygen in biological media has been developed. Singlet oxygen was generated using Radachlorin® photosensitizer, while thermal disturbances caused by nonradiative deactivation of singlet oxygen were detected by the holographic interferometry technique. Processing of interferograms yields temperature maps that characterize the deactivation process and show the distribution of singlet oxygen species.

  1. Tight Bounds for Distributed Functional Monitoring

    DEFF Research Database (Denmark)

    Woodruff, David P.; Zhang, Qin

    2011-01-01

    $, our bound resolves their main open question. Our lower bounds are based on new direct sum theorems for approximate majority, and yield significant improvements to problems in the data stream model, improving the bound for estimating $F_p, p > 2,$ in $t$ passes from $\\tilde{\\Omega}(n^{1-2/p}/(\\eps^{2/p......} t))$ to $\\tilde{\\Omega}(n^{1-2/p}/(\\eps^{4/p} t))$, giving the first bound for estimating $F_0$ in $t$ passes of $\\Omega(1/(\\eps^2 t))$ bits of space that does not use the gap-hamming problem, and showing a distribution for the gap-hamming problem with high external information cost or super...

  2. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  3. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  4. Balancing modern Power System with large scale of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Altin, Müfit; Hansen, Anca Daniela

    2014-01-01

    Power system operators must ensure robust, secure and reliable power system operation even with a large scale integration of wind power. Electricity generated from the intermittent wind in large propor-tion may impact on the control of power system balance and thus deviations in the power system...... frequency in small or islanded power systems or tie line power flows in interconnected power systems. Therefore, the large scale integration of wind power into the power system strongly concerns the secure and stable grid operation. To ensure the stable power system operation, the evolving power system has...... to be analysed with improved analytical tools and techniques. This paper proposes techniques for the active power balance control in future power systems with the large scale wind power integration, where power balancing model provides the hour-ahead dispatch plan with reduced planning horizon and the real time...

  5. The role of large scale motions on passive scalar transport

    Science.gov (United States)

    Dharmarathne, Suranga; Araya, Guillermo; Tutkun, Murat; Leonardi, Stefano; Castillo, Luciano

    2014-11-01

    We study direct numerical simulation (DNS) of turbulent channel flow at Reτ = 394 to investigate effect of large scale motions on fluctuating temperature field which forms a passive scalar field. Statistical description of the large scale features of the turbulent channel flow is obtained using two-point correlations of velocity components. Two-point correlations of fluctuating temperature field is also examined in order to identify possible similarities between velocity and temperature fields. The two-point cross-correlations betwen the velocity and temperature fluctuations are further analyzed to establish connections between these two fields. In addition, we use proper orhtogonal decompotion (POD) to extract most dominant modes of the fields and discuss the coupling of large scale features of turbulence and the temperature field.

  6. Large-Scale Structure and Hyperuniformity of Amorphous Ices

    Science.gov (United States)

    Martelli, Fausto; Torquato, Salvatore; Giovambattista, Nicolas; Car, Roberto

    2017-09-01

    We investigate the large-scale structure of amorphous ices and transitions between their different forms by quantifying their large-scale density fluctuations. Specifically, we simulate the isothermal compression of low-density amorphous ice (LDA) and hexagonal ice to produce high-density amorphous ice (HDA). Both HDA and LDA are nearly hyperuniform; i.e., they are characterized by an anomalous suppression of large-scale density fluctuations. By contrast, in correspondence with the nonequilibrium phase transitions to HDA, the presence of structural heterogeneities strongly suppresses the hyperuniformity and the system becomes hyposurficial (devoid of "surface-area fluctuations"). Our investigation challenges the largely accepted "frozen-liquid" picture, which views glasses as structurally arrested liquids. Beyond implications for water, our findings enrich our understanding of pressure-induced structural transformations in glasses.

  7. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  8. System and Method for Monitoring Distributed Asset Data

    Science.gov (United States)

    Gorinevsky, Dimitry (Inventor)

    2015-01-01

    A computer-based monitoring system and monitoring method implemented in computer software for detecting, estimating, and reporting the condition states, their changes, and anomalies for many assets. The assets are of same type, are operated over a period of time, and outfitted with data collection systems. The proposed monitoring method accounts for variability of working conditions for each asset by using regression model that characterizes asset performance. The assets are of the same type but not identical. The proposed monitoring method accounts for asset-to-asset variability; it also accounts for drifts and trends in the asset condition and data. The proposed monitoring system can perform distributed processing of massive amounts of historical data without discarding any useful information where moving all the asset data into one central computing system might be infeasible. The overall processing is includes distributed preprocessing data records from each asset to produce compressed data.

  9. Understanding large scale groundwater flow in fractured crystalline rocks to aid in repository siting

    International Nuclear Information System (INIS)

    Davison, C.; Brown, A.; Gascoyne, M.; Stevenson, D.; Ophori, D.

    2000-01-01

    Atomic Energy of Canada Limited (AECL) conducted a ten-year long groundwater flow study of a 1050 km 2 region of fractured crystalline rock in southeastern Manitoba to illustrate how an understanding of large scale groundwater flow can be used to assist in selecting a hydraulically favourable location for the deep geological disposal of nuclear fuel waste. The study involved extensive field investigations that included the drilling testing, sampling and monitoring of twenty deep boreholes distributed at detailed study areas across the region. The surface and borehole geotechnical investigations were used to construct a conceptual model of the main litho-structural features that controlled groundwater flow through the crystalline rocks of the region. Eighty-three large fracture zones and other spatial domains of moderately fractured and sparsely fractured rocks were represented in a finite element model of the area to simulate regional groundwater flow. The groundwater flow model was calibrated to match the observed groundwater recharge rate and the hydraulic heads measured in the network of deep boreholes. Particle tracking was used to determine the pathways and travel times from different depths in the velocity field of the calibrated groundwater flow model. The results were used to identify locations in the regional flow field that maximize the time it takes for groundwater to travel to surface discharge areas through long, slow groundwater pathways. One of these locations was chosen as a good hypothetical location for situating a nuclear fuel waste disposal vault at 750 m depth. (authors)

  10. Streamflow Observations From Cameras: Large-Scale Particle Image Velocimetry or Particle Tracking Velocimetry?

    Science.gov (United States)

    Tauro, F.; Piscopia, R.; Grimaldi, S.

    2017-12-01

    Image-based methodologies, such as large scale particle image velocimetry (LSPIV) and particle tracking velocimetry (PTV), have increased our ability to noninvasively conduct streamflow measurements by affording spatially distributed observations at high temporal resolution. However, progress in optical methodologies has not been paralleled by the implementation of image-based approaches in environmental monitoring practice. We attribute this fact to the sensitivity of LSPIV, by far the most frequently adopted algorithm, to visibility conditions and to the occurrence of visible surface features. In this work, we test both LSPIV and PTV on a data set of 12 videos captured in a natural stream wherein artificial floaters are homogeneously and continuously deployed. Further, we apply both algorithms to a video of a high flow event on the Tiber River, Rome, Italy. In our application, we propose a modified PTV approach that only takes into account realistic trajectories. Based on our findings, LSPIV largely underestimates surface velocities with respect to PTV in both favorable (12 videos in a natural stream) and adverse (high flow event in the Tiber River) conditions. On the other hand, PTV is in closer agreement than LSPIV with benchmark velocities in both experimental settings. In addition, the accuracy of PTV estimations can be directly related to the transit of physical objects in the field of view, thus providing tangible data for uncertainty evaluation.

  11. Open source tools for large-scale neuroscience.

    Science.gov (United States)

    Freeman, Jeremy

    2015-06-01

    New technologies for monitoring and manipulating the nervous system promise exciting biology but pose challenges for analysis and computation. Solutions can be found in the form of modern approaches to distributed computing, machine learning, and interactive visualization. But embracing these new technologies will require a cultural shift: away from independent efforts and proprietary methods and toward an open source and collaborative neuroscience. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.

  12. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  13. An innovative experimental setup for Large Scale Particle Image Velocimetry measurements in riverine environments

    Science.gov (United States)

    Tauro, Flavia; Olivieri, Giorgio; Porfiri, Maurizio; Grimaldi, Salvatore

    2014-05-01

    Large Scale Particle Image Velocimetry (LSPIV) is a powerful methodology to nonintrusively monitor surface flows. Its use has been beneficial to the development of rating curves in riverine environments and to map geomorphic features in natural waterways. Typical LSPIV experimental setups rely on the use of mast-mounted cameras for the acquisition of natural stream reaches. Such cameras are installed on stream banks and are angled with respect to the water surface to capture large scale fields of view. Despite its promise and the simplicity of the setup, the practical implementation of LSPIV is affected by several challenges, including the acquisition of ground reference points for image calibration and time-consuming and highly user-assisted procedures to orthorectify images. In this work, we perform LSPIV studies on stream sections in the Aniene and Tiber basins, Italy. To alleviate the limitations of traditional LSPIV implementations, we propose an improved video acquisition setup comprising a telescopic, an inexpensive GoPro Hero 3 video camera, and a system of two lasers. The setup allows for maintaining the camera axis perpendicular to the water surface, thus mitigating uncertainties related to image orthorectification. Further, the mast encases a laser system for remote image calibration, thus allowing for nonintrusively calibrating videos without acquiring ground reference points. We conduct measurements on two different water bodies to outline the performance of the methodology in case of varying flow regimes, illumination conditions, and distribution of surface tracers. Specifically, the Aniene river is characterized by high surface flow velocity, the presence of abundant, homogeneously distributed ripples and water reflections, and a meagre number of buoyant tracers. On the other hand, the Tiber river presents lower surface flows, isolated reflections, and several floating objects. Videos are processed through image-based analyses to correct for lens

  14. Report of the LASCAR forum: Large scale reprocessing plant safeguards

    International Nuclear Information System (INIS)

    1992-01-01

    This report has been prepared to provide information on the studies which were carried out from 1988 to 1992 under the auspices of the multinational forum known as Large Scale Reprocessing Plant Safeguards (LASCAR) on safeguards for four large scale reprocessing plants operated or planned to be operated in the 1990s. The report summarizes all of the essential results of these studies. The participants in LASCAR were from France, Germany, Japan, the United Kingdom, the United States of America, the Commission of the European Communities - Euratom, and the International Atomic Energy Agency

  15. Large-scale structure observables in general relativity

    International Nuclear Information System (INIS)

    Jeong, Donghui; Schmidt, Fabian

    2015-01-01

    We review recent studies that rigorously define several key observables of the large-scale structure of the Universe in a general relativistic context. Specifically, we consider (i) redshift perturbation of cosmic clock events; (ii) distortion of cosmic rulers, including weak lensing shear and magnification; and (iii) observed number density of tracers of the large-scale structure. We provide covariant and gauge-invariant expressions of these observables. Our expressions are given for a linearly perturbed flat Friedmann–Robertson–Walker metric including scalar, vector, and tensor metric perturbations. While we restrict ourselves to linear order in perturbation theory, the approach can be straightforwardly generalized to higher order. (paper)

  16. Topology Optimization of Large Scale Stokes Flow Problems

    DEFF Research Database (Denmark)

    Aage, Niels; Poulsen, Thomas Harpsøe; Gersborg-Hansen, Allan

    2008-01-01

    This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs.......This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs....

  17. Fatigue Analysis of Large-scale Wind turbine

    Directory of Open Access Journals (Sweden)

    Zhu Yongli

    2017-01-01

    Full Text Available The paper does research on top flange fatigue damage of large-scale wind turbine generator. It establishes finite element model of top flange connection system with finite element analysis software MSC. Marc/Mentat, analyzes its fatigue strain, implements load simulation of flange fatigue working condition with Bladed software, acquires flange fatigue load spectrum with rain-flow counting method, finally, it realizes fatigue analysis of top flange with fatigue analysis software MSC. Fatigue and Palmgren-Miner linear cumulative damage theory. The analysis result indicates that its result provides new thinking for flange fatigue analysis of large-scale wind turbine generator, and possesses some practical engineering value.

  18. Generation of large-scale vortives in compressible helical turbulence

    International Nuclear Information System (INIS)

    Chkhetiani, O.G.; Gvaramadze, V.V.

    1989-01-01

    We consider generation of large-scale vortices in compressible self-gravitating turbulent medium. The closed equation describing evolution of the large-scale vortices in helical turbulence with finite correlation time is obtained. This equation has the form similar to the hydromagnetic dynamo equation, which allows us to call the vortx genertation effect the vortex dynamo. It is possible that principally the same mechanism is responsible both for amplification and maintenance of density waves and magnetic fields in gaseous disks of spiral galaxies. (author). 29 refs

  19. Large-Scale Spray Releases: Initial Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Schonewill, Philip P.; Gauglitz, Phillip A.; Bontha, Jagannadha R.; Daniel, Richard C.; Kurath, Dean E.; Adkins, Harold E.; Billing, Justin M.; Burns, Carolyn A.; Davis, James M.; Enderlin, Carl W.; Fischer, Christopher M.; Jenks, Jeromy WJ; Lukins, Craig D.; MacFarlan, Paul J.; Shutthanandan, Janani I.; Smith, Dennese M.

    2012-12-01

    One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. Two key technical areas were identified where testing results were needed to improve the technical basis by reducing the uncertainty due to extrapolating existing literature results. The first technical need was to quantify the role of slurry particles in small breaches where the slurry particles may plug and result in substantially reduced, or even negligible, respirable fraction formed by high-pressure sprays. The second technical need was to determine the aerosol droplet size distribution and volume from prototypic breaches and fluids, specifically including sprays from larger breaches with slurries where data from the literature are scarce. To address these technical areas, small- and large-scale test stands were constructed and operated with simulants to determine aerosol release fractions and generation rates from a range of breach sizes and geometries. The properties of the simulants represented the range of properties expected in the WTP process streams and included water, sodium salt solutions, slurries containing boehmite or gibbsite, and a hazardous chemical simulant. The effect of anti-foam agents was assessed with most of the simulants. Orifices included round holes and

  20. Large-scale volcanism associated with coronae on Venus

    Science.gov (United States)

    Roberts, K. Magee; Head, James W.

    1993-01-01

    The formation and evolution of coronae on Venus are thought to be the result of mantle upwellings against the crust and lithosphere and subsequent gravitational relaxation. A variety of other features on Venus have been linked to processes associated with mantle upwelling, including shield volcanoes on large regional rises such as Beta, Atla and Western Eistla Regiones and extensive flow fields such as Mylitta and Kaiwan Fluctus near the Lada Terra/Lavinia Planitia boundary. Of these features, coronae appear to possess the smallest amounts of associated volcanism, although volcanism associated with coronae has only been qualitatively examined. An initial survey of coronae based on recent Magellan data indicated that only 9 percent of all coronae are associated with substantial amounts of volcanism, including interior calderas or edifices greater than 50 km in diameter and extensive, exterior radial flow fields. Sixty-eight percent of all coronae were found to have lesser amounts of volcanism, including interior flooding and associated volcanic domes and small shields; the remaining coronae were considered deficient in associated volcanism. It is possible that coronae are related to mantle plumes or diapirs that are lower in volume or in partial melt than those associated with the large shields or flow fields. Regional tectonics or variations in local crustal and thermal structure may also be significant in determining the amount of volcanism produced from an upwelling. It is also possible that flow fields associated with some coronae are sheet-like in nature and may not be readily identified. If coronae are associated with volcanic flow fields, then they may be a significant contributor to plains formation on Venus, as they number over 300 and are widely distributed across the planet. As a continuation of our analysis of large-scale volcanism on Venus, we have reexamined the known population of coronae and assessed quantitatively the scale of volcanism associated