WorldWideScience

Sample records for distributed analysis environment

  1. Distributed analysis environment for HEP and interdisciplinary applications

    International Nuclear Information System (INIS)

    Moscicki, J.T.

    2003-01-01

    Huge data volumes of Larger Hadron Collider experiment require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R and D project (http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modules implemented with newer technology as necessary. The paper gives an overview of DIANE architecture and explains the main design choices. Selected examples of diverse applications from a variety of domains applicable to DIANE are presented. As well as preliminary benchmarking results

  2. Distributed analysis environment for HEP and interdisciplinary applications

    CERN Document Server

    Moscicki, J T

    2003-01-01

    Huge data volumes of Large Hadron Collider experiments require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R&D project (http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modul...

  3. Distributed Scheduling in Time Dependent Environments: Algorithms and Analysis

    OpenAIRE

    Shmuel, Ori; Cohen, Asaf; Gurewitz, Omer

    2017-01-01

    Consider the problem of a multiple access channel in a time dependent environment with a large number of users. In such a system, mostly due to practical constraints (e.g., decoding complexity), not all users can be scheduled together, and usually only one user may transmit at any given time. Assuming a distributed, opportunistic scheduling algorithm, we analyse the system's properties, such as delay, QoS and capacity scaling laws. Specifically, we start with analyzing the performance while \\...

  4. Extending the LWS Data Environment: Distributed Data Processing and Analysis

    Science.gov (United States)

    Narock, Thomas

    2005-01-01

    The final stages of this work saw changes to the original framework, as well as the completion and integration of several data processing services. Initially, it was thought that a peer-to-peer architecture was necessary to make this work possible. The peer-to-peer architecture provided many benefits including the dynamic discovery of new services that would be continually added. A prototype example was built and while it showed promise, a major disadvantage was seen in that it was not easily integrated into the existing data environment. While the peer-to-peer system worked well for finding and accessing distributed data processing services, it was found that its use was limited by the difficulty in calling it from existing tools and services. After collaborations with members of the data community, it was determined that our data processing system was of high value and that a new interface should be pursued in order for the community to take full advantage of it. As such; the framework was modified from a peer-to-peer architecture to a more traditional web service approach. Following this change multiple data processing services were added. These services include such things as coordinate transformations and sub setting of data. Observatory (VHO), assisted with integrating the new architecture into the VHO. This allows anyone using the VHO to search for data, to then pass that data through our processing services prior to downloading it. As a second attempt at demonstrating the new system, a collaboration was established with the Collaborative Sun Earth Connector (CoSEC) group at Lockheed Martin. This group is working on a graphical user interface to the Virtual Observatories and data processing software. The intent is to provide a high-level easy-to-use graphical interface that will allow access to the existing Virtual Observatories and data processing services from one convenient application. Working with the CoSEC group we provided access to our data

  5. Distribution analysis of segmented wave sea clutter in littoral environments

    CSIR Research Space (South Africa)

    Strempel, MD

    2015-10-01

    Full Text Available are then fitted against the K-distribution. It is shown that the approach can accurately describe specific sections of the wave with a reduced error between actual and estimated distributions. The improved probability density function (PDF) representation...

  6. Power distribution, the environment, and public health. A state-level analysis

    International Nuclear Information System (INIS)

    Boyce, James K.; Klemer, Andrew R.; Templet, Paul H.; Willis, Cleve E.

    1999-01-01

    This paper examines relationships among power distribution, the environment, and public health by means of a cross-sectional analysis of the 50 US states. A measure of inter-state variations in power distribution is derived from data on voter participation, tax fairness, Medicaid access, and educational attainment. We develop and estimate a recursive model linking the distribution of power to environmental policy, environmental stress, and public health. The results support the hypothesis that greater power inequality leads to weaker environmental policies, which in turn lead to greater environmental degradation and to adverse public health outcomes

  7. Power distribution, the environment, and public health. A state-level analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boyce, James K. [Department of Economics, University of Massachusetts, Amherst, MA 01003 (United States); Klemer, Andrew R. [Department of Biology, University of Minnesota, Duluth, MN (United States); Templet, Paul H. [Institute of Environmental Studies, Louisiana State University, Baton Rouge, LA (United States); Willis, Cleve E. [Department of Resource Economics, University of Massachusetts, Amherst, MA 01003 (United States)

    1999-04-15

    This paper examines relationships among power distribution, the environment, and public health by means of a cross-sectional analysis of the 50 US states. A measure of inter-state variations in power distribution is derived from data on voter participation, tax fairness, Medicaid access, and educational attainment. We develop and estimate a recursive model linking the distribution of power to environmental policy, environmental stress, and public health. The results support the hypothesis that greater power inequality leads to weaker environmental policies, which in turn lead to greater environmental degradation and to adverse public health outcomes.

  8. Power distribution, the environment, and public health. A state-level analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boyce, James K. [Department of Economics, University of Massachusetts, Amherst, MA 01003 (United States); Klemer, Andrew R. [Department of Biology, University of Minnesota, Duluth, MN (United States); Templet, Paul H. [Institute of Environmental Studies, Louisiana State University, Baton Rouge, LA (United States); Willis, Cleve E. [Department of Resource Economics, University of Massachusetts, Amherst, MA 01003 (United States)

    1999-04-15

    This paper examines relationships among power distribution, the environment, and public health by means of a cross-sectional analysis of the 50 US states. A measure of inter-state variations in power distribution is derived from data on voter participation, tax fairness, Medicaid access, and educational attainment. We develop and estimate a recursive model linking the distribution of power to environmental policy, environmental stress, and public health. The results support the hypothesis that greater power inequality leads to weaker environmental policies, which in turn lead to greater environmental degradation and to adverse public health outcomes

  9. Learning Networks Distributed Environment

    NARCIS (Netherlands)

    Martens, Harrie; Vogten, Hubert; Koper, Rob; Tattersall, Colin; Van Rosmalen, Peter; Sloep, Peter; Van Bruggen, Jan; Spoelstra, Howard

    2005-01-01

    Learning Networks Distributed Environment is a prototype of an architecture that allows the sharing and modification of learning materials through a number of transport protocols. The prototype implements a p2p protcol using JXTA.

  10. A spatial pattern analysis of the halophytic species distribution in an arid coastal environment.

    Science.gov (United States)

    Badreldin, Nasem; Uria-Diez, J; Mateu, J; Youssef, Ali; Stal, Cornelis; El-Bana, Magdy; Magdy, Ahmed; Goossens, Rudi

    2015-05-01

    Obtaining information about the spatial distribution of desert plants is considered as a serious challenge for ecologists and environmental modeling due to the required intensive field work and infrastructures in harsh and remote arid environments. A new method was applied for assessing the spatial distribution of the halophytic species (HS) in an arid coastal environment. This method was based on the object-based image analysis for a high-resolution Google Earth satellite image. The integration of the image processing techniques and field work provided accurate information about the spatial distribution of HS. The extracted objects were based on assumptions that explained the plant-pixel relationship. Three different types of digital image processing techniques were implemented and validated to obtain an accurate HS spatial distribution. A total of 2703 individuals of the HS community were found in the case study, and approximately 82% were located above an elevation of 2 m. The micro-topography exhibited a significant negative relationship with pH and EC (r = -0.79 and -0.81, respectively, p < 0.001). The spatial structure was modeled using stochastic point processes, in particular a hybrid family of Gibbs processes. A new model is proposed that uses a hard-core structure at very short distances, together with a cluster structure in short-to-medium distances and a Poisson structure for larger distances. This model was found to fit the data perfectly well.

  11. Security Analysis of Measurement-Device-Independent Quantum Key Distribution in Collective-Rotation Noisy Environment

    Science.gov (United States)

    Li, Na; Zhang, Yu; Wen, Shuang; Li, Lei-lei; Li, Jian

    2018-01-01

    Noise is a problem that communication channels cannot avoid. It is, thus, beneficial to analyze the security of MDI-QKD in noisy environment. An analysis model for collective-rotation noise is introduced, and the information theory methods are used to analyze the security of the protocol. The maximum amount of information that Eve can eavesdrop is 50%, and the eavesdropping can always be detected if the noise level ɛ ≤ 0.68. Therefore, MDI-QKD protocol is secure as quantum key distribution protocol. The maximum probability that the relay outputs successful results is 16% when existing eavesdropping. Moreover, the probability that the relay outputs successful results when existing eavesdropping is higher than the situation without eavesdropping. The paper validates that MDI-QKD protocol has better robustness.

  12. Distributed Research Center for Analysis of Regional Climatic Changes and Their Impacts on Environment

    Science.gov (United States)

    Shiklomanov, A. I.; Okladnikov, I.; Gordov, E. P.; Proussevitch, A. A.; Titov, A. G.

    2016-12-01

    Presented is a collaborative project carrying out by joint team of researchers from the Institute of Monitoring of Climatic and Ecological Systems, Russia and Earth Systems Research Center, University of New Hampshire, USA. Its main objective is development of a hardware and software prototype of Distributed Research Center (DRC) for monitoring and projecting of regional climatic and and their impacts on the environment over the Northern extratropical areas. In the framework of the project new approaches to "cloud" processing and analysis of large geospatial datasets (big geospatial data) are being developed. It will be deployed on technical platforms of both institutions and applied in research of climate change and its consequences. Datasets available at NCEI and IMCES include multidimensional arrays of climatic, environmental, demographic, and socio-economic characteristics. The project is aimed at solving several major research and engineering tasks: 1) structure analysis of huge heterogeneous climate and environmental geospatial datasets used in the project, their preprocessing and unification; 2) development of a new distributed storage and processing model based on a "shared nothing" paradigm; 3) development of a dedicated database of metadata describing geospatial datasets used in the project; 4) development of a dedicated geoportal and a high-end graphical frontend providing intuitive user interface, internet-accessible online tools for analysis of geospatial data and web services for interoperability with other geoprocessing software packages. DRC will operate as a single access point to distributed archives of spatial data and online tools for their processing. Flexible modular computational engine running verified data processing routines will provide solid results of geospatial data analysis. "Cloud" data analysis and visualization approach will guarantee access to the DRC online tools and data from all over the world. Additionally, exporting of data

  13. Analysis of steady state temperature distribution in rod arrays exposed to stagnant gaseous environment

    International Nuclear Information System (INIS)

    Pal, G.; MaarKandeya, S.G.; Raj, V.V.

    1991-01-01

    This paper deals with the calculation of radiative heat exchange in a rod array exposed to stagnant gaseous environment. a computer code has been developed for this purpose and has been used for predicting the steady state temperature distribution in a nuclear fuel sub-assembly. Nuclear fuels continue to generate heat even after their removal from the core. During the transfer of the nuclear fuel sub-assemblies from the core to the storage bay, they pass through stagnant gaseous environment and may remain there for extended periods under abnormal conditions. Radiative heat exchange will be the dominant mode within the sub-assembly involved, since axial heat conduction through the fuel pins needs to be accounted for. a computer code RHEINA-3D (Radiative Heat Exchange In Nuclear Assemblies -3D) has been developed based on a simplified numerical model which considers both the above-mentioned phenomena. The analytical model and the results obtained are briefly discussed in this paper

  14. Analysis of Ecological Distribution and Genomic Content from a Clade of Bacteroidetes Endemic to Sulfidic Environments

    Science.gov (United States)

    Zhou, K.; Sylvan, J. B.; Hallam, S. J.

    2017-12-01

    The Bacteroidetes are a ubiquitous phylum of bacteria found in a wide variety of habitats. Marine Bacteroidetes are known to utilize complex carbohydrates and have a potentially important role in the global carbon cycle through processing these compounds, which are not digestible by many other microbes. Some members of the phylum are known to perform denitrification and are facultative anaerobes, but Bacteroidetes are not known to participate in sulfur redox cycling. Recently, it was shown that a clade of uncultured Bacteroidetes, including the VC2.1_Bac22 group, appears to be endemic to sulfidic environments, including hydrothermal vent sulfide chimneys, sediments and marine water column oxygen minimum zones (OMZs). This clade, dubbed the Sulfiphilic Bacteroidetes, is not detected in 16S rRNA amplicon studies from non-sulfidic environments. To test the hypothesis that the Sulphiphilic Bacteroidetes are involved in sulfur redox chemistry, we updated our meta-analysis of the clade using 16s rRNA sequences from public databases and employed single-cell genomics to survey their genomic potential using 19 single amplified genomes (SAGs) isolated from the seasonally anoxic Saanich Inlet, a seasonally hypoxic basin in British Columbia. Initial analysis of these SAGs indicates the Sulphiphilic Bacteroidetes may perform sulfur redox reactions using a three gene psrABC operon encoding the polysulfide reductase enzyme complex with a thiosulfate sulfurtransferase (rhodanese), which putatively uses cyanide to convert thiosulfate to sulfite, just upstream. Interestingly, this is the same configuration as discovered recently in some Marine Group A bacteria. Further aspects of the Sulphiphilic Bacteroidetes' genomic potential will be presented in light of their presence in sulfidic environments.

  15. Biosensor for laboratory and lander-based analysis of benthicnitrate plus nitrite distribution in marine environments

    DEFF Research Database (Denmark)

    Revsbech, N. P.; Glud, Ronnie Nøhr

    2009-01-01

    We present a psychotropic bacteria–based biosensor that can be used in low–temperature seawater for the analysis of nitrate + nitrite (NOx –). The sensor can be used to resolve concentrations below 1 µmol L–1 at low temperature (

  16. Antibiotics in the coastal environment of the Hailing Bay region, South China Sea: Spatial distribution, source analysis and ecological risks

    International Nuclear Information System (INIS)

    Chen, Hui; Liu, Shan; Xu, Xiang-Rong; Zhou, Guang-Jie; Liu, Shuang-Shuang; Yue, Wei-Zhong; Sun, Kai-Feng; Ying, Guang-Guo

    2015-01-01

    Highlights: • Thirty-eight antibiotics were systematically investigated in marine environment. • The distribution of antibiotics was significantly correlated with COD and NO 3 –N. • Untreated domestic sewage was the primary source of antibiotics. • Fluoroquinolones showed a strong sorption capacity onto sediments. • Oxytetracycline, norfloxacin and erythromycin–H 2 O indicated high risks. - Abstract: In this study, the occurrence and spatial distribution of 38 antibiotics in surface water and sediment samples of the Hailing Bay region, South China Sea, were investigated. Twenty-one, 16 and 15 of 38 antibiotics were detected with the concentrations ranging from <0.08 (clarithromycin) to 15,163 ng/L (oxytetracycline), 2.12 (methacycline) to 1318 ng/L (erythromycin–H 2 O), <1.95 (ciprofloxacin) to 184 ng/g (chlortetracycline) in the seawater, discharged effluent and sediment samples, respectively. The concentrations of antibiotics in the water phase were correlated positively with chemical oxygen demand and nitrate. The source analysis indicated that untreated domestic sewage was the primary source of antibiotics in the study region. Fluoroquinolones showed strong sorption capacity onto sediments due to their high pseudo-partitioning coefficients. Risk assessment indicated that oxytetracycline, norfloxacin and erythromycin–H 2 O posed high risks to aquatic organisms

  17. Spatial distribution and ecological environment analysis of great gerbil in Xinjiang Plague epidemic foci based on remote sensing

    International Nuclear Information System (INIS)

    Gao, Mengxu; Wang, Juanle; Li, Qun; Cao, Chunxiang

    2014-01-01

    Yersinia pestis (Plague bacterium) from great gerbil was isolated in 2005 in Xinjiang Dzungarian Basin, which confirmed the presence of the plague epidemic foci. This study analysed the spatial distribution and suitable habitat of great gerbil based on the monitoring data of great gerbil from Chinese Center for Disease Control and Prevention, as well as the ecological environment elements obtained from remote sensing products. The results showed that: (1) 88.5% (277/313) of great gerbil distributed in the area of elevation between 200 and 600 meters. (2) All the positive points located in the area with a slope of 0–3 degree, and the sunny tendency on aspect was not obvious. (3) All 313 positive points of great gerbil distributed in the area with an average annual temperature from 5 to 11 °C, and 165 points with an average annual temperature from 7 to 9 °C. (4) 72.8% (228/313) of great gerbil survived in the area with an annual precipitation of 120–200mm. (5) The positive points of great gerbil increased correspondingly with the increasing of NDVI value, but there is no positive point when NDVI is higher than 0.521, indicating the suitability of vegetation for great gerbil. This study explored a broad and important application for the monitoring and prevention of plague using remote sensing and geographic information system

  18. Xcache in the ATLAS Distributed Computing Environment

    CERN Document Server

    Hanushevsky, Andrew; The ATLAS collaboration

    2018-01-01

    Built upon the Xrootd Proxy Cache (Xcache), we developed additional features to adapt the ATLAS distributed computing and data environment, especially its data management system RUCIO, to help improve the cache hit rate, as well as features that make the Xcache easy to use, similar to the way the Squid cache is used by the HTTP protocol. We are optimizing Xcache for the HPC environments, and adapting the HL-LHC Data Lakes design as its component for data delivery. We packaged the software in CVMFS, in Docker and Singularity containers in order to standardize the deployment and reduce the cost to resolve issues at remote sites. We are also integrating it into RUCIO as a volatile storage systems, and into various ATLAS workflow such as user analysis,

  19. Rare earth elements determination and distribution patterns in sediments of polluted marine environment by instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Akyil, S.; Yusof, A.M.; Wood, A.K.H.

    2001-01-01

    Results obtained from the analysis of sediment core samples taken from a fairly polluted marine environment were analyzed for the REE contents to determine the concentrations of La, Ce, Sm, Eu, Tb, Dy and Yb using instrumental neutron activation analysis. Core samples were divided into strata of between 2 to 3 cm intervals and prepared in the powdered form before irradiating them in a TRIGA Mk.II reactor. Down-core concentration profiles of La, Ce, Sm, Eu, Tb, Dy and Yb in 3 core sediments from three sites are obtained. The shale-normalized REE pattern from each site was examined and later used to explain the history of sedimentation by natural processes such as shoreline erosion and weathering products deposited on the seabed and furnishing some baseline data and/or pollution trend occurring within the study area

  20. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  1. Distributed Large Data-Object Environments: End-to-End Performance Analysis of High Speed Distributed Storage Systems in Wide Area ATM Networks

    Science.gov (United States)

    Johnston, William; Tierney, Brian; Lee, Jason; Hoo, Gary; Thompson, Mary

    1996-01-01

    We have developed and deployed a distributed-parallel storage system (DPSS) in several high speed asynchronous transfer mode (ATM) wide area networks (WAN) testbeds to support several different types of data-intensive applications. Architecturally, the DPSS is a network striped disk array, but is fairly unique in that its implementation allows applications complete freedom to determine optimal data layout, replication and/or coding redundancy strategy, security policy, and dynamic reconfiguration. In conjunction with the DPSS, we have developed a 'top-to-bottom, end-to-end' performance monitoring and analysis methodology that has allowed us to characterize all aspects of the DPSS operating in high speed ATM networks. In particular, we have run a variety of performance monitoring experiments involving the DPSS in the MAGIC testbed, which is a large scale, high speed, ATM network and we describe our experience using the monitoring methodology to identify and correct problems that limit the performance of high speed distributed applications. Finally, the DPSS is part of an overall architecture for using high speed, WAN's for enabling the routine, location independent use of large data-objects. Since this is part of the motivation for a distributed storage system, we describe this architecture.

  2. Antibiotics in the coastal environment of the Hailing Bay region, South China Sea: Spatial distribution, source analysis and ecological risks.

    Science.gov (United States)

    Chen, Hui; Liu, Shan; Xu, Xiang-Rong; Zhou, Guang-Jie; Liu, Shuang-Shuang; Yue, Wei-Zhong; Sun, Kai-Feng; Ying, Guang-Guo

    2015-06-15

    In this study, the occurrence and spatial distribution of 38 antibiotics in surface water and sediment samples of the Hailing Bay region, South China Sea, were investigated. Twenty-one, 16 and 15 of 38 antibiotics were detected with the concentrations ranging from antibiotics in the water phase were correlated positively with chemical oxygen demand and nitrate. The source analysis indicated that untreated domestic sewage was the primary source of antibiotics in the study region. Fluoroquinolones showed strong sorption capacity onto sediments due to their high pseudo-partitioning coefficients. Risk assessment indicated that oxytetracycline, norfloxacin and erythromycin-H2O posed high risks to aquatic organisms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Problem solving environment for distributed interactive applications

    NARCIS (Netherlands)

    Rycerz, K.; Bubak, M.; Sloot, P.; Getov, V.; Gorlatch, S.; Bubak, M.; Priol, T.

    2008-01-01

    Interactive Problem Solving Environments (PSEs) offer an integrated approach for constructing and running complex systems, such as distributed simulation systems. To achieve efficient execution of High Level Architecture (HLA)-based distributed interactive simulations on the Grid, we introduce a PSE

  4. Rare earth elements determination and distribution patterns in sediments of a polluted marine environment by instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Yusof, A.M.

    2001-01-01

    Results obtained from the analysis of sediment core samples taken from a fairly polluted marine environment were analyzed for the REE contents to determine the concentrations of La, Ce, Sm, Eu, Tb, Dy and Yb using instrumental neutron activation analysis. Core samples were divided into strata of between 2 to 3 cm intervals and prepared in the powdered form before irradiating them in a neutron flux of about 5.0 x 10 12 n x cm -2 x s -1 in a Triga Mark II reactor. Down-core concentration profiles of La, Ce, Sm, Eu, Tb, Dy and Yb in 3 core sediments from three sites are obtained. The shale-normalized REE pattern from each site was examined and later used to explain the history of sedimentation by natural processes such as shoreline erosion and weathering products deposited on the seabed and furnishing some baseline data and/or pollution trend occurring within the study area. The shale-normalized REE patterns also showed that LREE in the sediment samples exhibit enrichment relative to HREE particularly, La and Sm showing enrichment compared to the ratios in shale. REE concentrations of 124 μg/g at the surface of sediment collected at two of the three sites were found to decrease to 58 and 95 μg/g, respectively. This was of particular interest when it is used to explain the anomalies occurring in the marine sediment as a result of geochemical processes over a long period of time. Changes in concentrations from surface to bottom of the sediments ratioed to Sm concentrations and the correlation between concentrations of Sm and these elements were also investigated and correlation coefficients were calculated for all REEs and sites. Validation of the method used was done using a Soil-7 SRM. (author)

  5. Distributed Analysis in CMS

    CERN Document Server

    Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank

    2009-01-01

    The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.

  6. Communication Analysis of Environment.

    Science.gov (United States)

    Malik, M. F.; Thwaites, H. M.

    This textbook was developed for use in a Concordia University (Quebec) course entitled "Communication Analysis of Environment." Designed as a practical application of information theory and cybernetics in the field of communication studies, the course is intended to be a self-instructional process, whereby each student chooses one…

  7. Distributed analysis at LHCb

    International Nuclear Information System (INIS)

    Williams, Mike; Egede, Ulrik; Paterson, Stuart

    2011-01-01

    The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.

  8. Sensors in Distributed Mixed Reality Environments

    Directory of Open Access Journals (Sweden)

    Felix Hamza-Lup

    2005-04-01

    Full Text Available A distributed mixed-reality (MR or virtual reality (VR environment implies the cooperative engagement of a set of software and hardware resources. With the advances in sensors and computer networks we have seen an increase in the number of potential MR/VR applications that require large amounts of information from the real world collected through sensors (e.g. position and orientation tracking sensors. These sensors collect data from the real environment in real-time at different locations and a distributed environment connecting them must assure data distribution among collaborative sites at interactive speeds. With the advances in sensor technology, we envision that in future systems a significant amount of data will be collected from sensors and devices attached to the participating nodes This paper proposes a new architecture for sensor based interactive distributed MR/VR environments that falls in-between the atomistic peer-to-peer model and the traditional client-server model. Each node is autonomous and fully manages its resources and connectivity. The dynamic behavior of the nodes is dictated by the human participants that manipulate the sensors attached to these nodes.

  9. Texture mapping in a distributed environment

    NARCIS (Netherlands)

    Nicolae, Goga; Racovita, Zoea; Telea, Alexandru

    2003-01-01

    This paper presents a tool for texture mapping in a distributed environment. A parallelization method based on the master-slave model is described. The purpose of this work is to lower the image generation time in the complex 3D scenes synthesis process. The experimental results concerning the

  10. The ATLAS distributed analysis system

    International Nuclear Information System (INIS)

    Legger, F

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  11. The ATLAS distributed analysis system

    Science.gov (United States)

    Legger, F.; Atlas Collaboration

    2014-06-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  12. Dynamic, Interactive and Visual Analysis of Population Distribution and Mobility Dynamics in an Urban Environment Using the Mobility Explorer Framework

    Directory of Open Access Journals (Sweden)

    Jan Peters-Anders

    2017-05-01

    Full Text Available This paper investigates the extent to which a mobile data source can be utilised to generate new information intelligence for decision-making in smart city planning processes. In this regard, the Mobility Explorer framework is introduced and applied to the City of Vienna (Austria by using anonymised mobile phone data from a mobile phone service provider. This framework identifies five necessary elements that are needed to develop complex planning applications. As part of the investigation and experiments a new dynamic software tool, called Mobility Explorer, has been designed and developed based on the requirements of the planning department of the City of Vienna. As a result, the Mobility Explorer enables city stakeholders to interactively visualise the dynamic diurnal population distribution, mobility patterns and various other complex outputs for planning needs. Based on the experiences during the development phase, this paper discusses mobile data issues, presents the visual interface, performs various user-defined analyses, demonstrates the application’s usefulness and critically reflects on the evaluation results of the citizens’ motion exploration that reveal the great potential of mobile phone data in smart city planning but also depict its limitations. These experiences and lessons learned from the Mobility Explorer application development provide useful insights for other cities and planners who want to make informed decisions using mobile phone data in their city planning processes through dynamic visualisation of Call Data Record (CDR data.

  13. Analysis of Impact of Geographical Environment and Socio-economic Factors on the Spatial Distribution of Kaohsiung Dengue Fever Epidemic

    Science.gov (United States)

    Hsu, Wei-Yin; Wen, Tzai-Hung; Yu, Hwa-Lung

    2013-04-01

    Taiwan is located in subtropical and tropical regions with high temperature and high humidity in the summer. This kind of climatic condition is the hotbed for the propagation and spread of the dengue vector mosquito. Kaohsiung City has been the worst dengue fever epidemic city in Taiwan. During the study period, from January 1998 to December 2011, Taiwan CDC recorded 7071 locally dengue epidemic cases in Kaohsiung City, and the number of imported case is 118. Our research uses Quantile Regression, a spatial infection disease distribution, to analyze the correlation between dengue epidemic and geographic environmental factors and human society factors in Kaohsiung. According to our experiment statistics, agriculture and natural forest have a positive relation to dengue fever(5.5~34.39 and 3.91~15.52). The epidemic will rise when the ratio for agriculture and natural forest increases. Residential ratio has a negative relation for quantile 0.1 to 0.4(-0.005~-0.78), and a positive relation for quantile 0.5 to0.9(0.01~18.0) . The mean income is also a significant factor in social economy field, and it has a negative relation to dengue fever(-0.01~-0.04). Conclusion from our research is that the main factor affecting the degree of dengue fever in predilection area is the residential proportion and the ratio of agriculture and natural forest plays an important role affecting the degree of dengue fever in non predilection area. Moreover, the serious epidemic area located by regression model is the same as the actual condition in Kaohsiung. This model can be used to predict the serious epidemic area of dengue fever and provide some references for the Health Agencies

  14. Wigner Ville Distribution in Signal Processing, using Scilab Environment

    Directory of Open Access Journals (Sweden)

    Petru Chioncel

    2011-01-01

    Full Text Available The Wigner Ville distribution offers a visual display of quantitative information about the way a signal’s energy is distributed in both, time and frequency. Through that, this distribution embodies the fundamentally concepts of the Fourier and time-domain analysis. The energy of the signal is distributed so that specific frequencies are localized in time by the group delay time and at specifics instants in time the frequency is given by the instantaneous frequency. The net positive volum of the Wigner distribution is numerically equal to the signal’s total energy. The paper shows the application of the Wigner Ville distribution, in the field of signal processing, using Scilab environment.

  15. Research computing in a distributed cloud environment

    International Nuclear Information System (INIS)

    Fransham, K; Agarwal, A; Armstrong, P; Bishop, A; Charbonneau, A; Desmarais, R; Hill, N; Gable, I; Gaudet, S; Goliath, S; Impey, R; Leavett-Brown, C; Ouellete, J; Paterson, M; Pritchet, C; Penfold-Brown, D; Podaima, W; Schade, D; Sobie, R J

    2010-01-01

    The recent increase in availability of Infrastructure-as-a-Service (IaaS) computing clouds provides a new way for researchers to run complex scientific applications. However, using cloud resources for a large number of research jobs requires significant effort and expertise. Furthermore, running jobs on many different clouds presents even more difficulty. In order to make it easy for researchers to deploy scientific applications across many cloud resources, we have developed a virtual machine resource manager (Cloud Scheduler) for distributed compute clouds. In response to a user's job submission to a batch system, the Cloud Scheduler manages the distribution and deployment of user-customized virtual machines across multiple clouds. We describe the motivation for and implementation of a distributed cloud using the Cloud Scheduler that is spread across both commercial and dedicated private sites, and present some early results of scientific data analysis using the system.

  16. Distributed computing and nuclear reactor analysis

    International Nuclear Information System (INIS)

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-01-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations

  17. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  18. Distributed autonomous mapping of indoor environments

    Science.gov (United States)

    Rogers, J.; Paluri, M.; Cunningham, A.; Christensen, H. I.; Michael, N.; Kumar, V.; Ma, J.; Matthies, L.

    2011-06-01

    This paper describes the results of a Joint Experiment performed on behalf of the MAST CTA. The system developed for the Joint Experiment makes use of three robots which work together to explore and map an unknown environment. Each of the robots used in this experiment is equipped with a laser scanner for measuring walls and a camera for locating doorways. Information from both of these types of structures is concurrently incorporated into each robot's local map using a graph based SLAM technique. A Distributed-Data-Fusion algorithm is used to efficiently combine local maps from each robot into a shared global map. Each robot computes a compressed local feature map and transmits it to neighboring robots, which allows each robot to merge its map with the maps of its neighbors. Each robot caches the compressed maps from its neighbors, allowing it to maintain a coherent map with a common frame of reference. The robots utilize an exploration strategy to efficiently cover the unknown environment which allows collaboration on an unreliable communications channel. As each new branching point is discovered by a robot, it broadcasts the information about where this point is along with the robot's path from a known landmark to the other robots. When the next robot reaches a dead-end, new branching points are allocated by auction. In the event of communication interruption, the robot which observed the branching point will eventually explore it; therefore, the exploration is complete in the face of communication failure.

  19. Single Particulate SEM-EDX Analysis of Iron-Containing Coarse Particulate Matter in an Urban Environment: Sources and Distribution of Iron within Cleveland, Ohio

    Science.gov (United States)

    The physicochemical properties of coarse-mode, iron-containing particles, and their temporal and spatial distributions are poorly understood. Single particle analysis combining x-ray elemental mapping and computer-controlled scanning electron microscopy (CCSEM-EDX) of passively ...

  20. Distribution of peanut allergen in the environment.

    Science.gov (United States)

    Perry, Tamara T; Conover-Walker, Mary Kay; Pomés, Anna; Chapman, Martin D; Wood, Robert A

    2004-05-01

    Patients with peanut allergy can have serious reactions to very small quantities of peanut allergen and often go to extreme measures to avoid potential contact with this allergen. The purpose of this study was to detect peanut allergen under various environmental conditions and examine the effectiveness of cleaning agents for allergen removal. A monoclonal-based ELISA for Arachis hypogaea allergen 1 (Ara h 1; range of detection, 30-2000 ng/mL) was used to assess peanut contamination on cafeteria tables and other surfaces in schools, the presence of residual peanut protein after using various cleaning products on hands and tabletops, and airborne peanut allergen during the consumption of several forms of peanut. After hand washing with liquid soap, bar soap, or commercial wipes, Ara h 1 was undetectable. Plain water and antibacterial hand sanitizer left detectable Ara h 1 on 3 of 12 and 6 of 12 hands, respectively. Common household cleaning agents removed peanut allergen from tabletops, except dishwashing liquid, which left Ara h 1 on 4 of 12 tables. Of the 6 area preschools and schools evaluated, Ara h 1 was found on 1 of 13 water fountains, 0 of 22 desks, and 0 of 36 cafeteria tables. Airborne Ara h 1 was undetectable in simulated real-life situations when participants consumed peanut butter, shelled peanuts, and unshelled peanuts. The major peanut allergen, Ara h 1, is relatively easily cleaned from hands and tabletops with common cleaning agents and does not appear to be widely distributed in preschools and schools. We were not able to detect airborne allergen in many simulated environments.

  1. Functional Analysis in Virtual Environments

    Science.gov (United States)

    Vasquez, Eleazar, III; Marino, Matthew T.; Donehower, Claire; Koch, Aaron

    2017-01-01

    Functional analysis (FA) is an assessment procedure involving the systematic manipulation of an individual's environment to determine why a target behavior is occurring. An analog FA provides practitioners the opportunity to manipulate variables in a controlled environment and formulate a hypothesis for the function of a behavior. In previous…

  2. The distributed development environment for SDSS software

    International Nuclear Information System (INIS)

    Berman, E.; Gurbani, V.; Mackinnon, B.; Newberg, H. Nicinski, T.; Petravick, D.; Pordes, R.; Sergey, G.; Stoughton, C.; Lupton, R.

    1994-04-01

    The authors present an integrated science software development environment, code maintenance and support system for the Sloan Digital Sky Survey (SDSS) now being actively used throughout the collaboration

  3. Distributed Computations Environment Protection Using Artificial Immune Systems

    Directory of Open Access Journals (Sweden)

    A. V. Moiseev

    2011-12-01

    Full Text Available In this article the authors describe possibility of artificial immune systems applying for distributed computations environment protection from definite types of malicious impacts.

  4. Distributed computing environment for Mine Warfare Command

    OpenAIRE

    Pritchard, Lane L.

    1993-01-01

    Approved for public release; distribution is unlimited. The Mine Warfare Command in Charleston, South Carolina has been converting its information systems architecture from a centralized mainframe based system to a decentralized network of personal computers over the past several years. This thesis analyzes the progress Of the evolution as of May of 1992. The building blocks of a distributed architecture are discussed in relation to the choices the Mine Warfare Command has made to date. Ar...

  5. Distributed computing environment monitoring and user expectations

    International Nuclear Information System (INIS)

    Cottrell, R.L.A.; Logg, C.A.

    1996-01-01

    This paper discusses the growing needs for distributed system monitoring and compares it to current practices. It then goes to identify the components of distributed system monitoring and shows how they are implemented and successfully used at one site today to address the Local area Network (WAN), and host monitoring. It shows how this monitoring can be used to develop realistic service level expectations and also identifies the costs. Finally, the paper briefly discusses the future challenges in network monitoring. (author)

  6. Study of the distribution characteristics of rare earth elements in Solanum lycocarpum from different tropical environments in Brazil by neutron activation analysis

    International Nuclear Information System (INIS)

    Maria, Sheila Piorino

    2001-01-01

    In this work, the concentration of eight rare earth elements (REE), La, Ce, Nd, Sm, Eu, Tb, Yb and Lu, was determined by neutron activation analysis (INAA), in plant leaves of Solanum lycocarpum. This species is a typical Brazilian 'cerrado' plant, widely distributed in Brazil. The analysis of the plant reference materials CRM Pine Needles (NIST 1575) and Spruce Needles (BCR 101) proved that the methodology applied was sufficiently accurate and precise for the determination of REE in plants. In order to better evaluate the uptake of the REE from the soil to the plant, the host soil was also analyzed by ESiAA. The studied areas were Salitre, MG, Serra do Cipo, MG, Lagoa da Pampulha and Mangabeiras, in Belo Horizonte, MG, and Cerrado de Emas, in Pirassununga, SP. The results were analyzed through the calculation of transfer factors soil-plant and by using diagrams normalized to chondrites. The data obtained showed different transfer factors from soil to plant as the subtract changes. Similar distribution patterns for the soil and the plant were obtained in all the studied sites, presenting an enrichment of the light REE (La to Sm), in contrast to the heavy REE (Eu to Lu), less absorbed. These results indicate that the light REE remain available to the plant in the more superficial soil layers. The similarity between the distribution patterns indicates a typical REE absorption by this species, in spite of the significant differences in the substratum . (author)

  7. Supportability Analysis in LCI Environment

    OpenAIRE

    Dragan Vasiljevic; Ana Horvat

    2013-01-01

    Starting from the basic pillars of the supportability analysis this paper queries its characteristics in LCI (Life Cycle Integration) environment. The research methodology contents a review of modern logistics engineering literature with the objective to collect and synthesize the knowledge relating to standards of supportability design in e-logistics environment. The results show that LCI framework has properties which are in fully compatibility with the requirement of s...

  8. Distributed computing environment monitoring and user expectations

    International Nuclear Information System (INIS)

    Cottrell, R.L.A.; Logg, C.A.

    1995-11-01

    This paper discusses the growing needs for distributed system monitoring and compares it to current practices. It then goes on to identify the components of distributed system monitoring and shows how they are implemented and successfully used at one site today to address the Local Area Network (LAN), network services and applications, the Wide Area Network (WAN), and host monitoring. It shows how this monitoring can be used to develop realistic service level expectations and also identifies the costs. Finally, the paper briefly discusses the future challenges in network monitoring

  9. Plutonium in the ocean environment. Its distributions and behavior

    International Nuclear Information System (INIS)

    Hirose, Katsumi

    2009-01-01

    Marine environments have been extensively contaminated by plutonium as a result of global fallout due to atmospheric nuclear-weapons testing. Knowledge of the levels and behavior of plutonium in marine environments is necessary to assess the radiological and ecological effects of plutonium. Such analytical techniques as radiochemical analysis, α-spectrometry, and mass spectrometry have been developed to analyze the plutonium in seawater over the past five decades. Because of complex chemical properties (e.g. high reactivity to particles), plutonium in the ocean exhibits more complicated behavior than other long-lived anthropogenic radionuclides, such as 137 Cs. In the present study. In reviewed the research history of plutonium in the ocean, including spatial and temporal changes of plutonium levels and distributions, and its oceanographic behavior. (author)

  10. Distributed analysis with PROOF in ATLAS collaboration

    International Nuclear Information System (INIS)

    Panitkin, S Y; Ernst, M; Ito, H; Maeno, T; Majewski, S; Rind, O; Tarrade, F; Wenaus, T; Ye, S; Benjamin, D; Montoya, G Carillo; Guan, W; Mellado, B; Xu, N; Cranmer, K; Shibata, A

    2010-01-01

    The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF can be configured to work with centralized storage systems, but it is especially effective together with distributed local storage systems - like Xrootd, when data are distributed over computing nodes. It works efficiently on different types of hardware and scales well from a multi-core laptop to large computing farms. From that point of view it is well suited for both large central analysis facilities and Tier 3 type analysis farms. PROOF can be used in interactive or batch like regimes. The interactive regime allows the user to work with typically distributed data from the ROOT command prompt and get a real time feedback on analysis progress and intermediate results. We will discuss our experience with PROOF in the context of ATLAS Collaboration distributed analysis. In particular we will discuss PROOF performance in various analysis scenarios and in multi-user, multi-session environments. We will also describe PROOF integration with the ATLAS distributed data management system and prospects of running PROOF on geographically distributed analysis farms.

  11. Distributed analysis with PROOF in ATLAS collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Panitkin, S Y; Ernst, M; Ito, H; Maeno, T; Majewski, S; Rind, O; Tarrade, F; Wenaus, T; Ye, S [Brookhaven National Laboratory, Upton, NY 11973 (United States); Benjamin, D [Duke University, Durham, NC 27708 (United States); Montoya, G Carillo; Guan, W; Mellado, B; Xu, N [University of Wisconsin-Madison, Madison, WI 53706 (United States); Cranmer, K; Shibata, A [New York University, New York, NY 10003 (United States)

    2010-04-01

    The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF can be configured to work with centralized storage systems, but it is especially effective together with distributed local storage systems - like Xrootd, when data are distributed over computing nodes. It works efficiently on different types of hardware and scales well from a multi-core laptop to large computing farms. From that point of view it is well suited for both large central analysis facilities and Tier 3 type analysis farms. PROOF can be used in interactive or batch like regimes. The interactive regime allows the user to work with typically distributed data from the ROOT command prompt and get a real time feedback on analysis progress and intermediate results. We will discuss our experience with PROOF in the context of ATLAS Collaboration distributed analysis. In particular we will discuss PROOF performance in various analysis scenarios and in multi-user, multi-session environments. We will also describe PROOF integration with the ATLAS distributed data management system and prospects of running PROOF on geographically distributed analysis farms.

  12. An Intelligent System for Document Retrieval in Distributed Office Environments.

    Science.gov (United States)

    Mukhopadhyay, Uttam; And Others

    1986-01-01

    MINDS (Multiple Intelligent Node Document Servers) is a distributed system of knowledge-based query engines for efficiently retrieving multimedia documents in an office environment of distributed workstations. By learning document distribution patterns and user interests and preferences during system usage, it customizes document retrievals for…

  13. PIXE analysis of work environment aerosols

    International Nuclear Information System (INIS)

    Akabayashi, Hideo; Fujimoto, Fuminori; Komaki, Kenichiro; Ootuka, Akio; Kobayashi, Koichi; Yamashita, Hiroshi

    1988-01-01

    In labor environment, the quantity of chemical substances in the air is more, and their kinds are more diversified than in general home environment. It has been well known that some substances contained in the aerosol in labor environment (floating dust in the atmosphere) such as asbestos and hexavalent chromium have the possibility of causing serious injuries such as cancer of respiratory organ. In order to identify the harmful substances to which laborers are exposed and to take the measures for removing them, it is necessary to investigate in detail into many factors related to the effect of aerosol on human bodies, such as the composition of elements, chemical condition, concentration, the particle size of dust and temporal and spatial distributions. For the purpose, sampling and analysis must be carried out so that information can be extracted as much as possible from a minute amount of sample. The particle induced x-ray emission (PIXE) analysis is very effective for this application. In this paper, the development of a PIXE analysis system and the knowledge obtained by the sampling and measurement of aerosol in indoor labor environment are reported. The labor environment selected is that of the workshop of Department of Liberal Arts, University of Tokyo. Sampling, the experimental apparatus, the method of data analysis and the results of analysis are described. (Kako, I.)

  14. Distributed intelligent urban environment monitoring system

    Science.gov (United States)

    Du, Jinsong; Wang, Wei; Gao, Jie; Cong, Rigang

    2018-02-01

    The current environmental pollution and destruction have developed into a world-wide major social problem that threatens human survival and development. Environmental monitoring is the prerequisite and basis of environmental governance, but overall, the current environmental monitoring system is facing a series of problems. Based on the electrochemical sensor, this paper designs a small, low-cost, easy to layout urban environmental quality monitoring terminal, and multi-terminal constitutes a distributed network. The system has been small-scale demonstration applications and has confirmed that the system is suitable for large-scale promotion

  15. Flexible session management in a distributed environment

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Zach; /Wisconsin U., Madison; Bradley, Dan; /Wisconsin U., Madison; Tannenbaum, Todd; /Wisconsin U., Madison; Sfiligoi, Igor; /Fermilab

    2010-01-01

    Many secure communication libraries used by distributed systems, such as SSL, TLS, and Kerberos, fail to make a clear distinction between the authentication, session, and communication layers. In this paper we introduce CEDAR, the secure communication library used by the Condor High Throughput Computing software, and present the advantages to a distributed computing system resulting from CEDAR's separation of these layers. Regardless of the authentication method used, CEDAR establishes a secure session key, which has the flexibility to be used for multiple capabilities. We demonstrate how a layered approach to security sessions can avoid round-trips and latency inherent in network authentication. The creation of a distinct session management layer allows for optimizations to improve scalability by way of delegating sessions to other components in the system. This session delegation creates a chain of trust that reduces the overhead of establishing secure connections and enables centralized enforcement of system-wide security policies. Additionally, secure channels based upon UDP datagrams are often overlooked by existing libraries; we show how CEDAR's structure accommodates this as well. As an example of the utility of this work, we show how the use of delegated security sessions and other techniques inherent in CEDAR's architecture enables US CMS to meet their scalability requirements in deploying Condor over large-scale, wide-area grid systems.

  16. Flexible session management in a distributed environment

    International Nuclear Information System (INIS)

    Miller, Zach; Bradley, Dan; Tannenbaum, Todd; Sfiligoi, Igor

    2010-01-01

    Many secure communication libraries used by distributed systems, such as SSL, TLS, and Kerberos, fail to make a clear distinction between the authentication, session, and communication layers. In this paper we introduce CEDAR, the secure communication library used by the Condor High Throughput Computing software, and present the advantages to a distributed computing system resulting from CEDAR's separation of these layers. Regardless of the authentication method used, CEDAR establishes a secure session key, which has the flexibility to be used for multiple capabilities. We demonstrate how a layered approach to security sessions can avoid round-trips and latency inherent in network authentication. The creation of a distinct session management layer allows for optimizations to improve scalability by way of delegating sessions to other components in the system. This session delegation creates a chain of trust that reduces the overhead of establishing secure connections and enables centralized enforcement of system-wide security policies. Additionally, secure channels based upon UDP datagrams are often overlooked by existing libraries; we show how CEDAR's structure accommodates this as well. As an example of the utility of this work, we show how the use of delegated security sessions and other techniques inherent in CEDAR's architecture enables US CMS to meet their scalability requirements in deploying Condor over large-scale, wide-area grid systems.

  17. Flexible session management in a distributed environment

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Zach; Bradley, Dan; Tannenbaum, Todd [University of Wisconsin, Madison, WI (United States); Sfiligoi, Igor, E-mail: zmiller@cs.wisc.ed [Fermi National Acceleartor Laboratory, Batavia, IL (United States)

    2010-04-01

    Many secure communication libraries used by distributed systems, such as SSL, TLS, and Kerberos, fail to make a clear distinction between the authentication, session, and communication layers. In this paper we introduce CEDAR, the secure communication library used by the Condor High Throughput Computing software, and present the advantages to a distributed computing system resulting from CEDAR's separation of these layers. Regardless of the authentication method used, CEDAR establishes a secure session key, which has the flexibility to be used for multiple capabilities. We demonstrate how a layered approach to security sessions can avoid round-trips and latency inherent in network authentication. The creation of a distinct session management layer allows for optimizations to improve scalability by way of delegating sessions to other components in the system. This session delegation creates a chain of trust that reduces the overhead of establishing secure connections and enables centralized enforcement of system-wide security policies. Additionally, secure channels based upon UDP datagrams are often overlooked by existing libraries; we show how CEDAR's structure accommodates this as well. As an example of the utility of this work, we show how the use of delegated security sessions and other techniques inherent in CEDAR's architecture enables US CMS to meet their scalability requirements in deploying Condor over large-scale, wide-area grid systems.

  18. Flexible session management in a distributed environment

    Science.gov (United States)

    Miller, Zach; Bradley, Dan; Tannenbaum, Todd; Sfiligoi, Igor

    2010-04-01

    Many secure communication libraries used by distributed systems, such as SSL, TLS, and Kerberos, fail to make a clear distinction between the authentication, session, and communication layers. In this paper we introduce CEDAR, the secure communication library used by the Condor High Throughput Computing software, and present the advantages to a distributed computing system resulting from CEDAR's separation of these layers. Regardless of the authentication method used, CEDAR establishes a secure session key, which has the flexibility to be used for multiple capabilities. We demonstrate how a layered approach to security sessions can avoid round-trips and latency inherent in network authentication. The creation of a distinct session management layer allows for optimizations to improve scalability by way of delegating sessions to other components in the system. This session delegation creates a chain of trust that reduces the overhead of establishing secure connections and enables centralized enforcement of system-wide security policies. Additionally, secure channels based upon UDP datagrams are often overlooked by existing libraries; we show how CEDAR's structure accommodates this as well. As an example of the utility of this work, we show how the use of delegated security sessions and other techniques inherent in CEDAR's architecture enables US CMS to meet their scalability requirements in deploying Condor over large-scale, wide-area grid systems.

  19. The comprehensive intermodal planning system in distributed environment

    Directory of Open Access Journals (Sweden)

    Olgierd Dziamski

    2013-06-01

    Full Text Available Background: Goods distribution by containers has opened new opportunities for them in the global supply chain network. Standardization of transportation units allow to simplify transport logistics operations and enable the integration of different types of transport. Currently, container transport is the most popular means of transport by sea, but recent studies show an increase in its popularity in land transport. In the paper presented the concept of a comprehensive intermodal planning system in a distributed environment which enables more efficient use containers in the global transport. The aim of this paper was to formulate and develop new concept of Internet Intermodal Planning System in distribution environment, supporting the common interests of the consignors and the logistic service providers by integrating the transportation modes, routing, logistics operations and scheduling in order to create intermodal transport plans. Methods: In the paper presented and discussed the new approach to the management of logistics resources used in international intermodal transport. Detailed analysis of proposed classification of variety transportation means has been carried out along with their specific properties, which may affect the time and cost of transportation. Have been presented the modular web-based distribution planning system supporting container transportation planning. Conducted the analysis of the main planning process and curried out the discussion on the effectiveness of the new web-based container transport planning system. Result and conclusions: This paper presents a new concept of the distributed container transport planning system integrating the available on the market logistics service providers with freight exchange. Specific to container planning new transport categories has been identified which simplify and facilitate the management of intermodal planning process. The paper presents a modular structure of Integrated

  20. Authorization Administration in a Distributed Multi-application Environment

    Institute of Scientific and Technical Information of China (English)

    DUAN Sujuan; HONG Fan; LI Xinhua

    2004-01-01

    To meet the authorization administration requirements in a distributed computer network environment, this paper extends the role-based access control model with multiple application dimensions and establishes a new access control model ED-RBAC(Extended Role Based Access Control Model) for the distributed environment. We propose an extendable hierarchical authorization assignment framework and design effective role-registering, role-applying and role-assigning protocol with symmetric and asymmetric cryptographic systems. The model can be used to simplify authorization administration in a distributed environment with multiple applications.

  1. Mobile agent location in distributed environments

    Science.gov (United States)

    Fountoukis, S. G.; Argyropoulos, I. P.

    2012-12-01

    An agent is a small program acting on behalf of a user or an application which plays the role of a user. Artificial intelligence can be encapsulated in agents so that they can be capable of both behaving autonomously and showing an elementary decision ability regarding movement and some specific actions. Therefore they are often called autonomous mobile agents. In a distributed system, they can move themselves from one processing node to another through the interconnecting network infrastructure. Their purpose is to collect useful information and to carry it back to their user. Also, agents are used to start, monitor and stop processes running on the individual interconnected processing nodes of computer cluster systems. An agent has a unique id to discriminate itself from other agents and a current position. The position can be expressed as the address of the processing node which currently hosts the agent. Very often, it is necessary for a user, a processing node or another agent to know the current position of an agent in a distributed system. Several procedures and algorithms have been proposed for the purpose of position location of mobile agents. The most basic of all employs a fixed computing node, which acts as agent position repository, receiving messages from all the moving agents and keeping records of their current positions. The fixed node, responds to position queries and informs users, other nodes and other agents about the position of an agent. Herein, a model is proposed that considers pairs and triples of agents instead of single ones. A location method, which is investigated in this paper, attempts to exploit this model.

  2. Multimedia Services in Open Distributed Telecommunications Environments

    NARCIS (Netherlands)

    Leydekkers, Peter

    1997-01-01

    The design and analysis of transport protocols for reliable communications constitutes the topic of this dissertation. These transport protocols guarantee the sequenced and complete delivery of user data over networks which may lose, duplicate and reorder packets. Reliable transport services are

  3. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2001-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures. Distribution System Modeling and Analysis helps prevent those errors. It gives readers a basic understanding of the modeling and operating characteristics of the major components of a distribution system. One by one, the author develops and analyzes each component as a stand-alone element, then puts them all together to analyze a distribution system comprising the various shunt and series devices for power-flow and short-circuit studies. He includes the derivation of all models and includes many num...

  4. Distributed analysis challenges in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Duckeck, Guenter; Legger, Federica; Mitterer, Christoph Anton; Walker, Rodney [Ludwig-Maximilians-Universitaet Muenchen (Germany)

    2016-07-01

    The ATLAS computing model has undergone massive changes to meet the high luminosity challenge of the second run of the Large Hadron Collider (LHC) at CERN. The production system and distributed data management have been redesigned, a new data format and event model for analysis have been introduced, and common reduction and derivation frameworks have been developed. We report on the impact these changes have on the distributed analysis system, study the various patterns of grid usage for user analysis, focusing on the differences between the first and th e second LHC runs, and measure performances of user jobs.

  5. Exploring distributed user interfaces in ambient intelligent environments

    NARCIS (Netherlands)

    Dadlani Mahtani, P.M.; Peregrin Emparanza, J.; Markopoulos, P.; Gallud, J.A.; Tesoriero, R.; Penichet, V.M.R.

    2011-01-01

    In this paper we explore the use of Distributed User Interfaces (DUIs) in the field of Ambient Intelligence (AmI). We first introduce the emerging area of AmI, followed by describing three case studies where user interfaces or ambient displays are distributed and blending in the user’s environments.

  6. The ATLAS distributed analysis system

    OpenAIRE

    Legger, F.

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During...

  7. Internet-centric collaborative design in a distributed environment

    International Nuclear Information System (INIS)

    Kim, Hyun; Kim, Hyoung Sun; Do, Nam Chul; Lee, Jae Yeol; Lee, Joo Haeng; Myong, Jae Hyong

    2001-01-01

    Recently, advanced information technologies including internet-related technology and distributed object technology have opened new possibilities for collaborative designs. In this paper, we discuss computer supports for collaborative design in a distributed environment. The proposed system is the internet-centric system composed of an engineering framework, collaborative virtual workspace and engineering service. It allows the distributed designers to more efficiently and collaboratively work their engineering tasks throughout the design process

  8. Cooperative Control of Distributed Autonomous Vehicles in Adversarial Environments

    Science.gov (United States)

    2006-08-14

    COOPERATIVE CONTROL OF DISTRIBUTED AUTONOMOUS VEHICLES IN ADVERSARIAL ENVIRONMENTS Grant #F49620–01–1–0361 Final Report Jeff Shamma Department of...CONTRACT NUMBER F49620-01-1-0361 5b. GRANT NUMBER 4. TITLE AND SUBTITLE COOPERATIVE CONTROL OF DISTRIBUTED AUTONOMOUS VEHICLES IN...single dominant language or a distribution of languages. A relation to multivehicle systems is understanding how highly autonomous vehicles on extended

  9. Dynamic shared state maintenance in distributed virtual environments

    Science.gov (United States)

    Hamza-Lup, Felix George

    Advances in computer networks and rendering systems facilitate the creation of distributed collaborative environments in which the distribution of information at remote locations allows efficient communication. Particularly challenging are distributed interactive Virtual Environments (VE) that allow knowledge sharing through 3D information. The purpose of this work is to address the problem of latency in distributed interactive VE and to develop a conceptual model for consistency maintenance in these environments based on the participant interaction model. An area that needs to be explored is the relationship between the dynamic shared state and the interaction with the virtual entities present in the shared scene. Mixed Reality (MR) and VR environments must bring the human participant interaction into the loop through a wide range of electronic motion sensors, and haptic devices. Part of the work presented here defines a novel criterion for categorization of distributed interactive VE and introduces, as well as analyzes, an adaptive synchronization algorithm for consistency maintenance in such environments. As part of the work, a distributed interactive Augmented Reality (AR) testbed and the algorithm implementation details are presented. Currently the testbed is part of several research efforts at the Optical Diagnostics and Applications Laboratory including 3D visualization applications using custom built head-mounted displays (HMDs) with optical motion tracking and a medical training prototype for endotracheal intubation and medical prognostics. An objective method using quaternion calculus is applied for the algorithm assessment. In spite of significant network latency, results show that the dynamic shared state can be maintained consistent at multiple remotely located sites. In further consideration of the latency problems and in the light of the current trends in interactive distributed VE applications, we propose a hybrid distributed system architecture for

  10. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration; Pacheco Pages, A; Stradling, A

    2013-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  11. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  12. Protect Heterogeneous Environment Distributed Computing from Malicious Code Assignment

    Directory of Open Access Journals (Sweden)

    V. S. Gorbatov

    2011-09-01

    Full Text Available The paper describes the practical implementation of the protection system of heterogeneous environment distributed computing from malicious code for the assignment. A choice of technologies, development of data structures, performance evaluation of the implemented system security are conducted.

  13. Computer network environment planning and analysis

    Science.gov (United States)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  14. Task distribution mechanism for effective collaboration in virtual environments

    International Nuclear Information System (INIS)

    Khalid, S.; Ullah, S.; Alam, A.

    2016-01-01

    Collaborative Virtual Environments (CVEs) are computer generated worlds where two or more users can simultaneously interact with synthetic objects to perform a task. User performance is one of the main issues caused by either loose coordination, less awareness or communication among collaborating users. In this paper, a new model for task distribution is proposed, in which task distribution strategy among multiple users in CVEs is defined. The model assigns the task to collaborating users in CVEs either on static or dynamic basis. In static distribution there exists loose dependency and requires less communication during task realization whereas in dynamic distribution users are more dependent on each other and thus require more communication. In order to study the effect of static and dynamic task distribution strategies on user's performance in CVEs, a collaborative virtual environment is developed where twenty four (24) teams (each consists of two users) perform a task in collaboration under both strategies (static and dynamic). Results reveal that static distribution is more effective and increases users performance in CVEs. The outcome of this work will help the development of effective CVEs in the field of virtual assembly, repair, education and entertainment. (author)

  15. Distributed computing testbed for a remote experimental environment

    International Nuclear Information System (INIS)

    Butner, D.N.; Casper, T.A.; Howard, B.C.; Henline, P.A.; Davis, S.L.; Barnes, D.

    1995-01-01

    Collaboration is increasing as physics research becomes concentrated on a few large, expensive facilities, particularly in magnetic fusion energy research, with national and international participation. These facilities are designed for steady state operation and interactive, real-time experimentation. We are developing tools to provide for the establishment of geographically distant centers for interactive operations; such centers would allow scientists to participate in experiments from their home institutions. A testbed is being developed for a Remote Experimental Environment (REE), a ''Collaboratory.'' The testbed will be used to evaluate the ability of a remotely located group of scientists to conduct research on the DIII-D Tokamak at General Atomics. The REE will serve as a testing environment for advanced control and collaboration concepts applicable to future experiments. Process-to-process communications over high speed wide area networks provide real-time synchronization and exchange of data among multiple computer networks, while the ability to conduct research is enhanced by adding audio/video communication capabilities. The Open Software Foundation's Distributed Computing Environment is being used to test concepts in distributed control, security, naming, remote procedure calls and distributed file access using the Distributed File Services. We are exploring the technology and sociology of remotely participating in the operation of a large scale experimental facility

  16. Investment decisions in environment of uncertainties integrated to the viability analysis of the distribution and transmission projects; Decisao de investimentos em ambiente de incertezas integrada a analise de viabilidade de projetos de transmissao e distribuicao

    Energy Technology Data Exchange (ETDEWEB)

    Gazzi, L.M.P. [Universidade de Sao Paulo (USP), SP (Brazil). Curso de Pos-graduacao em Engenharia Eletrica; Ramos, D.S. [Universidade de Sao Paulo (USP), SP (Brazil)

    2009-07-01

    This work intends to support the necessary investment decisions to the expansion of the network of 138/88 kV, belonging to the electric energy distribution companies. In the economic and financial analysis, are considered the needs of the Regulator for the tariff recognition of an investment, using economic engineering techniques to evaluate the feedback of the invested capital but considering an uncertainty environment, where the best decision depends on exogenous variables to the traditional planning process. In order to make viable the inclusion of the main variables of random behavior that condition the economic and financial performances of the analyzed alternatives for decision making, it was chosen the utilization of the methodology based on 'Real Options', which is an used technique in the financial market for some time.

  17. Response Time Analysis of Distributed Web Systems Using QPNs

    Directory of Open Access Journals (Sweden)

    Tomasz Rak

    2015-01-01

    Full Text Available A performance model is used for studying distributed Web systems. Performance evaluation is done by obtaining load test measurements. Queueing Petri Nets formalism supports modeling and performance analysis of distributed World Wide Web environments. The proposed distributed Web systems modeling and design methodology have been applied in the evaluation of several system architectures under different external loads. Furthermore, performance analysis is done to determine the system response time.

  18. Distributed Data Analysis in ATLAS

    CERN Document Server

    Nilsson, P; The ATLAS collaboration

    2012-01-01

    Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and NorduGrid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interfa...

  19. Business models for distributed generation in a liberalized market environment

    International Nuclear Information System (INIS)

    Gordijn, Jaap; Akkermans, Hans

    2007-01-01

    The analysis of the potential of emerging innovative technologies calls for a systems-theoretic approach that takes into account technical as well as socio-economic factors. This paper reports the main findings of several business case studies of different future applications in various countries of distributed power generation technologies, all based on a common methodology for networked business modeling and analysis. (author)

  20. A Virtual Hosting Environment for Distributed Online Gaming

    Science.gov (United States)

    Brossard, David; Prieto Martinez, Juan Luis

    With enterprise boundaries becoming fuzzier, it’s become clear that businesses need to share resources, expose services, and interact in many different ways. In order to achieve such a distribution in a dynamic, flexible, and secure way, we have designed and implemented a virtual hosting environment (VHE) which aims at integrating business services across enterprise boundaries and virtualising the ICT environment within which these services operate in order to exploit economies of scale for the businesses as well as achieve shorter concept-to-market time scales. To illustrate the relevance of the VHE, we have applied it to the online gaming world. Online gaming is an early adopter of distributed computing and more than 30% of gaming developer companies, being aware of the shift, are focusing on developing high performance platforms for the new online trend.

  1. Optimising metadata workflows in a distributed information environment

    OpenAIRE

    Robertson, R. John; Barton, Jane

    2005-01-01

    The different purposes present within a distributed information environment create the potential for repositories to enhance their metadata by capitalising on the diversity of metadata available for any given object. This paper presents three conceptual reference models required to achieve this optimisation of metadata workflow: the ecology of repositories, the object lifecycle model, and the metadata lifecycle model. It suggests a methodology for developing the metadata lifecycle model, and ...

  2. A distributed execution environment for large data visualization

    International Nuclear Information System (INIS)

    Huang Jian; Liu Huadong; Beck, Micah; Gao Jinzhu; Moore, Terry

    2006-01-01

    Over the years, homogeneous computer cluster have been the most popular, and, in some sense, the only viable, platform for use in parallel visualization. In this work, we designed an execution environment for data-intensive visualization that is suitable to handle SciDAC scale datasets. This environment is solely based on computers distributed across the Internet that are owned and operated by independent institutions, while being openly shared for free. Those Internet computers are inherently of heterogeneous hardware configuration and running a variety of operating systems. Using 100 processors of such kind, we have been able to obtain the same level of performance offered by a 64-node cluster of 2.2 GHz P4 processors, while processing a 75GBs subset of TSI simulation data. Due to its inherently shared nature, this execution environment for data-intensive visualization could provide a viable means of collaboration among geographically separated SciDAC scientists

  3. Distribution of {sup 129}I in terrestrial surface water environments

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xuegao [State Key Laboratory of Hydrology-Water Resources and Hydraulic Engineering, Hohai University, Nanjing 210098 (China); College of Hydrology and Water Resources, Hohai University, Nanjing (China); Gong, Meng [College of Hydrology and Water Resources, Hohai University, Nanjing (China); Yi, Peng, E-mail: pengyi1915@163.com [State Key Laboratory of Hydrology-Water Resources and Hydraulic Engineering, Hohai University, Nanjing 210098 (China); College of Hydrology and Water Resources, Hohai University, Nanjing (China); Aldahan, Ala [Department of Earth Sciences, Uppsala University, Uppsala (Sweden); Department of Geology, United Arab Emirates University, Al Ain (United Arab Emirates); Yu, Zhongbo [State Key Laboratory of Hydrology-Water Resources and Hydraulic Engineering, Hohai University, Nanjing 210098 (China); College of Hydrology and Water Resources, Hohai University, Nanjing (China); Possnert, Göran [Tandem Laboratory, Uppsala University, Uppsala (Sweden); Chen, Li [State Key Laboratory of Hydrology-Water Resources and Hydraulic Engineering, Hohai University, Nanjing 210098 (China); College of Hydrology and Water Resources, Hohai University, Nanjing (China)

    2015-10-15

    The global distribution of the radioactive isotope iodine-129 in surface waters (lakes and rivers) is presented here and compared with the atmospheric deposition and distribution in surface marine waters. The results indicate relatively high concentrations in surface water systems in close vicinity of the anthropogenic release sources as well as in parts of Western Europe, North America and Central Asia. {sup 129}I level is generally higher in the terrestrial surface water of the Northern hemisphere compared to the southern hemisphere. The highest values of {sup 129}I appear around 50°N and 40°S in the northern and southern hemisphere, separately. Direct gaseous and marine atmospheric emissions are the most likely avenues for the transport of {sup 129}I from the sources to the terrestrial surface waters. To apply iodine-129 as process tracer in terrestrial surface water environment, more data are needed on {sup 129}I distribution patterns both locally and globally.

  4. Study of cache performance in distributed environment for data processing

    International Nuclear Information System (INIS)

    Makatun, Dzmitry; Lauret, Jérôme; Šumbera, Michal

    2014-01-01

    Processing data in distributed environment has found its application in many fields of science (Nuclear and Particle Physics (NPP), astronomy, biology to name only those). Efficiently transferring data between sites is an essential part of such processing. The implementation of caching strategies in data transfer software and tools, such as the Reasoner for Intelligent File Transfer (RIFT) being developed in the STAR collaboration, can significantly decrease network load and waiting time by reusing the knowledge of data provenance as well as data placed in transfer cache to further expand on the availability of sources for files and data-sets. Though, a great variety of caching algorithms is known, a study is needed to evaluate which one can deliver the best performance in data access considering the realistic demand patterns. Records of access to the complete data-sets of NPP experiments were analyzed and used as input for computer simulations. Series of simulations were done in order to estimate the possible cache hits and cache hits per byte for known caching algorithms. The simulations were done for cache of different sizes within interval 0.001 – 90% of complete data-set and low-watermark within 0-90%. Records of data access were taken from several experiments and within different time intervals in order to validate the results. In this paper, we will discuss the different data caching strategies from canonical algorithms to hybrid cache strategies, present the results of our simulations for the diverse algorithms, debate and identify the choice for the best algorithm in the context of Physics Data analysis in NPP. While the results of those studies have been implemented in RIFT, they can also be used when setting up cache in any other computational work-flow (Cloud processing for example) or managing data storages with partial replicas of the entire data-set

  5. Distributed mobility management - framework & analysis

    NARCIS (Netherlands)

    Liebsch, M.; Seite, P.; Karagiannis, Georgios

    2013-01-01

    Mobile operators consider the distribution of mobility anchors to enable offloading some traffic from their core network. The Distributed Mobility Management (DMM) Working Group is investigating the impact of decentralized mobility management to existing protocol solutions, while taking into account

  6. Source Coding for Wireless Distributed Microphones in Reverberant Environments

    DEFF Research Database (Denmark)

    Zahedi, Adel

    2016-01-01

    . However, it comes with the price of several challenges, including the limited power and bandwidth resources for wireless transmission of audio recordings. In such a setup, we study the problem of source coding for the compression of the audio recordings before the transmission in order to reduce the power...... consumption and/or transmission bandwidth by reduction in the transmission rates. Source coding for wireless microphones in reverberant environments has several special characteristics which make it more challenging in comparison with regular audio coding. The signals which are acquired by the microphones......Modern multimedia systems are more and more shifting toward distributed and networked structures. This includes audio systems, where networks of wireless distributed microphones are replacing the traditional microphone arrays. This allows for flexibility of placement and high spatial diversity...

  7. Environment effects on the flattening distribution of galaxies

    International Nuclear Information System (INIS)

    Freitas Pacheco, J.A. de; Souza, R.E. de; Arakaki, L.

    1983-01-01

    The results of a study of environment effects on the flattening distribution of galaxies are presented. Samples of different morphological types in regions of high and low projected galaxian density were prepared using the Uppsala General Catalogue of Galaxies. No significant differences were detected for the E and Spiral galaxies. However this study suggest that there is an excess of flat S phi systems in regions of high galaxian density with respect the field systems. These 'flat' S phi's are interpreted as being gas stripped Sa galaxies and the consequences of such a hypothesis are also discussed. (Author) [pt

  8. Community problem-solving framed as a distributed information use environment: bridging research and practice

    Directory of Open Access Journals (Sweden)

    Joan C. Durrance

    2006-01-01

    Full Text Available Introduction. This article results from a qualitative study of 1 information behavior in community problem-solving framed as a distributed information use environment and 2 approaches used by a best-practice library to anticipate information needs associated with community problem solving. Method. Several approaches to data collection were used - focus groups, interviews, observation of community and library meetings, and analysis of supporting documents. We focused first on the information behaviour of community groups. Finding that the library supported these activities we sought to understand its approach. Analysis. Data were coded thematically for both information behaviour concepts and themes germane to problem-solving activity. A grounded theory approach was taken to capture aspects of the library staff's practice. Themes evolved from the data; supporting documentation - reports, articles and library communication - was also coded. Results. The study showed 1 how information use environment components (people, setting, problems, problem resolutions combine in this distributed information use environment to determine specific information needs and uses; and 2 how the library contributed to the viability of this distributed information use environment. Conclusion. Community problem solving, here explicated as a distributed IUE, is likely to be seen in multiple communities. The library model presented demonstrates that by reshaping its information practice within the framework of an information use environment, a library can anticipate community information needs as they are generated and where they are most relevant.

  9. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  10. A Distributed Feature-based Environment for Collaborative Design

    Directory of Open Access Journals (Sweden)

    Wei-Dong Li

    2003-02-01

    Full Text Available This paper presents a client/server design environment based on 3D feature-based modelling and Java technologies to enable design information to be shared efficiently among members within a design team. In this environment, design tasks and clients are organised through working sessions generated and maintained by a collaborative server. The information from an individual design client during a design process is updated and broadcast to other clients in the same session through an event-driven and call-back mechanism. The downstream manufacturing analysis modules can be wrapped as agents and plugged into the open environment to support the design activities. At the server side, a feature-feature relationship is established and maintained to filter the varied information of a working part, so as to facilitate efficient information update during the design process.

  11. 99Tc in the environment. Sources, distribution and methods

    International Nuclear Information System (INIS)

    Garcia-Leon, Manuel

    2005-01-01

    99 Tc is a β-emitter, E max =294 keV, with a very long half-life (T 1/2 =2.11 x 10 5 y). It is mainly produced in the fission of 235 U and 239 Pu at a rate of about 6%. This rate together with its long half-life makes it a significant nuclide in the whole nuclear fuel cycle, from which it can be introduced into the environment at different rates depending on the cycle step. A gross estimation shows that adding all the possible sources, at least 2000 TBq had been released into the environment up to 2000 and that up to the middle of the nineties of the last century some 64000 TBq had been produced worldwide. Nuclear explosions have liberated some 160 TBq into the environment. In this work, environmental distribution of 99 Tc as well as the methods for its determination will be discussed. Emphasis is put on the environmental relevance of 99 Tc, mainly with regard to the future committed radiation dose received by the population and to the problem of nuclear waste management. Its determination at environmental levels is a challenging task. For that, special mention is made about the mass spectrometric methods for its measurement. (author)

  12. Generalized Load Sharing for Homogeneous Networks of Distributed Environment

    Directory of Open Access Journals (Sweden)

    A. Satheesh

    2008-01-01

    Full Text Available We propose a method for job migration policies by considering effective usage of global memory in addition to CPU load sharing in distributed systems. When a node is identified for lacking sufficient memory space to serve jobs, one or more jobs of the node will be migrated to remote nodes with low memory allocations. If the memory space is sufficiently large, the jobs will be scheduled by a CPU-based load sharing policy. Following the principle of sharing both CPU and memory resources, we present several load sharing alternatives. Our objective is to reduce the number of page faults caused by unbalanced memory allocations for jobs among distributed nodes, so that overall performance of a distributed system can be significantly improved. We have conducted trace-driven simulations to compare CPU-based load sharing policies with our policies. We show that our load sharing policies not only improve performance of memory bound jobs, but also maintain the same load sharing quality as the CPU-based policies for CPU-bound jobs. Regarding remote execution and preemptive migration strategies, our experiments indicate that a strategy selection in load sharing is dependent on the amount of memory demand of jobs, remote execution is more effective for memory-bound jobs, and preemptive migration is more effective for CPU-bound jobs. Our CPU-memory-based policy using either high performance or high throughput approach and using the remote execution strategy performs the best for both CPU-bound and memory-bound job in homogeneous networks of distributed environment.

  13. Sources and distribution of anthropogenic radionuclides in different marine environments

    International Nuclear Information System (INIS)

    Holm, E.

    1997-01-01

    The knowledge of the distribution in time and space radiologically important radionuclides from different sources in different marine environments is important for assessment of dose commitment following controlled or accidental releases and for detecting eventual new sources. Present sources from nuclear explosion tests, releases from nuclear facilities and the Chernobyl accident provide a tool for such studies. The different sources can be distinguished by different isotopic and radionuclide composition. Results show that radiocaesium behaves rather conservatively in the south and north Atlantic while plutonium has a residence time of about 8 years. On the other hand enhanced concentrations of plutonium in surface waters in arctic regions where vertical mixing is small and iceformation plays an important role. Significantly increased concentrations of plutonium are also found below the oxic layer in anoxic basins due to geochemical concentration. (author)

  14. Taxing energy to improve the environment. Efficiency and distributional effects

    International Nuclear Information System (INIS)

    Heijdra, B.J.; Van der Horst, A.

    1998-02-01

    The effects of environmental tax policy in a dynamic overlapping-generations model of a small open economy with environmental quality incorporated as a durable consumption good have been studied. Raising the energy tax may deliver an efficiency gain if agents care enough about the environment. The benefits are unevenly distributed across generations since capital ownership, and the capital loss induced by a tax increase, rises with age. A suitable egalitarian bond policy can be employed in order to ensure everybody gains to the same extent. With this additional instrument the optimal energy tax can be computed. The authors further considered a tax reform that simultaneously lowers labour taxation and raises the energy tax. This policy delivers qualitatively similar consequences as the first scenario, though all changes are less pronounced. A double dividend may appear soon after the reform but vanishes in the course of the transition. 22 refs

  15. Taxing energy to improve the environment. Efficiency and distributional effects

    Energy Technology Data Exchange (ETDEWEB)

    Heijdra, B.J.; Van der Horst, A. [Faculty of Economics and Econometrics, University of Amsterdam, Amsterdam (Netherlands)

    1998-02-01

    The effects of environmental tax policy in a dynamic overlapping-generations model of a small open economy with environmental quality incorporated as a durable consumption good have been studied. Raising the energy tax may deliver an efficiency gain if agents care enough about the environment. The benefits are unevenly distributed across generations since capital ownership, and the capital loss induced by a tax increase, rises with age. A suitable egalitarian bond policy can be employed in order to ensure everybody gains to the same extent. With this additional instrument the optimal energy tax can be computed. The authors further considered a tax reform that simultaneously lowers labour taxation and raises the energy tax. This policy delivers qualitatively similar consequences as the first scenario, though all changes are less pronounced. A double dividend may appear soon after the reform but vanishes in the course of the transition. 22 refs.

  16. DESIGN COORDINATION IN DISTRIBUTED ENVIRONMENTS USING VIRTUAL REALITY SYSTEMS

    Directory of Open Access Journals (Sweden)

    Rusdi HA

    2003-01-01

    Full Text Available This paper presents a research project, which investigates the use of virtual reality and computer communication technology to facilitate building design coordination in distributed environments. The emphasis of the system, called VR-based DEsign COordination (VRDECO is providing a communication tool that can be used by remote designers for settling ideas before they fully engage in concurrent engineering environments. VRDECO provides the necessary design tools, library of building elements and communication procedures, for designers from remote places to perform and coordinate their initial tasks. It has been implemented using available commercial software packages, and is used in designing a simple house. VRDECO facilitates the creation a preliminary design and simple communication with the client. There are, however, some difficulties in the development of the full version of VRDECO, i.e.: creating an adequate number of building elements, building specification database with a sufficient number of choices, and establishing a systematic rule to determine the parts of a building that are updateable.

  17. Detecting Distributed SQL Injection Attacks in a Eucalyptus Cloud Environment

    Science.gov (United States)

    Kebert, Alan; Barnejee, Bikramjit; Solano, Juan; Solano, Wanda

    2013-01-01

    The cloud computing environment offers malicious users the ability to spawn multiple instances of cloud nodes that are similar to virtual machines, except that they can have separate external IP addresses. In this paper we demonstrate how this ability can be exploited by an attacker to distribute his/her attack, in particular SQL injection attacks, in such a way that an intrusion detection system (IDS) could fail to identify this attack. To demonstrate this, we set up a small private cloud, established a vulnerable website in one instance, and placed an IDS within the cloud to monitor the network traffic. We found that an attacker could quite easily defeat the IDS by periodically altering its IP address. To detect such an attacker, we propose to use multi-agent plan recognition, where the multiple source IPs are considered as different agents who are mounting a collaborative attack. We show that such a formulation of this problem yields a more sophisticated approach to detecting SQL injection attacks within a cloud computing environment.

  18. ESC Track Fusion Demonstration Tool for Distributed Environments

    Science.gov (United States)

    Cox, C.; Degraaf, E.; Perry, R.; Diaz, R.

    A key requirement of future net-centric Space Situational Awareness systems development and operations will be decentralized operations, including multi-level distributed data fusion. Raytheon has developed a demonstration for ESC 850 ELSG/NS that fuses sensor-supplied tracks in a dense resident space object (RSO) environment. The demonstration use the state vector and covariance input data from single pass orbit solutions and applies track-to-track correlation algorithms to fuse the individual tracks into composite orbits. Platform independent Java technology and an agent-based software design using asynchronous inter-process communications was used in the demonstration tool development. The tool has been tested against a simulated scenario corresponding to the future 100,000+ object catalog environment. Ten days of simulated data from Fylingdales, Shemya, Eglin, and a future Space Fence sensor were generated for a co-orbiting family of 122 sun-synchronous objects between 700 and 800 km altitude from the NASA simulated small debris for 2015. The selected set exceeds the average object densities for the 100,000+ RSO environment, and provides a scenario similar to an evolved breakup where the debris has had time to disperse. The demo produced very good results using fast and simple astrodynamic models. A total of 16678 input tracks were fused, with less than 1.6% being misassociated. Pure tracks were generated for 65% of the 122 truth objects, and 97% of the objects had a misassociation rate tool that can be used to assess current breakups such as the Chinese ASAT event.

  19. KeyWare: an open wireless distributed computing environment

    Science.gov (United States)

    Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir

    1995-12-01

    Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.

  20. Prediction of oil droplet size distribution in agitated aquatic environments

    International Nuclear Information System (INIS)

    Khelifa, A.; Lee, K.; Hill, P.S.

    2004-01-01

    Oil spilled at sea undergoes many transformations based on physical, biological and chemical processes. Vertical dispersion is the hydrodynamic mechanism controlled by turbulent mixing due to breaking waves, vertical velocity, density gradients and other environmental factors. Spilled oil is dispersed in the water column as small oil droplets. In order to estimate the mass of an oil slick in the water column, it is necessary to know how the droplets formed. Also, the vertical dispersion and fate of oil spilled in aquatic environments can be modelled if the droplet-size distribution of the oil droplets is known. An oil spill remediation strategy can then be implemented. This paper presented a newly developed Monte Carlo model to predict droplet-size distribution due to Brownian motion, turbulence and a differential settling at equilibrium. A kinematic model was integrated into the proposed model to simulate droplet breakage. The key physical input of the model is the maximum droplet size permissible in the simulation. Laboratory studies were found to be in good agreement with field studies. 26 refs., 1 tab., 5 figs

  1. Distribution system analysis and automation

    CERN Document Server

    Gers, Juan

    2013-01-01

    A comprehensive guide to techniques that allow engineers to simulate, analyse and optimise power distribution systems which combined with automation, underpin the emerging concept of the "smart grid". This book is supported by theoretical concepts with real-world applications and MATLAB exercises.

  2. Distributed temperature and distributed acoustic sensing for remote and harsh environments

    Science.gov (United States)

    Mondanos, Michael; Parker, Tom; Milne, Craig H.; Yeo, Jackson; Coleman, Thomas; Farhadiroushan, Mahmoud

    2015-05-01

    Advances in opto-electronics and associated signal processing have enabled the development of Distributed Acoustic and Temperature Sensors. Unlike systems relying on discrete optical sensors a distributed system does not rely upon manufactured sensors but utilises passive custom optical fibre cables resistant to harsh environments, including high temperature applications (600°C). The principle of distributed sensing is well known from the distributed temperature sensor (DTS) which uses the interaction of the source light with thermal vibrations (Raman scattering) to determine the temperature at all points along the fibre. Distributed Acoustic Sensing (DAS) uses a novel digital optical detection technique to precisely capture the true full acoustic field (amplitude, frequency and phase) over a wide dynamic range at every point simultaneously. A number of signal processing techniques have been developed to process a large array of acoustic signals to quantify the coherent temporal and spatial characteristics of the acoustic waves. Predominantly these systems have been developed for the oil and gas industry to assist reservoir engineers in optimising the well lifetime. Nowadays these systems find a wide variety of applications as integrity monitoring tools in process vessels, storage tanks and piping systems offering the operator tools to schedule maintenance programs and maximize service life.

  3. Depositional environments and porosity distribution in regressive limestone reservoirs of the Mishrif Formation, Southern Iraq

    International Nuclear Information System (INIS)

    AlDabbas, Moutaz; AlJassim Jassim; AlJumaily Saad

    2010-01-01

    Eight subsurface sections and a large number of thin sections of the Mishrif Limestone were studied to unravel the depositional facies and environments. The allochems in the Mishrif Formation are dominated by bioclasts, whereas peloids, ooids, and intraclasts are less abundant. The sedimentary microfacies of the Mishrif Formation includes mudstone, wackestone, packstone, grainstone, floatstone, and rudstone, which have been deposited in basinal, outer shelf, slop followed by shoal reef and lagoonal environments. The formation displays various extents of dolomitization and is cemented by calcite and dolomite. The formation has gradational contact with the underlying Rumaila Formation but is unconformably overlain by the Khasib Formation. The unconformity is recognized because the skeletal grains are dominated by Chaophyta (algae), which denotes the change of environment from fully marine to lacustrine environment. Thus, the vertical bioclast analysis indicates that the Mishrif Formation is characterized by two regressive cycles, which control the distribution of reservoir quality as well as the patterns of calcite and dolomite cement distribution. Mishrif Formation gradationally overlies Rumaila Formation. This was indicated by the presence of the green parts of Chaophyta (algae) as main skeletal grains at the uppermost part of well Zb-47, which refer to lacustrine or fresh water environment. Petrographical study shows that the fossils, peloids, oolitis, and intraclasts represent the main allochem. Calcite and dolomite (as diagenetic products) are the predominant mineral components of Mishrif Formation. Fossils were studied as an environmental age and facial boundaries indicators, which are located in a chart using personal computer programs depending on their distributions on the first appearance of species. Fifteen principal sedimentary microfacies have been identified in the Mishrif Formation, which includes lime mudstone, mudstone-wackestone, wackestone

  4. Transient stability analysis of a distribution network with distributed generators

    NARCIS (Netherlands)

    Xyngi, I.; Ishchenko, A.; Popov, M.; Sluis, van der L.

    2009-01-01

    This letter describes the transient stability analysis of a 10-kV distribution network with wind generators, microturbines, and CHP plants. The network being modeled in Matlab/Simulink takes into account detailed dynamic models of the generators. Fault simulations at various locations are

  5. Metagenomes Reveal Global Distribution of Bacterial Steroid Catabolism in Natural, Engineered, and Host Environments

    Directory of Open Access Journals (Sweden)

    Johannes Holert

    2018-01-01

    Full Text Available Steroids are abundant growth substrates for bacteria in natural, engineered, and host-associated environments. This study analyzed the distribution of the aerobic 9,10-seco steroid degradation pathway in 346 publically available metagenomes from diverse environments. Our results show that steroid-degrading bacteria are globally distributed and prevalent in particular environments, such as wastewater treatment plants, soil, plant rhizospheres, and the marine environment, including marine sponges. Genomic signature-based sequence binning recovered 45 metagenome-assembled genomes containing a majority of 9,10-seco pathway genes. Only Actinobacteria and Proteobacteria were identified as steroid degraders, but we identified several alpha- and gammaproteobacterial lineages not previously known to degrade steroids. Actino- and proteobacterial steroid degraders coexisted in wastewater, while soil and rhizosphere samples contained mostly actinobacterial ones. Actinobacterial steroid degraders were found in deep ocean samples, while mostly alpha- and gammaproteobacterial ones were found in other marine samples, including sponges. Isolation of steroid-degrading bacteria from sponges confirmed their presence. Phylogenetic analysis of key steroid degradation proteins suggested their biochemical novelty in genomes from sponges and other environments. This study shows that the ecological significance as well as taxonomic and biochemical diversity of bacterial steroid degradation has so far been largely underestimated, especially in the marine environment.

  6. Distribution coefficients for radionuclides in aquatic environments. Volume 2. Dialysis experiments in marine environments

    International Nuclear Information System (INIS)

    Sibley, T.H.; Nevissi, A.E.; Schell, W.R.

    1981-05-01

    The overall objective of this research program was to obtain new information that can be used to predict the fate of radionuclides that may enter the aquatic environment from nuclear power plants, waste storage facilities or fuel reprocessing plants. Important parameters for determining fate are the distribution of radionuclides between the soluble and particulate phases and the partitioning of radionuclides among various suspended particulates. This report presents the results of dialysis experiments that were used to study the distribution of radionuclides among suspended sediments, phytoplankton, organic detritus, and filtered sea water. Three experiments were conducted to investigate the adsorption kinetics and equilibrium distribution of (59)Fe, (60)Co, (65)Zn, (106)Ru, (137)Cs, (207)Bi, (238)Pu, and (241)Am in marine system. Diffusion across the dialysis membranes depends upon the physico-chemical form of the radionuclides, proceeding quite rapidly for ionic species of (137)Cs and (60)Co but much more slowly for radionuclides which occur primarily as colloids and solid precipitates such as (59)Fe, (207)Bi, and (241)Am. All the radionuclides adsorb to suspended particulates although the amount of adsorption depends upon the specific types and concentration of particulates in the system and the selected radionuclide. High affinity of some radionuclides - e.g., (106)Ru and (241)Am - for detritus and phytoplankton suggests that suspended organics may significantly affect the eventual fate of those radionuclides in marine ecosystems

  7. Knowledge Flow Rules of Modern Design under Distributed Resource Environment

    Directory of Open Access Journals (Sweden)

    Junning Li

    2013-01-01

    Full Text Available The process of modern design under the distributed resource environment is interpreted as the process of knowledge flow and integration. As the acquisition of new knowledge strongly depends on resources, knowledge flow can be influenced by technical, economic, and social relation factors, and so forth. In order to achieve greater efficiency of knowledge flow and make the product more competitive, the root causes of the above factors should be acquired first. In this paper, the authors attempt to reveal the nature of design knowledge flow from the perspectives of fluid dynamics and energy. The knowledge field effect and knowledge agglomeration effect are analyzed, respectively, in which the knowledge field effect model considering single task node and the single knowledge energy model in the knowledge flow are established, then the general expression of knowledge energy conservation with consideration of the kinetic energy and potential energy of knowledge is built. Then, the knowledge flow rules and their influential factors including complete transfer and incomplete transfer of design knowledge are studied. Finally, the coupling knowledge flows in the knowledge service platform for modern design are analyzed to certify the feasibility of the research work.

  8. Distributions and kinetics of inert radionuclides in environment

    International Nuclear Information System (INIS)

    Fukui, Masami; Yamasaki, Keizo; Okamoto, Ken-ichi; Yoshimoto, Taka-aki

    1988-01-01

    Recently there has been increased concern over the exposures not only in the nuclear energy field but in natural radiation environment. So this paper presents a study on distributions and kinetics of radioactive gases such as 41 Ar and radon progeny in the Kyoto University Reactor, Research Institute (KUR-RI) and radon in the earth, which is one of the main radiation sources in out-door and has been continually generated. Amount of 41 Ar released from the KUR has been decreased by about one-tenth within past seven years and further efforts have been focused on minimizing the released activity according to the concept of ALARA. The simultaneity of the diurnal variations in the radioactive dust observed in several buildings at the KUR-RI indicates that the concentration changes are not caused by the activities of man. Peak concentrations in vertical profile of radon are observed at ca 40 cm below ground surface and a rising of water table following precipitation causes a brust of radon concentration by upflow in ground. Exposure levels on workers and the public resulting from the KUR operation and natural radiation induced by radon progeny will be discussed at this meeting. (author)

  9. Guest Editor's introduction: Special issue on distributed virtual environments

    Science.gov (United States)

    Lea, Rodger

    1998-09-01

    Distributed virtual environments (DVEs) combine technology from 3D graphics, virtual reality and distributed systems to provide an interactive 3D scene that supports multiple participants. Each participant has a representation in the scene, often known as an avatar, and is free to navigate through the scene and interact with both the scene and other viewers of the scene. Changes to the scene, for example, position changes of one avatar as the associated viewer navigates through the scene, or changes to objects in the scene via manipulation, are propagated in real time to all viewers. This ensures that all viewers of a shared scene `see' the same representation of it, allowing sensible reasoning about the scene. Early work on such environments was restricted to their use in simulation, in particular in military simulation. However, over recent years a number of interesting and potentially far-reaching attempts have been made to exploit the technology for a range of other uses, including: Social spaces. Such spaces can be seen as logical extensions of the familiar text chat space. In 3D social spaces avatars, representing participants, can meet in shared 3D scenes and in addition to text chat can use visual cues and even in some cases spatial audio. Collaborative working. A number of recent projects have attempted to explore the use of DVEs to facilitate computer-supported collaborative working (CSCW), where the 3D space provides a context and work space for collaboration. Gaming. The shared 3D space is already familiar, albeit in a constrained manner, to the gaming community. DVEs are a logical superset of existing 3D games and can provide a rich framework for advanced gaming applications. e-commerce. The ability to navigate through a virtual shopping mall and to look at, and even interact with, 3D representations of articles has appealed to the e-commerce community as it searches for the best method of presenting merchandise to electronic consumers. The technology

  10. Analysis of irregularly distributed points

    DEFF Research Database (Denmark)

    Hartelius, Karsten

    1996-01-01

    conditional modes are applied to this problem. The Kalman filter is described as a powerfull tool for modelling two-dimensional data. Motivated by the development of the reduced update Kalman filter we propose a reduced update Kalman smoother which offers considerable computa- tional savings. Kriging...... on hybridisation analysis, which comprise matching a grid to an arrayed set of DNA- clones spotted onto a hybridisation filter. The line process has proven to perform a satisfactorly modelling of shifted fields (subgrids) in the hybridisation grid, and a two-staged hierarchical grid matching scheme which...

  11. Time evolution of distribution functions in dissipative environments

    International Nuclear Information System (INIS)

    Hu Li-Yun; Chen Fei; Wang Zi-Sheng; Fan Hong-Yi

    2011-01-01

    By introducing the thermal entangled state representation, we investigate the time evolution of distribution functions in the dissipative channels by bridging the relation between the initial distribution function and the any time distribution function. We find that most of them are expressed as such integrations over the Laguerre—Gaussian function. Furthermore, as applications, we derive the time evolution of photon-counting distribution by bridging the relation between the initial distribution function and the any time photon-counting distribution, and the time evolution of R-function characteristic of nonclassicality depth. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  12. Cluster analysis for determining distribution center location

    Science.gov (United States)

    Lestari Widaningrum, Dyah; Andika, Aditya; Murphiyanto, Richard Dimas Julian

    2017-12-01

    Determination of distribution facilities is highly important to survive in the high level of competition in today’s business world. Companies can operate multiple distribution centers to mitigate supply chain risk. Thus, new problems arise, namely how many and where the facilities should be provided. This study examines a fast-food restaurant brand, which located in the Greater Jakarta. This brand is included in the category of top 5 fast food restaurant chain based on retail sales. There were three stages in this study, compiling spatial data, cluster analysis, and network analysis. Cluster analysis results are used to consider the location of the additional distribution center. Network analysis results show a more efficient process referring to a shorter distance to the distribution process.

  13. Ambient environment analysis by means of perception

    NARCIS (Netherlands)

    Bitterman, M.S.; Ciftcioglu, O.; Bhatt, M.; Schultz, C.

    2013-01-01

    Analysis of an ambient environment by means of perception is described. The surveillance of an object by human, who watches a scene via a monitor that shows camera sensed information, is investigated. Although the camera sensing process is a deterministic process, human perception of a scene via

  14. Analysis of methods. [information systems evolution environment

    Science.gov (United States)

    Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity

  15. Radioactivity Distribution in Malaysian Marine Environment. Chapter 4

    International Nuclear Information System (INIS)

    Zal U'yun Wan Mahmood; Abdul Kadir Ishak; Norfaizal Mohamad; Wo, Y.M.; Kamarudin Samuding

    2015-01-01

    The marine radioactivity distribution mapping in malaysia was developed with the aim to illustrate the pattern of the distribution of both anthropogenic and natural radionuclides in seawater and sediments. The data collected will form the basis for an anthropogenic radioactivity.

  16. Data analysis in an Object Request Broker environment

    International Nuclear Information System (INIS)

    Malon, D.M.; May, E.N.; Grossman, R.L.; Day, C.T.; Quarrie, D.R.

    1995-01-01

    Computing for the Next Millenium will require software interoperability in heterogeneous, increasingly object-oriented environments. The Common Object Request Broker Architecture (CORBA) is a software industry effort, under the aegis of the Object Management Group (OMG), to standardize mechanisms for software interaction among disparate applications written in a variety of languages and running on a variety of distributed platforms. In this paper, we describe some of the design and performance implications for software that must function in such a brokered environment in a standards-compliant way. We illustrate these implications with a physics data analysis example as a case study

  17. Data analysis in an object request broker environment

    International Nuclear Information System (INIS)

    Malon, David M.; May, Edward N.; Grossman, Robert L.; Day, Christopher T.; Quarrie, David R.

    1996-01-01

    Computing for the Next Millennium will require software interoperability in heterogeneous, increasingly object-oriented environments. The Common Request Broker Architecture (CORBA) is a software industry effort, under the aegis of the Object Management Group (OMG), to standardize mechanism for software interaction among disparate applications written in a variety of languages and running on a variety of distributed platforms. In this paper, we describe some of the design and performance implications for software that must function is such a brokered environment in a standards-compliant way. We illustrate these implications with a physics data analysis example as a case study. (author)

  18. Distributed optimization-based control of multi-agent networks in complex environments

    CERN Document Server

    Zhu, Minghui

    2015-01-01

    This book offers a concise and in-depth exposition of specific algorithmic solutions for distributed optimization based control of multi-agent networks and their performance analysis. It synthesizes and analyzes distributed strategies for three collaborative tasks: distributed cooperative optimization, mobile sensor deployment and multi-vehicle formation control. The book integrates miscellaneous ideas and tools from dynamic systems, control theory, graph theory, optimization, game theory and Markov chains to address the particular challenges introduced by such complexities in the environment as topological dynamics, environmental uncertainties, and potential cyber-attack by human adversaries. The book is written for first- or second-year graduate students in a variety of engineering disciplines, including control, robotics, decision-making, optimization and algorithms and with backgrounds in aerospace engineering, computer science, electrical engineering, mechanical engineering and operations research. Resea...

  19. Getting reforms done in inhospitable institutional environments: untying a Gordian Knot in India's power distribution sector

    International Nuclear Information System (INIS)

    Tankha, Sunil; Misal, Annasahed B.; Fuller, Boyd W.

    2010-01-01

    When grand institutional reforms based on idealized models are stalled by the poor institutional environments and difficult politics which often surround large infrastructure systems in developing countries, partial reforms whose design and implementation take into account the different interests of the key stakeholders can provide valuable and immediate benefits while moving these systems from low- towards higher-level equilibria. Strategically negotiated, experimentally partial and purposefully hybrid, these reforms are based on careful stakeholder analysis and strategic coalition building that avoid rigid positions based on idealized models. Our findings are based on a study of power sector reforms in India, where we performed a micro-level and in-depth analysis of a partial and innovative experiment which has allowed private sector participation in electricity distribution within a hostile institutional environment.

  20. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2016-04-01

    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  1. Maintaining Traceability in an Evolving Distributed Computing Environment

    Science.gov (United States)

    Collier, I.; Wartel, R.

    2015-12-01

    The management of risk is fundamental to the operation of any distributed computing infrastructure. Identifying the cause of incidents is essential to prevent them from re-occurring. In addition, it is a goal to contain the impact of an incident while keeping services operational. For response to incidents to be acceptable this needs to be commensurate with the scale of the problem. The minimum level of traceability for distributed computing infrastructure usage is to be able to identify the source of all actions (executables, file transfers, pilot jobs, portal jobs, etc.) and the individual who initiated them. In addition, sufficiently fine-grained controls, such as blocking the originating user and monitoring to detect abnormal behaviour, are necessary for keeping services operational. It is essential to be able to understand the cause and to fix any problems before re-enabling access for the user. The aim is to be able to answer the basic questions who, what, where, and when concerning any incident. This requires retaining all relevant information, including timestamps and the digital identity of the user, sufficient to identify, for each service instance, and for every security event including at least the following: connect, authenticate, authorize (including identity changes) and disconnect. In traditional grid infrastructures (WLCG, EGI, OSG etc.) best practices and procedures for gathering and maintaining the information required to maintain traceability are well established. In particular, sites collect and store information required to ensure traceability of events at their sites. With the increased use of virtualisation and private and public clouds for HEP workloads established procedures, which are unable to see 'inside' running virtual machines no longer capture all the information required. Maintaining traceability will at least involve a shift of responsibility from sites to Virtual Organisations (VOs) bringing with it new requirements for their

  2. Distributed Scaffolding: Synergy in Technology-Enhanced Learning Environments

    Science.gov (United States)

    Ustunel, Hale H.; Tokel, Saniye Tugba

    2018-01-01

    When technology is employed challenges increase in learning environments. Kim et al. ("Sci Educ" 91(6):1010-1030, 2007) presented a pedagogical framework that provides a valid technology-enhanced learning environment. The purpose of the present design-based study was to investigate the micro context dimension of this framework and to…

  3. DEVELOPMENT OF A HETEROGENIC DISTRIBUTED ENVIRONMENT FOR SPATIAL DATA PROCESSING USING CLOUD TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    A. S. Garov

    2016-06-01

    Full Text Available We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (http://cartsrv.mexlab.ru/geoportal will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.

  4. Distribution and characteristics of gamma and cosmic ray dose rate in living environment

    International Nuclear Information System (INIS)

    Nagaoka, Toshi; Moriuchi, Shigeru

    1991-01-01

    A series of environmental radiation surveys was carried out from the viewpoint of characterizing the natural radiation dose rate distribution in the living environment, including natural and artificial ones. Through the analysis of the data obtained at numbers of places, several aspects of the radiation field in living environments were clarified. That is the gamma ray dose rate varies due to the following three dominant causes: 1) the radionuclide concentration of surrounding materials acting as gamma ray sources, 2) the spatial distribution of surrounding materials, and 3) the geometrical and shielding conditions between the natural gamma ray sources and the measured point; whereas, the cosmic ray dose rate varies due to the thickness of upper shielding materials. It was also suggested that the gamma ray dose rate generally shows an upward tendency, and the cosmic ray dose rate a downward one in artificial environment. This kind of knowledge is expected to serve as fundamental information for accurate and realistic evaluation of the collective dose in the living environment. (author)

  5. Development of a Heterogenic Distributed Environment for Spatial Data Processing Using Cloud Technologies

    Science.gov (United States)

    Garov, A. S.; Karachevtseva, I. P.; Matveev, E. V.; Zubarev, A. E.; Florinsky, I. V.

    2016-06-01

    We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (geoportal"target="_blank">http://cartsrv.mexlab.ru/geoportal) will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.

  6. Distributed metadata in a high performance computing environment

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Zhang, Zhenhua; Liu, Xuezhao; Tang, Haiying

    2017-07-11

    A computer-executable method, system, and computer program product for managing meta-data in a distributed storage system, wherein the distributed storage system includes one or more burst buffers enabled to operate with a distributed key-value store, the co computer-executable method, system, and computer program product comprising receiving a request for meta-data associated with a block of data stored in a first burst buffer of the one or more burst buffers in the distributed storage system, wherein the meta data is associated with a key-value, determining which of the one or more burst buffers stores the requested metadata, and upon determination that a first burst buffer of the one or more burst buffers stores the requested metadata, locating the key-value in a portion of the distributed key-value store accessible from the first burst buffer.

  7. Distribution of cobalt in soil from Kavadarci and the environs

    International Nuclear Information System (INIS)

    Stafilov, Trajche; Shajn, Robert; Boev, Blazho; Cvetkovich, Julujana; Mukaetov, Dushko; Andreevski, Marjan; Lepitkova, Sonja

    2009-01-01

    The results of the study of spatial distribution of cobalt in surface soil and subsoil over of the Kavadarci region, Republic of Macedonia, are reported. From the investigated region (360 km 2 ) in total 344 soil samples from 172 locations were collected. At each sampling point soil samples were collected at two depths, topsoil (0-5 cm) and subsoil soil (20-30 cm). Inductively coupled plasma - mass spectrometry (ICP-MS) was applied for the determination of cobalt. Data analysis and construction of the map were performed using the Paradox (ver. 9), Statistica (ver. 6.1), AutoDesk Map (ver. 2008) and Surfer (ver. 8.09) software. It was found that for both topsoil and subsoil the median and average values are 15 mg/kg, ranges between 6.7 and 58 mg/kg. The highest content of cobalt is present in the soil from the area of Paleozoic and Mesozoic rocks (Pz-Mz) on the western part of the investigated area and Flysch (E) - Eocene upper flysch zone (on the northern part) and the lowest in the soils from the Holocene alluvium of the rivers Crna Reka and Vardar. There are no significant differences between the surface and subsoil in terms of its average quantities. It was found that the critically high contents are related primarily to high contents of cobalt in the sampling points from the western part of the investigated region. The contents of cobalt are higher in subsoil than in topsoil from which it can be concluded that the occurrence is natural.

  8. DISTRIBUTED SYSTEM FOR HUMAN MACHINE INTERACTION IN VIRTUAL ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    Abraham Obed Chan-Canche

    2017-07-01

    Full Text Available The communication networks built by multiple devices and sensors are becoming more frequent. These device networks allow human-machine interaction development which aims to improve the human performance generating an adaptive environment in response to the information provided by it. The problem of this work is the quick integration of a device network that allows the development of a flexible immersive environment for different uses.

  9. 'Y' a distributed resource sharing system in nuclear research environment

    International Nuclear Information System (INIS)

    Popescu-Zeletin, R.

    1986-01-01

    The paper outlines the rationales for the transition from HMINET-2 to a distributed resource sharing system in Hahn-Meitner-Institute for Nuclear Research. The architecture and rationales for the planned new distributed resource system (Y) in HMI are outlined. The introduction of a distributed operating system is a prerequisite for a resource-sharing system. Y will provide not only the integration of networks of different qualities (high speed back-bone, LANs of different technologies, ports to national X.25 network and satellite) at hardware level, but also an integrated global user view of the whole system. This will be designed and implemented by decoupling the user-view from the hardware topology by introducing a netwide distributed operating system. (Auth.)

  10. Distributed computing environments for future space control systems

    Science.gov (United States)

    Viallefont, Pierre

    1993-01-01

    The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.

  11. Earth observation scientific workflows in a distributed computing environment

    CSIR Research Space (South Africa)

    Van Zyl, TL

    2011-09-01

    Full Text Available capabilities has focused on the web services approach as exemplified by the OGC's Web Processing Service and by GRID computing. The approach to leveraging distributed computing resources described in this paper uses instead remote objects via RPy...

  12. Developing an Environment for Exploring Distributed Operations: A Wargaming Example

    National Research Council Canada - National Science Library

    Holden, William T., Jr; Smith, Matthew L; Conzelman, Clair E; Smith, Paul G; Lickteig, Carl W; Sanders, William R

    2005-01-01

    Requirements for Future Force operations indicate that planning and wargaming must transition from a collocated, sequential, and staff-centered process to one that is distributed, simultaneous, and commander-centered...

  13. Distributed Diagnosis in Uncertain Environments Using Dynamic Bayesian Networks

    Data.gov (United States)

    National Aeronautics and Space Administration — This paper presents a distributed Bayesian fault diagnosis scheme for physical systems. Our diagnoser design is based on a procedure for factoring the global system...

  14. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    Science.gov (United States)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be

  15. Analyzing coastal environments by means of functional data analysis

    Science.gov (United States)

    Sierra, Carlos; Flor-Blanco, Germán; Ordoñez, Celestino; Flor, Germán; Gallego, José R.

    2017-07-01

    Here we used Functional Data Analysis (FDA) to examine particle-size distributions (PSDs) in a beach/shallow marine sedimentary environment in Gijón Bay (NW Spain). The work involved both Functional Principal Components Analysis (FPCA) and Functional Cluster Analysis (FCA). The grainsize of the sand samples was characterized by means of laser dispersion spectroscopy. Within this framework, FPCA was used as a dimension reduction technique to explore and uncover patterns in grain-size frequency curves. This procedure proved useful to describe variability in the structure of the data set. Moreover, an alternative approach, FCA, was applied to identify clusters and to interpret their spatial distribution. Results obtained with this latter technique were compared with those obtained by means of two vector approaches that combine PCA with CA (Cluster Analysis). The first method, the point density function (PDF), was employed after adapting a log-normal distribution to each PSD and resuming each of the density functions by its mean, sorting, skewness and kurtosis. The second applied a centered-log-ratio (clr) to the original data. PCA was then applied to the transformed data, and finally CA to the retained principal component scores. The study revealed functional data analysis, specifically FPCA and FCA, as a suitable alternative with considerable advantages over traditional vector analysis techniques in sedimentary geology studies.

  16. Pseudodifferential Analysis, Automorphic Distributions in the Plane and Modular Forms

    CERN Document Server

    Unterberger, Andre

    2011-01-01

    Pseudodifferential analysis, introduced in this book in a way adapted to the needs of number theorists, relates automorphic function theory in the hyperbolic half-plane I to automorphic distribution theory in the plane. Spectral-theoretic questions are discussed in one or the other environment: in the latter one, the problem of decomposing automorphic functions in I according to the spectral decomposition of the modular Laplacian gives way to the simpler one of decomposing automorphic distributions in R2 into homogeneous components. The Poincare summation process, which consists in building au

  17. Spatial distribution of tritium in the Rawatbhata Rajasthan site environment

    International Nuclear Information System (INIS)

    GilI, Rajpal; Tiwari, S.N.; Gocher, A.K.; Ravi, P.M.; Tripathi, R.M.

    2014-01-01

    Tritium is one of the most environmentally mobile radionuclides and hence has high potential for migration into the different compartments of environment. Tritium from nuclear facilities at RAPS site is released into the environment through 93 m and 100 m high stack mainly as tritiated water (HTO). The released tritium undergoes dilution and dispersion and then follows the ecological pathway of water molecule. Environmental Survey Laboratory of Health Physics Division, Bhabha Atomic Research Centre (BARC), located at Rajasthan Atomic Power Station (RAPS) site is continuously monitoring the concentration of tritium in the environment to ensure the public safety. Atmospheric tritium activity during the period (2009-2013) was measured regularly around Rajasthan Atomic Power Station (RAPS). Data collected showed a large variation of H-3 concentration in air fluctuating in the range of 0.43 - 5.80 Bq.m -3 at site boundary of 1.6 km. This paper presents the result of analyses of tritium in atmospheric environment covering an area up to 20 km radius around RAPS site. Large number of air moisture samples were collected around the RAPS site, for estimating tritium in atmospheric environment to ascertain the atmospheric dispersion and computation of radiation dose to the public

  18. Distribution of bat-borne viruses and environment patterns.

    Science.gov (United States)

    Afelt, Aneta; Lacroix, Audrey; Zawadzka-Pawlewska, Urszula; Pokojski, Wojciech; Buchy, Philippe; Frutos, Roger

    2018-03-01

    Environmental modifications are leading to biodiversity changes, loss and habitat disturbance. This in turn increases contacts between wildlife and hence the risk of transmission and emergence of zoonotic diseases. We analyzed the environment and land use using remote spatial data around the sampling locations of bats positive for coronavirus (21 sites) and astrovirus (11 sites) collected in 43 sites. A clear association between viruses and hosts was observed. Viruses associated to synanthropic bat genera, such as Myotis or Scotophilus were associated to highly transformed habitats with human presence while viruses associated to fruit bat genera were correlated with natural environments with dense forest, grassland areas and regions of high elevation. In particular, group C betacoronavirus were associated with mosaic habitats found in anthropized environments. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Distributed energy resources and benefits to the environment

    DEFF Research Database (Denmark)

    Akorede, Mudathir Funsho; Hizam, Hashim; Pouresmaeil, Edris

    2010-01-01

    generation in 2030 would be produced from fossil fuels. This global dependence on fossil fuels is dangerous to our environment in terms of their emissions unless specific policies and measures are put in place. Nevertheless, recent research reveals that a reduction in the emissions of these gases is possible...... on fossil fuels to our environment. The study finally justifies how DG technologies could substantially reduce greenhouse gas emissions when fully adopted; hence, reducing the public concerns over human health risks caused by the conventional method of electricity generation....

  20. Implementing Run-Time Evaluation of Distributed Timing Constraints in a Real-Time Environment

    DEFF Research Database (Denmark)

    Kristensen, C. H.; Drejer, N.

    1994-01-01

    In this paper we describe a solution to the problem of implementing run-time evaluation of timing constraints in distributed real-time environments......In this paper we describe a solution to the problem of implementing run-time evaluation of timing constraints in distributed real-time environments...

  1. GIS-based poverty and population distribution analysis in China

    Science.gov (United States)

    Cui, Jing; Wang, Yingjie; Yan, Hong

    2009-07-01

    Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

  2. Location Analysis of Freight Distribution Terminal of Jakarta City, Indonesia

    Directory of Open Access Journals (Sweden)

    Nahry Nahry

    2016-03-01

    Full Text Available Currently Jakarta has two freight terminals, namely Pulo Gebang and Tanah Merdeka. But, both terminals are just functioned for parking and have not been utilized properly yet, e.g. for consolidation. Goods consolidation, which is usually performed in distribution terminal, may reduce number of freight flow within the city. This paper is aimed to determine the best location of distribution terminal in Jakarta among those two terminals and two additional alternative sites, namely Lodan and Rawa Buaya. It is initialized by the identification of important factors that affect the location selection. It is carried out by Likert analysis through the questionnaires distributed to logistics firms. The best location is determined by applying Overlay Analysis using ArcGIS 9.2. Four grid maps are produced to represent the accessibility, cost, time, and environment factors as the important factors of location. The result shows that the ranking from the best is; Lodan, Tanah Merdeka, Pulo Gebang, and Rawa Buaya.

  3. Preview of the Mission Assurance Analysis Protocol (MAAP): Assessing Risk and Opportunity in Complex Environments

    National Research Council Canada - National Science Library

    Alberts, Christopher; Dorofee, Audrey; Marino, Lisa

    2008-01-01

    .... A MAAP assessment provides a systematic, in-depth analysis of the potential for success in distributed, complex, and uncertain environments and can be applied across the life cycle and throughout the supply chain...

  4. Distributed Algorithms for Time Optimal Reachability Analysis

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    . We propose distributed computing to accelerate time optimal reachability analysis. We develop five distributed state exploration algorithms, implement them in \\uppaal enabling it to exploit the compute resources of a dedicated model-checking cluster. We experimentally evaluate the implemented...... algorithms with four models in terms of their ability to compute near- or proven-optimal solutions, their scalability, time and memory consumption and communication overhead. Our results show that distributed algorithms work much faster than sequential algorithms and have good speedup in general.......Time optimal reachability analysis is a novel model based technique for solving scheduling and planning problems. After modeling them as reachability problems using timed automata, a real-time model checker can compute the fastest trace to the goal states which constitutes a time optimal schedule...

  5. Taxing energy to improve the environment : Efficiency and distributional effects

    NARCIS (Netherlands)

    Heijdra, BJ; van der Horst, A

    We study the effects of environmental tax policy in a dynamic overlapping generations model of a small open economy with environmental quality incorporated as a durable consumption good. Raising the energy tax may yield an efficiency gain if agents care enough about the environment. The benefits are

  6. Security in Distributed Collaborative Environments: Limitations and Solutions

    Science.gov (United States)

    Saadi, Rachid; Pierson, Jean-Marc; Brunie, Lionel

    The main goal of establishing collaboration between heterogeneous environment is to create such as Pervasive context which provide nomadic users with ubiquitous access to digital information and surrounding resources. However, the constraints of mobility and heterogeneity arise a number of crucial issues related to security, especially authentication access control and privacy. First of all, in this chapter we explore the trust paradigm, specially the transitive capability to enable a trust peer to peer collaboration. In this manner, when each organization sets its own security policy to recognize (authenticate) users members of a trusted community and provide them a local access (access control), the trust transitivity between peers will allows users to gain a broad, larger and controlled access inside the pervasive environment. Next, we study the problem of user's privacy. In fact in pervasive and ubiquitous environments, nomadic users gather and exchange certificates or credential which providing them rights to access by transitivity unknown and trusted environments. These signed documents embeds increasing number of attribute that require to be filtered according to such contextual situation. In this chapter, we propose a new morph signature enabling each certificate owner to preserve his privacy by discloses or blinds some sensitive attributes according to faced situation.

  7. Long-term preservation of analysis software environment

    International Nuclear Information System (INIS)

    Toppe Larsen, Dag; Blomer, Jakob; Buncic, Predrag; Charalampidis, Ioannis; Haratyunyan, Artem

    2012-01-01

    Long-term preservation of scientific data represents a challenge to experiments, especially regarding the analysis software. Preserving data is not enough; the full software and hardware environment is needed. Virtual machines (VMs) make it possible to preserve hardware “in software”. A complete infrastructure package has been developed for easy deployment and management of VMs, based on CERN virtual machine (CernVM). Further, a HTTP-based file system, CernVM file system (CVMFS), is used for the distribution of the software. It is possible to process data with any given software version, and a matching, regenerated VM version. A point-and-click web user interface is being developed for setting up the complete processing chain, including VM and software versions, number and type of processing nodes, and the particular type of analysis and data. This paradigm also allows for distributed cloud-computing on private and public clouds, for both legacy and contemporary experiments.

  8. Seasonal distribution of organic matter in mangrove environment of Goa

    Digital Repository Service at National Institute of Oceanography (India)

    Jagtap, T.G.

    Water and sediments were studied for the distribution of suspended matter, organic carbon and nitrogen Suspended matter ranged from 3-373 mg.l-1 while particulate organic carbon (POC) from 0.03-9.94 mg.l-1 POC value showed significant correlation...

  9. Bioprospecting and Functional Analysis of Neglected Environments

    DEFF Research Database (Denmark)

    Vogt, Josef Korbinian

    on Biological Diversity (explained in Chapter 3). Proteolytic enzymes – described in Chapter 4 – are the target for bioprospecting due to their high market value. Section II describes methods used for the analysis of metagenomic and RNA-seq datasets, including Manuscript I, which includes the taxonomic...... of arctic marine metagenomes reveals bacterial strategies for deep sea persistence (Manuscript II). Furthermore, this extreme environment is a fertile ground to mine for novel proteolytic enzymes. Manuscript III presents a bioinformatics approach to identify sequences for potential commercialization...

  10. Social network analysis of study environment

    Directory of Open Access Journals (Sweden)

    Blaženka Divjak

    2010-06-01

    Full Text Available Student working environment influences student learning and achievement level. In this respect social aspects of students’ formal and non-formal learning play special role in learning environment. The main research problem of this paper is to find out if students' academic performance influences their position in different students' social networks. Further, there is a need to identify other predictors of this position. In the process of problem solving we use the Social Network Analysis (SNA that is based on the data we collected from the students at the Faculty of Organization and Informatics, University of Zagreb. There are two data samples: in the basic sample N=27 and in the extended sample N=52. We collected data on social-demographic position, academic performance, learning and motivation styles, student status (full-time/part-time, attitudes towards individual and teamwork as well as informal cooperation. Afterwards five different networks (exchange of learning materials, teamwork, informal communication, basic and aggregated social network were constructed. These networks were analyzed with different metrics and the most important were betweenness, closeness and degree centrality. The main result is, firstly, that the position in a social network cannot be forecast only by academic success and, secondly, that part-time students tend to form separate groups that are poorly connected with full-time students. In general, position of a student in social networks in study environment can influence student learning as well as her/his future employability and therefore it is worthwhile to be investigated.

  11. Distributed Monitoring and Resource Management for Large Cloud Environments

    OpenAIRE

    Wuhib, Fetahi Zebenigus

    2010-01-01

    Over the last decade, the number, size and complexity of large-scale networked systems has been growing fast, and this trend is expected to accelerate. The best known example of a large-scale networked system is probably the Internet, while large datacenters for cloud services are the most recent ones. In such environments, a key challenge is to develop scalable and adaptive technologies for management functions. This thesis addresses the challenge by engineering several protocols  for distri...

  12. Performance optimisations for distributed analysis in ALICE

    CERN Document Server

    Betev, L; Gheata, M; Grigoras, C; Hristov, P

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the framewo rks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available r esources and ranging from fully I/O - bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by a...

  13. An Embedded Software Platform for Distributed Automotive Environment Management

    Directory of Open Access Journals (Sweden)

    Seepold Ralf

    2009-01-01

    Full Text Available This paper discusses an innovative extension of the actual vehicle platforms that integrate intelligent environments in order to carry out e-safety tasks improving the driving security. These platforms are dedicated to automotive environments which are characterized by sensor networks deployed along the vehicles. Since this kind of platform infrastructure is hardly extensible and forms a non-scalable process unit, an embedded OSGi-based UPnP platform extension is proposed in this article. Such extension deploys a compatible and scalable uniform environment that allows to manage the vehicle components heterogeneity and to provide plug and play support, being compatible with all kind of devices and sensors located in a car network. Furthermore, such extension allows to autoregister any kind of external devices, wherever they are located, providing the in-vehicle system with additional services and data supplied by them. This extension also supports service provisioning and connections to external and remote network services using SIP technology.

  14. An ontological framework for requirement change management in distributed environment

    International Nuclear Information System (INIS)

    Khatoon, A.; Hafeez, Y.; Ali, T.

    2014-01-01

    Global Software Development (GSD) is getting fame in the software industry gradually. However, in GSD, multiple and diverse stakeholders are involved in the development of complex software systems. GSD introduces several challenges, i.e. physical distance, time zone, culture difference, language barriers. As requirements play a significant role in any software development. The greatest challenge in GSD environment is to maintain a consistent view of the system even if the requirements change. But at the same time single change in the requirement might affect several other modules. In GSD different people use terms and have different ways of expressing the concepts for which people at remote sites are unable to get uniformity regarding the semantics of the terms. In a global environment requires effective communication and coordination. However, to overcome inconsistencies and ambiguities among the team members and to make the team members aware of the consistent view, a shared and common understanding is required. In this paper an approach beneficial to software industry has been proposed, focusing on changing requirements in a Global Software Development environment. A case study has been used for the evaluation of the proposed approach. Therefore, Requirements change management process has been improved by applying the approach of the case study. The proposed approach is beneficial to the software development organizations where frequent changes occur. It guided the software industry to provide the common understandings to all the development teams residing in remote locations. (author)

  15. Distributed Sensing and Processing Adaptive Collaboration Environment (D-SPACE)

    Science.gov (United States)

    2014-07-01

    RISC 525 Brooks Road Rome NY 13441-4505 10. SPONSOR/MONITOR’S ACRONYM(S) AFRL/RI 11. SPONSOR/MONITOR’S REPORT NUMBER AFRL-RI-RS-TR-2014-195 12...cloud” technologies are not appropriate for situation understanding in areas of denial, where computation resources are limited, data not easily...graph matching process. D-SPACE distributes graph exploitation among a network of autonomous computational resources, designs the collaboration policy

  16. Monte Carlo in radiotherapy: experience in a distributed computational environment

    Science.gov (United States)

    Caccia, B.; Mattia, M.; Amati, G.; Andenna, C.; Benassi, M.; D'Angelo, A.; Frustagli, G.; Iaccarino, G.; Occhigrossi, A.; Valentini, S.

    2007-06-01

    New technologies in cancer radiotherapy need a more accurate computation of the dose delivered in the radiotherapeutical treatment plan, and it is important to integrate sophisticated mathematical models and advanced computing knowledge into the treatment planning (TP) process. We present some results about using Monte Carlo (MC) codes in dose calculation for treatment planning. A distributed computing resource located in the Technologies and Health Department of the Italian National Institute of Health (ISS) along with other computer facilities (CASPUR - Inter-University Consortium for the Application of Super-Computing for Universities and Research) has been used to perform a fully complete MC simulation to compute dose distribution on phantoms irradiated with a radiotherapy accelerator. Using BEAMnrc and GEANT4 MC based codes we calculated dose distributions on a plain water phantom and air/water phantom. Experimental and calculated dose values below ±2% (for depth between 5 mm and 130 mm) were in agreement both in PDD (Percentage Depth Dose) and transversal sections of the phantom. We consider these results a first step towards a system suitable for medical physics departments to simulate a complete treatment plan using remote computing facilities for MC simulations.

  17. RAID Unbound: Storage Fault Tolerance in a Distributed Environment

    Science.gov (United States)

    Ritchie, Brian

    1996-01-01

    Mirroring, data replication, backup, and more recently, redundant arrays of independent disks (RAID) are all technologies used to protect and ensure access to critical company data. A new set of problems has arisen as data becomes more and more geographically distributed. Each of the technologies listed above provides important benefits; but each has failed to adapt fully to the realities of distributed computing. The key to data high availability and protection is to take the technologies' strengths and 'virtualize' them across a distributed network. RAID and mirroring offer high data availability, which data replication and backup provide strong data protection. If we take these concepts at a very granular level (defining user, record, block, file, or directory types) and them liberate them from the physical subsystems with which they have traditionally been associated, we have the opportunity to create a highly scalable network wide storage fault tolerance. The network becomes the virtual storage space in which the traditional concepts of data high availability and protection are implemented without their corresponding physical constraints.

  18. SPATIAL ANALYSIS TO SUPPORT GEOGRAPHIC TARGETING OF GENOTYPES TO ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    Glenn eHyman

    2013-03-01

    Full Text Available Crop improvement efforts have benefited greatly from advances in available data, computing technology and methods for targeting genotypes to environments. These advances support the analysis of genotype by environment interactions to understand how well a genotype adapts to environmental conditions. This paper reviews the use of spatial analysis to support crop improvement research aimed at matching genotypes to their most appropriate environmental niches. Better data sets are now available on soils, weather and climate, elevation, vegetation, crop distribution and local conditions where genotypes are tested in experimental trial sites. The improved data are now combined with spatial analysis methods to compare environmental conditions across sites, create agro-ecological region maps and assess environment change. Climate, elevation and vegetation data sets are now widely available, supporting analyses that were much more difficult even five or ten years ago. While detailed soil data for many parts of the world remains difficult to acquire for crop improvement studies, new advances in digital soil mapping are likely to improve our capacity. Site analysis and matching and regional targeting methods have advanced in parallel to data and technology improvements. All these developments have increased our capacity to link genotype to phenotype and point to a vast potential to improve crop adaptation efforts.

  19. Distributed Energy Systems in the Built Environment - European Legal Framework on Distributed Energy Systems in the Built Environment

    NARCIS (Netherlands)

    Pront-van Bommel, S.; Bregman, A.

    2013-01-01

    This paper aims to outline the stimuli provided by European law for promoting integrated planning of distributed renewable energy installations on the one hand and its limitations thereof on the other hand, also with regard to the role of local governments.

  20. Survival Function Analysis of Planet Size Distribution

    OpenAIRE

    Zeng, Li; Jacobsen, Stein B.; Sasselov, Dimitar D.; Vanderburg, Andrew

    2018-01-01

    Applying the survival function analysis to the planet radius distribution of the Kepler exoplanet candidates, we have identified two natural divisions of planet radius at 4 Earth radii and 10 Earth radii. These divisions place constraints on planet formation and interior structure model. The division at 4 Earth radii separates small exoplanets from large exoplanets above. When combined with the recently-discovered radius gap at 2 Earth radii, it supports the treatment of planets 2-4 Earth rad...

  1. Data distribution method of workflow in the cloud environment

    Science.gov (United States)

    Wang, Yong; Wu, Junjuan; Wang, Ying

    2017-08-01

    Cloud computing for workflow applications provides the required high efficiency calculation and large storage capacity and it also brings challenges to the protection of trade secrets and other privacy data. Because of privacy data will cause the increase of the data transmission time, this paper presents a new data allocation algorithm based on data collaborative damage degree, to improve the existing data allocation strategy? Safety and public cloud computer algorithm depends on the private cloud; the static allocation method in the initial stage only to the non-confidential data division to improve the original data, in the operational phase will continue to generate data to dynamically adjust the data distribution scheme. The experimental results show that the improved method is effective in reducing the data transmission time.

  2. Parallel SN algorithms in shared- and distributed-memory environments

    International Nuclear Information System (INIS)

    Haghighat, Alireza; Hunter, Melissa A.; Mattis, Ronald E.

    1995-01-01

    Different 2-D spatial domain partitioning Sn transport theory algorithms have been developed on the basis of the Block-Jacobi iterative scheme. These algorithms have been incorporated into TWOTRAN-II, and tested on a shared-memory CRAY Y-MP C90 and a distributed-memory IBM SP1. For a series of fixed source r-z geometry homogeneous problems, parallel efficiencies in a range of 50-90% are achieved on the C90 with 6 processors, and lower values (20-60%) are obtained on the SP1. It is demonstrated that better performance is attainable if one addresses issues such as convergence rate, load-balancing, and granularity for both architectures, as well as message passing (network bandwidth and latency) for SP1. (author). 17 refs, 4 figs

  3. 3D Game Content Distributed Adaptation in Heterogeneous Environments

    Directory of Open Access Journals (Sweden)

    Berretty Robert-Paul

    2007-01-01

    Full Text Available Most current multiplayer 3D games can only be played on a single dedicated platform (a particular computer, console, or cell phone, requiring specifically designed content and communication over a predefined network. Below we show how, by using signal processing techniques such as multiresolution representation and scalable coding for all the components of a 3D graphics object (geometry, texture, and animation, we enable online dynamic content adaptation, and thus delivery of the same content over heterogeneous networks to terminals with very different profiles, and its rendering on them. We present quantitative results demonstrating how the best displayed quality versus computational complexity versus bandwidth tradeoffs have been achieved, given the distributed resources available over the end-to-end content delivery chain. Additionally, we use state-of-the-art, standardised content representation and compression formats (MPEG-4 AFX, JPEG 2000, XML, enabling deployment over existing infrastructure, while keeping hooks to well-established practices in the game industry.

  4. Software/hardware distributed processing network supporting the Ada environment

    Science.gov (United States)

    Wood, Richard J.; Pryk, Zen

    1993-09-01

    A high-performance, fault-tolerant, distributed network has been developed, tested, and demonstrated. The network is based on the MIPS Computer Systems, Inc. R3000 Risc for processing, VHSIC ASICs for high speed, reliable, inter-node communications and compatible commercial memory and I/O boards. The network is an evolution of the Advanced Onboard Signal Processor (AOSP) architecture. It supports Ada application software with an Ada- implemented operating system. A six-node implementation (capable of expansion up to 256 nodes) of the RISC multiprocessor architecture provides 120 MIPS of scalar throughput, 96 Mbytes of RAM and 24 Mbytes of non-volatile memory. The network provides for all ground processing applications, has merit for space-qualified RISC-based network, and interfaces to advanced Computer Aided Software Engineering (CASE) tools for application software development.

  5. AXAF user interfaces for heterogeneous analysis environments

    Science.gov (United States)

    Mandel, Eric; Roll, John; Ackerman, Mark S.

    1992-01-01

    The AXAF Science Center (ASC) will develop software to support all facets of data center activities and user research for the AXAF X-ray Observatory, scheduled for launch in 1999. The goal is to provide astronomers with the ability to utilize heterogeneous data analysis packages, that is, to allow astronomers to pick the best packages for doing their scientific analysis. For example, ASC software will be based on IRAF, but non-IRAF programs will be incorporated into the data system where appropriate. Additionally, it is desired to allow AXAF users to mix ASC software with their own local software. The need to support heterogeneous analysis environments is not special to the AXAF project, and therefore finding mechanisms for coordinating heterogeneous programs is an important problem for astronomical software today. The approach to solving this problem has been to develop two interfaces that allow the scientific user to run heterogeneous programs together. The first is an IRAF-compatible parameter interface that provides non-IRAF programs with IRAF's parameter handling capabilities. Included in the interface is an application programming interface to manipulate parameters from within programs, and also a set of host programs to manipulate parameters at the command line or from within scripts. The parameter interface has been implemented to support parameter storage formats other than IRAF parameter files, allowing one, for example, to access parameters that are stored in data bases. An X Windows graphical user interface called 'agcl' has been developed, layered on top of the IRAF-compatible parameter interface, that provides a standard graphical mechanism for interacting with IRAF and non-IRAF programs. Users can edit parameters and run programs for both non-IRAF programs and IRAF tasks. The agcl interface allows one to communicate with any command line environment in a transparent manner and without any changes to the original environment. For example, the authors

  6. SERVICES OF FULL-TEXT SEARCHING IN A DISTRIBUTED INFORMATION ENVIRONMENT (PROJECT HUMANITARIANA

    Directory of Open Access Journals (Sweden)

    S. K. Lyapin

    2015-01-01

    Full Text Available Problem statement. We justify the possibility of full-text search services application in both universal and specialized (in terms of resource base digital libraries for the extraction and analysis of the context knowledge in the humanities. The architecture and services of virtual information and resource center for extracting knowledge from the humanitarian texts generated by «Humanitariana» project are described. The functional integration of the resources and services for a full-text search in a distributed decentralized environment, organized in the Internet / Intranet architecture under the control of the client (user browser accessing a variety of independent servers. An algorithm for a distributed full-text query implementation is described. Methods. Method of combining requency-ranked and paragraph-oriented full-text queries is used: the first are used for the preliminary analysis of the subject area or a combination product (explication of "vertical" context, or macro context, the second - for the explication of "horizontal" context, or micro context within copyright paragraph. The results of the frequency-ranked queries are used to compile paragraph-oriented queries. Results. The results of textual research are shown on the topics "The question of fact in Russian philosophy", "The question of loneliness in Russian philosophy and culture". About 50 pieces of context knowledge on the total resource base of about 2,500 full-text resources have been explicated and briefly described to their further expert investigating. Practical significance. The proposed technology (advanced full-text searching services in a distributed information environment can be used for the information support of humanitarian studies and education in the humanities, for functional integration of resources and services of various organizations, for carrying out interdisciplinary research.

  7. A Study of Land Surface Temperature Retrieval and Thermal Environment Distribution Based on Landsat-8 in Jinan City

    Science.gov (United States)

    Dong, Fang; Chen, Jian; Yang, Fan

    2018-01-01

    Based on the medium resolution Landsat 8 OLI/TIRS, the temperature distribution in four seasons of urban area in Jinan City was obtained by using atmospheric correction method for the retrieval of land surface temperature. Quantitative analysis of the spatio-temporal distribution characteristics, development trend of urban thermal environment, the seasonal variation and the relationship between surface temperature and normalized difference vegetation index (NDVI) was studied. The results show that the distribution of high temperature areas is concentrated in Jinan, and there is a tendency to expand from east to west, revealing a negative correlation between land surface temperature distribution and NDVI. So as to provide theoretical references and scientific basis of improving the ecological environment of Jinan City, strengthening scientific planning and making overall plan addressing climate change.

  8. FAME, the Flux Analysis and Modeling Environment

    Directory of Open Access Journals (Sweden)

    Boele Joost

    2012-01-01

    Full Text Available Abstract Background The creation and modification of genome-scale metabolic models is a task that requires specialized software tools. While these are available, subsequently running or visualizing a model often relies on disjoint code, which adds additional actions to the analysis routine and, in our experience, renders these applications suboptimal for routine use by (systems biologists. Results The Flux Analysis and Modeling Environment (FAME is the first web-based modeling tool that combines the tasks of creating, editing, running, and analyzing/visualizing stoichiometric models into a single program. Analysis results can be automatically superimposed on familiar KEGG-like maps. FAME is written in PHP and uses the Python-based PySCeS-CBM for its linear solving capabilities. It comes with a comprehensive manual and a quick-start tutorial, and can be accessed online at http://f-a-m-e.org/. Conclusions With FAME, we present the community with an open source, user-friendly, web-based "one stop shop" for stoichiometric modeling. We expect the application will be of substantial use to investigators and educators alike.

  9. Distributed analysis in ATLAS using GANGA

    International Nuclear Information System (INIS)

    Elmsheuser, Johannes; Brochu, Frederic; Egede, Ulrik; Reece, Will; Williams, Michael; Gaidioz, Benjamin; Maier, Andrew; Moscicki, Jakub; Vanderster, Daniel; Lee, Hurng-Chun; Pajchel, Katarina; Samset, Bjorn; Slater, Mark; Soroko, Alexander; Cowan, Greig

    2010-01-01

    Distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The needs to manage the resources are very high. In every experiment up to a thousand physicists will be submitting analysis jobs to the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without expertise in Grid technology. These tools enlarge the number of Grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments, provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment. Support for all Grids presently used by ATLAS, namely the LCG/EGEE, NDGF/NorduGrid, and OSG/PanDA is provided. The integration and interaction with the ATLAS data management system DQ2 into GANGA is a key functionality. An intelligent job brokering is set up by using the job splitting mechanism together with data-set and file location knowledge. The brokering is aided by an automated system that regularly processes test analysis jobs at all ATLAS DQ2 supported sites. Large numbers of analysis jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports amongst other things tasks of user analysis with reconstructed data and small scale production of Monte Carlo data.

  10. The distribution and transformations of iodine in the environment

    International Nuclear Information System (INIS)

    Whitehead, D.C.

    1984-01-01

    Iodine in the atmosphere is derived largely from seawater. It is probable that the biological production of methyl iodide is important in this transfer. Subsequent photolytic dissociation and oxidation of the methyl iodide, together with other inputs, with partial sorption of the products by aerosols, results in the atmospheric iodine being distributed between various gaseous and particulate forms. Atmospheric iodine is the major source of the iodine in soils, and the process of enrichment continues throughout soil formation and development until ultimately an equilibrium concentration is attained. The atmosphere is also a direct source of iodine for plants, and in some situations may be more important than the soil. Iodine may be lost from soils by leaching, volatilization, and removal in crops. The amounts of iodine reported in groundwaters, and in rivers and lakes remote from human activity, suggest that some leaching of iodine is widespread. Increased amounts of iodine occur in rivers receiving effluent from sewage works. Milk and milk products are now major dietary sources of iodine because their content is often increased by concentrate feedingstuffs supplemented with iodine and/or by the use in dairies of iodophor detergents and sterilants. (author)

  11. Removal and distribution of iodate in mangrove environment

    International Nuclear Information System (INIS)

    Machado, E.C; Machado, W; Sanders, L.F; Bellido, L.F; Patchineelam, S.R; Bellido Jr, A.V

    2003-01-01

    Retention experiments were carried out in laboratory in order to investigate the iodate ion removal from tidal water by the underlying creek sediment, as well as, distribution in the mangrove sediment of Sepetiba Bay, Rio de Janeiro. Local tidal water spiked with 131 IO 3 was added to several cores containing 6-7 cm of mangrove sediment. A sample solution was taken everyday for 8□9 days and assayed by means of a high purity germanium detector coupled to a multichannel analyser. It was found that, after two days, around 56 % of the radioiodate was taken up by the sediment. At the end of the experiment, the sediment was sectioned in 1-cm layers and also analyzed by gamma spectrometry. The overall results, in seven cores, showed that 89% of initial 131 IO 3 activities were transferred to sediment, whereas 78% of the total activity was within the first cm of the upper layer. It seems from the determination of total organic matter content, that the organic matter present in the Sepetiba's mangrove sediment, have a substantial influence in the iodate removal from the water column (author)

  12. Objective Bayesian Analysis of Skew- t Distributions

    KAUST Repository

    BRANCO, MARCIA D'ELIA

    2012-02-27

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student\\'s t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric around zero and proper. Moreover, we propose a Student\\'s t approximation of the Jeffreys prior that makes an objective Bayesian analysis easy to perform. We carry out a Monte Carlo simulation study that demonstrates an overall better behaviour of the maximum a posteriori estimator compared with the maximum likelihood estimator. We also compare the frequentist coverage of the credible intervals based on the Jeffreys prior and its approximation and show that they are similar. We further discuss location-scale models under scale mixtures of skew-normal distributions and show some conditions for the existence of the posterior distribution and its moments. Finally, we present three numerical examples to illustrate the implications of our results on inference for skew-t distributions. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  13. A cooperative model for IS security risk management in distributed environment.

    Science.gov (United States)

    Feng, Nan; Zheng, Chundong

    2014-01-01

    Given the increasing cooperation between organizations, the flexible exchange of security information across the allied organizations is critical to effectively manage information systems (IS) security in a distributed environment. In this paper, we develop a cooperative model for IS security risk management in a distributed environment. In the proposed model, the exchange of security information among the interconnected IS under distributed environment is supported by Bayesian networks (BNs). In addition, for an organization's IS, a BN is utilized to represent its security environment and dynamically predict its security risk level, by which the security manager can select an optimal action to safeguard the firm's information resources. The actual case studied illustrates the cooperative model presented in this paper and how it can be exploited to manage the distributed IS security risk effectively.

  14. Uncertainty analysis for secondary energy distributions

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1978-01-01

    In many transport calculations the integral design parameter of interest (response) is determined mainly by secondary particles such as gamma rays from (n,γ) reactions or secondary neutrons from inelastic scattering events or (n,2n) reactions. Standard sensitivity analysis usually allows to calculate the sensitivities to the production cross sections of such secondaries, but an extended formalism is needed to also obtain the sensitivities to the energy distribution of the generated secondary particles. For a 30-group standard cross-section set 84% of all non-zero table positions pertain to the description of secondary energy distributions (SED's) and only 16% to the actual reaction cross sections. Therefore, any sensitivity/uncertainty analysis which does not consider the effects of SED's is incomplete and neglects most of the input data. This paper describes the methods of how sensitivity profiles for SED's are obtained and used to estimate the uncertainty of an integral response due to uncertainties in these SED's. The detailed theory is documented elsewhere and implemented in the LASL sensitivity code SENSIT. SED sensitivity profiles have proven particularly valuable in cross-section uncertainty analyses for fusion reactors. Even when the production cross sections for secondary neutrons were assumed to be without error, the uncertainties in the energy distribution of these secondaries produced appreciable uncertainties in the calculated tritium breeding rate. However, complete error files for SED's are presently nonexistent. Therefore, methods will be described that allow rough error estimates due to estimated SED uncertainties based on integral SED sensitivities

  15. CyberWalk : a web-based distributed virtual walkthrough environment.

    OpenAIRE

    Chim, J.; Lau, R. W. H.; Leong, H. V.; Si, A.

    2003-01-01

    A distributed virtual walkthrough environment allows users connected to the geometry server to walk through a specific place of interest, without having to travel physically. This place of interest may be a virtual museum, virtual library or virtual university. There are two basic approaches to distribute the virtual environment from the geometry server to the clients, complete replication and on-demand transmission. Although the on-demand transmission approach saves waiting time and optimize...

  16. Distribution and communication in software engineering environments. Application to the HELIOS Software Bus.

    OpenAIRE

    Jean, F. C.; Jaulent, M. C.; Coignard, J.; Degoulet, P.

    1991-01-01

    Modularity, distribution and integration are current trends in Software Engineering. To reach these goals HELIOS, a distributive Software Engineering Environment dedicated to the medical field, has been conceived and a prototype implemented. This environment is made by the collaboration of several, well encapsulated Software Components. This paper presents the architecture retained to allow communication between the different components and focus on the implementation details of the Software ...

  17. Scaling analysis of meteorite shower mass distributions

    DEFF Research Database (Denmark)

    Oddershede, Lene; Meibom, A.; Bohr, Jakob

    1998-01-01

    Meteorite showers are the remains of extraterrestrial objects which are captivated by the gravitational field of the Earth. We have analyzed the mass distribution of fragments from 16 meteorite showers for scaling. The distributions exhibit distinct scaling behavior over several orders of magnetude......; the observed scaling exponents vary from shower to shower. Half of the analyzed showers show a single scaling region while the orther half show multiple scaling regimes. Such an analysis can provide knowledge about the fragmentation process and about the original meteoroid. We also suggest to compare...... the observed scaling exponents to exponents observed in laboratory experiments and discuss the possibility that one can derive insight into the original shapes of the meteoroids....

  18. Population fluctuation and vertical distribution of meiofauna in the Red Sea interstitial environment.

    Science.gov (United States)

    El-Serehy, Hamed A; Al-Misned, Fahad A; Al-Rasheid, Khaled A

    2015-07-01

    The composition and distribution of the benthic meiofauna assemblages of the Egyptian coasts along the Red Sea are described in relation to abiotic variables. Sediment samples were collected seasonally from three stations chosen along the Red Sea to observe the meiofaunal community structure, its temporal distribution and vertical fluctuation in relation to environmental conditions of the Red Sea marine ecosystem. The temperature, salinity, pH, dissolved oxygen, and redox potential were measured at the time of collection. The water content of the sediments, total organic matters and chlorophyll a values were determined, and sediment samples were subjected to granulometric analysis. A total of 10 meiofauna taxa were identified, with the meiofauna being primarily represented by nematodes (on annual average from 42% to 84%), harpacticoids, polycheates and ostracodes; and the meiofauna abundances ranging from 41 to 167 ind./10 cm(2). The meiofaunal population density fluctuated seasonally with a peak of 192.52 ind./10 cm(2) during summer at station II. The vertical zonation in the distribution of meiofaunal community was significantly correlated with interstitial water, chlorophyll a and total organic matter values. The present study indicates the existence of the well diversified meiofaunal group which can serve as food for higher trophic levels in the Red Sea interstitial environment.

  19. Geospatial analysis of food environment demonstrates associations with gestational diabetes.

    Science.gov (United States)

    Kahr, Maike K; Suter, Melissa A; Ballas, Jerasimos; Ramin, Susan M; Monga, Manju; Lee, Wesley; Hu, Min; Shope, Cindy D; Chesnokova, Arina; Krannich, Laura; Griffin, Emily N; Mastrobattista, Joan; Dildy, Gary A; Strehlow, Stacy L; Ramphul, Ryan; Hamilton, Winifred J; Aagaard, Kjersti M

    2016-01-01

    Gestational diabetes mellitus (GDM) is one of most common complications of pregnancy, with incidence rates varying by maternal age, race/ethnicity, obesity, parity, and family history. Given its increasing prevalence in recent decades, covariant environmental and sociodemographic factors may be additional determinants of GDM occurrence. We hypothesized that environmental risk factors, in particular measures of the food environment, may be a diabetes contributor. We employed geospatial modeling in a populous US county to characterize the association of the relative availability of fast food restaurants and supermarkets to GDM. Utilizing a perinatal database with >4900 encoded antenatal and outcome variables inclusive of ZIP code data, 8912 consecutive pregnancies were analyzed for correlations between GDM and food environment based on countywide food permit registration data. Linkage between pregnancies and food environment was achieved on the basis of validated 5-digit ZIP code data. The prevalence of supermarkets and fast food restaurants per 100,000 inhabitants for each ZIP code were gathered from publicly available food permit sources. To independently authenticate our findings with objective data, we measured hemoglobin A1c levels as a function of geospatial distribution of food environment in a matched subset (n = 80). Residence in neighborhoods with a high prevalence of fast food restaurants (fourth quartile) was significantly associated with an increased risk of developing GDM (relative to first quartile: adjusted odds ratio, 1.63; 95% confidence interval, 1.21-2.19). In multivariate analysis, this association held true after controlling for potential confounders (P = .002). Measurement of hemoglobin A1c levels in a matched subset were significantly increased in association with residence in a ZIP code with a higher fast food/supermarket ratio (n = 80, r = 0.251 P analysis, a relationship of food environment and risk for gestational diabetes was

  20. Exploring the Relationship Between Distributed Training, Integrated Learning Environments, and Immersive Training Environments

    Science.gov (United States)

    2007-01-01

    educating and training (O’Keefe IV & McIntyre III, 2006). Topics vary widely from standard educational topics such as teaching kids physics, mechanics...Winn, W., & Yu, R. (1997). The Impact of Three Dimensional Immersive Virtual Environments on Modern Pedagogy : Global Change, VR and Learning

  1. Distributed multiscale computing with MUSCLE 2, the Multiscale Coupling Library and Environment

    NARCIS (Netherlands)

    Borgdorff, J.; Mamonski, M.; Bosak, B.; Kurowski, K.; Ben Belgacem, M.; Chopard, B.; Groen, D.; Coveney, P.V.; Hoekstra, A.G.

    2014-01-01

    We present the Multiscale Coupling Library and Environment: MUSCLE 2. This multiscale component-based execution environment has a simple to use Java, C++, C, Python and Fortran API, compatible with MPI, OpenMP and threading codes. We demonstrate its local and distributed computing capabilities and

  2. Performance optimisations for distributed analysis in ALICE

    International Nuclear Information System (INIS)

    Betev, L; Gheata, A; Grigoras, C; Hristov, P; Gheata, M

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with ''sensors'' collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis

  3. Analysis of Jingdong Mall Logistics Distribution Model

    Science.gov (United States)

    Shao, Kang; Cheng, Feng

    In recent years, the development of electronic commerce in our country to speed up the pace. The role of logistics has been highlighted, more and more electronic commerce enterprise are beginning to realize the importance of logistics in the success or failure of the enterprise. In this paper, the author take Jingdong Mall for example, performing a SWOT analysis of their current situation of self-built logistics system, find out the problems existing in the current Jingdong Mall logistics distribution and give appropriate recommendations.

  4. Environmental factors determining ammonia-oxidizing organism distribution and diversity in marine environments.

    Science.gov (United States)

    Bouskill, Nicholas J; Eveillard, Damien; Chien, Diana; Jayakumar, Amal; Ward, Bess B

    2012-03-01

    Ammonia-oxidizing bacteria (AOB) and archaea (AOA) play a vital role in bridging the input of fixed nitrogen, through N-fixation and remineralization, to its loss by denitrification and anammox. Yet the major environmental factors determining AOB and AOA population dynamics are little understood, despite both groups having a wide environmental distribution. This study examined the relative abundance of both groups of ammonia-oxidizing organisms (AOO) and the diversity of AOA across large-scale gradients in temperature, salinity and substrate concentration and dissolved oxygen. The relative abundance of AOB and AOA varied across environments, with AOB dominating in the freshwater region of the Chesapeake Bay and AOA more abundant in the water column of the coastal and open ocean. The highest abundance of the AOA amoA gene was recorded in the oxygen minimum zones (OMZs) of the Eastern Tropical South Pacific (ETSP) and the Arabian Sea (AS). The ratio of AOA : AOB varied from 0.7 in the Chesapeake Bay to 1600 in the Sargasso Sea. Relative abundance of both groups strongly correlated with ammonium concentrations. AOA diversity, as determined by phylogenetic analysis of clone library sequences and archetype analysis from a functional gene DNA microarray, detected broad phylogenetic differences across the study sites. However, phylogenetic diversity within physicochemically congruent stations was more similar than would be expected by chance. This suggests that the prevailing geochemistry, rather than localized dispersal, is the major driving factor determining OTU distribution. © 2011 Society for Applied Microbiology and Blackwell Publishing Ltd.

  5. Some Technical Implications of Distributed Cognition on the Design on Interactive Learning Environments.

    Science.gov (United States)

    Dillenbourg, Pierre

    1996-01-01

    Maintains that diagnosis, explanation, and tutoring, the functions of an interactive learning environment, are collaborative processes. Examines how human-computer interaction can be improved using a distributed cognition framework. Discusses situational and distributed knowledge theories and provides a model on how they can be used to redesign…

  6. Problem-Based Learning and Problem-Solving Tools: Synthesis and Direction for Distributed Education Environments.

    Science.gov (United States)

    Friedman, Robert S.; Deek, Fadi P.

    2002-01-01

    Discusses how the design and implementation of problem-solving tools used in programming instruction are complementary with both the theories of problem-based learning (PBL), including constructivism, and the practices of distributed education environments. Examines how combining PBL, Web-based distributed education, and a problem-solving…

  7. Instructional Design Issues in a Distributed Collaborative Engineering Design (CED) Instructional Environment

    Science.gov (United States)

    Koszalka, Tiffany A.; Wu, Yiyan

    2010-01-01

    Changes in engineering practices have spawned changes in engineering education and prompted the use of distributed learning environments. A distributed collaborative engineering design (CED) course was designed to engage engineering students in learning about and solving engineering design problems. The CED incorporated an advanced interactive…

  8. Comparative phyloinformatics of virus genes at micro and macro levels in a distributed computing environment.

    Science.gov (United States)

    Singh, Dadabhai T; Trehan, Rahul; Schmidt, Bertil; Bretschneider, Timo

    2008-01-01

    Preparedness for a possible global pandemic caused by viruses such as the highly pathogenic influenza A subtype H5N1 has become a global priority. In particular, it is critical to monitor the appearance of any new emerging subtypes. Comparative phyloinformatics can be used to monitor, analyze, and possibly predict the evolution of viruses. However, in order to utilize the full functionality of available analysis packages for large-scale phyloinformatics studies, a team of computer scientists, biostatisticians and virologists is needed--a requirement which cannot be fulfilled in many cases. Furthermore, the time complexities of many algorithms involved leads to prohibitive runtimes on sequential computer platforms. This has so far hindered the use of comparative phyloinformatics as a commonly applied tool in this area. In this paper the graphical-oriented workflow design system called Quascade and its efficient usage for comparative phyloinformatics are presented. In particular, we focus on how this task can be effectively performed in a distributed computing environment. As a proof of concept, the designed workflows are used for the phylogenetic analysis of neuraminidase of H5N1 isolates (micro level) and influenza viruses (macro level). The results of this paper are hence twofold. Firstly, this paper demonstrates the usefulness of a graphical user interface system to design and execute complex distributed workflows for large-scale phyloinformatics studies of virus genes. Secondly, the analysis of neuraminidase on different levels of complexity provides valuable insights of this virus's tendency for geographical based clustering in the phylogenetic tree and also shows the importance of glycan sites in its molecular evolution. The current study demonstrates the efficiency and utility of workflow systems providing a biologist friendly approach to complex biological dataset analysis using high performance computing. In particular, the utility of the platform Quascade

  9. Massive calculations of electrostatic potentials and structure maps of biopolymers in a distributed computing environment

    International Nuclear Information System (INIS)

    Akishina, T.P.; Ivanov, V.V.; Stepanenko, V.A.

    2013-01-01

    Among the key factors determining the processes of transcription and translation are the distributions of the electrostatic potentials of DNA, RNA and proteins. Calculations of electrostatic distributions and structure maps of biopolymers on computers are time consuming and require large computational resources. We developed the procedures for organization of massive calculations of electrostatic potentials and structure maps for biopolymers in a distributed computing environment (several thousands of cores).

  10. Natural Environment Suitability of China and Its Relationship with Population Distributions

    Directory of Open Access Journals (Sweden)

    Xiaohuan Yang

    2009-12-01

    Full Text Available The natural environment factor is one of the main indexes for evaluating human habitats, sustained economic growth and ecological health status. Based on Geographic Information System (GIS technology and an analytic hierarchy process method, this article presents the construction of the Natural Environment Suitability Index (NESI model of China by using natural environment data including climate, hydrology, surface configuration and ecological conditions. The NESI value is calculated in grids of 1 km by 1 km through ArcGIS. The spatial regularity of NESI is analyzed according to its spatial distribution and proportional structure. The relationship of NESI with population distribution and economic growth is also discussed by analyzing NESI results with population distribution data and GDP data in 1 km by 1 km grids. The study shows that: (1 the value of NESI is higher in the East and lower in the West in China; The best natural environment area is the Yangtze River Delta region and the worst are the northwest of Tibet and southwest of Xinjiang. (2 There is a close correlation among natural environment, population distribution and economic growth; the best natural environment area, the Yangtze River Delta region, is also the region with higher population density and richer economy. The worst natural environment areas, Northwest and Tibetan Plateau, are also regions with lower population density and poorer economies.

  11. Natural Environment Suitability of China and Its Relationship with Population Distributions

    Science.gov (United States)

    Yang, Xiaohuan; Ma, Hanqing

    2009-01-01

    The natural environment factor is one of the main indexes for evaluating human habitats, sustained economic growth and ecological health status. Based on Geographic Information System (GIS) technology and an analytic hierarchy process method, this article presents the construction of the Natural Environment Suitability Index (NESI) model of China by using natural environment data including climate, hydrology, surface configuration and ecological conditions. The NESI value is calculated in grids of 1 km by 1 km through ArcGIS. The spatial regularity of NESI is analyzed according to its spatial distribution and proportional structure. The relationship of NESI with population distribution and economic growth is also discussed by analyzing NESI results with population distribution data and GDP data in 1 km by 1 km grids. The study shows that: (1) the value of NESI is higher in the East and lower in the West in China; The best natural environment area is the Yangtze River Delta region and the worst are the northwest of Tibet and southwest of Xinjiang. (2) There is a close correlation among natural environment, population distribution and economic growth; the best natural environment area, the Yangtze River Delta region, is also the region with higher population density and richer economy. The worst natural environment areas, Northwest and Tibetan Plateau, are also regions with lower population density and poorer economies. PMID:20049243

  12. IRST infrared background analysis of bay environments

    CSIR Research Space (South Africa)

    Schwering, PBW

    2008-04-01

    Full Text Available Present-day naval operations take place in coastal environments as well as narrow straits all over the world. Coastal environments around the world are exhibiting a number of threats to naval forces. In particular a large number of asymmetric...

  13. General Electric Company analytical model for loss-of-coolant analysis in accordance with 10CFR50 appendix K, amendment No. 3: effect of steam environment on BWR core spray distribution

    International Nuclear Information System (INIS)

    1977-04-01

    The core spray sparger designs of the BWR/2 through BWR/5 product lines were verified by means of full-scale mock-ups tested in air at various flow conditions. In 1974, an overseas technical partner of General Electric reported that a steam environment changed the individual core spray nozzle patterns when compared to patterns measured in air. This document describes preliminary findings of how a steam environment alters the core spray nozzle pattern, and the actions which General Electric is pursuing to quantify the steam effects

  14. Students’ perception of the learning environment in a distributed medical programme

    Directory of Open Access Journals (Sweden)

    Kiran Veerapen

    2010-09-01

    Full Text Available Background : The learning environment of a medical school has a significant impact on students’ achievements and learning outcomes. The importance of equitable learning environments across programme sites is implicit in distributed undergraduate medical programmes being developed and implemented. Purpose : To study the learning environment and its equity across two classes and three geographically separate sites of a distributed medical programme at the University of British Columbia Medical School that commenced in 2004. Method : The validated Dundee Ready Educational Environment Survey was sent to all students in their 2nd and 3rd year (classes graduating in 2009 and 2008 of the programme. The domains of the learning environment surveyed were: students’ perceptions of learning, students’ perceptions of teachers, students’ academic self-perceptions, students’ perceptions of the atmosphere, and students’ social self-perceptions. Mean scores, frequency distribution of responses, and inter- and intrasite differences were calculated. Results : The perception of the global learning environment at all sites was more positive than negative. It was characterised by a strongly positive perception of teachers. The work load and emphasis on factual learning were perceived negatively. Intersite differences within domains of the learning environment were more evident in the pioneer class (2008 of the programme. Intersite differences consistent across classes were largely related to on-site support for students. Conclusions : Shared strengths and weaknesses in the learning environment at UBC sites were evident in areas that were managed by the parent institution, such as the attributes of shared faculty and curriculum. A greater divergence in the perception of the learning environment was found in domains dependent on local arrangements and social factors that are less amenable to central regulation. This study underlines the need for ongoing

  15. Students' perception of the learning environment in a distributed medical programme.

    Science.gov (United States)

    Veerapen, Kiran; McAleer, Sean

    2010-09-24

    The learning environment of a medical school has a significant impact on students' achievements and learning outcomes. The importance of equitable learning environments across programme sites is implicit in distributed undergraduate medical programmes being developed and implemented. To study the learning environment and its equity across two classes and three geographically separate sites of a distributed medical programme at the University of British Columbia Medical School that commenced in 2004. The validated Dundee Ready Educational Environment Survey was sent to all students in their 2nd and 3rd year (classes graduating in 2009 and 2008) of the programme. The domains of the learning environment surveyed were: students' perceptions of learning, students' perceptions of teachers, students' academic self-perceptions, students' perceptions of the atmosphere, and students' social self-perceptions. Mean scores, frequency distribution of responses, and inter- and intrasite differences were calculated. The perception of the global learning environment at all sites was more positive than negative. It was characterised by a strongly positive perception of teachers. The work load and emphasis on factual learning were perceived negatively. Intersite differences within domains of the learning environment were more evident in the pioneer class (2008) of the programme. Intersite differences consistent across classes were largely related to on-site support for students. Shared strengths and weaknesses in the learning environment at UBC sites were evident in areas that were managed by the parent institution, such as the attributes of shared faculty and curriculum. A greater divergence in the perception of the learning environment was found in domains dependent on local arrangements and social factors that are less amenable to central regulation. This study underlines the need for ongoing comparative evaluation of the learning environment at the distributed sites and

  16. Performance analysis and optimization of AMGA for the WISDOM environment

    CERN Document Server

    Ahn, Sunil; Lee, Seehoon; Hwang, Soonwook; Breton, Vincent; Koblitz, Birger

    2008-01-01

    AMGA is a gLite-metadata catalogue service designed to offer access to metadata for files stored on the Grid. We evaluated AMGA to analyze whether it is suitable for the WISDOM environment, where thousands of jobs access it simultaneously to get metadata describing docking results and the status of jobs. In this work, we address performance issues on AMGA and propose new techniques to improve AMGA performance in the WISDOM environment. In the WISDOM environment, thousands of job agents distributed on the Grid may have access to an AMGA server simultaneously (1) to take docking tasks out of the AMGA server to execute on the machine that they are sitting, (2) to get the related ligand and target information, and (3) to store the docking results. The docking tasks take about 10 to 30 minutes to finish depending on the machine that they run and the docking configuration. We have carried out some performance analysis on the current AMGA implementation. Due to the overhead required to handle GSI/SSL connection on t...

  17. Residence time distribution software analysis. User's manual

    International Nuclear Information System (INIS)

    1996-01-01

    Radiotracer applications cover a wide range of industrial activities in chemical and metallurgical processes, water treatment, mineral processing, environmental protection and civil engineering. Experiment design, data acquisition, treatment and interpretation are the basic elements of tracer methodology. The application of radiotracers to determine impulse response as RTD as well as the technical conditions for conducting experiments in industry and in the environment create a need for data processing using special software. Important progress has been made during recent years in the preparation of software programs for data treatment and interpretation. The software package developed for industrial process analysis and diagnosis by the stimulus-response methods contains all the methods for data processing for radiotracer experiments

  18. CMS distributed data analysis with CRAB3

    Science.gov (United States)

    Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; Riahi, H.; Tanasijczuk, A. J.; Yusli, M. N. B.; Wolf, M.; Woodard, A. E.; Vaandering, E.

    2015-12-01

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day. CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks and submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. The new system improves the robustness, scalability and sustainability of the service. Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.

  19. Occurrence and distribution of steroids, hormones and selected pharmaceuticals in South Florida coastal environments.

    Science.gov (United States)

    Singh, Simrat P; Azua, Arlette; Chaudhary, Amit; Khan, Shabana; Willett, Kristine L; Gardinali, Piero R

    2010-02-01

    The common occurrence of human derived contaminants like pharmaceuticals, steroids and hormones in surface waters has raised the awareness of the role played by the release of treated or untreated sewage in the water quality along sensitive coastal ecosystems. South Florida is home of many important protected environments ranging from wetlands to coral reefs which are in close proximity to large metropolitan cities. Because, large portions of South Florida and most of the Florida Keys population are not served by modern sewage treatment plants and rely heavily on the use of septic systems, a comprehensive survey of selected human waste contamination markers was conducted in three areas to assess water quality with respect to non-traditional micro-constituents. This study documents the occurrence and distribution of fifteen hormones and steroids and five commonly detected pharmaceuticals in surface water samples collected from different near shore environments along South Florida between 2004 and 2006. The compounds most frequently detected were: cholesterol, caffeine, estrone, DEET, coprostanol, biphenol-A, beta-estradiol, and triclosan. The concentration detected for estrone and beta-estradiol were up to 5.2 and 1.8 ng/L, respectively. Concentrations of caffeine (5.5-68 ng/L) and DEET (4.8-49 ng/L) were generally higher and more prevalent than were the steroids. Distribution of microconstituents was site specific likely reflecting a diversity of sources. In addition to chemical analysis, the yeast estrogen screen assay was used to screen the samples for estrogen equivalency. Overall, the results show that water collected from inland canals and restricted circulation water bodies adjacent to heavily populated areas had high concentrations of multiple steroids, pharmaceuticals, and personal care products while open bay waters were largely devoid of the target analytes.

  20. A planning and analysis framework for evaluating distributed generation and utility strategies

    International Nuclear Information System (INIS)

    Ault, Graham W.

    2000-01-01

    The numbers of smaller scale distributed power generation units connected to the distribution networks of electricity utilities in the UK and elsewhere have grown significantly in recent years. Numerous economic and political drivers have stimulated this growth and continue to provide the environment for future growth in distributed generation. The simple fact that distributed generation is independent from the distribution utility complicates planning and operational tasks for the distribution network. The uncertainty relating to the number, location and type of distributed generating units to connect to the distribution network in the future makes distribution planning a particularly difficult activity. This thesis concerns the problem of distribution network and business planning in the era of distributed generation. A distributed generation strategic analysis framework is proposed to provide the required analytical capability and planning and decision making framework to enable distribution utilities to deal effectively with the challenges and opportunities presented to them by distributed generation. The distributed generation strategic analysis framework is based on the best features of modern planning and decision making methodologies and facilitates scenario based analysis across many utility strategic options and uncertainties. Case studies are presented and assessed to clearly illustrate the potential benefits of such an approach to distributed generation planning in the UK electricity supply industry. (author)

  1. Decision Criteria for Distributed Versus Non-Distributed Information Systems in the Health Care Environment

    Science.gov (United States)

    McGinnis, John W.

    1980-01-01

    The very same technological advances that support distributed systems have also dramatically increased the efficiency and capabilities of centralized systems making it more complex for health care managers to select the “right” system architecture to meet their particular needs. How this selection can be made with a reasonable degree of managerial comfort is the focus of this paper. The approach advocated is based on experience in developing the Tri-Service Medical Information System (TRIMIS) program. Along with this technical standards and configuration management procedures were developed that provided the necessary guidance to implement the selected architecture and to allow it to change in a controlled way over its life cycle.

  2. Buffered Communication Analysis in Distributed Multiparty Sessions

    Science.gov (United States)

    Deniélou, Pierre-Malo; Yoshida, Nobuko

    Many communication-centred systems today rely on asynchronous messaging among distributed peers to make efficient use of parallel execution and resource access. With such asynchrony, the communication buffers can happen to grow inconsiderately over time. This paper proposes a static verification methodology based on multiparty session types which can efficiently compute the upper bounds on buffer sizes. Our analysis relies on a uniform causality audit of the entire collaboration pattern - an examination that is not always possible from each end-point type. We extend this method to design algorithms that allocate communication channels in order to optimise the memory requirements of session executions. From these analyses, we propose two refinements methods which respect buffer bounds: a global protocol refinement that automatically inserts confirmation messages to guarantee stipulated buffer sizes and a local protocol refinement to optimise asynchronous messaging without buffer overflow. Finally our work is applied to overcome a buffer overflow problem of the multi-buffering algorithm.

  3. Node-based analysis of species distributions

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Rahbek, Carsten; Fjeldså, Jon

    2014-01-01

    overrepresentation score (SOS) and the geographic node divergence (GND) score, which together combine ecological and evolutionary patterns into a single framework and avoids many of the problems that characterize community phylogenetic methods in current use.This approach goes through each node in the phylogeny...... with case studies on two groups with well-described biogeographical histories: a local-scale community data set of hummingbirds in the North Andes, and a large-scale data set of the distribution of all species of New World flycatchers. The node-based analysis of these two groups generates a set...... of intuitively interpretable patterns that are consistent with current biogeographical knowledge.Importantly, the results are statistically tractable, opening many possibilities for their use in analyses of evolutionary, historical and spatial patterns of species diversity. The method is implemented...

  4. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  5. Probabilistic analysis of flaw distribution on structure under cyclic load

    International Nuclear Information System (INIS)

    Kwak, Sang Log; Choi, Young Hwan; Kim, Hho Jung

    2003-01-01

    Flaw geometries, applied stress, and material properties are major input variables for the fracture mechanics analysis. Probabilistic approach can be applied for the consideration of uncertainties within these input variables. But probabilistic analysis requires many assumptions due to the lack of initial flaw distributions data. In this study correlations are examined between initial flaw distributions and in-service flaw distributions on structures under cyclic load. For the analysis, LEFM theories and Monte Carlo simulation are applied. Result shows that in-service flaw distributions are determined by initial flaw distributions rather than fatigue crack growth rate. So initial flaw distribution can be derived from in-service flaw distributions

  6. Distribution tactics for success in turbulent versus stable environments: A complexity theory approach

    Directory of Open Access Journals (Sweden)

    Roger Bruce Mason

    2013-11-01

    Full Text Available This article proposes that the external environment influences the choice of distribution tactics. Since businesses and markets are complex adaptive systems, using complexity theory to understand such environments is necessary, but it has not been widely researched. A qualitative case method using in-depth interviews investigated four successful, versus less successful, companies in turbulent versus stable environments. The results tentatively confirmed that the more successful company, in a turbulent market, sees distribution activities as less important than other aspects of the marketing mix, but uses them to stabilise customer relationships and to maintain distribution processes. These findings can benefit marketers by emphasising a new way to consider place activities. How marketers can be assisted, and suggestions for further research, are provided.

  7. The structure and pyrolysis product distribution of lignite from different sedimentary environment

    International Nuclear Information System (INIS)

    Liu, Peng; Zhang, Dexiang; Wang, Lanlan; Zhou, Yang; Pan, Tieying; Lu, Xilan

    2016-01-01

    Highlights: • Carbon structure of three lignites was measured by solid "1"3C NMR. • Effect of carbon structure on pyrolysis product distribution was studied. • Tar yield is influenced by aliphatic carbon and oxygen functional group. • C1–C4 content of pyrolysis gas is related to CH_2/CH_3 ratio. - Abstract: Low-temperature pyrolysis is an economically efficient method for lignite to obtain coal tar and improve its combustion calorific value. The research on the distribution of pyrolysis product (especially coal tar yield) plays an important role in energy application and economic development in the now and future. Pyrolysis test was carried out in a tube reactor at 873 K for 15 min. The structure of the lignite was measured by solid "1"3C nuclear magnetic resonance (NMR) and Fourier transform infrared spectroscopy (FTIR). The thermal analysis was analyzed by thermo-gravimetric (TG) analyzer. The results show that the pyrolysis product distribution is related to the breakage of branch structures of aromatic ring in lignites from different sedimentary environment. The gas yield and composition are related to the decomposition of carbonyl group and the breakage of aliphatic carbon. The tar yield derived from lignite pyrolysis follows the order: Xianfeng lignite (XF, 13.67 wt.%) > Xiaolongtan lignite (XLT, 7.97 wt.%) > Inner Mongolia lignite (IM, 6.30 wt.%), which is mainly influenced by the aliphatic carbon contents, the CH_2/CH_3 ratio and the oxygen functional groups in lignite. The pyrolysis water yield depends on the decomposition of oxygen functional groups. IM has the highest content of oxygen-linked carbon so that the pyrolysis water yield derived from IM is the highest (9.20 wt.%), and is far more than that from the other two lignites.

  8. Data management in an object-oriented distributed aircraft conceptual design environment

    Science.gov (United States)

    Lu, Zhijie

    In the competitive global market place, aerospace companies are forced to deliver the right products to the right market, with the right cost, and at the right time. However, the rapid development of technologies and new business opportunities, such as mergers, acquisitions, supply chain management, etc., have dramatically increased the complexity of designing an aircraft. Therefore, the pressure to reduce design cycle time and cost is enormous. One way to solve such a dilemma is to develop and apply advanced engineering environments (AEEs), which are distributed collaborative virtual design environments linking researchers, technologists, designers, etc., together by incorporating application tools and advanced computational, communications, and networking facilities. Aircraft conceptual design, as the first design stage, provides major opportunity to compress design cycle time and is the cheapest place for making design changes. However, traditional aircraft conceptual design programs, which are monolithic programs, cannot provide satisfactory functionality to meet new design requirements due to the lack of domain flexibility and analysis scalability. Therefore, we are in need of the next generation aircraft conceptual design environment (NextADE). To build the NextADE, the framework and the data management problem are two major problems that need to be addressed at the forefront. Solving these two problems, particularly the data management problem, is the focus of this research. In this dissertation, in light of AEEs, a distributed object-oriented framework is firstly formulated and tested for the NextADE. In order to improve interoperability and simplify the integration of heterogeneous application tools, data management is one of the major problems that need to be tackled. To solve this problem, taking into account the characteristics of aircraft conceptual design data, a robust, extensible object-oriented data model is then proposed according to the

  9. Merging assistance function with task distribution model to enhance user performance in collaborative virtual environment

    International Nuclear Information System (INIS)

    Khalid, S.; Alam, A.

    2016-01-01

    Collaborative Virtual Environments (CVEs) falls under Virtual Reality (VR) where two or more users manipulate objects collaboratively. In this paper we have made some experiments to make assembly from constituents parts scattered in Virtual Environment (VE) based on task distribution model using assistance functions for checking and enhancing user performance. The CVEs subjects setting on distinct connected machines via local area network. In this perspective, we consider the effects of assistance function with oral communication on collaboration, co-presence and users performance. Twenty subjects performed collaboratively an assembly task on static and dynamic based task distribution. We examine the degree of influence of assistance function with oral communications on user's performance based on task distribution model. The results show that assistance functions with oral communication based on task distribution model not only increase user performance but also enhance the sense of copresence and awareness. (author)

  10. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    International Nuclear Information System (INIS)

    Scheidt, Rafael de Faria; Vilain, Patrícia; Dantas, M A R

    2014-01-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers

  11. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    Science.gov (United States)

    de Faria Scheidt, Rafael; Vilain, Patrícia; Dantas, M. A. R.

    2014-10-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers.

  12. Energy and environment efficiency analysis based on an improved environment DEA cross-model: Case study of complex chemical processes

    International Nuclear Information System (INIS)

    Geng, ZhiQiang; Dong, JunGen; Han, YongMing; Zhu, QunXiong

    2017-01-01

    Highlights: •An improved environment DEA cross-model method is proposed. •Energy and environment efficiency analysis framework of complex chemical processes is obtained. •This proposed method is efficient in energy-saving and emission reduction of complex chemical processes. -- Abstract: The complex chemical process is a high pollution and high energy consumption industrial process. Therefore, it is very important to analyze and evaluate the energy and environment efficiency of the complex chemical process. Data Envelopment Analysis (DEA) is used to evaluate the relative effectiveness of decision-making units (DMUs). However, the traditional DEA method usually cannot genuinely distinguish the effective and inefficient DMU due to its extreme or unreasonable weight distribution of input and output variables. Therefore, this paper proposes an energy and environment efficiency analysis method based on an improved environment DEA cross-model (DEACM) method. The inputs of the complex chemical process are divided into energy and non-energy inputs. Meanwhile, the outputs are divided into desirable and undesirable outputs. And then the energy and environment performance index (EEPI) based on the cross evaluation is used to represent the overall performance of each DMU. Moreover, the improvement direction of energy-saving and carbon emission reduction of each inefficiency DMU is quantitatively obtained based on the self-evaluation model of the improved environment DEACM. The results show that the improved environment DEACM method has a better effective discrimination than the original DEA method by analyzing the energy and environment efficiency of the ethylene production process in complex chemical processes, and it can obtain the potential of energy-saving and carbon emission reduction of ethylene plants, especially the improvement direction of inefficient DMUs to improve energy efficiency and reduce carbon emission.

  13. Evaluation of trace elements distribution in water, sediment, soil and cassava plant in Muria peninsula environment by NAA method

    International Nuclear Information System (INIS)

    Muryono, H.; Sumining; Agus Taftazani; Kris Tri Basuki; Sukarman, A.

    1999-01-01

    The evaluation of trace elements distribution in water, sediment, soil and cassava plant in Muria peninsula by NAA method were done. The nuclear power plant (NPP) and the coal power plant (CPP) will be built in Muria peninsula, so, the Muria peninsula is an important site for samples collection and monitoring of environment. River-water, sediment, dryland-soil and cassava plant were choosen as specimens samples from Muria peninsula environment. The analysis result of trace elements were used as a contributed data for environment monitoring before and after NPP was built. The trace elements in specimens of river-water, sediment, dryland-soil and cassava plant samples were analyzed by INAA method. It was found that the trace elements distribution were not evenly distributed. Percentage of trace elements distribution in river-water, sediment, dryland-soil and cassava leaves were 0.00026-0.037% in water samples, 0.49-62.7% in sediment samples, 36.29-99.35% in soil samples and 0.21-99.35% in cassava leaves. (author)

  14. Evaluation of trace elements distribution in water, sediment, soil and cassava plant in Muria peninsula environment by NAA method

    Energy Technology Data Exchange (ETDEWEB)

    Muryono, H.; Sumining; Agus Taftazani; Kris Tri Basuki; Sukarman, A. [Yogyakarta Nuclear Research Center, Yogyakarta (Indonesia)

    1999-10-01

    The evaluation of trace elements distribution in water, sediment, soil and cassava plant in Muria peninsula by NAA method were done. The nuclear power plant (NPP) and the coal power plant (CPP) will be built in Muria peninsula, so, the Muria peninsula is an important site for samples collection and monitoring of environment. River-water, sediment, dryland-soil and cassava plant were choosen as specimens samples from Muria peninsula environment. The analysis result of trace elements were used as a contributed data for environment monitoring before and after NPP was built. The trace elements in specimens of river-water, sediment, dryland-soil and cassava plant samples were analyzed by INAA method. It was found that the trace elements distribution were not evenly distributed. Percentage of trace elements distribution in river-water, sediment, dryland-soil and cassava leaves were 0.00026-0.037% in water samples, 0.49-62.7% in sediment samples, 36.29-99.35% in soil samples and 0.21-99.35% in cassava leaves. (author)

  15. Full immersion simulation: validation of a distributed simulation environment for technical and non-technical skills training in Urology.

    Science.gov (United States)

    Brewin, James; Tang, Jessica; Dasgupta, Prokar; Khan, Muhammad S; Ahmed, Kamran; Bello, Fernando; Kneebone, Roger; Jaye, Peter

    2015-07-01

    To evaluate the face, content and construct validity of the distributed simulation (DS) environment for technical and non-technical skills training in endourology. To evaluate the educational impact of DS for urology training. DS offers a portable, low-cost simulated operating room environment that can be set up in any open space. A prospective mixed methods design using established validation methodology was conducted in this simulated environment with 10 experienced and 10 trainee urologists. All participants performed a simulated prostate resection in the DS environment. Outcome measures included surveys to evaluate the DS, as well as comparative analyses of experienced and trainee urologist's performance using real-time and 'blinded' video analysis and validated performance metrics. Non-parametric statistical methods were used to compare differences between groups. The DS environment demonstrated face, content and construct validity for both non-technical and technical skills. Kirkpatrick level 1 evidence for the educational impact of the DS environment was shown. Further studies are needed to evaluate the effect of simulated operating room training on real operating room performance. This study has shown the validity of the DS environment for non-technical, as well as technical skills training. DS-based simulation appears to be a valuable addition to traditional classroom-based simulation training. © 2014 The Authors BJU International © 2014 BJU International Published by John Wiley & Sons Ltd.

  16. Development of pair distribution function analysis

    International Nuclear Information System (INIS)

    Vondreele, R.; Billinge, S.; Kwei, G.; Lawson, A.

    1996-01-01

    This is the final report of a 3-year LDRD project at LANL. It has become more and more evident that structural coherence in the CuO 2 planes of high-T c superconducting materials over some intermediate length scale (nm range) is important to superconductivity. In recent years, the pair distribution function (PDF) analysis of powder diffraction data has been developed for extracting structural information on these length scales. This project sought to expand and develop this technique, use it to analyze neutron powder diffraction data, and apply it to problems. In particular, interest is in the area of high-T c superconductors, although we planned to extend the study to the closely related perovskite ferroelectric materials andother materials where the local structure affects the properties where detailed knowledge of the local and intermediate range structure is important. In addition, we planned to carry out single crystal experiments to look for diffuse scattering. This information augments the information from the PDF

  17. Ethnographic analysis: a study of classroom environments.

    Science.gov (United States)

    Griswold, L A

    1994-05-01

    Occupational therapists assess and adapt an environment to enhance clients' abilities to function. Therapists working in schools may assess several classroom environments in a week. Identifying relevant information in an efficient manner is essential yet presents a challenge for school therapists. In this study, ethnographic research methodology was used to analyze the plethora of data gained from observations in eight classrooms. Three major categories were identified to structure observations: activities, people, and communication. These categories were used to compile a Classroom Observation Guide that gives therapists relevant questions to ask in each category. Using the Classroom Observation Guide, occupational therapists can recommend classroom activities that suit a particular teacher's style. For example, working with a teacher who prefers structural activities with clear time and space boundaries for one specific purpose, a therapist might suggest organized sensorimotor games with a distinct purpose to be carried out for a given time period.

  18. Collaborative Virtual Environments as Means to Increase the Level of Intersubjectivity in a Distributed Cognition System

    Science.gov (United States)

    Ligorio, M. Beatrice; Cesareni, Donatella; Schwartz, Neil

    2008-01-01

    Virtual environments are able to extend the space of interaction beyond the classroom. In order to analyze how distributed cognition functions in such an extended space, we suggest focusing on the architecture of intersubjectivity. The Euroland project--a virtual land created and populated by seven classrooms supported by a team of…

  19. Framing and Enhancing Distributed Leadership in the Quality Management of Online Learning Environments in Higher Education

    Science.gov (United States)

    Holt, Dale; Palmer, Stuart; Gosper, Maree; Sankey, Michael; Allan, Garry

    2014-01-01

    This article reports on the findings of senior leadership interviews in a nationally funded project on distributed leadership in the quality management of online learning environments (OLEs) in higher education. Questions were framed around the development of an OLE quality management framework and the situation of the characteristics of…

  20. Instructional Designers' Media Selection Practices for Distributed Problem-Based Learning Environments

    Science.gov (United States)

    Fells, Stephanie

    2012-01-01

    The design of online or distributed problem-based learning (dPBL) is a nascent, complex design problem. Instructional designers are challenged to effectively unite the constructivist principles of problem-based learning (PBL) with appropriate media in order to create quality dPBL environments. While computer-mediated communication (CMC) tools and…

  1. Corroded scale analysis from water distribution pipes

    Directory of Open Access Journals (Sweden)

    Rajaković-Ognjanović Vladana N.

    2011-01-01

    Full Text Available The subject of this study was the steel pipes that are part of Belgrade's drinking water supply network. In order to investigate the mutual effects of corrosion and water quality, the corrosion scales on the pipes were analyzed. The idea was to improve control of corrosion processes and prevent impact of corrosion on water quality degradation. The instrumental methods for corrosion scales characterization used were: scanning electron microscopy (SEM, for the investigation of corrosion scales of the analyzed samples surfaces, X-ray diffraction (XRD, for the analysis of the presence of solid forms inside scales, scanning electron microscopy (SEM, for the microstructural analysis of the corroded scales, and BET adsorption isotherm for the surface area determination. Depending on the composition of water next to the pipe surface, corrosion of iron results in the formation of different compounds and solid phases. The composition and structure of the iron scales in the drinking water distribution pipes depends on the type of the metal and the composition of the aqueous phase. Their formation is probably governed by several factors that include water quality parameters such as pH, alkalinity, buffer intensity, natural organic matter (NOM concentration, and dissolved oxygen (DO concentration. Factors such as water flow patterns, seasonal fluctuations in temperature, and microbiological activity as well as water treatment practices such as application of corrosion inhibitors can also influence corrosion scale formation and growth. Therefore, the corrosion scales found in iron and steel pipes are expected to have unique features for each site. Compounds that are found in iron corrosion scales often include goethite, lepidocrocite, magnetite, hematite, ferrous oxide, siderite, ferrous hydroxide, ferric hydroxide, ferrihydrite, calcium carbonate and green rusts. Iron scales have characteristic features that include: corroded floor, porous core that contains

  2. RELIABILITY ANALYSIS OF POWER DISTRIBUTION SYSTEMS

    Directory of Open Access Journals (Sweden)

    Popescu V.S.

    2012-04-01

    Full Text Available Power distribution systems are basic parts of power systems and reliability of these systems at present is a key issue for power engineering development and requires special attention. Operation of distribution systems is accompanied by a number of factors that produce random data a large number of unplanned interruptions. Research has shown that the predominant factors that have a significant influence on the reliability of distribution systems are: weather conditions (39.7%, defects in equipment(25% and unknown random factors (20.1%. In the article is studied the influence of random behavior and are presented estimations of reliability of predominantly rural electrical distribution systems.

  3. THE ENVIRONMENT OF REGIONAL DEVELOPMENT FINANCIAL ANALYSIS

    OpenAIRE

    Bechis Liviu; MOSCVICIOV Andrei

    2012-01-01

    The paper presents the difference between the two concepts regionalism and regionalization. It also presents the three types of regionalism analysis depending on the dimension and the nature of the relations: regionalism at national level, transnational regionalism and international regionalism analysis.

  4. On delay adjustment for dynamic load balancing in distributed virtual environments.

    Science.gov (United States)

    Deng, Yunhua; Lau, Rynson W H

    2012-04-01

    Distributed virtual environments (DVEs) are becoming very popular in recent years, due to the rapid growing of applications, such as massive multiplayer online games (MMOGs). As the number of concurrent users increases, scalability becomes one of the major challenges in designing an interactive DVE system. One solution to address this scalability problem is to adopt a multi-server architecture. While some methods focus on the quality of partitioning the load among the servers, others focus on the efficiency of the partitioning process itself. However, all these methods neglect the effect of network delay among the servers on the accuracy of the load balancing solutions. As we show in this paper, the change in the load of the servers due to network delay would affect the performance of the load balancing algorithm. In this work, we conduct a formal analysis of this problem and discuss two efficient delay adjustment schemes to address the problem. Our experimental results show that our proposed schemes can significantly improve the performance of the load balancing algorithm with neglectable computation overhead.

  5. Data intensive high energy physics analysis in a distributed cloud

    Science.gov (United States)

    Charbonneau, A.; Agarwal, A.; Anderson, M.; Armstrong, P.; Fransham, K.; Gable, I.; Harris, D.; Impey, R.; Leavett-Brown, C.; Paterson, M.; Podaima, W.; Sobie, R. J.; Vliet, M.

    2012-02-01

    We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.

  6. Data intensive high energy physics analysis in a distributed cloud

    International Nuclear Information System (INIS)

    Charbonneau, A; Impey, R; Podaima, W; Agarwal, A; Anderson, M; Armstrong, P; Fransham, K; Gable, I; Harris, D; Leavett-Brown, C; Paterson, M; Sobie, R J; Vliet, M

    2012-01-01

    We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.

  7. Making distributed ALICE analysis simple using the GRID plug-in

    International Nuclear Information System (INIS)

    Gheata, A; Gheata, M

    2012-01-01

    We have developed an interface within the ALICE analysis framework that allows transparent usage of the experiment's distributed resources. This analysis plug-in makes it possible to configure back-end specific parameters from a single interface and to run with no change the same custom user analysis in many computing environments, from local workstations to PROOF clusters or GRID resources. The tool is used now extensively in the ALICE collaboration for both end-user analysis and large scale productions.

  8. Development of System Architecture to Investigate the Impact of Integrated Air and Missile Defense in a Distributed Lethality Environment

    Science.gov (United States)

    2017-12-01

    SYSTEM ARCHITECTURE TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT by Justin K. Davis...TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT 5. FUNDING NUMBERS 6. AUTHOR(S) Justin K...ARCHITECTURE TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT Justin K. Davis Lieutenant

  9. Scene analysis in the natural environment

    DEFF Research Database (Denmark)

    Lewicki, Michael S; Olshausen, Bruno A; Surlykke, Annemarie

    2014-01-01

    The problem of scene analysis has been studied in a number of different fields over the past decades. These studies have led to important insights into problems of scene analysis, but not all of these insights are widely appreciated, and there remain critical shortcomings in current approaches th...... ill-posed problems, (2) the ability to integrate and store information across time and modality, (3) efficient recovery and representation of 3D scene structure, and (4) the use of optimal motor actions for acquiring information to progress toward behavioral goals....

  10. Spatial structures of the environment and of dispersal impact species distribution in competitive metacommunities.

    Science.gov (United States)

    Ai, Dexiecuo; Gravel, Dominique; Chu, Chengjin; Wang, Gang

    2013-01-01

    The correspondence between species distribution and the environment depends on species' ability to track favorable environmental conditions (via dispersal) and to maintain competitive hierarchy against the constant influx of migrants (mass effect) and demographic stochasticity (ecological drift). Here we report a simulation study of the influence of landscape structure on species distribution. We consider lottery competition for space in a spatially heterogeneous environment, where the landscape is represented as a network of localities connected by dispersal. We quantified the contribution of neutrality and species sorting to their spatial distribution. We found that neutrality increases and the strength of species-sorting decreases with the centrality of a community in the landscape when the average dispersal among communities is low, whereas the opposite was found at elevated dispersal. We also found that the strength of species-sorting increases with environmental heterogeneity. Our results illustrate that spatial structure of the environment and of dispersal must be taken into account for understanding species distribution. We stress the importance of spatial geographic structure on the relative importance of niche vs. neutral processes in controlling community dynamics.

  11. Spatial structures of the environment and of dispersal impact species distribution in competitive metacommunities.

    Directory of Open Access Journals (Sweden)

    Dexiecuo Ai

    Full Text Available The correspondence between species distribution and the environment depends on species' ability to track favorable environmental conditions (via dispersal and to maintain competitive hierarchy against the constant influx of migrants (mass effect and demographic stochasticity (ecological drift. Here we report a simulation study of the influence of landscape structure on species distribution. We consider lottery competition for space in a spatially heterogeneous environment, where the landscape is represented as a network of localities connected by dispersal. We quantified the contribution of neutrality and species sorting to their spatial distribution. We found that neutrality increases and the strength of species-sorting decreases with the centrality of a community in the landscape when the average dispersal among communities is low, whereas the opposite was found at elevated dispersal. We also found that the strength of species-sorting increases with environmental heterogeneity. Our results illustrate that spatial structure of the environment and of dispersal must be taken into account for understanding species distribution. We stress the importance of spatial geographic structure on the relative importance of niche vs. neutral processes in controlling community dynamics.

  12. silicon bipolar distributed oscillator design and analysis

    African Journals Online (AJOL)

    digital and analogue market, wired or wireless is making it necessary to operate ... is generally high; this additional power is supplied by the eternal dc source. ... distributed oscillator consists of a pair of transmission lines with characteristic ...

  13. FAME, the flux analysis and modelling environment

    NARCIS (Netherlands)

    Boele, J.; Olivier, B.G.; Teusink, B.

    2012-01-01

    Background: The creation and modification of genome-scale metabolic models is a task that requires specialized software tools. While these are available, subsequently running or visualizing a model often relies on disjoint code, which adds additional actions to the analysis routine and, in our

  14. Distributed energy store railguns experiment and analysis

    International Nuclear Information System (INIS)

    Holland, L.D.

    1984-01-01

    Electromagnetic acceleration of projectiles holds the potential for achieving higher velocities than yet achieved by any other means. A railgun is the simplest form of electromagnetic macroparticle accelerator and can generate the highest sustained accelerating force. The practical length of conventional railguns is limited by the impedance of the rails because current must be carried along the entire length of the rails. A railgun and power supply system called the distributed energy store railgun was proposed as a solution to this limitation. The distributed energy store railgun used multiple current sources connected to the rails of a railgun at points distributed along the bore. These current sources (energy stores) are turned on in sequence as the projectile moves down the bore so that current is fed to the railgun from behind the armature. In this system the length of the rails that carry the full armature current is less than the total length of the railgun. If a sufficient number of energy stores is used, this removes the limitation on the length of a railgun. An additional feature of distributed energy store type railguns is that they can be designed to maintain a constant pressure on the projectile being accelerated. A distributed energy store railgun was constructed and successfully operated. In addition to this first demonstration of the distributed energy store railgun principle, a theoretical model of the system was also constructed

  15. Web Based Distributed Coastal Image Analysis System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project develops Web based distributed image analysis system processing the Moderate Resolution Imaging Spectroradiometer (MODIS) data to provide decision...

  16. Distribution coefficients for plutonium and americium on particulates in aquatic environments

    International Nuclear Information System (INIS)

    Sanchez, A.L.; Schell, W.R.; Sibley, T.H.

    1982-01-01

    The distribution coefficients of two transuranic elements, plutonium and americium, were measured experimentally in laboratory systems of selected freshwater, estuarine, and marine environments. Gamma-ray emitting isotopes of these radionuclides, 237 Pu and 241 Am, were significantly greater than the sorption Ksub(d) values, suggesting some irreversibility in the sorption of these radionuclides onto sediments. The effects of pH and of sediment concentration on the distribution coefficients were also investigated. There were significant changes in the Ksub(d) values as these parameters were varied. Experiments using sterilized and nonsterilized samples for some of the sediment/water systems indicate possible bacterial effects on Ksub(d) values. (author)

  17. Expressing Environment Assumptions and Real-time Requirements for a Distributed Embedded System with Shared Variables

    DEFF Research Database (Denmark)

    Tjell, Simon; Fernandes, João Miguel

    2008-01-01

    In a distributed embedded system, it is often necessary to share variables among its computing nodes to allow the distribution of control algorithms. It is therefore necessary to include a component in each node that provides the service of variable sharing. For that type of component, this paper...... for the component. The CPN model can be used to validate the environment assumptions and the requirements. The validation is performed by execution of the model during which traces of events and states are automatically generated and evaluated against the requirements....

  18. Scilab and Maxima Environment: Towards Free Software in Numerical Analysis

    Science.gov (United States)

    Mora, Angel; Galan, Jose Luis; Aguilera, Gabriel; Fernandez, Alvaro; Merida, Enrique; Rodriguez, Pedro

    2010-01-01

    In this work we will present the ScilabUMA environment we have developed as an alternative to Matlab. This environment connects Scilab (for numerical analysis) and Maxima (for symbolic computations). Furthermore, the developed interface is, in our opinion at least, as powerful as the interface of Matlab. (Contains 3 figures.)

  19. Analysis of mixed mode microwave distribution manifolds

    International Nuclear Information System (INIS)

    White, T.L.

    1982-09-01

    The 28-GHz microwave distribution manifold used in the ELMO Bumpy Torus-Scale (EBT-S) experiments consists of a toroidal metallic cavity, whose dimensions are much greater than a wavelength, fed by a source of microwave power. Equalization of the mixed mode power distribution ot the 24 cavities of EBT-S is accomplished by empirically adjusting the coupling irises which are equally spaced around the manifold. The performance of the manifold to date has been very good, yet no analytical models exist for optimizing manifold transmission efficiency or for scaling this technology to the EBT-P manifold design. The present report develops a general model for mixed mode microwave distribution manifolds based on isotropic plane wave sources of varying amplitudes that are distributed toroidally around the manifold. The calculated manifold transmission efficiency for the most recent EBT-S coupling iris modification is 90%. This agrees with the average measured transmission efficiency. Also, the model predicts the coupling iris areas required to balance the distribution of microwave power while maximizing transmission efficiency, and losses in waveguide feeds connecting the irises to the cavities of EBT are calculated using an approach similar to the calculation of mainfold losses. The model will be used to evaluate EBT-P manifold designs

  20. Widening the scope of incident analysis in complex work environments

    NARCIS (Netherlands)

    Vuuren, van W.; Kanse, L.; Manser, T.

    2003-01-01

    Incident analysis is commonly used as a tooi to provide information on how to improve health and safety in complex work environments. The effectiveness of this tooi, however, depends on how the analysis is actually carried out. The traditional view on incident analysis highlights the role of human

  1. Distribution

    Science.gov (United States)

    John R. Jones

    1985-01-01

    Quaking aspen is the most widely distributed native North American tree species (Little 1971, Sargent 1890). It grows in a great diversity of regions, environments, and communities (Harshberger 1911). Only one deciduous tree species in the world, the closely related Eurasian aspen (Populus tremula), has a wider range (Weigle and Frothingham 1911)....

  2. A Knowledge-based Environment for Software Process Performance Analysis

    Directory of Open Access Journals (Sweden)

    Natália Chaves Lessa Schots

    2015-08-01

    Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.

  3. Distributed energy store railguns experiment and analysis

    Science.gov (United States)

    Holland, L. D.

    1984-02-01

    Electromagnetic acceleration of projectiles holds the potential for achieving higher velocities than yet achieved by any other means. A railgun is the simplest form of electromagnetic macroparticle accelerator and can generate the highest sustained accelerating force. The practical length of conventional railguns is limited by the impedance of the rails because current must be carried along the entire length of the rails. A railgun and power supply system called the distributed energy store railgun was proposed as a solution to this limitation. A distributed energy storage railgun was constructed and successfully operated. In addition to this demonstration of the distributed energy store railgun principle, a theoretical model of the system was also constructed. A simple simulation of the railgun system based on this model, but ignoring frictional drag, was compared with the experimental results. During the process of comparing results from the simulation and the experiment, the effect of significant frictional drag of the projectile on the sidewalls of the bore was observed.

  4. Field distribution analysis in deflecting structures

    Energy Technology Data Exchange (ETDEWEB)

    Paramonov, V.V. [Joint Inst. for Nuclear Research, Moscow (Russian Federation)

    2013-02-15

    Deflecting structures are used now manly for bunch rotation in emittance exchange concepts, bunch diagnostics and to increase the luminosity. The bunch rotation is a transformation of a particles distribution in the six dimensional phase space. Together with the expected transformations, deflecting structures introduce distortions due to particularities - aberrations - in the deflecting field distribution. The distributions of deflecting fields are considered with respect to non linear additions, which provide emittance deteriorations during a transformation. The deflecting field is treated as combination of hybrid waves HE{sub 1} and HM{sub 1}. The criteria for selection and formation of deflecting structures with minimized level of aberrations are formulated and applied to known structures. Results of the study are confirmed by comparison with results of numerical simulations.

  5. Field distribution analysis in deflecting structures

    International Nuclear Information System (INIS)

    Paramonov, V.V.

    2013-02-01

    Deflecting structures are used now manly for bunch rotation in emittance exchange concepts, bunch diagnostics and to increase the luminosity. The bunch rotation is a transformation of a particles distribution in the six dimensional phase space. Together with the expected transformations, deflecting structures introduce distortions due to particularities - aberrations - in the deflecting field distribution. The distributions of deflecting fields are considered with respect to non linear additions, which provide emittance deteriorations during a transformation. The deflecting field is treated as combination of hybrid waves HE 1 and HM 1 . The criteria for selection and formation of deflecting structures with minimized level of aberrations are formulated and applied to known structures. Results of the study are confirmed by comparison with results of numerical simulations.

  6. Stock-environment recruitment analysis for Namibian Cape hake ...

    African Journals Online (AJOL)

    Stock-environment recruitment analysis for Namibian Cape hake Merluccius capensis. ... The factors modulating recruitment success of Cape hake Merluccius capensis in Namibian waters are still unresolved. ... AJOL African Journals Online.

  7. Analysis Of Aspects Of Messages Hiding In Text Environments

    Directory of Open Access Journals (Sweden)

    Afanasyeva Olesya

    2015-09-01

    Full Text Available In the work are researched problems, which arise during hiding of messages in text environments, being transmitted by electronic communication channels and the Internet. The analysis of selection of places in text environment (TE, which can be replaced by word from the message is performed. Selection and replacement of words in the text environment is implemented basing on semantic analysis of text fragment, consisting of the inserted word, and its environment in TE. For implementation of such analysis is used concept of semantic parameters of words coordination and semantic value of separate word. Are used well-known methods of determination of values of these parameters. This allows moving from quality level to quantitative level analysis of text fragments semantics during their modification by word substitution. Invisibility of embedded messages is ensured by providing preset values of the semantic cooperation parameter deviations.

  8. Statistical analysis of partial reduced width distributions

    International Nuclear Information System (INIS)

    Tran Quoc Thuong.

    1973-01-01

    The aim of this study was to develop rigorous methods for analysing experimental event distributions according to a law in chi 2 and to check if the number of degrees of freedom ν is compatible with the value 1 for the reduced neutron width distribution. Two statistical methods were used (the maximum-likelihood method and the method of moments); it was shown, in a few particular cases, that ν is compatible with 1. The difference between ν and 1, if it exists, should not exceed 3%. These results confirm the validity of the compound nucleus model [fr

  9. Distributed service-based approach for sensor data fusion in IoT environments.

    Science.gov (United States)

    Rodríguez-Valenzuela, Sandra; Holgado-Terriza, Juan A; Gutiérrez-Guerrero, José M; Muros-Cobos, Jesús L

    2014-10-15

    The Internet of Things (IoT) enables the communication among smart objects promoting the pervasive presence around us of a variety of things or objects that are able to interact and cooperate jointly to reach common goals. IoT objects can obtain data from their context, such as the home, office, industry or body. These data can be combined to obtain new and more complex information applying data fusion processes. However, to apply data fusion algorithms in IoT environments, the full system must deal with distributed nodes, decentralized communication and support scalability and nodes dynamicity, among others restrictions. In this paper, a novel method to manage data acquisition and fusion based on a distributed service composition model is presented, improving the data treatment in IoT pervasive environments.

  10. An Artificial Intelligence-Based Environment Quality Analysis System

    OpenAIRE

    Oprea , Mihaela; Iliadis , Lazaros

    2011-01-01

    Part 20: Informatics and Intelligent Systems Applications for Quality of Life information Services (ISQLIS) Workshop; International audience; The paper describes an environment quality analysis system based on a combination of some artificial intelligence techniques, artificial neural networks and rule-based expert systems. Two case studies of the system use are discussed: air pollution analysis and flood forecasting with their impact on the environment and on the population health. The syste...

  11. RECOMMENDATIONS FOR THE DISTRIBUTION STRATEGY IN CHANGING MARKET ENVIRONMENT : Case: Belgian Brewery Van Honsebrouck in Russia

    OpenAIRE

    Louckx, Yulia

    2014-01-01

    The efficient distribution strategy formulation becomes vital to the success and survival of any organization, especially when it is involved in international trade. Today’s world is particularly challenging due to rapidly changing market conditions. Therefore, in order to able to compete, satisfy customers, and meet the needs of other stakeholders profitably, it is crucial for any company to make profound market environment analyses, react to changes in the market and adjust strategies accor...

  12. Design for Distributed Moroccan Hospital Pharmacy Information Environment with Service Oriented Architecture

    OpenAIRE

    Omrana, Hajar; Nassiri, Safae; Belouadha, Fatima-Zahra; Roudiés, Ounsa

    2012-01-01

    In the last five years, Moroccan e-health system has focused on improving the quality of patient care services by making use of advanced Information and Communications Technologies (ICT) solutions. In actual fact, achieving runtime and efficient information sharing, through large-scale distributed environments such as e-health system, is not a trivial task. It seems to present many issues due to the heterogeneity and complex nature of data resources. This concerns, in particular, Moroccan Hos...

  13. Investigation on Oracle GoldenGate Veridata for Data Consistency in WLCG Distributed Database Environment

    OpenAIRE

    Asko, Anti; Lobato Pardavila, Lorena

    2014-01-01

    Abstract In the distributed database environment, the data divergence can be an important problem: if it is not discovered and correctly identified, incorrect data can lead to poor decision making, errors in the service and in the operative errors. Oracle GoldenGate Veridata is a product to compare two sets of data and identify and report on data that is out of synchronization. IT DB is providing a replication service between databases at CERN and other computer centers worldwide as a par...

  14. Acquisitions Everywhere: Modeling an Acquisitions Data Standard to Connect a Distributed Environment

    OpenAIRE

    Hanson, Eric M.; Lightcap, Paul W.; Miguez, Matthew R.

    2016-01-01

    Acquisitions functions remain operationally crucial in providing access to paid information resources, but data formats and workflows utilized within library acquisitions remain primarily within the traditional integrated library system (ILS). As libraries have evolved to use distributed systems to manage information resources, so too must acquisitions functions adapt to an environment that may include the ILS, e‐resource management systems (ERMS), institutional repositories (IR), and other d...

  15. About normal distribution on SO(3) group in texture analysis

    Science.gov (United States)

    Savyolova, T. I.; Filatov, S. V.

    2017-12-01

    This article studies and compares different normal distributions (NDs) on SO(3) group, which are used in texture analysis. Those NDs are: Fisher normal distribution (FND), Bunge normal distribution (BND), central normal distribution (CND) and wrapped normal distribution (WND). All of the previously mentioned NDs are central functions on SO(3) group. CND is a subcase for normal CLT-motivated distributions on SO(3) (CLT here is Parthasarathy’s central limit theorem). WND is motivated by CLT in R 3 and mapped to SO(3) group. A Monte Carlo method for modeling normally distributed values was studied for both CND and WND. All of the NDs mentioned above are used for modeling different components of crystallites orientation distribution function in texture analysis.

  16. Distributed crack analysis of ceramic inlays

    NARCIS (Netherlands)

    Peters, M.C.R.B.; Vree, de J.H.P.; Brekelmans, W.A.M.

    1993-01-01

    In all-ceramic restorations, crack formation and propagation phenomena are of major concern, since they may result in intra-oral fracture. The objective of this study was calculation of damage in porcelain MOD inlays by utilization of a finite-element (FE) implementation of the distributed crack

  17. Objective Bayesian Analysis of Skew- t Distributions

    KAUST Repository

    BRANCO, MARCIA D'ELIA; GENTON, MARC G.; LISEO, BRUNERO

    2012-01-01

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student's t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric

  18. Analysis of refrigerant mal-distribution

    DEFF Research Database (Denmark)

    Kærn, Martin Ryhl; Elmegaard, Brian

    2009-01-01

    to be two straight tubes. The refrigerant maldistribution is then induced to the evaporator by varying the vapor quality at the inlet to each tube and the air-flow across each tube. Finally it is shown that mal-distribution can be compensated by an intelligent distributor, that ensures equal superheat...

  19. High-Surety Telemedicine in a Distributed, 'Plug-and-Play' Environment

    International Nuclear Information System (INIS)

    Craft, Richard L.; Funkhouser, Donald R.; Gallagher, Linda K.; Garcia, Rudy J.; Parks, Raymond C.; Warren, Steve

    1999-01-01

    Commercial telemedicine systems are increasingly functional, incorporating video-conferencing capabilities, diagnostic peripherals, medication reminders, and patient education services. However, these systems (1) rarely utilize information architectures which allow them to be easily integrated with existing health information networks and (2) do not always protect patient confidentiality with adequate security mechanisms. Using object-oriented methods and software wrappers, we illustrate the transformation of an existing stand-alone telemedicine system into 'plug-and-play' components that function in a distributed medical information environment. We show, through the use of open standards and published component interfaces, that commercial telemedicine offerings which were once incompatible with electronic patient record systems can now share relevant data with clinical information repositories while at the same time hiding the proprietary implementations of the respective systems. Additionally, we illustrate how leading-edge technology can secure this distributed telemedicine environment, maintaining patient confidentiality and the integrity of the associated electronic medical data. Information surety technology also encourages the development of telemedicine systems that have both read and write access to electronic medical records containing patient-identifiable information. The win-win approach to telemedicine information system development preserves investments in legacy software and hardware while promoting security and interoperability in a distributed environment

  20. High-Surety Telemedicine in a Distributed, 'Plug-andPlan' Environment

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard L.; Funkhouser, Donald R.; Gallagher, Linda K.; Garcia, Rudy J.; Parks, Raymond C.; Warren, Steve

    1999-05-17

    Commercial telemedicine systems are increasingly functional, incorporating video-conferencing capabilities, diagnostic peripherals, medication reminders, and patient education services. However, these systems (1) rarely utilize information architectures which allow them to be easily integrated with existing health information networks and (2) do not always protect patient confidentiality with adequate security mechanisms. Using object-oriented methods and software wrappers, we illustrate the transformation of an existing stand-alone telemedicine system into `plug-and-play' components that function in a distributed medical information environment. We show, through the use of open standards and published component interfaces, that commercial telemedicine offerings which were once incompatible with electronic patient record systems can now share relevant data with clinical information repositories while at the same time hiding the proprietary implementations of the respective systems. Additionally, we illustrate how leading-edge technology can secure this distributed telemedicine environment, maintaining patient confidentiality and the integrity of the associated electronic medical data. Information surety technology also encourages the development of telemedicine systems that have both read and write access to electronic medical records containing patient-identifiable information. The win-win approach to telemedicine information system development preserves investments in legacy software and hardware while promoting security and interoperability in a distributed environment.

  1. Distance Determination Method for Normally Distributed Obstacle Avoidance of Mobile Robots in Stochastic Environments

    Directory of Open Access Journals (Sweden)

    Jinhong Noh

    2016-04-01

    Full Text Available Obstacle avoidance methods require knowledge of the distance between a mobile robot and obstacles in the environment. However, in stochastic environments, distance determination is difficult because objects have position uncertainty. The purpose of this paper is to determine the distance between a robot and obstacles represented by probability distributions. Distance determination for obstacle avoidance should consider position uncertainty, computational cost and collision probability. The proposed method considers all of these conditions, unlike conventional methods. It determines the obstacle region using the collision probability density threshold. Furthermore, it defines a minimum distance function to the boundary of the obstacle region with a Lagrange multiplier method. Finally, it computes the distance numerically. Simulations were executed in order to compare the performance of the distance determination methods. Our method demonstrated a faster and more accurate performance than conventional methods. It may help overcome position uncertainty issues pertaining to obstacle avoidance, such as low accuracy sensors, environments with poor visibility or unpredictable obstacle motion.

  2. Empirical analysis for Distributed Energy Resources' impact on future distribution network

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2012-01-01

    There has been a large body of statements claiming that the large scale deployment of Distributed Energy Resources (DERs) will eventually reshape the future distribution grid operation in various ways. Thus, it is interesting to introduce a platform to interpret to what extent the power system...... operation will be alternated. In this paper, quantitative results in terms of how the future distribution grid will be changed by the deployment of distributed generation, active demand and electric vehicles, are presented. The analysis is based on the conditions for both a radial and a meshed distribution...... network. The input parameters are based on the current and envisioned DER deployment scenarios proposed for Sweden....

  3. Economic analysis of efficient distribution transformer trends

    Energy Technology Data Exchange (ETDEWEB)

    Downing, D.J.; McConnell, B.W.; Barnes, P.R.; Hadley, S.W.; Van Dyke, J.W.

    1998-03-01

    This report outlines an approach that will account for uncertainty in the development of evaluation factors used to identify transformer designs with the lowest total owning cost (TOC). The TOC methodology is described and the most highly variable parameters are discussed. The model is developed to account for uncertainties as well as statistical distributions for the important parameters. Sample calculations are presented. The TOC methodology is applied to data provided by two utilities in order to test its validity.

  4. Nonlinear Progressive Collapse Analysis Including Distributed Plasticity

    Directory of Open Access Journals (Sweden)

    Mohamed Osama Ahmed

    2016-01-01

    Full Text Available This paper demonstrates the effect of incorporating distributed plasticity in nonlinear analytical models used to assess the potential for progressive collapse of steel framed regular building structures. Emphasis on this paper is on the deformation response under the notionally removed column, in a typical Alternate Path (AP method. The AP method employed in this paper is based on the provisions of the Unified Facilities Criteria – Design of Buildings to Resist Progressive Collapse, developed and updated by the U.S. Department of Defense [1]. The AP method is often used for to assess the potential for progressive collapse of building structures that fall under Occupancy Category III or IV. A case study steel building is used to examine the effect of incorporating distributed plasticity, where moment frames were used on perimeter as well as the interior of the three dimensional structural system. It is concluded that the use of moment resisting frames within the structural system will enhance resistance to progressive collapse through ductile deformation response and that it is conserative to ignore the effects of distributed plasticity in determining peak displacement response under the notionally removed column.

  5. The role of distributed generation (DG) in a restructured utility environment

    International Nuclear Information System (INIS)

    Feibus, H.

    1999-01-01

    A major consequence of the restructuring of the electric utility industry is disintegration, by which the traditional integrated utility is spinning off its generation business and becoming a power distribution company, or distco. This company will be the remaining entity of the traditional electric utility that continues to be regulated. The world in which the distco functions is becoming a very different place. The distco will be called upon to deliver not only power, but a range of ancillary services, defined by the Federal Energy Regulatory Commission, including spinning reserves, voltage regulation, reactive power, energy imbalance and network stability, some of which may be obtained from the independent system operator, and some of which may be provided by the distco. In this environment the distco must maintain system reliability and provide service to the customer at the least cost. Meanwhile, restructuring is spawning a new generation of unregulated energy service companies that threaten to win the most attractive customers from the distco. Fortunately there is a new emerging generation of technologies, distributed resources, that provide options to the distco to help retain prime customers, by improving reliability and lowering costs. Specifically, distributed generation and storage systems if dispersed into the distribution system can provide these benefits, if generators with the right characteristics are selected, and the integration into the distribution system is done skillfully. The Electric Power Research Institute has estimated that new distributed generation may account for 30% of new generation. This presentation will include the characteristics of several distributed resources and identify potential benefits that can be obtained through the proper integration of distributed generation and storage systems

  6. The clinical learning environment in nursing education: a concept analysis.

    Science.gov (United States)

    Flott, Elizabeth A; Linden, Lois

    2016-03-01

    The aim of this study was to report an analysis of the clinical learning environment concept. Nursing students are evaluated in clinical learning environments where skills and knowledge are applied to patient care. These environments affect achievement of learning outcomes, and have an impact on preparation for practice and student satisfaction with the nursing profession. Providing clarity of this concept for nursing education will assist in identifying antecedents, attributes and consequences affecting student transition to practice. The clinical learning environment was investigated using Walker and Avant's concept analysis method. A literature search was conducted using WorldCat, MEDLINE and CINAHL databases using the keywords clinical learning environment, clinical environment and clinical education. Articles reviewed were written in English and published in peer-reviewed journals between 1995-2014. All data were analysed for recurring themes and terms to determine possible antecedents, attributes and consequences of this concept. The clinical learning environment contains four attribute characteristics affecting student learning experiences. These include: (1) the physical space; (2) psychosocial and interaction factors; (3) the organizational culture and (4) teaching and learning components. These attributes often determine achievement of learning outcomes and student self-confidence. With better understanding of attributes comprising the clinical learning environment, nursing education programmes and healthcare agencies can collaborate to create meaningful clinical experiences and enhance student preparation for the professional nurse role. © 2015 John Wiley & Sons Ltd.

  7. An Effective Distributed Model for Power System Transient Stability Analysis

    Directory of Open Access Journals (Sweden)

    MUTHU, B. M.

    2011-08-01

    Full Text Available The modern power systems consist of many interconnected synchronous generators having different inertia constants, connected with large transmission network and ever increasing demand for power exchange. The size of the power system grows exponentially due to increase in power demand. The data required for various power system applications have been stored in different formats in a heterogeneous environment. The power system applications themselves have been developed and deployed in different platforms and language paradigms. Interoperability between power system applications becomes a major issue because of the heterogeneous nature. The main aim of the paper is to develop a generalized distributed model for carrying out power system stability analysis. The more flexible and loosely coupled JAX-RPC model has been developed for representing transient stability analysis in large interconnected power systems. The proposed model includes Pre-Fault, During-Fault, Post-Fault and Swing Curve services which are accessible to the remote power system clients when the system is subjected to large disturbances. A generalized XML based model for data representation has also been proposed for exchanging data in order to enhance the interoperability between legacy power system applications. The performance measure, Round Trip Time (RTT is estimated for different power systems using the proposed JAX-RPC model and compared with the results obtained using traditional client-server and Java RMI models.

  8. The physics analysis environment of the ZEUS experiment

    International Nuclear Information System (INIS)

    Bauerdick, L.A.T.; Derugin, O.; Gilkinson, D.; Kasemann, M.; Manczak, O.

    1995-12-01

    The ZEUS Experiment has over the last three years developed its own model of the central computing environment for physics analysis. This model has been designed to provide ZEUS physicists with powerful and user friendly tools for data analysis as well as to be truly scalable and open. (orig.)

  9. a New Initiative for Tiling, Stitching and Processing Geospatial Big Data in Distributed Computing Environments

    Science.gov (United States)

    Olasz, A.; Nguyen Thai, B.; Kristóf, D.

    2016-06-01

    Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage) nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats) are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/) developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu). The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows) written in different development frameworks (e.g. Python, R or C#). It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.

  10. A NEW INITIATIVE FOR TILING, STITCHING AND PROCESSING GEOSPATIAL BIG DATA IN DISTRIBUTED COMPUTING ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    A. Olasz

    2016-06-01

    Full Text Available Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/ developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu. The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows written in different development frameworks (e.g. Python, R or C#. It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.

  11. A Java based environment to control and monitor distributed processing systems

    International Nuclear Information System (INIS)

    Legrand, I.C.

    1997-01-01

    Distributed processing systems are considered to solve the challenging requirements of triggering and data acquisition systems for future HEP experiments. The aim of this work is to present a software environment to control and monitor large scale parallel processing systems based on a distributed client-server approach developed in Java. One server task may control several processing nodes, switching elements or controllers for different sub-systems. Servers are designed as multi-thread applications for efficient communications with other objects. Servers communicate between themselves by using Remote Method Invocation (RMI) in a peer-to-peer mechanism. This distributed server layer has to provide a dynamic and transparent access from any client to all the resources in the system. The graphical user interface programs, which are platform independent, may be transferred to any client via the http protocol. In this scheme the control and monitor tasks are distributed among servers and network controls the flow of information among servers and clients providing a flexible mechanism for monitoring and controlling large heterogenous distributed systems. (author)

  12. Distributed open environment for data retrieval based on pattern recognition techniques

    International Nuclear Information System (INIS)

    Pereira, A.; Vega, J.; Castro, R.; Portas, A.

    2010-01-01

    Pattern recognition methods for data retrieval have been applied to fusion databases for the localization and extraction of similar waveforms within temporal evolution signals. In order to standardize the use of these methods, a distributed open environment has been designed. It is based on a client/server architecture that supports distribution, interoperability and portability between heterogeneous platforms. The server part is a single desktop application based on J2EE (Java 2 Enterprise Edition), which provides a mature standard framework and a modular architecture. It can handle transactions and concurrency of components that are deployed on JETTY, an embedded web container within the Java server application for providing HTTP services. The data management is based on Apache DERBY, a relational database engine also embedded on the same Java based solution. This encapsulation allows hiding of unnecessary details about the installation, distribution, and configuration of all these components but with the flexibility to create and allocate many databases on different servers. The DERBY network module increases the scope of the installed database engine by providing traditional Java database network connections (JDBC-TCP/IP). This avoids scattering several database engines (a unique embedded engine defines the rules for accessing the distributed data). Java thin clients (Java 5 or above is the unique requirement) can be executed in the same computer than the server program (for example a desktop computer) but also server and client software can be distributed in a remote participation environment (wide area networks). The thin client provides graphic user interface to look for patterns (entire waveforms or specific structural forms) and display the most similar ones. This is obtained with HTTP requests and by generating dynamic content (servlets) in response to these client requests.

  13. Distributed Open Environment for Data Retrieval based on Pattern Recognition Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, A.; Vega, J.; Castro, R.; Portas, A. [Association EuratomCIEMAT para Fusion, Madrid (Spain)

    2009-07-01

    Full text of publication follows: Pattern recognition methods for data retrieval have been applied to fusion databases for the localization and extraction of similar waveforms within temporal evolution signals. In order to standardize the use of these methods, a distributed open environment has been designed. It is based on a client/server architecture that supports distribution, inter-operability and portability between heterogeneous platforms. The server part is a single desktop application based on J2EE, which provides a mature standard framework and a modular architecture. It can handle transactions and competition of components that are deployed on JETTY, an embedded web container within the Java server application for providing HTTP services. The data management is based on Apache DERBY, a relational database engine also embedded on the same Java based solution. This encapsulation allows concealment of unnecessary details about the installation, distribution, and configuration of all these components but with the flexibility to create and allocate many databases on different servers. The DERBY network module increases the scope of the installed database engine by providing traditional Java database network connections (JDBC-TCP/IP). This avoids scattering several database engines (a unique embedded engine defines the rules for accessing the distributed data). Java thin clients (Java 5 or above is the unique requirement) can be executed in the same computer than the server program (for example a desktop computer) but also server and client software can be distributed in a remote participation environment (wide area networks). The thin client provides graphic user interface to look for patterns (entire waveforms or specific structural forms) and display the most similar ones. This is obtained with HTTP requests and by generating dynamic content (servlets) in response to these client requests. (authors)

  14. Distributed open environment for data retrieval based on pattern recognition techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, A., E-mail: augusto.pereira@ciemat.e [Asociacion EURATOM/CIEMAT para Fusion, CIEMAT, Edificio 66, Avda. Complutense, 22, 28040 Madrid (Spain); Vega, J.; Castro, R.; Portas, A. [Asociacion EURATOM/CIEMAT para Fusion, CIEMAT, Edificio 66, Avda. Complutense, 22, 28040 Madrid (Spain)

    2010-07-15

    Pattern recognition methods for data retrieval have been applied to fusion databases for the localization and extraction of similar waveforms within temporal evolution signals. In order to standardize the use of these methods, a distributed open environment has been designed. It is based on a client/server architecture that supports distribution, interoperability and portability between heterogeneous platforms. The server part is a single desktop application based on J2EE (Java 2 Enterprise Edition), which provides a mature standard framework and a modular architecture. It can handle transactions and concurrency of components that are deployed on JETTY, an embedded web container within the Java server application for providing HTTP services. The data management is based on Apache DERBY, a relational database engine also embedded on the same Java based solution. This encapsulation allows hiding of unnecessary details about the installation, distribution, and configuration of all these components but with the flexibility to create and allocate many databases on different servers. The DERBY network module increases the scope of the installed database engine by providing traditional Java database network connections (JDBC-TCP/IP). This avoids scattering several database engines (a unique embedded engine defines the rules for accessing the distributed data). Java thin clients (Java 5 or above is the unique requirement) can be executed in the same computer than the server program (for example a desktop computer) but also server and client software can be distributed in a remote participation environment (wide area networks). The thin client provides graphic user interface to look for patterns (entire waveforms or specific structural forms) and display the most similar ones. This is obtained with HTTP requests and by generating dynamic content (servlets) in response to these client requests.

  15. Distributed interactive virtual environments for collaborative experiential learning and training independent of distance over Internet2.

    Science.gov (United States)

    Alverson, Dale C; Saiki, Stanley M; Jacobs, Joshua; Saland, Linda; Keep, Marcus F; Norenberg, Jeffrey; Baker, Rex; Nakatsu, Curtis; Kalishman, Summers; Lindberg, Marlene; Wax, Diane; Mowafi, Moad; Summers, Kenneth L; Holten, James R; Greenfield, John A; Aalseth, Edward; Nickles, David; Sherstyuk, Andrei; Haines, Karen; Caudell, Thomas P

    2004-01-01

    Medical knowledge and skills essential for tomorrow's healthcare professionals continue to change faster than ever before creating new demands in medical education. Project TOUCH (Telehealth Outreach for Unified Community Health) has been developing methods to enhance learning by coupling innovations in medical education with advanced technology in high performance computing and next generation Internet2 embedded in virtual reality environments (VRE), artificial intelligence and experiential active learning. Simulations have been used in education and training to allow learners to make mistakes safely in lieu of real-life situations, learn from those mistakes and ultimately improve performance by subsequent avoidance of those mistakes. Distributed virtual interactive environments are used over distance to enable learning and participation in dynamic, problem-based, clinical, artificial intelligence rules-based, virtual simulations. The virtual reality patient is programmed to dynamically change over time and respond to the manipulations by the learner. Participants are fully immersed within the VRE platform using a head-mounted display and tracker system. Navigation, locomotion and handling of objects are accomplished using a joy-wand. Distribution is managed via the Internet2 Access Grid using point-to-point or multi-casting connectivity through which the participants can interact. Medical students in Hawaii and New Mexico (NM) participated collaboratively in problem solving and managing of a simulated patient with a closed head injury in VRE; dividing tasks, handing off objects, and functioning as a team. Students stated that opportunities to make mistakes and repeat actions in the VRE were extremely helpful in learning specific principles. VRE created higher performance expectations and some anxiety among VRE users. VRE orientation was adequate but students needed time to adapt and practice in order to improve efficiency. This was also demonstrated successfully

  16. A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods

    Science.gov (United States)

    Ritter, Nicola L.

    2012-01-01

    Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…

  17. Non-uniform distribution pattern for differentially expressed genes of transgenic rice Huahui 1 at different developmental stages and environments.

    Directory of Open Access Journals (Sweden)

    Zhi Liu

    Full Text Available DNA microarray analysis is an effective method to detect unintended effects by detecting differentially expressed genes (DEG in safety assessment of genetically modified (GM crops. With the aim to reveal the distribution of DEG of GM crops under different conditions, we performed DNA microarray analysis using transgenic rice Huahui 1 (HH1 and its non-transgenic parent Minghui 63 (MH63 at different developmental stages and environmental conditions. Considerable DEG were selected in each group of HH1 under different conditions. For each group of HH1, the number of DEG was different; however, considerable common DEG were shared between different groups of HH1. These findings suggested that both DEG and common DEG were adequate for investigation of unintended effects. Furthermore, a number of significantly changed pathways were found in all groups of HH1, indicating genetic modification caused everlasting changes to plants. To our knowledge, our study for the first time provided the non-uniformly distributed pattern for DEG of GM crops at different developmental stages and environments. Our result also suggested that DEG selected in GM plants at specific developmental stage and environment could act as useful clues for further evaluation of unintended effects of GM plants.

  18. Telefacturing Based Distributed Manufacturing Environment for Optimal Manufacturing Service by Enhancing the Interoperability in the Hubs

    Directory of Open Access Journals (Sweden)

    V. K. Manupati

    2017-01-01

    Full Text Available Recent happenings are surrounding the manufacturing sector leading to intense progress towards the development of effective distributed collaborative manufacturing environments. This evolving collaborative manufacturing not only focuses on digitalisation of this environment but also necessitates service-dependent manufacturing system that offers an uninterrupted approach to a number of diverse, complicated, dynamic manufacturing operations management systems at a common work place (hub. This research presents a novel telefacturing based distributed manufacturing environment for recommending the manufacturing services based on the user preferences. The first step in this direction is to deploy the most advanced tools and techniques, that is, Ontology-based Protégé 5.0 software for transforming the huge stored knowledge/information into XML schema of Ontology Language (OWL documents and Integration of Process Planning and Scheduling (IPPS for multijobs in a collaborative manufacturing system. Thereafter, we also investigate the possibilities of allocation of skilled workers to the best feasible operations sequence. In this context, a mathematical model is formulated for the considered objectives, that is, minimization of makespan and total training cost of the workers. With an evolutionary algorithm and developed heuristic algorithm, the performance of the proposed manufacturing system has been improved. Finally, to manifest the capability of the proposed approach, an illustrative example from the real-time manufacturing industry is validated for optimal service recommendation.

  19. Determination of hydraulic conductivity from grain-size distribution for different depositional environments

    KAUST Repository

    Rosas, Jorge

    2013-06-06

    Over 400 unlithified sediment samples were collected from four different depositional environments in global locations and the grain-size distribution, porosity, and hydraulic conductivity were measured using standard methods. The measured hydraulic conductivity values were then compared to values calculated using 20 different empirical equations (e.g., Hazen, Carman-Kozeny) commonly used to estimate hydraulic conductivity from grain-size distribution. It was found that most of the hydraulic conductivity values estimated from the empirical equations correlated very poorly to the measured hydraulic conductivity values with errors ranging to over 500%. To improve the empirical estimation methodology, the samples were grouped by depositional environment and subdivided into subgroups based on lithology and mud percentage. The empirical methods were then analyzed to assess which methods best estimated the measured values. Modifications of the empirical equations, including changes to special coefficients and addition of offsets, were made to produce modified equations that considerably improve the hydraulic conductivity estimates from grain size data for beach, dune, offshore marine, and river sediments. Estimated hydraulic conductivity errors were reduced to 6 to 7.1m/day for the beach subgroups, 3.4 to 7.1m/day for dune subgroups, and 2.2 to 11m/day for offshore sediments subgroups. Improvements were made for river environments, but still produced high errors between 13 and 23m/day. © 2013, National Ground Water Association.

  20. Interaction Control Protocols for Distributed Multi-user Multi-camera Environments

    Directory of Open Access Journals (Sweden)

    Gareth W Daniel

    2003-10-01

    Full Text Available Video-centred communication (e.g., video conferencing, multimedia online learning, traffic monitoring, and surveillance is becoming a customary activity in our lives. The management of interactions in such an environment is a complicated HCI issue. In this paper, we present our study on a collection of interaction control protocols for distributed multiuser multi-camera environments. These protocols facilitate different approaches to managing a user's entitlement for controlling a particular camera. We describe a web-based system that allows multiple users to manipulate multiple cameras in varying remote locations. The system was developed using the Java framework, and all protocols discussed have been incorporated into the system. Experiments were designed and conducted to evaluate the effectiveness of these protocols, and to enable the identification of various human factors in a distributed multi-user and multi-camera environment. This work provides an insight into the complexity associated with the interaction management in video-centred communication. It can also serve as a conceptual and experimental framework for further research in this area.

  1. Determination of hydraulic conductivity from grain-size distribution for different depositional environments

    KAUST Repository

    Rosas, Jorge; Lopez Valencia, Oliver Miguel; Missimer, Thomas M.; Coulibaly, Kapo M.; Dehwah, Abdullah; Sesler, Kathryn; Rodri­ guez, Luis R. Lujan; Mantilla, David

    2013-01-01

    Over 400 unlithified sediment samples were collected from four different depositional environments in global locations and the grain-size distribution, porosity, and hydraulic conductivity were measured using standard methods. The measured hydraulic conductivity values were then compared to values calculated using 20 different empirical equations (e.g., Hazen, Carman-Kozeny) commonly used to estimate hydraulic conductivity from grain-size distribution. It was found that most of the hydraulic conductivity values estimated from the empirical equations correlated very poorly to the measured hydraulic conductivity values with errors ranging to over 500%. To improve the empirical estimation methodology, the samples were grouped by depositional environment and subdivided into subgroups based on lithology and mud percentage. The empirical methods were then analyzed to assess which methods best estimated the measured values. Modifications of the empirical equations, including changes to special coefficients and addition of offsets, were made to produce modified equations that considerably improve the hydraulic conductivity estimates from grain size data for beach, dune, offshore marine, and river sediments. Estimated hydraulic conductivity errors were reduced to 6 to 7.1m/day for the beach subgroups, 3.4 to 7.1m/day for dune subgroups, and 2.2 to 11m/day for offshore sediments subgroups. Improvements were made for river environments, but still produced high errors between 13 and 23m/day. © 2013, National Ground Water Association.

  2. Multi-VO support in IHEP's distributed computing environment

    International Nuclear Information System (INIS)

    Yan, T; Suo, B; Zhao, X H; Zhang, X M; Ma, Z T; Yan, X F; Lin, T; Deng, Z Y; Li, W D; Belov, S; Pelevanyuk, I; Zhemchugov, A; Cai, H

    2015-01-01

    Inspired by the success of BESDIRAC, the distributed computing environment based on DIRAC for BESIII experiment, several other experiments operated by Institute of High Energy Physics (IHEP), such as Circular Electron Positron Collider (CEPC), Jiangmen Underground Neutrino Observatory (JUNO), Large High Altitude Air Shower Observatory (LHAASO) and Hard X-ray Modulation Telescope (HXMT) etc, are willing to use DIRAC to integrate the geographically distributed computing resources available by their collaborations. In order to minimize manpower and hardware cost, we extended the BESDIRAC platform to support multi-VO scenario, instead of setting up a self-contained distributed computing environment for each VO. This makes DIRAC as a service for the community of those experiments. To support multi-VO, the system architecture of BESDIRAC is adjusted for scalability. The VOMS and DIRAC servers are reconfigured to manage users and groups belong to several VOs. A lightweight storage resource manager StoRM is employed as the central SE to integrate local and grid data. A frontend system is designed for user's massive job splitting, submission and management, with plugins to support new VOs. A monitoring and accounting system is also considered to easy the system administration and VO related resources usage accounting. (paper)

  3. CLINICAL SURFACES - Activity-Based Computing for Distributed Multi-Display Environments in Hospitals

    Science.gov (United States)

    Bardram, Jakob E.; Bunde-Pedersen, Jonathan; Doryab, Afsaneh; Sørensen, Steffen

    A multi-display environment (MDE) is made up of co-located and networked personal and public devices that form an integrated workspace enabling co-located group work. Traditionally, MDEs have, however, mainly been designed to support a single “smart room”, and have had little sense of the tasks and activities that the MDE is being used for. This paper presents a novel approach to support activity-based computing in distributed MDEs, where displays are physically distributed across a large building. CLINICAL SURFACES was designed for clinical work in hospitals, and enables context-sensitive retrieval and browsing of patient data on public displays. We present the design and implementation of CLINICAL SURFACES, and report from an evaluation of the system at a large hospital. The evaluation shows that using distributed public displays to support activity-based computing inside a hospital is very useful for clinical work, and that the apparent contradiction between maintaining privacy of medical data in a public display environment can be mitigated by the use of CLINICAL SURFACES.

  4. A distributed reasoning engine ecosystem for semantic context-management in smart environments.

    Science.gov (United States)

    Almeida, Aitor; López-de-Ipiña, Diego

    2012-01-01

    To be able to react adequately a smart environment must be aware of the context and its changes. Modeling the context allows applications to better understand it and to adapt to its changes. In order to do this an appropriate formal representation method is needed. Ontologies have proven themselves to be one of the best tools to do it. Semantic inference provides a powerful framework to reason over the context data. But there are some problems with this approach. The inference over semantic context information can be cumbersome when working with a large amount of data. This situation has become more common in modern smart environments where there are a lot sensors and devices available. In order to tackle this problem we have developed a mechanism to distribute the context reasoning problem into smaller parts in order to reduce the inference time. In this paper we describe a distributed peer-to-peer agent architecture of context consumers and context providers. We explain how this inference sharing process works, partitioning the context information according to the interests of the agents, location and a certainty factor. We also discuss the system architecture, analyzing the negotiation process between the agents. Finally we compare the distributed reasoning with the centralized one, analyzing in which situations is more suitable each approach.

  5. A Distributed Snapshot Protocol for Efficient Artificial Intelligence Computation in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    JongBeom Lim

    2018-01-01

    Full Text Available Many artificial intelligence applications often require a huge amount of computing resources. As a result, cloud computing adoption rates are increasing in the artificial intelligence field. To support the demand for artificial intelligence applications and guarantee the service level agreement, cloud computing should provide not only computing resources but also fundamental mechanisms for efficient computing. In this regard, a snapshot protocol has been used to create a consistent snapshot of the global state in cloud computing environments. However, the existing snapshot protocols are not optimized in the context of artificial intelligence applications, where large-scale iterative computation is the norm. In this paper, we present a distributed snapshot protocol for efficient artificial intelligence computation in cloud computing environments. The proposed snapshot protocol is based on a distributed algorithm to run interconnected multiple nodes in a scalable fashion. Our snapshot protocol is able to deal with artificial intelligence applications, in which a large number of computing nodes are running. We reveal that our distributed snapshot protocol guarantees the correctness, safety, and liveness conditions.

  6. E-Commerce Strategic Business Environment Analysis in Indonesia

    OpenAIRE

    Aribawa, Dwitya

    2016-01-01

    This research is aim to identified important factors in external business environment that tend to influence company capabilities to achieve objective. It conducts to several e-commerce in Indonesia. Those companies operate several industries, such as grocery and household retail, fashion retail, electronic and gadget retail and travel agency booking provider. We conduct thematic analysis with quad helix stakeholders approach. It found that the firm faces several external environment factors ...

  7. Design of a distributed simulation environment for building control applications based on systems engineering methodology

    NARCIS (Netherlands)

    Yahiaoui, Azzedine

    2018-01-01

    The analysis of innovative designs that distributes control to buildings over a network is currently a challenging task as exciting building performance simulation tools do not offer sufficient capabilities and the flexibility to fully respond to the full complexity of Automated Buildings (ABs). For

  8. Pigeon interaction mode switch-based UAV distributed flocking control under obstacle environments.

    Science.gov (United States)

    Qiu, Huaxin; Duan, Haibin

    2017-11-01

    Unmanned aerial vehicle (UAV) flocking control is a serious and challenging problem due to local interactions and changing environments. In this paper, a pigeon flocking model and a pigeon coordinated obstacle-avoiding model are proposed based on a behavior that pigeon flocks will switch between hierarchical and egalitarian interaction mode at different flight phases. Owning to the similarity between bird flocks and UAV swarms in essence, a distributed flocking control algorithm based on the proposed pigeon flocking and coordinated obstacle-avoiding models is designed to coordinate a heterogeneous UAV swarm to fly though obstacle environments with few informed individuals. The comparative simulation results are elaborated to show the feasibility, validity and superiority of our proposed algorithm. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Probabilistic analysis in normal operation of distribution system with distributed generation

    DEFF Research Database (Denmark)

    Villafafila-Robles, R.; Sumper, A.; Bak-Jensen, B.

    2011-01-01

    Nowadays, the incorporation of high levels of small-scale non-dispatchable distributed generation is leading to the transition from the traditional 'vertical' power system structure to a 'horizontally-operated' power system, where the distribution networks contain both stochastic generation...... and load. This fact increases the number of stochastic inputs and dependence structures between them need to be considered. The deterministic analysis is not enough to cope with these issues and a new approach is needed. Probabilistic analysis provides a better approach. Moreover, as distribution systems...

  10. Family size, the physical environment, and socioeconomic effects across the stature distribution.

    Science.gov (United States)

    Carson, Scott Alan

    2012-04-01

    A neglected area in historical stature studies is the relationship between stature and family size. Using robust statistics and a large 19th century data set, this study documents a positive relationship between stature and family size across the stature distribution. The relationship between material inequality and health is the subject of considerable debate, and there was a positive relationship between stature and wealth and an inverse relationship between stature and material inequality. After controlling for family size and wealth variables, the paper reports a positive relationship between the physical environment and stature. Copyright © 2012 Elsevier GmbH. All rights reserved.

  11. Multi-objective optimization with estimation of distribution algorithm in a noisy environment.

    Science.gov (United States)

    Shim, Vui Ann; Tan, Kay Chen; Chia, Jun Yong; Al Mamun, Abdullah

    2013-01-01

    Many real-world optimization problems are subjected to uncertainties that may be characterized by the presence of noise in the objective functions. The estimation of distribution algorithm (EDA), which models the global distribution of the population for searching tasks, is one of the evolutionary computation techniques that deals with noisy information. This paper studies the potential of EDAs; particularly an EDA based on restricted Boltzmann machines that handles multi-objective optimization problems in a noisy environment. Noise is introduced to the objective functions in the form of a Gaussian distribution. In order to reduce the detrimental effect of noise, a likelihood correction feature is proposed to tune the marginal probability distribution of each decision variable. The EDA is subsequently hybridized with a particle swarm optimization algorithm in a discrete domain to improve its search ability. The effectiveness of the proposed algorithm is examined via eight benchmark instances with different characteristics and shapes of the Pareto optimal front. The scalability, hybridization, and computational time are rigorously studied. Comparative studies show that the proposed approach outperforms other state of the art algorithms.

  12. A two-population sporadic meteoroid bulk density distribution and its implications for environment models

    Science.gov (United States)

    Moorhead, Althea V.; Blaauw, Rhiannon C.; Moser, Danielle E.; Campbell-Brown, Margaret D.; Brown, Peter G.; Cooke, William J.

    2017-12-01

    The bulk density of a meteoroid affects its dynamics in space, its ablation in the atmosphere, and the damage it does to spacecraft and lunar or planetary surfaces. Meteoroid bulk densities are also notoriously difficult to measure, and we are typically forced to assume a density or attempt to measure it via a proxy. In this paper, we construct a density distribution for sporadic meteoroids based on existing density measurements. We considered two possible proxies for density: the KB parameter introduced by Ceplecha and Tisserand parameter, TJ. Although KB is frequently cited as a proxy for meteoroid material properties, we find that it is poorly correlated with ablation-model-derived densities. We therefore follow the example of Kikwaya et al. in associating density with the Tisserand parameter. We fit two density distributions to meteoroids originating from Halley-type comets (TJ 2); the resulting two-population density distribution is the most detailed sporadic meteoroid density distribution justified by the available data. Finally, we discuss the implications for meteoroid environment models and spacecraft risk assessments. We find that correcting for density increases the fraction of meteoroid-induced spacecraft damage produced by the helion/antihelion source.

  13. Distributed bearing fault diagnosis based on vibration analysis

    Science.gov (United States)

    Dolenc, Boštjan; Boškoski, Pavle; Juričić, Đani

    2016-01-01

    Distributed bearing faults appear under various circumstances, for example due to electroerosion or the progression of localized faults. Bearings with distributed faults tend to generate more complex vibration patterns than those with localized faults. Despite the frequent occurrence of such faults, their diagnosis has attracted limited attention. This paper examines a method for the diagnosis of distributed bearing faults employing vibration analysis. The vibrational patterns generated are modeled by incorporating the geometrical imperfections of the bearing components. Comparing envelope spectra of vibration signals shows that one can distinguish between localized and distributed faults. Furthermore, a diagnostic procedure for the detection of distributed faults is proposed. This is evaluated on several bearings with naturally born distributed faults, which are compared with fault-free bearings and bearings with localized faults. It is shown experimentally that features extracted from vibrations in fault-free, localized and distributed fault conditions form clearly separable clusters, thus enabling diagnosis.

  14. Analysis of the moral habitability of the nursing work environment.

    Science.gov (United States)

    Peter, Elizabeth H; Macfarlane, Amy V; O'Brien-Pallas, Linda L

    2004-08-01

    Following health reform, nurses have experienced the tremendous stress of heavy workloads, long hours and difficult professional responsibilities. In recognition of these problems, a study was conducted that examined the impact of the working environment on the health of nurses. After conducting focus groups across Canada with nurses and others well acquainted with nursing issues, it became clear that the difficult work environments described had significant ethical implications. The aim of this paper is to report the findings of research that examined the moral habitability of the nursing working environment. A secondary analysis was conducted using the theoretical work of Margaret Urban Walker. Moral practices and responsibilities from Walker's perspective cannot be extricated from other social roles, practices and divisions of labour. Moral-social orders, such as work environments in this research, must be made transparent to examine their moral habitability. Morally habitable environments are those in which differently situated people experience their responsibilities as intelligible and coherent. They also foster recognition, cooperation and shared benefits. Four overarching categories were developed through the analysis of the data: (1) oppressive work environments; (2) incoherent moral understandings; (3) moral suffering and (4) moral influence and resistance. The findings clearly indicate that participants perceived the work environment to be morally uninhabitable. The social and spatial positioning of nurses left them vulnerable to being overburdened by and unsure of their responsibilities. Nevertheless, nurses found meaningful ways to resist and to influence the moral environment. We recommend that nurses develop strong moral identities, make visible the inseparability of their proximity to patients and moral accountability, and further identify what forms of collective action are most effective in improving the moral habitability of their work

  15. A Modeling Framework for Schedulability Analysis of Distributed Avionics Systems

    DEFF Research Database (Denmark)

    Han, Pujie; Zhai, Zhengjun; Nielsen, Brian

    2018-01-01

    This paper presents a modeling framework for schedulability analysis of distributed integrated modular avionics (DIMA) systems that consist of spatially distributed ARINC-653 modules connected by a unified AFDX network. We model a DIMA system as a set of stopwatch automata (SWA) in UPPAAL...

  16. User-friendly Tool for Power Flow Analysis and Distributed ...

    African Journals Online (AJOL)

    Akorede

    AKOREDE et al: TOOL FOR POWER FLOW ANALYSIS AND DISTRIBUTED GENERATION OPTIMISATION. 23 ... greenhouse gas emissions and the current deregulation of electric energy ..... Visual composition and temporal behaviour of GUI.

  17. Analysis of color environment in nuclear power plants

    International Nuclear Information System (INIS)

    Natori, Kazuyuki; Akagi, Ichiro; Souma, Ichiro; Hiraki, Tadao; Sakurai, Yukihiro.

    1996-01-01

    This article reports the results of color and psychological analysis of the outlook of nuclear power plants and the visual environments inside of the plants. Study one was the color measurements of the outlook of nuclear plants and the visual environment inside of the plants. Study two was a survey of the impressions on the visual environments of nuclear plants obtained from observers and interviews of the workers. Through these analysis, we have identified the present state of, and the problems of the color environments of the nuclear plants. In the next step, we have designed the color environments of inside and outside of the nuclear plants which we would recommend (inside designs were about fuel handling room, operation floor of turbine building, observers' pathways, central control room, rest room for the operators). Study three was the survey about impressions on our design inside and outside of the nuclear plants. Nuclear plant observers, residents in Osaka city, residents near the nuclear plants, the operators, employees of subsidiary company and the PR center guides rated their impressions on the designs. Study four was the survey about the design of the rest room for the operators controlling the plants. From the results of four studies, we have proposed some guidelines and problems about the future planning about the visual environments of nuclear power plants. (author)

  18. Virtual environments for scene of crime reconstruction and analysis

    Science.gov (United States)

    Howard, Toby L. J.; Murta, Alan D.; Gibson, Simon

    2000-02-01

    This paper describes research conducted in collaboration with Greater Manchester Police (UK), to evalute the utility of Virtual Environments for scene of crime analysis, forensic investigation, and law enforcement briefing and training. We present an illustrated case study of the construction of a high-fidelity virtual environment, intended to match a particular real-life crime scene as closely as possible. We describe and evaluate the combination of several approaches including: the use of the Manchester Scene Description Language for constructing complex geometrical models; the application of a radiosity rendering algorithm with several novel features based on human perceptual consideration; texture extraction from forensic photography; and experiments with interactive walkthroughs and large-screen stereoscopic display of the virtual environment implemented using the MAVERIK system. We also discuss the potential applications of Virtual Environment techniques in the Law Enforcement and Forensic communities.

  19. Gamma-ray Burst Formation Environment: Comparison of Redshift Distributions of GRB Afterglows

    Directory of Open Access Journals (Sweden)

    Sung-Eun Kim

    2005-12-01

    Full Text Available Since gamma-ray bursts(GRBs have been first known to science societites in 1973, many scientists are involved in their studies. Observations of GRB afterglows provide us with much information on the environment in which the observed GRBs are born. Study of GRB afterglows deals with longer timescale emissions in lower energy bands (e.g., months or even up to years than prompt emissions in gamma-rays. Not all the bursts accompany afterglows in whole ranges of wavelengths. It has been suggested as a reason for that, for instance, that radio and/or X-ray afterglows are not recorded mainly due to lower sensitivity of detectors, and optical afterglows due to extinctions in intergalactic media or self-extinctions within a host galaxy itself. Based on the idea that these facts may also provide information on the GRB environment, we analyze statistical properties of GRB afterglows. We first select samples of the redshift-known GRBs according to the wavelength of afterglow they accompanied. We then compare their distributions as a function of redshift, using statistical methods. As a results, we find that the distribution of the GRBs with X-ray afterglows is consistent with that of the GRBs with optical afterglows. We, therefore, conclude that the lower detection rate of optical afterglows is not due to extinctions in intergalactic media.

  20. On the Moments and the Distribution of Aggregate Discounted Claims in a Markovian Environment

    Directory of Open Access Journals (Sweden)

    Shuanming Li

    2018-05-01

    Full Text Available This paper studies the moments and the distribution of the aggregate discounted claims (ADCs in a Markovian environment, where the claim arrivals, claim amounts, and forces of interest (for discounting are influenced by an underlying Markov process. Specifically, we assume that claims occur according to a Markovian arrival process (MAP. The paper shows that the vector of joint Laplace transforms of the ADC occurring in each state of the environment process by any specific time satisfies a matrix-form first-order partial differential equation, through which a recursive formula is derived for the moments of the ADC occurring in certain states (a subset. We also study two types of covariances of the ADC occurring in any two subsets of the state space and with two different time lengths. The distribution of the ADC occurring in certain states by any specific time is also investigated. Numerical results are also presented for a two-state Markov-modulated model case.

  1. Distribution of hydrocarbon-degrading bacteria in the soil environment and their contribution to bioremediation.

    Science.gov (United States)

    Fukuhara, Yuki; Horii, Sachie; Matsuno, Toshihide; Matsumiya, Yoshiki; Mukai, Masaki; Kubo, Motoki

    2013-05-01

    A real-time PCR quantification method for indigenous hydrocarbon-degrading bacteria (HDB) carrying the alkB gene in the soil environment was developed to investigate their distribution in soil. The detection limit of indigenous HDB by the method was 1 × 10(6) cells/g-soil. The indigenous HDB were widely distributed throughout the soil environment and ranged from 3.7 × 10(7) to 5.0 × 10(8) cells/g-soil, and the ratio to total bacteria was 0.1-4.3 %. The dynamics of total bacteria, indigenous HDB, and Rhodococcus erythropolis NDKK6 (carrying alkB R2) during bioremediation were analyzed. During bioremediation with an inorganic nutrient treatment, the numbers of these bacteria were slightly increased. The numbers of HDB (both indigenous bacteria and strain NDKK6) were gradually decreased from the middle stage of bioremediation. Meanwhile, the numbers of these bacteria were highly increased and were maintained during bioremediation with an organic nutrient. The organic treatment led to activation of not only the soil bacteria but also the HDB, so an efficient bioremediation was carried out.

  2. Comparison of indoor air distribution and thermal environment for different combinations of radiant heating systems with mechanical ventilation systems

    DEFF Research Database (Denmark)

    Wu, Xiaozhou; Fang, Lei; Olesen, Bjarne W.

    2018-01-01

    A hybrid system with a radiant heating system and a mechanical ventilation system, which is regarded as an advanced heating, ventilation and air-conditioning (HVAC) system, has been applied in many modern buildings worldwide. To date, almost no studies focused on comparative analysis of the indoor...... air distribution and the thermal environment for all combinations of radiant heating systems with mechanical ventilation systems. Therefore, in this article, the indoor air distribution and the thermal environment were comparatively analyzed in a room with floor heating (FH) or ceiling heating (CH......) and mixing ventilation (MV) or displacement ventilation (DV) when the supply air temperature ranged from 15.0°C to 19.0°C. The results showed that the temperature effectiveness values were 1.05–1.16 and 0.95–1.02 for MV+ FH and MV+ CH, respectively, and they were 0.78–0.91 and 0.51–0.67 for DV + FH and DV...

  3. ANALYSIS OF VIRTUAL ENVIRONMENT BENEFIT IN E-LEARNING

    Directory of Open Access Journals (Sweden)

    NOVÁK, Martin

    2013-06-01

    Full Text Available The analysis of the virtual environment assets towards the e-learning process improvements is mentioned in this article. The virtual environment was created within the solution of the project ‘Virtualization’ at the Faculty of Economics and Administration, University of Pardubice. The aim of this project was to eliminate the disproportion of free access to licensed software between groups of part-time and full-time students. The research was realized within selected subjects of the study program System Engineering and Informatics. The subjects were connected to the informatics, applied informatics, control and decision making. Student subject results, student feedback based on electronic questionnaire and data from log file of virtual server usage were compared and analysed. Based on analysis of virtualization possibilities the solution of virtual environment was implemented through Microsoft Terminal Server.

  4. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas

    2002-01-01

    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic in...... were compared to investigate computational differences and separation results. The ICA properties were finally implemented in a chat room analysis tool and briefly investigated for visualization of search engines results....

  5. The distribution of helium isotopes of natural gas and tectonic environment

    International Nuclear Information System (INIS)

    Sun Mingliang; Tao Mingxin

    1993-01-01

    Based on the 3 He/ 4 He data of the main oil-gas bearing basins in continental China, a systematic study has been made for the first time on the relations between the space distribution of the helium isotopes of natural gas and the tectonic environment. The average value R-bar of 3 He/ 4 He in eastern China bordering on the Pacific Ocean is 2.08 x 10 -6 >Ra, and that is dualistic mixed helium containing mantle source helium. The R-bar of central and western China is 4.96 x 10 -8 , and that is mainly crust source radioactive helium. The R-bar of Huabei and Zhongyuan oil-gas fields is 8.00 x 10 -7 , and that is a kind of transitional helium intercepted between the eastern region and the central western region of China. On the whole, the 3 He/ 4 He values decrease gradually with the distance from the subduction zone of the Western Pacific Ocean. The results show that the space distributions of the helium isotopes is controlled by the tectonic environment, that is the environment of tensile rift, especially in the neighborhood of deep megafractures advantageous to the rise of mantle source helium, so then and there the 3 He/ 4 He value is the highest; In the most stable craton basins, the value is the lowest and the helium is a typical crust source radioactive one. Between the active area (rift) and stable area, there is the transitional helium and its value is 10 -7 , as is the case in Huabei-Zhongyuan oil-gas field

  6. Separating the effects of environment and space on tree species distribution: from population to community.

    Science.gov (United States)

    Lin, Guojun; Stralberg, Diana; Gong, Guiquan; Huang, Zhongliang; Ye, Wanhui; Wu, Linfang

    2013-01-01

    Quantifying the relative contributions of environmental conditions and spatial factors to species distribution can help improve our understanding of the processes that drive diversity patterns. In this study, based on tree inventory, topography and soil data from a 20-ha stem-mapped permanent forest plot in Guangdong Province, China, we evaluated the influence of different ecological processes at different spatial scales using canonical redundancy analysis (RDA) at the community level and multiple linear regression at the species level. At the community level, the proportion of explained variation in species distribution increased with grid-cell sizes, primarily due to a monotonic increase in the explanatory power of environmental variables. At the species level, neither environmental nor spatial factors were important determinants of overstory species' distributions at small cell sizes. However, purely spatial variables explained most of the variation in the distributions of understory species at fine and intermediate cell sizes. Midstory species showed patterns that were intermediate between those of overstory and understory species. At the 20-m cell size, the influence of spatial factors was stronger for more dispersal-limited species, suggesting that much of the spatial structuring in this community can be explained by dispersal limitation. Comparing environmental factors, soil variables had higher explanatory power than did topography for species distribution. However, both topographic and edaphic variables were highly spatial structured. Our results suggested that dispersal limitation has an important influence on fine-intermediate scale (from several to tens of meters) species distribution, while environmental variability facilitates species distribution at intermediate (from ten to tens of meters) and broad (from tens to hundreds of meters) scales.

  7. POPULATION STRUCTURE AND SPATIAL DISTRIBUTION OF Ceratozamia mexicana BRONGN. (ZAMIACEAE IN PRESERVED AND DISTURBED ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    Andrés Rivera-Fernández

    2012-11-01

    Full Text Available Vegetal populations are affected by biotic and abiotic factors that influence the regeneration processes. The aims of this study were to know the population structure of Ceratozamia mexicana under two contrasting conditions (conserved site and disturbed site, and to determine if the sexual structure, the population density and the spatial distribution of C. mexicana are modified by effect of disturbance. Eight plots of 25 m2 within each site (conserved and disturbed were used. The structure and spatial distribution of the sites were determined. Methods included analysis of variance, spatial distribution indexes, and climatic and edaphic factors determined by conventional methods for their comparison. The conserved site showed a demographic structure of an inverted "J", while the disturbed site varied slightly with more discontinuous distribution. Population density was 0.78 individuals/m2 in the conserved site and 0.26 individuals/m2 in the disturbed site. Spatial distribution for all development stages of the plant was random, with the exception of the seedling stage, which was aggregated. Results showed that perturbation decreases the density of plants and removes reproductive individuals, which threatens the persistence of the population.

  8. Spatial Distribution of Viruses Associated with Planktonic and Attached Microbial Communities in Hydrothermal Environments

    Science.gov (United States)

    Nunoura, Takuro; Kazama, Hiromi; Noguchi, Takuroh; Inoue, Kazuhiro; Akashi, Hironori; Yamanaka, Toshiro; Toki, Tomohiro; Yamamoto, Masahiro; Furushima, Yasuo; Ueno, Yuichiro; Yamamoto, Hiroyuki; Takai, Ken

    2012-01-01

    Viruses play important roles in marine surface ecosystems, but little is known about viral ecology and virus-mediated processes in deep-sea hydrothermal microbial communities. In this study, we examined virus-like particle (VLP) abundances in planktonic and attached microbial communities, which occur in physical and chemical gradients in both deep and shallow submarine hydrothermal environments (mixing waters between hydrothermal fluids and ambient seawater and dense microbial communities attached to chimney surface areas or macrofaunal bodies and colonies). We found that viruses were widely distributed in a variety of hydrothermal microbial habitats, with the exception of the interior parts of hydrothermal chimney structures. The VLP abundance and VLP-to-prokaryote ratio (VPR) in the planktonic habitats increased as the ratio of hydrothermal fluid to mixing water increased. On the other hand, the VLP abundance in attached microbial communities was significantly and positively correlated with the whole prokaryotic abundance; however, the VPRs were always much lower than those for the surrounding hydrothermal waters. This is the first report to show VLP abundance in the attached microbial communities of submarine hydrothermal environments, which presented VPR values significantly lower than those in planktonic microbial communities reported before. These results suggested that viral lifestyles (e.g., lysogenic prevalence) and virus interactions with prokaryotes are significantly different among the planktonic and attached microbial communities that are developing in the submarine hydrothermal environments. PMID:22210205

  9. Dynamic Distribution and Layouting of Model-Based User Interfaces in Smart Environments

    Science.gov (United States)

    Roscher, Dirk; Lehmann, Grzegorz; Schwartze, Veit; Blumendorf, Marco; Albayrak, Sahin

    The developments in computer technology in the last decade change the ways of computer utilization. The emerging smart environments make it possible to build ubiquitous applications that assist users during their everyday life, at any time, in any context. But the variety of contexts-of-use (user, platform and environment) makes the development of such ubiquitous applications for smart environments and especially its user interfaces a challenging and time-consuming task. We propose a model-based approach, which allows adapting the user interface at runtime to numerous (also unknown) contexts-of-use. Based on a user interface modelling language, defining the fundamentals and constraints of the user interface, a runtime architecture exploits the description to adapt the user interface to the current context-of-use. The architecture provides automatic distribution and layout algorithms for adapting the applications also to contexts unforeseen at design time. Designers do not specify predefined adaptations for each specific situation, but adaptation constraints and guidelines. Furthermore, users are provided with a meta user interface to influence the adaptations according to their needs. A smart home energy management system serves as running example to illustrate the approach.

  10. Neutron activation analysis applied to energy and environment

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1975-01-01

    Neutron activation analysis was applied to a number of problems concerned with energy production and the environment. Burning of fossil fuel, the search for new sources of uranium, possible presence of toxic elements in food and water, and the relationship of trace elements to cardiovascular disease are some of the problems in which neutron activation was used. (auth)

  11. PROTONEGOCIATIONS - SALES FORECAST AND COMPETITIVE ENVIRONMENT ANALYSIS METHOD

    Directory of Open Access Journals (Sweden)

    Lupu Adrian Gelu

    2009-05-01

    Full Text Available Protonegotiation management, as part of successful negotiations of the organizations, is an issue for analysis extremely important for today’s managers in the confrontations generated by the changes of the environments in the period of transition to marke

  12. Estimation of Speciation and Distribution of {sup 131}I in urban and natural Environments

    Energy Technology Data Exchange (ETDEWEB)

    Hormann, Volker; Fischer, Helmut W. [University of Bremen, Institute of Environmental Physics, Otto-Hahn-Allee 1, 28359 Bremen (Germany)

    2014-07-01

    {sup 131}I is a radionuclide that may be introduced into natural and urban environments via several pathways. As a result of nuclear accidents it may be washed out from air or settle onto the ground by dry deposition. In urban landscapes this is again washed out by rain, partly introduced into the sewer system and thus transported to the next wastewater plant where it may accumulate in certain compartments. In rural landscapes it may penetrate the soil and be more or less available to plant uptake depending on chemical and physical conditions. On a regular basis, {sup 131}I is released into the urban sewer system in the course of therapeutic and diagnostic treatment of patients with thyroid diseases. The speciation of iodine in the environment is complex. Depending on redox state and biological activity, it may appear as I{sup -}, IO{sub 3}{sup -}, I{sub 2} or bound to organic molecules (e.g. humic acids). Moreover, some of these species are bound to surfaces of particles suspended in water or present in soil, e.g. hydrous ferric oxides (HFO). It is to be expected that speciation and solid-liquid distribution of iodine strongly depends on environmental conditions. In this study, the speciation and solid-liquid distribution of iodine in environmental samples such as waste water, sewage sludge and soil are estimated with the help of the geochemical code PHREEQC. The calculations are carried out using chemical equilibrium and sorption data from the literature and chemical analyses of the media. We present the results of these calculations and compare them with experimental results of medical {sup 131}I in waste water and sewage sludge. The output of this study will be used in future work where transport and distribution of iodine in wastewater treatment plants and in irrigated agricultural soils will be modeled. (authors)

  13. Statistical Analysis Of Failure Strength Of Material Using Weibull Distribution

    International Nuclear Information System (INIS)

    Entin Hartini; Mike Susmikanti; Antonius Sitompul

    2008-01-01

    In evaluation of ceramic and glass materials strength a statistical approach is necessary Strength of ceramic and glass depend on its measure and size distribution of flaws in these material. The distribution of strength for ductile material is narrow and close to a Gaussian distribution while strength of brittle materials as ceramic and glass following Weibull distribution. The Weibull distribution is an indicator of the failure of material strength resulting from a distribution of flaw size. In this paper, cumulative probability of material strength to failure probability, cumulative probability of failure versus fracture stress and cumulative probability of reliability of material were calculated. Statistical criteria calculation supporting strength analysis of Silicon Nitride material were done utilizing MATLAB. (author)

  14. Integrated Design Engineering Analysis (IDEA) Environment Automated Generation of Structured CFD Grids using Topology Methods

    Science.gov (United States)

    Kamhawi, Hilmi N.

    2012-01-01

    This report documents the work performed from March 2010 to March 2012. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML) as a framework and supporting the configuration design and parametric CFD grid generation. This report will focus on describing the work in the area of parametric CFD grid generation using novel concepts for defining the interaction between the mesh topology and the geometry in such a way as to separate the mesh topology from the geometric topology while maintaining the link between the mesh topology and the actual geometry.

  15. Mosquito distribution in a saltmarsh: determinants of eggs in a variable environment.

    Science.gov (United States)

    Rowbottom, Raylea; Carver, Scott; Barmuta, Leon A; Weinstein, Philip; Allen, Geoff R

    2017-06-01

    Two saltmarsh mosquitoes dominate the transmission of Ross River virus (RRV, Togoviridae: Alphavirus), one of Australia's most prominent mosquito-borne diseases. Ecologically, saltmarshes vary in their structure, including habitat types, hydrological regimes, and diversity of aquatic fauna, all of which drive mosquito oviposition behavior. Understanding the distribution of vector mosquitoes within saltmarshes can inform early warning systems, surveillance, and management of vector populations. The aim of this study was to identify the distribution of Ae. camptorhynchus, a known vector for RRV, across a saltmarsh and investigate the influence that other invertebrate assemblage might have on Ae. camptorhynchus egg dispersal. We demonstrate that vegetation is a strong indicator for Ae. camptorhynchus egg distribution, and this was not correlated with elevation or other invertebrates located at this saltmarsh. Also, habitats within this marsh are less frequently inundated, resulting in dryer conditions. We conclude that this information can be applied in vector surveillance and monitoring of temperate saltmarsh environments and also provides a baseline for future investigations into understanding mosquito vector habitat requirements. © 2017 The Society for Vector Ecology.

  16. [Relationships of bee population fluctuation and distribution with natural environment in Anhui province].

    Science.gov (United States)

    Yu, Linsheng; Zou, Yunding; Bi, Shoudong; Wu, Houzhang; Cao, Yifeng

    2006-08-01

    In 2002 to approximately 2004, an investigation was made on the bee population dynamics and its relationships with the ecological environment in four ecological regions of Anhui Province. The results indicated that in the mountainous areas of south and west Anhui, there were 46 and 37 species of nectariferous plants, and the distribution density of Apis cerena cerena population was 2.01 and 1.95 colony x km(-2), respectively. In Jianghuai area and Huaibei plain, there were 17 and 12 species of nectariferous plants, which had concentrated and short flowering period and fitted for Apis mellifera Ligustica oysterring and producing, and the distribution density of Apis cerena cerena population was 0. 06 and 0. 02 colony x km(-2), respectively. Bee population fluctuation and distribution was affected by wasp predation. The breeding proportion of Apis cerena cerena to local apis population was 41.5%, 36.8%, 3.1% and 1.1%, and that of Apis mellifera Ligustica was 58.5%, 63.2%, 96.9% and 98.9% in the mountainous areas of south and west Anhui, Jianghuai area, and Huaibei plain, respectively.

  17. Using high performance interconnects in a distributed computing and mass storage environment

    International Nuclear Information System (INIS)

    Ernst, M.

    1994-01-01

    Detector Collaborations of the HERA Experiments typically involve more than 500 physicists from a few dozen institutes. These physicists require access to large amounts of data in a fully transparent manner. Important issues include Distributed Mass Storage Management Systems in a Distributed and Heterogeneous Computing Environment. At the very center of a distributed system, including tens of CPUs and network attached mass storage peripherals are the communication links. Today scientists are witnessing an integration of computing and communication technology with the open-quote network close-quote becoming the computer. This contribution reports on a centrally operated computing facility for the HERA Experiments at DESY, including Symmetric Multiprocessor Machines (84 Processors), presently more than 400 GByte of magnetic disk and 40 TB of automoted tape storage, tied together by a HIPPI open-quote network close-quote. Focussing on the High Performance Interconnect technology, details will be provided about the HIPPI based open-quote Backplane close-quote configured around a 20 Gigabit/s Multi Media Router and the performance and efficiency of the related computer interfaces

  18. Modeling and analysis of solar distributed generation

    Science.gov (United States)

    Ortiz Rivera, Eduardo Ivan

    power point tracking algorithms. Finally, several PV power applications will be presented like circuit analysis for a load connected to two different PV arrays, speed control for a do motor connected to a PVM, and a novel single phase photovoltaic inverter system using the Z-source converter.

  19. Modelling and analysis of distributed simulation protocols with distributed graph transformation

    OpenAIRE

    Lara, Juan de; Taentzer, Gabriele

    2005-01-01

    Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. J. de Lara, and G. Taentzer, "Modelling and analysis of distributed simulation protocols with distributed graph transformation...

  20. GALAXY ENVIRONMENTS OVER COSMIC TIME: THE NON-EVOLVING RADIAL GALAXY DISTRIBUTIONS AROUND MASSIVE GALAXIES SINCE z = 1.6

    International Nuclear Information System (INIS)

    Tal, Tomer; Van Dokkum, Pieter G.; Leja, Joel; Franx, Marijn; Wake, David A.; Whitaker, Katherine E.

    2013-01-01

    We present a statistical study of the environments of massive galaxies in four redshift bins between z = 0.04 and z = 1.6, using data from the Sloan Digital Sky Survey and the NEWFIRM Medium Band Survey. We measure the projected radial distribution of galaxies in cylinders around a constant number density selected sample of massive galaxies and utilize a statistical subtraction of contaminating sources. Our analysis shows that massive primary galaxies typically live in group halos and are surrounded by 2-3 satellites with masses more than one-tenth of the primary galaxy mass. The cumulative stellar mass in these satellites roughly equals the mass of the primary galaxy itself. We further find that the radial number density profile of galaxies around massive primaries has not evolved significantly in either slope or overall normalization in the past 9.5 Gyr. A simplistic interpretation of this result can be taken as evidence for a lack of mergers in the studied groups and as support for a static evolution model of halos containing massive primaries. Alternatively, there exists a tight balance between mergers and accretion of new satellites such that the overall distribution of galaxies in and around the halo is preserved. The latter interpretation is supported by a comparison to a semi-analytic model, which shows a similar constant average satellite distribution over the same redshift range.

  1. Prototype for a generic thin-client remote analysis environment for CMS

    International Nuclear Information System (INIS)

    Steenberg, C.D.; Bunn, J.J.; Hickey, T.M.; Holtman, K.; Legrand, I.; Litvin, V.; Newman, H.B.; Samar, A.; Singh, S.; Wilkinson, R.

    2001-01-01

    The multi-tiered architecture of the highly-distributed CMS computing systems necessitates a flexible data distribution and analysis environment. The authors describe a prototype analysis environment which functions efficiently over wide area networks using a server installed at the Caltech/UCSD Tier 2 prototype to analyze CMS data stored at various locations using a thin client. The analysis environment is based on existing HEP (Anaphe) and CMS (CARF, ORCA, IGUANA) software technology on the server accessed from a variety of clients. A Java Analysis Studio (JAS, from SLAC) plug-in is being developed as a reference client. The server is operated as a 'black box' on the proto-Tier2 system. ORCA objectivity databases (e.g. an existing large CMS Muon sample) are hosted on the master and slave nodes, and remote clients can request processing of queries across the server nodes, and get the histogram results returned and rendered in the client. The server is implemented using pure C++, and use XML-RPC as a language-neutral transport. This has several benefits, including much better scalability, better integration with CARF-ORCA, and importantly, makes the work directly useful to other non-Java general-purpose analysis and presentation tools such as Hippodraw, Lizard, or ROOT

  2. Comparative analysis through probability distributions of a data set

    Science.gov (United States)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  3. WCET Analysis of Java Bytecode Featuring Common Execution Environments

    DEFF Research Database (Denmark)

    Luckow, Kasper Søe; Thomsen, Bent; Frost, Christian

    2011-01-01

    We present a novel tool for statically determining the Worst Case Execution Time (WCET) of Java Bytecode-based programs called Tool for Execution Time Analysis of Java bytecode (TetaJ). This tool differentiates itself from existing tools by separating the individual constituents of the execution...... environment into independent components. The prime benefit is that it can be used for execution environments featuring common embedded processors and software implementations of the JVM. TetaJ employs a model checking approach for statically determining WCET where the Java program, the JVM, and the hardware...

  4. Distribution of lod scores in oligogenic linkage analysis.

    Science.gov (United States)

    Williams, J T; North, K E; Martin, L J; Comuzzie, A G; Göring, H H; Blangero, J

    2001-01-01

    In variance component oligogenic linkage analysis it can happen that the residual additive genetic variance bounds to zero when estimating the effect of the ith quantitative trait locus. Using quantitative trait Q1 from the Genetic Analysis Workshop 12 simulated general population data, we compare the observed lod scores from oligogenic linkage analysis with the empirical lod score distribution under a null model of no linkage. We find that zero residual additive genetic variance in the null model alters the usual distribution of the likelihood-ratio statistic.

  5. System analysis and planning of a gas distribution network

    Energy Technology Data Exchange (ETDEWEB)

    Salas, Edwin F.M.; Farias, Helio Monteiro [AUTOMIND, Rio de Janeiro, RJ (Brazil); Costa, Carla V.R. [Universidade Salvador (UNIFACS), BA (Brazil)

    2009-07-01

    The increase in demand by gas consumers require that projects or improvements in gas distribution networks be made carefully and safely to ensure a continuous, efficient and economical supply. Gas distribution companies must ensure that the networks and equipment involved are defined and designed at the appropriate time to attend to the demands of the market. To do that a gas distribution network analysis and planning tool should use distribution networks and transmission models for the current situation and the future changes to be implemented. These models are used to evaluate project options and help in making appropriate decisions in order to minimize the capital investment in new components or simple changes in operational procedures. Gas demands are increasing and it is important that gas distribute design new distribution systems to ensure this growth, considering financial constraints of the company, as well as local legislation and regulation. In this study some steps of developing a flexible system that attends to those needs will be described. The analysis of distribution requires geographically referenced data for the models as well as an accurate connectivity and the attributes of the equipment. GIS systems are often used as a deposit center that holds the majority of this information. GIS systems are constantly updated as distribution network equipment is modified. The distribution network modeling gathered from this system ensures that the model represents the current network condition. The benefits of this architecture drastically reduce the creation and maintenance cost of the network models, because network components data are conveniently made available to populate the distribution network. This architecture ensures that the models are continually reflecting the reality of the distribution network. (author)

  6. Heavy-tailed distribution of the SSH Brute-force attack duration in a multi-user environment

    Science.gov (United States)

    Lee, Jae-Kook; Kim, Sung-Jun; Park, Chan Yeol; Hong, Taeyoung; Chae, Huiseung

    2016-07-01

    Quite a number of cyber-attacks to be place against supercomputers that provide highperformance computing (HPC) services to public researcher. Particularly, although the secure shell protocol (SSH) brute-force attack is one of the traditional attack methods, it is still being used. Because stealth attacks that feign regular access may occur, they are even harder to detect. In this paper, we introduce methods to detect SSH brute-force attacks by analyzing the server's unsuccessful access logs and the firewall's drop events in a multi-user environment. Then, we analyze the durations of the SSH brute-force attacks that are detected by applying these methods. The results of an analysis of about 10 thousands attack source IP addresses show that the behaviors of abnormal users using SSH brute-force attacks are based on human dynamic characteristics of a typical heavy-tailed distribution.

  7. The development of a distributed computing environment for the design and modeling of plasma spectroscopy experiments

    International Nuclear Information System (INIS)

    Nash, J.K.; Eme, W.G.; Lee, R.W.; Salter, J.M.

    1994-10-01

    The design and analysis of plasma spectroscopy experiments can be significantly complicated by relatively routine computational tasks arising from the massive amount of data encountered in the experimental design and analysis stages of the work. Difficulties in obtaining, computing, manipulating and visualizing the information represent not simply an issue of convenience -- they have a very real limiting effect on the final quality of the data and on the potential for arriving at meaningful conclusions regarding an experiment. We describe ongoing work in developing a portable UNIX environment shell with the goal of simplifying and enabling these activities for the plasma-modeling community. Applications to the construction of atomic kinetics models and to the analysis of x-ray transmission spectroscopy will be shown

  8. Multiuser Diversity with Adaptive Modulation in Non-Identically Distributed Nakagami Fading Environments

    KAUST Repository

    Rao, Anlei

    2012-09-08

    In this paper, we analyze the performance of adaptive modulation with single-cell multiuser scheduling over independent but not identical distributed (i.n.i.d.) Nakagami fading channels. Closed-form expressions are derived for the average channel capacity, spectral efficiency, and bit-error-rate (BER) for both constant-power variable-rate and variable-power variable-rate uncoded/coded M-ary quadrature amplitude modulation (M-QAM) schemes. We also study the impact of time delay on the average BER of adaptive M-QAM. Selected numerical results show that the multiuser diversity brings a considerably better performance even over i.n.i.d. fading environments.

  9. Patient identity management for secondary use of biomedical research data in a distributed computing environment.

    Science.gov (United States)

    Nitzlnader, Michael; Schreier, Günter

    2014-01-01

    Dealing with data from different source domains is of increasing importance in today's large scale biomedical research endeavours. Within the European Network for Cancer research in Children and Adolescents (ENCCA) a solution to share such data for secondary use will be established. In this paper the solution arising from the aims of the ENCCA project and regulatory requirements concerning data protection and privacy is presented. Since the details of secondary biomedical dataset utilisation are often not known in advance, data protection regulations are met with an identity management concept that facilitates context-specific pseudonymisation and a way of data aggregation using a hidden reference table later on. Phonetic hashing is proposed to prevent duplicated patient registration and re-identification of patients is possible via a trusted third party only. Finally, the solution architecture allows for implementation in a distributed computing environment, including cloud-based elements.

  10. Implementing GIS regression trees for generating the spatial distribution of copper in Mediterranean environments

    DEFF Research Database (Denmark)

    Bou Kheir, Rania; Greve, Mogens Humlekrog; Deroin, Jean-Paul

    2013-01-01

    Soil contamination by heavy metals has become a widespread dangerous problem in many parts of the world, including the Mediterranean environments. This is closely related to the increase irrigation by waste waters, to the uncontrolled application of sewage sludge, industrial effluents, pesticides...... and fertilizers, to the rapid urbanization, to the atmospheric deposition of dust and aerosols, to the vehicular emissions and to many other negative human activities. In this context, this paper predicts the spatial distribution and concentration level of copper (Cu) in the 195km2 of Nahr el-Jawz watershed......H, hydraulical conductivity, organic matter, stoniness ratio, soil depth, slope gradient, slope aspect, slope curvature, land cover/use, distance to drainage line, proximity to roads, nearness to cities, and surroundings to waste areas) were generated from satellite imageries, Digital Elevation Models (DEMs...

  11. MODELING OF WATER DISTRIBUTION SYSTEM PARAMETERS AND THEIR PARTICULAR IMPORTANCE IN ENVIRONMENT ENGINEERING PROCESSES

    Directory of Open Access Journals (Sweden)

    Agnieszka Trębicka

    2016-05-01

    Full Text Available The object of this study is to present a mathematical model of water-supply network and the analysis of basic parameters of water distribution system with a digital model. The reference area is Kleosin village, municipality Juchnowiec Kościelny in podlaskie province, located at the border with Białystok. The study focused on the significance of every change related to the quality and quantity of water delivered to WDS through modeling the basic parameters of water distribution system in different variants of work in order to specify new, more rational ways of exploitation (decrease in pressure value and to define conditions for development and modernization of the water-supply network, with special analysis of the scheme, in frames of specification of the most dangerous places in the network. The analyzed processes are based on copying and developing the existing state of water distribution sub-system (the WDS with the use of mathematical modeling that includes the newest accessible computer techniques.

  12. LHCb Distributed Data Analysis on the Computing Grid

    CERN Document Server

    Paterson, S; Parkes, C

    2006-01-01

    LHCb is one of the four Large Hadron Collider (LHC) experiments based at CERN, the European Organisation for Nuclear Research. The LHC experiments will start taking an unprecedented amount of data when they come online in 2007. Since no single institute has the compute resources to handle this data, resources must be pooled to form the Grid. Where the Internet has made it possible to share information stored on computers across the world, Grid computing aims to provide access to computing power and storage capacity on geographically distributed systems. LHCb software applications must work seamlessly on the Grid allowing users to efficiently access distributed compute resources. It is essential to the success of the LHCb experiment that physicists can access data from the detector, stored in many heterogeneous systems, to perform distributed data analysis. This thesis describes the work performed to enable distributed data analysis for the LHCb experiment on the LHC Computing Grid.

  13. An integrated economic and distributional analysis of energy policies

    International Nuclear Information System (INIS)

    Labandeira, Xavier; Labeaga, Jose M.; Rodriguez, Miguel

    2009-01-01

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  14. An integrated economic and distributional analysis of energy policies

    Energy Technology Data Exchange (ETDEWEB)

    Labandeira, Xavier [Facultade de CC. Economicas, University of Vigo, 36310 Vigo (Spain); Labeaga, Jose M. [Instituto de Estudios Fiscales, Avda. Cardenal Herrera Oria 378, 28035 Madrid (Spain); Rodriguez, Miguel [Facultade de CC. Empresariais e Turismo, University of Vigo, 32004 Ourense (Spain)

    2009-12-15

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  15. Relationships between the Raindrop Size Distribution and Properties of the Environment and Clouds Inferred from TRMM

    Science.gov (United States)

    Munchak, Stephen Joseph; Kummerow, Christian; Elsaesser, Gregory

    2013-01-01

    Variability in the raindrop sized distribution (DSD) has long been recognized as a source of uncertainty in relationships between radar reflectivity Z and rain rate R. In this study, we analyze DSD retrievals from two years of data gathered by the Tropical Rainfall Measuring Mission (TRMM) satellite and processed with a combined radar-radiometer retrieval algorithm over the global oceans equatorward of 35?. Numerous variables describing properties of each reflectivity profile, large-scale organization, and the background environment are examined for relationships to the reflectivity-normalized median drop diameter, epsilonDSD. In general, we find that higher freezing levels and relative humidities are associated with smaller epsilonDSD. Within a given environment, the mesoscale organization of precipitation and the vertical profile of reflectivity are associated with DSD characteristics. In the tropics, the smallest epsilonDSD values are found in large but shallow convective systems, where warm rain formation processes are thought to be predominant, whereas larger sizes are found in the stratiform regions of organized deep convection. In the extratropics, the largest epsilonDSD values are found in the scattered convection that occurs when cold, dry continental air moves over the much warmer ocean after the passage of a cold front. The geographical distribution of the retrieved DSDs is consistent with many of the observed regional Z-R relationships found in the literature as well as discrepancies between the TRMM radar-only and radiometer-only precipitation products. In particular, mid-latitude and tropical regions near land tend to have larger drops for a given reflectivity, whereas the smallest drops are found in the eastern Pacific Intertropical Convergence Zone.

  16. Environment

    DEFF Research Database (Denmark)

    Valentini, Chiara

    2017-01-01

    The term environment refers to the internal and external context in which organizations operate. For some scholars, environment is defined as an arrangement of political, economic, social and cultural factors existing in a given context that have an impact on organizational processes and structures....... For others, environment is a generic term describing a large variety of stakeholders and how these interact and act upon organizations. Organizations and their environment are mutually interdependent and organizational communications are highly affected by the environment. This entry examines the origin...... and development of organization-environment interdependence, the nature of the concept of environment and its relevance for communication scholarships and activities....

  17. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    This paper investigates the characteristics of typical optimisation models within Distribution Network Design. During the paper fourteen models known from the literature will be thoroughly analysed. Through this analysis a schematic approach to categorisation of distribution network design models...... for educational purposes. Furthermore, the paper can be seen as a practical introduction to network design modelling as well as a being an art manual or recipe when constructing such a model....

  18. Retrospective analysis of 'gamma distribution' based IMRT QA criteria

    International Nuclear Information System (INIS)

    Wen, C.; Chappell, R.A.

    2010-01-01

    Full text: IMRT has been implemented into clinical practice at Royal Hobart Hospital (RHH) since mid 2006 for treating patients with Head and Neck (H and N) or prostate tumours. A local quality assurance (QA) acceptance criteria based on 'gamma distribution' for approving IMRT plan was developed and implemented in early 2007. A retrospective analysis of such criteria over 194 clinical cases will be presented. The RHH IMRT criteria was established with assumption that gamma distribution obtained through inter-comparison of 2 D dose maps between planned and delivered was governed by a positive-hail' normal distribution. A commercial system-MapCheck was used for 2 D dose map comparison with a built-in gamma analysis tool. Gamma distribution histogram was generated and recorded for all cases. By retrospectively analysing those distributions using curve fitting technique, a statistical gamma distribution can be obtained and evaluated. This analytical result can be used for future IMRT planing and treatment delivery. The analyses indicate that gamma distribution obtained through MapCheckTM is well under the normal distribution, particularly for prostate cases. The applied pass/fail criteria is not overly sensitive to identify 'false fails' but can be further tighten-up for smaller field while for larger field found in both H and N and prostate cases, the criteria was correctly applied. Non-uniform distribution of detectors in MapCheck and experience level of planners are two major factors to variation in gamma distribution among clinical cases. This criteria derived from clinical statistics is superior and more accurate than single-valued criteria for lMRT QA acceptance procedure. (author)

  19. Temporal distribution of Pu and Am in the Marine Environment of Southern Coast of Spain

    Energy Technology Data Exchange (ETDEWEB)

    Iranzo, E.; Gasco, C.; Romero, L.; Martinez, A.; Mingarro, E.; Rivas, P.

    1989-07-01

    The distribution of transuranides and 137Cs in selected sediment cores from southern Spanish coast has been studied. The area is of special interest as an accidental release of transuranides (four nuclear bombs fallen down) occurred near the seaside in 1966 (Palomares). The possible transference to the coast by the river (contaminated by one bomb, surface contamination: 12-120 {alpha} Bq/m2) or by aerosol produced in the explosion had never been studied. Several sediments cores between Cape of Palos and Cape of Gata and surface sediments were raised. The profiles of transuranides distribution were determined and were dated in order to determine the year of the contributions. An enhanced concentration of transuranides has been observed at southern area of Palomares coast, the causes of this fenomena have been analyzed in this paper. The accuracy of radiochemical analysis was checked by analyzing standard material and duplicated samples. (Author)

  20. Temporal distribution of Pu and Am in the Marine Environment of Southern Coast of Spain

    International Nuclear Information System (INIS)

    Iranzo, E.; Gasco, C.; Romero, L.; Martinez, A.; Mingarro, E.; Rivas, P.

    1989-01-01

    The distribution of transuranides and 137Cs in selected sediment cores from southern Spanish coast has been studied. The area is of special interest as an accidental release of transuranides (four nuclear bombs fallen down) occurred near the seaside in 1966 (Palomares). The possible transference to the coast by the river (contaminated by one bomb, surface contamination: 12-120 α Bq/m2) or by aerosol produced in the explosion had never been studied. Several sediments cores between Cape of Palos and Cape of Gata and surface sediments were raised. The profiles of transuranides distribution were determined and were dated in order to determine the year of the contributions. An enhanced concentration of transuranides has been observed at southern area of Palomares coast, the causes of this fenomena have been analyzed in this paper. The accuracy of radiochemical analysis was checked by analyzing standard material and duplicated samples. (Author)

  1. Temporal distribution of Pu and Am in the marine environment of southern coast of Spain

    International Nuclear Information System (INIS)

    Iranzo, E.; Gascon, C.; Romero, L.; Martinez, M.; Mingarro, E.; Rivas, P.

    1989-01-01

    The distribution of trasuranides and Cs-137 in selected sediment cores from southern Spanish coast has been studied. The area is of special interest as an accidental release of transuranides (four nuclear bombs fallen down) occured near the seasidse in 1966 (Palomare-s). The possible transference to the coast by the river (contaminated by one bomb, surface contamination: 12-120 alpha Bq/m2) or by aerosol produced in the explosion had never been estudied. Several sediments cores between Cape of Palos ands Cape of Gata and surface sediments were raised. The profiles of transuranides distribution were determined and were dated in order to determine the year of the contributions. An enhaced concentration of transuranides has been observed at southern area of Palomares coast, the causes of this fenomena have been analyzed in this paper. The accuracy of radiochemical analysis was checked by analyzing standard material and duplicated samples. (Author). 15 figs.; 27 refs

  2. Proceedings of the scientific meeting on 'behavior and distributions of trace substances in the environment'

    International Nuclear Information System (INIS)

    Fukui, M.; Matsuzuru, H.

    1998-02-01

    The scientific meeting was held at the Research Reactor Institute, Kyoto University on December 11-12, 1997. This single report covers all aspects concerning association of trace substances such as pesticides/herbicides, organic chemicals and radionuclides in the environment. The reason for having this meeting is to describe the distribution and behavior of trace substances in which the emphasis is directed towards the dynamic interaction between the soil-sediment-water system and the contaminants. The Chernobyl accident raised the attention on the fate of radionuclides released in the environment and stimulated many scientists, who carry out large scale 'field experiments' without using a tracer in a laboratory. Of course, fundamental laboratory studies are necessary to give direction to and to understand observations from field studies. These activities have brought a lot of knowledge and understanding towards revealing a part of the complexity of the transport processes. It is hoped that the assembled experts, will not only dwell on distinct scientific issues, but also be able to draw firm conclusions with respect to the effective environmental management of the ecological aspects of hazardous materials. The 25 of the presented papers are indexed individually. (J.P.N.)

  3. Study on distribution and behavior of long-lived radionuclides in surface soil environment

    International Nuclear Information System (INIS)

    Morita, Shigemitsu; Watanabe, Hitoshi; Katagiri, Hiromi; Akatsu, Yasuo; Ishiguro, Hideharu

    1996-01-01

    Technetium-99 ( 99 Tc) and Neptunium-237 ( 237 Np) are important radionuclides for environmental assessment around nuclear fuel cycle facilities, because these have long-lives and relatively high mobility in the environment. Therefore, we have been studied the determination, distribution and behavior of such long-lived radionuclides in surface soil environment. A new analytical technique using Inductively Coupled Plasma Mass Spectrometry (ICP-MS) was applied to the determination of long-lived radionuclides in environmental samples. The determination method consists of dry ashing, anion exchange and solvent extraction to eliminate the interfering elements and ICP-MS measurement. The sensitivity of this method was 10 to 100,000 times higher, and the counting time was 300 to 100,000 times shorter than the conventional radioanalytical methods. The soil samples were collected at nine points and core soil sample was collected by an electric core sampler at one point. The core soil sample was divided into eight layers. The depth profiles showed that more than 90% of 99 Tc and 237 Np were retained in the surface layer up to 10cm in depth which contained much amount of organic materials. The results suggest that content of organic materials in soil is related to adsorption of 99 Tc and 237 Np onto soil. (author)

  4. Distribution and behavior of tritium in the environment near its sources

    International Nuclear Information System (INIS)

    Fukui, Masami

    1994-01-01

    Of the long-lived radionuclides migrating globally through the environment, tritium was investigated because it is a latent source of detectable pollution in the Kyoto University Research Reactor Institute (KURRI). Moreover, mass transport in an in situ situation could be studied as well. Discharge rates of tritium from the operation of reactors and reprocessing plants throughout the world are given together with data on the background level of tritium in the environment and the resulting collective effective dose equivalent commitment. The behavior of tritium that has leaked from a heavy water facility in the Kyoto University Research Reactor (KURR) is dealt with by modeling its transport between air, water, and building materials. The spatial distributions of tritium in the air within the facilities of the KURR and KUCA (Kyoto University Critical Assembly), as measured by a convenient monitoring method that uses small water basins as passive samplers also are shown. Tritium discharged as liquid effluents in a reservoir for monitoring and in a retention pond at the KURRI site was monitored, and its behavior clarified by use of a box model and/or a classical dispersion equation. The purpose of this research is to keep radiation exposure to workers and the public as low as reasonably achievable, taking into account economic and social considerations (the ALARA concept). (author)

  5. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    Efficient and cost effective transportation and logistics plays a vital role in the supply chains of the modern world’s manufacturers. Global distribution of goods is a very complicated matter as it involves many different distinct planning problems. The focus of this presentation is to demonstrate...... a number of important issues which have been identified when addressing the Distribution Network Design problem from a modelling angle. More specifically, we present an analysis of the research which has been performed in utilizing operational research in developing and optimising distribution systems....

  6. Scientific Visualization for Atmospheric Data Analysis in Collaborative Virtual Environments

    Science.gov (United States)

    Engelke, Wito; Flatken, Markus; Garcia, Arturo S.; Bar, Christian; Gerndt, Andreas

    2016-04-01

    1 INTRODUCTION The three year European research project CROSS DRIVE (Collaborative Rover Operations and Planetary Science Analysis System based on Distributed Remote and Interactive Virtual Environments) started in January 2014. The research and development within this project is motivated by three use case studies: landing site characterization, atmospheric science and rover target selection [1]. Currently the implementation for the second use case is in its final phase [2]. Here, the requirements were generated based on the domain experts input and lead to development and integration of appropriate methods for visualization and analysis of atmospheric data. The methods range from volume rendering, interactive slicing, iso-surface techniques to interactive probing. All visualization methods are integrated in DLR's Terrain Rendering application. With this, the high resolution surface data visualization can be enriched with additional methods appropriate for atmospheric data sets. This results in an integrated virtual environment where the scientist has the possibility to interactively explore his data sets directly within the correct context. The data sets include volumetric data of the martian atmosphere, precomputed two dimensional maps and vertical profiles. In most cases the surface data as well as the atmospheric data has global coverage and is of time dependent nature. Furthermore, all interaction is synchronized between different connected application instances, allowing for collaborative sessions between distant experts. 2 VISUALIZATION TECHNIQUES Also the application is currently used for visualization of data sets related to Mars the techniques can be used for other data sets as well. Currently the prototype is capable of handling 2 and 2.5D surface data as well as 4D atmospheric data. Specifically, the surface data is presented using an LoD approach which is based on the HEALPix tessellation of a sphere [3, 4, 5] and can handle data sets in the order of

  7. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2013-01-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system......, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...... results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles....

  8. The Usability Analysis of An E-Learning Environment

    Directory of Open Access Journals (Sweden)

    Fulya TORUN

    2015-10-01

    Full Text Available In this research, an E-learning environment is developed for the teacher candidates taking the course on Scientific Research Methods. The course contents were adapted to one of the constructivist approach models referred to as 5E, and an expert opinion was received for the compliance of this model. An usability analysis was also performed to determine the usability of the e-learning environment. The participants of the research comprised 42 teacher candidates. The mixed method was used in the research. 3 different data collection tools were used in order to measure the three basic concepts of usability analyses, which are the dimensions of effectiveness, efficiency and satisfaction. Two of the data collection tools were the scales developed by different researchers and were applied with the approval received from the researchers involved. On the other hand, the usability test as another data tool was prepared by the researchers who conducted this study for the purpose of determining the participants’ success in handling the twelve tasks assigned to them with respect to the use of elearning environment, the seconds they spent on that environment and the number of clicks they performed. Considering the results of the analyses performed within the data obtained, the usability of the developed e-learning environment proved to be at a higher rate.

  9. Balancing Uplink and Downlink under Asymmetric Traffic Environments Using Distributed Receive Antennas

    Science.gov (United States)

    Sohn, Illsoo; Lee, Byong Ok; Lee, Kwang Bok

    Recently, multimedia services are increasing with the widespread use of various wireless applications such as web browsers, real-time video, and interactive games, which results in traffic asymmetry between the uplink and downlink. Hence, time division duplex (TDD) systems which provide advantages in efficient bandwidth utilization under asymmetric traffic environments have become one of the most important issues in future mobile cellular systems. It is known that two types of intercell interference, referred to as crossed-slot interference, additionally arise in TDD systems; the performances of the uplink and downlink transmissions are degraded by BS-to-BS crossed-slot interference and MS-to-MS crossed-slot interference, respectively. The resulting performance unbalance between the uplink and downlink makes network deployment severely inefficient. Previous works have proposed intelligent time slot allocation algorithms to mitigate the crossed-slot interference problem. However, they require centralized control, which causes large signaling overhead in the network. In this paper, we propose to change the shape of the cellular structure itself. The conventional cellular structure is easily transformed into the proposed cellular structure with distributed receive antennas (DRAs). We set up statistical Markov chain traffic model and analyze the bit error performances of the conventional cellular structure and proposed cellular structure under asymmetric traffic environments. Numerical results show that the uplink and downlink performances of the proposed cellular structure become balanced with the proper number of DRAs and thus the proposed cellular structure is notably cost-effective in network deployment compared to the conventional cellular structure. As a result, extending the conventional cellular structure into the proposed cellular structure with DRAs is a remarkably cost-effective solution to support asymmetric traffic environments in future mobile cellular

  10. Open Science Grid (OSG) Ticket Synchronization: Keeping Your Home Field Advantage In A Distributed Environment

    International Nuclear Information System (INIS)

    Gross, Kyle; Hayashi, Soichi; Teige, Scott; Quick, Robert

    2012-01-01

    Large distributed computing collaborations, such as the Worldwide LHC Computing Grid (WLCG), face many issues when it comes to providing a working grid environment for their users. One of these is exchanging tickets between various ticketing systems in use by grid collaborations. Ticket systems such as Footprints, RT, Remedy, and ServiceNow all have different schema that must be addressed in order to provide a reliable exchange of information between support entities and users in different grid environments. To combat this problem, OSG Operations has created a ticket synchronization interface called GOC-TX that relies on web services instead of error-prone email parsing methods of the past. Synchronizing tickets between different ticketing systems allows any user or support entity to work on a ticket in their home environment, thus providing a familiar and comfortable place to provide updates without having to learn another ticketing system. The interface is built in a way that it is generic enough that it can be customized for nearly any ticketing system with a web-service interface with only minor changes. This allows us to be flexible and rapidly bring new ticket synchronization online. Synchronization can be triggered by different methods including mail, web services interface, and active messaging. GOC-TX currently interfaces with Global Grid User Support (GGUS) for WLCG, Remedy at Brookhaven National Lab (BNL), and Request Tracker (RT) at the Virtual Data Toolkit (VDT). Work is progressing on the Fermi National Accelerator Laboratory (FNAL) ServiceNow synchronization. This paper will explain the problems faced by OSG and how they led OSG to create and implement this ticket synchronization system along with the technical details that allow synchronization to be preformed at a production level.

  11. Open Science Grid (OSG) Ticket Synchronization: Keeping Your Home Field Advantage In A Distributed Environment

    Science.gov (United States)

    Gross, Kyle; Hayashi, Soichi; Teige, Scott; Quick, Robert

    2012-12-01

    Large distributed computing collaborations, such as the Worldwide LHC Computing Grid (WLCG), face many issues when it comes to providing a working grid environment for their users. One of these is exchanging tickets between various ticketing systems in use by grid collaborations. Ticket systems such as Footprints, RT, Remedy, and ServiceNow all have different schema that must be addressed in order to provide a reliable exchange of information between support entities and users in different grid environments. To combat this problem, OSG Operations has created a ticket synchronization interface called GOC-TX that relies on web services instead of error-prone email parsing methods of the past. Synchronizing tickets between different ticketing systems allows any user or support entity to work on a ticket in their home environment, thus providing a familiar and comfortable place to provide updates without having to learn another ticketing system. The interface is built in a way that it is generic enough that it can be customized for nearly any ticketing system with a web-service interface with only minor changes. This allows us to be flexible and rapidly bring new ticket synchronization online. Synchronization can be triggered by different methods including mail, web services interface, and active messaging. GOC-TX currently interfaces with Global Grid User Support (GGUS) for WLCG, Remedy at Brookhaven National Lab (BNL), and Request Tracker (RT) at the Virtual Data Toolkit (VDT). Work is progressing on the Fermi National Accelerator Laboratory (FNAL) ServiceNow synchronization. This paper will explain the problems faced by OSG and how they led OSG to create and implement this ticket synchronization system along with the technical details that allow synchronization to be preformed at a production level.

  12. A Distributed Control System Prototyping Environment to Support Control Room Modernization

    Energy Technology Data Exchange (ETDEWEB)

    Lew, Roger Thomas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ulrich, Thomas Anthony [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-12-01

    Operators of critical processes, such as nuclear power production, must contend with highly complex systems, procedures, and regulations. Developing human-machine interfaces (HMIs) that better support operators is a high priority for ensuring the safe and reliable operation of critical processes. Human factors engineering (HFE) provides a rich and mature set of tools for evaluating the performance of HMIs, however the set of tools for developing and designing HMIs is still in its infancy. Here we propose a rapid prototyping approach for integrating proposed HMIs into their native environments before a design is finalized. This approach allows researchers and developers to test design ideas and eliminate design flaws prior to fully developing the new system. We illustrate this approach with four prototype designs developed using Microsoft’s Windows Presentation Foundation (WPF). One example is integrated into a microworld environment to test the functionality of the design and identify the optimal level of automation for a new system in a nuclear power plant. The other three examples are integrated into a full-scale, glasstop digital simulator of a nuclear power plant. One example demonstrates the capabilities of next generation control concepts; another aims to expand the current state of the art; lastly, an HMI prototype was developed as a test platform for a new control system currently in development at U.S. nuclear power plants. WPF possesses several characteristics that make it well suited to HMI design. It provides a tremendous amount of flexibility, agility, robustness, and extensibility. Distributed control system (DCS) specific environments tend to focus on the safety and reliability requirements for real-world interfaces and consequently have less emphasis on providing functionality to support novel interaction paradigms. Because of WPF’s large user-base, Microsoft can provide an extremely mature tool. Within process control applications,WPF is

  13. Distribution of chlorpyrifos in rice paddy environment and its potential dietary risk.

    Science.gov (United States)

    Fu, Yan; Liu, Feifei; Zhao, Chenglin; Zhao, Ying; Liu, Yihua; Zhu, Guonian

    2015-09-01

    Chlorpyrifos is one of the most extensively used insecticides in China. The distribution and residues of chlorpyrifos in a paddy environment were characterized under field and laboratory conditions. The half-lives of chlorpyrifos in the two conditions were 0.9-3.8days (field) and 2.8-10.3days (laboratory), respectively. The initial distribution of chlorpyrifos followed the increasing order of waterchlorpyrifos was rather low compared to the acceptable daily intake (ADI=0.01mg/kg bw) due to rice consumption. The chronic exposure risk from chlorpyrifos in rice grain was 5.90% and 1.30% ADI from field and laboratory results respectively. Concerning the acute dietary exposure, intake estimated for the highest chlorpyrifos level did not exceed the acute reference dose (ARfD=0.1mg/kg bw). The estimated short-term intakes (ESTIs) were 0.78% and 0.25% of the ARfD for chlorpyrifos. The results showed that the use of chlorpyrifos in rice paddies was fairly safe for consumption of rice grain by consumers. Copyright © 2015. Published by Elsevier B.V.

  14. Optimizing NEURON Simulation Environment Using Remote Memory Access with Recursive Doubling on Distributed Memory Systems.

    Science.gov (United States)

    Shehzad, Danish; Bozkuş, Zeki

    2016-01-01

    Increase in complexity of neuronal network models escalated the efforts to make NEURON simulation environment efficient. The computational neuroscientists divided the equations into subnets amongst multiple processors for achieving better hardware performance. On parallel machines for neuronal networks, interprocessor spikes exchange consumes large section of overall simulation time. In NEURON for communication between processors Message Passing Interface (MPI) is used. MPI_Allgather collective is exercised for spikes exchange after each interval across distributed memory systems. The increase in number of processors though results in achieving concurrency and better performance but it inversely affects MPI_Allgather which increases communication time between processors. This necessitates improving communication methodology to decrease the spikes exchange time over distributed memory systems. This work has improved MPI_Allgather method using Remote Memory Access (RMA) by moving two-sided communication to one-sided communication, and use of recursive doubling mechanism facilitates achieving efficient communication between the processors in precise steps. This approach enhanced communication concurrency and has improved overall runtime making NEURON more efficient for simulation of large neuronal network models.

  15. Inputs and distributions of synthetic musk fragrances in an estuarine and coastal environment; a case study

    Energy Technology Data Exchange (ETDEWEB)

    Sumner, Nicola R. [School of Earth, Ocean and Environmental Science, University of Plymouth, Drake Circus, Plymouth PL4 8AA (United Kingdom); Plymouth Marine Laboratory, Prospect Place, The Hoe, Plymouth PL1 3DH (United Kingdom); Guitart, Carlos, E-mail: guitart.carlos@gmail.co [Plymouth Marine Laboratory, Prospect Place, The Hoe, Plymouth PL1 3DH (United Kingdom); Fuentes, Gustavo [Instituto Universitario de Tecnologia del Mar (IUTEMAR), Fundacion La Salle de Ciencias Naturales, Margarita Island (Venezuela, Bolivarian Republic of); Readman, James W. [Plymouth Marine Laboratory, Prospect Place, The Hoe, Plymouth PL1 3DH (United Kingdom)

    2010-01-15

    Synthetic musks are ubiquitous contaminants in the environment. Compartmental distributions (dissolved, suspended particle associated and sedimentary) of the compounds throughout an axial estuarine transect and in coastal waters are reported. High concentrations of Galaxolide (HHCB) and Tonalide (AHTN) (987-2098 ng/L and 55-159 ng/L, respectively) were encountered in final effluent samples from sewage treatment plants (STPs) discharging into the Tamar and Plym Estuaries (UK), with lower concentrations of Celestolide (ADBI) (4-13 ng/L), Phantolide (AHMI) (6-9 ng/L), musk xylene (MX) (4-7 ng/L) and musk ketone (MK) (18-30 ng/L). Rapid dilution from the outfalls is demonstrated with resulting concentrations of HHCB spanning from 5 to 30 ng/L and those for AHTN from 3 to 15 ng/L. The other musks were generally not detected in the estuarine and coastal waters. The suspended particulate matter (SPM) and sedimentary profiles and compositions (HHCB:AHTN ratios) generally reflect the distribution in the water column with highest concentrations adjacent to sewage outfalls. - Synthetic musks were determined in coastal environmental compartments along an estuarine transect indicating their ubiquitous occurrence in transitional waters.

  16. Meteorological and Land Surface Properties Impacting Sea Breeze Extent and Aerosol Distribution in a Dry Environment

    Science.gov (United States)

    Igel, Adele L.; van den Heever, Susan C.; Johnson, Jill S.

    2018-01-01

    The properties of sea breeze circulations are influenced by a variety of meteorological and geophysical factors that interact with one another. These circulations can redistribute aerosol particles and pollution and therefore can play an important role in local air quality, as well as impact remote sensing. In this study, we select 11 factors that have the potential to impact either the sea breeze circulation properties and/or the spatial distribution of aerosols. Simulations are run to identify which of the 11 factors have the largest influence on the sea breeze properties and aerosol concentrations and to subsequently understand the mean response of these variables to the selected factors. All simulations are designed to be representative of conditions in coastal sub tropical environments and are thus relatively dry, as such they do not support deep convection associated with the sea breeze front. For this dry sea breeze regime, we find that the background wind speed was the most influential factor for the sea breeze propagation, with the soil saturation fraction also being important. For the spatial aerosol distribution, the most important factors were the soil moisture, sea-air temperature difference, and the initial boundary layer height. The importance of these factors seems to be strongly tied to the development of the surface-based mixed layer both ahead of and behind the sea breeze front. This study highlights potential avenues for further research regarding sea breeze dynamics and the impact of sea breeze circulations on pollution dispersion and remote sensing algorithms.

  17. Inputs and distributions of synthetic musk fragrances in an estuarine and coastal environment; a case study

    International Nuclear Information System (INIS)

    Sumner, Nicola R.; Guitart, Carlos; Fuentes, Gustavo; Readman, James W.

    2010-01-01

    Synthetic musks are ubiquitous contaminants in the environment. Compartmental distributions (dissolved, suspended particle associated and sedimentary) of the compounds throughout an axial estuarine transect and in coastal waters are reported. High concentrations of Galaxolide (HHCB) and Tonalide (AHTN) (987-2098 ng/L and 55-159 ng/L, respectively) were encountered in final effluent samples from sewage treatment plants (STPs) discharging into the Tamar and Plym Estuaries (UK), with lower concentrations of Celestolide (ADBI) (4-13 ng/L), Phantolide (AHMI) (6-9 ng/L), musk xylene (MX) (4-7 ng/L) and musk ketone (MK) (18-30 ng/L). Rapid dilution from the outfalls is demonstrated with resulting concentrations of HHCB spanning from 5 to 30 ng/L and those for AHTN from 3 to 15 ng/L. The other musks were generally not detected in the estuarine and coastal waters. The suspended particulate matter (SPM) and sedimentary profiles and compositions (HHCB:AHTN ratios) generally reflect the distribution in the water column with highest concentrations adjacent to sewage outfalls. - Synthetic musks were determined in coastal environmental compartments along an estuarine transect indicating their ubiquitous occurrence in transitional waters.

  18. DLTAP: A Network-efficient Scheduling Method for Distributed Deep Learning Workload in Containerized Cluster Environment

    Directory of Open Access Journals (Sweden)

    Qiao Wei

    2017-01-01

    Full Text Available Deep neural networks (DNNs have recently yielded strong results on a range of applications. Training these DNNs using a cluster of commodity machines is a promising approach since training is time consuming and compute-intensive. Furthermore, putting DNN tasks into containers of clusters would enable broader and easier deployment of DNN-based algorithms. Toward this end, this paper addresses the problem of scheduling DNN tasks in the containerized cluster environment. Efficiently scheduling data-parallel computation jobs like DNN over containerized clusters is critical for job performance, system throughput, and resource utilization. It becomes even more challenging with the complex workloads. We propose a scheduling method called Deep Learning Task Allocation Priority (DLTAP which performs scheduling decisions in a distributed manner, and each of scheduling decisions takes aggregation degree of parameter sever task and worker task into account, in particularly, to reduce cross-node network transmission traffic and, correspondingly, decrease the DNN training time. We evaluate the DLTAP scheduling method using a state-of-the-art distributed DNN training framework on 3 benchmarks. The results show that the proposed method can averagely reduce 12% cross-node network traffic, and decrease the DNN training time even with the cluster of low-end servers.

  19. Optimizing NEURON Simulation Environment Using Remote Memory Access with Recursive Doubling on Distributed Memory Systems

    Directory of Open Access Journals (Sweden)

    Danish Shehzad

    2016-01-01

    Full Text Available Increase in complexity of neuronal network models escalated the efforts to make NEURON simulation environment efficient. The computational neuroscientists divided the equations into subnets amongst multiple processors for achieving better hardware performance. On parallel machines for neuronal networks, interprocessor spikes exchange consumes large section of overall simulation time. In NEURON for communication between processors Message Passing Interface (MPI is used. MPI_Allgather collective is exercised for spikes exchange after each interval across distributed memory systems. The increase in number of processors though results in achieving concurrency and better performance but it inversely affects MPI_Allgather which increases communication time between processors. This necessitates improving communication methodology to decrease the spikes exchange time over distributed memory systems. This work has improved MPI_Allgather method using Remote Memory Access (RMA by moving two-sided communication to one-sided communication, and use of recursive doubling mechanism facilitates achieving efficient communication between the processors in precise steps. This approach enhanced communication concurrency and has improved overall runtime making NEURON more efficient for simulation of large neuronal network models.

  20. SoS Notebook: An Interactive Multi-Language Data Analysis Environment.

    Science.gov (United States)

    Peng, Bo; Wang, Gao; Ma, Jun; Leong, Man Chong; Wakefield, Chris; Melott, James; Chiu, Yulun; Du, Di; Weinstein, John N

    2018-05-22

    Complex bioinformatic data analysis workflows involving multiple scripts in different languages can be difficult to consolidate, share, and reproduce. An environment that streamlines the entire processes of data collection, analysis, visualization and reporting of such multi-language analyses is currently lacking. We developed Script of Scripts (SoS) Notebook, a web-based notebook environment that allows the use of multiple scripting language in a single notebook, with data flowing freely within and across languages. SoS Notebook enables researchers to perform sophisticated bioinformatic analysis using the most suitable tools for different parts of the workflow, without the limitations of a particular language or complications of cross-language communications. SoS Notebook is hosted at http://vatlab.github.io/SoS/ and is distributed under a BSD license. bpeng@mdanderson.org.

  1. Analysis of Virtual Learning Environments from a Comprehensive Semiotic Perspective

    Directory of Open Access Journals (Sweden)

    Gloria María Álvarez Cadavid

    2012-11-01

    Full Text Available Although there is a wide variety of perspectives and models for the study of online education, most of these focus on the analysis of the verbal aspects of such learning, while very few consider the relationship between speech and elements of a different nature, such as images and hypermediality. In a previous article we presented a proposal for a comprehensive semiotic analysis of virtual learning environments that more recently has been developed and tested for the study of different online training courses without instructional intervention. In this paper we use this same proposal to analyze online learning environments in the framework of courses with instructional intervention. One of the main observations in relation to this type of analyses is that the organizational aspects of the courses are found to be related to the way in which the input elements for the teaching and learning process are constructed.

  2. Decision analysis for cleanup strategies in an urban environment

    International Nuclear Information System (INIS)

    Sinkko, K.; Ikaeheimonen, T.K.

    2000-01-01

    The values entering the decisions on protective actions, as concerning the society, are multidimensional. People have strong feelings and beliefs about these values, some of which are not numerically quantified and do not exist in monetary form. The decision analysis is applied in planning the recovery operations to clean up an urban environment in the event of a hypothetical nuclear power plant accident assisting in rendering explicit and apparent all factors involved and evaluating their relative importance. (author)

  3. First Experiences with LHC Grid Computing and Distributed Analysis

    CERN Document Server

    Fisk, Ian

    2010-01-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  4. Thermal-hydraulic analysis of Ignalina NPP compartments response to group distribution header rupture using RALOC4 code

    International Nuclear Information System (INIS)

    Urbonavicius, E.

    2000-01-01

    The Accident Localisation System (ALS) of Ignalina NPP is a containment of pressure suppression type designed to protect the environment from the dangerous impact of the radioactivity. The failure of ALS could lead to contamination of the environment and prescribed public radiation doses could be exceeded. The purpose of the presented analysis is to perform long term thermal-hydraulic analysis of compartments response to Group Distribution Header rupture and verify if design pressure values are not exceeded. (authors)

  5. EXPLORATORY ANALYSIS OF ORGANIZATIONAL CULTURE IN GALATI COUNTY BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Gabriela GHEORGHE

    2014-06-01

    Full Text Available Based on data collected as part of the COMOR Project, developed by The Scientific Society of Management from Romania, for the analysis of organizational culture in the Romanian business environment, we have initiated an exploration using Business Intelligence tools. The purpose of this analysis is to find relevant information about the organizational culture for Galati County, starting from the results obtained in this project, and, using data mining techniques, to investigate relationships and links between responses to different survey questions, in this "mine" data, gathered by the project team effort.

  6. Evaluation of Occupational Cold Environments: Field Measurements and Subjective Analysis

    Science.gov (United States)

    OLIVEIRA, A. Virgílio M.; GASPAR, Adélio R.; RAIMUNDO, António M.; QUINTELA, Divo A.

    2014-01-01

    The present work is dedicated to the study of occupational cold environments in food distribution industrial units. Field measurements and a subjective assessment based on an individual questionnaire were considered. The survey was carried out in 5 Portuguese companies. The field measurements include 26 workplaces, while a sample of 160 responses was considered for the subjective assessment. In order to characterize the level of cold exposure, the Required Clothing Insulation Index (IREQ) was adopted. The IREQ index highlights that in the majority of the workplaces the clothing ensembles worn are inadequate, namely in the freezing chambers where the protection provided by clothing is always insufficient. The questionnaires results show that the food distribution sector is characterized by a female population (70.6%), by a young work force (60.7% are less than 35 yr old) and by a population with a medium-length professional career (80.1% in this occupation for less than 10 yr). The incidence of health effects which is higher among women, the distribution of protective clothing (50.0% of the workers indicate one garment) and the significant percentage of workers (>75%) that has more difficulties in performing the activity during the winter represent other important results of the present study. PMID:24583510

  7. Evaluation of occupational cold environments: field measurements and subjective analysis.

    Science.gov (United States)

    Oliveira, A Virgílio M; Gaspar, Adélio R; Raimundo, António M; Quintela, Divo A

    2014-01-01

    The present work is dedicated to the study of occupational cold environments in food distribution industrial units. Field measurements and a subjective assessment based on an individual questionnaire were considered. The survey was carried out in 5 Portuguese companies. The field measurements include 26 workplaces, while a sample of 160 responses was considered for the subjective assessment. In order to characterize the level of cold exposure, the Required Clothing Insulation Index (IREQ) was adopted. The IREQ index highlights that in the majority of the workplaces the clothing ensembles worn are inadequate, namely in the freezing chambers where the protection provided by clothing is always insufficient. The questionnaires results show that the food distribution sector is characterized by a female population (70.6%), by a young work force (60.7% are less than 35 yr old) and by a population with a medium-length professional career (80.1% in this occupation for less than 10 yr). The incidence of health effects which is higher among women, the distribution of protective clothing (50.0% of the workers indicate one garment) and the significant percentage of workers (>75%) that has more difficulties in performing the activity during the winter represent other important results of the present study.

  8. Respiratory quinones in Archaea: phylogenetic distribution and application as biomarkers in the marine environment.

    Science.gov (United States)

    Elling, Felix J; Becker, Kevin W; Könneke, Martin; Schröder, Jan M; Kellermann, Matthias Y; Thomm, Michael; Hinrichs, Kai-Uwe

    2016-02-01

    The distribution of respiratory quinone electron carriers among cultivated organisms provides clues on both the taxonomy of their producers and the redox processes these are mediating. Our study of the quinone inventories of 25 archaeal species belonging to the phyla Eury-, Cren- and Thaumarchaeota facilitates their use as chemotaxonomic markers for ecologically important archaeal clades. Saturated and monounsaturated menaquinones with six isoprenoid units forming the alkyl chain may serve as chemotaxonomic markers for Thaumarchaeota. Other diagnostic biomarkers are thiophene-bearing quinones for Sulfolobales and methanophenazines as functional quinone analogues of the Methanosarcinales. The ubiquity of saturated menaquinones in the Archaea in comparison to Bacteria suggests that these compounds may represent an ancestral and diagnostic feature of the Archaea. Overlap between quinone compositions of distinct thermophilic and halophilic archaea and bacteria may indicate lateral gene transfer. The biomarker potential of thaumarchaeal quinones was exemplarily demonstrated on a water column profile of the Black Sea. Both, thaumarchaeal quinones and membrane lipids showed similar distributions with maxima at the chemocline. Quinone distributions indicate that Thaumarchaeota dominate respiratory activity at a narrow interval in the chemocline, while they contribute only 9% to the microbial biomass at this depth, as determined by membrane lipid analysis. © 2015 Society for Applied Microbiology and John Wiley & Sons Ltd.

  9. Spatial and historical distribution of organic phosphorus driven by environment conditions in lake sediments.

    Science.gov (United States)

    Lü, Changwei; He, Jiang; Wang, Bing

    2018-02-01

    The chemistry of sedimentary organic phosphorus (OP) and its fraction distribution in sediments are greatly influenced by environmental conditions such as terrestrial inputs and runoffs. The linkage of OP with environmental conditions was analyzed on the basis of OP spatial and historical distributions in lake sediments. The redundancy analysis and OP spatial distribution results suggested that both NaOH-OP (OP extracted by NaOH) and Re-OP (residual OP) in surface sediments from the selected 13 lakes reflected the gradient effects of environmental conditions and the autochthonous and/or allochthonous inputs driven by latitude zonality in China. The lake level and salinity of Lake Hulun and the runoff and precipitation of its drainage basin were reconstructed on the basis of the geochemistry index. This work showed that a gradient in weather conditions presented by the latitude zonality in China impacts the OP accumulation through multiple drivers and in many ways. The drivers are mainly precipitation and temperature, governing organic matter (OM) production, degradation rate and transportation in the watershed. Over a long temporal dimension (4000years), the vertical distributions of Re-OP and NaOH-OP based on a dated sediment profile from HLH were largely regulated by the autochthonous and/or allochthonous inputs, which depended on the environmental and climate conditions and anthropogenic activities in the drainage basin. This work provides useful environmental geochemistry information to understand the inherent linkage of OP fractionation with environmental conditions and lake evolution. Copyright © 2017. Published by Elsevier B.V.

  10. Design and Implement a MapReduce Framework for Executing Standalone Software Packages in Hadoop-based Distributed Environments

    Directory of Open Access Journals (Sweden)

    Chao-Chun Chen

    2013-12-01

    Full Text Available The Hadoop MapReduce is the programming model of designing the auto scalable distributed computing applications. It provides developer an effective environment to attain automatic parallelization. However, most existing manufacturing systems are arduous and restrictive to migrate to MapReduce private cloud, due to the platform incompatible and tremendous complexity of system reconstruction. For increasing the efficiency of manufacturing systems with minimum modification of existing systems, we design a framework in this thesis, called MC-Framework: Multi-uses-based Cloudizing-Application Framework. It provides the simple interface to users for fairly executing requested tasks worked with traditional standalone software packages in MapReduce-based private cloud environments. Moreover, this thesis focuses on the multiuser workloads, but the default Hadoop scheduling scheme, i.e., FIFO, would increase delay under multiuser scenarios. Hence, we also propose a new scheduling mechanism, called Job-Sharing Scheduling, to explore and fairly share the jobs to machines in the MapReduce-based private cloud. Then, we prototype an experimental virtual-metrology module of a manufacturing system as a case study to verify and analysis the proposed MC-Framework. The results of our experiments indicate that our proposed framework enormously improved the time performance compared with the original package.

  11. Molecular Isotopic Distribution Analysis (MIDAs) with adjustable mass accuracy.

    Science.gov (United States)

    Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  12. Complete distributed computing environment for a HEP experiment: experience with ARC-connected infrastructure for ATLAS

    International Nuclear Information System (INIS)

    Read, A; Taga, A; O-Saada, F; Pajchel, K; Samset, B H; Cameron, D

    2008-01-01

    Computing and storage resources connected by the Nordugrid ARC middleware in the Nordic countries, Switzerland and Slovenia are a part of the ATLAS computing Grid. This infrastructure is being commissioned with the ongoing ATLAS Monte Carlo simulation production in preparation for the commencement of data taking in 2008. The unique non-intrusive architecture of ARC, its straightforward interplay with the ATLAS Production System via the Dulcinea executor, and its performance during the commissioning exercise is described. ARC support for flexible and powerful end-user analysis within the GANGA distributed analysis framework is also shown. Whereas the storage solution for this Grid was earlier based on a large, distributed collection of GridFTP-servers, the ATLAS computing design includes a structured SRM-based system with a limited number of storage endpoints. The characteristics, integration and performance of the old and new storage solutions are presented. Although the hardware resources in this Grid are quite modest, it has provided more than double the agreed contribution to the ATLAS production with an efficiency above 95% during long periods of stable operation

  13. Complete distributed computing environment for a HEP experiment: experience with ARC-connected infrastructure for ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Read, A; Taga, A; O-Saada, F; Pajchel, K; Samset, B H; Cameron, D [Department of Physics, University of Oslo, P.b. 1048 Blindern, N-0316 Oslo (Norway)], E-mail: a.l.read@fys.uio.no

    2008-07-15

    Computing and storage resources connected by the Nordugrid ARC middleware in the Nordic countries, Switzerland and Slovenia are a part of the ATLAS computing Grid. This infrastructure is being commissioned with the ongoing ATLAS Monte Carlo simulation production in preparation for the commencement of data taking in 2008. The unique non-intrusive architecture of ARC, its straightforward interplay with the ATLAS Production System via the Dulcinea executor, and its performance during the commissioning exercise is described. ARC support for flexible and powerful end-user analysis within the GANGA distributed analysis framework is also shown. Whereas the storage solution for this Grid was earlier based on a large, distributed collection of GridFTP-servers, the ATLAS computing design includes a structured SRM-based system with a limited number of storage endpoints. The characteristics, integration and performance of the old and new storage solutions are presented. Although the hardware resources in this Grid are quite modest, it has provided more than double the agreed contribution to the ATLAS production with an efficiency above 95% during long periods of stable operation.

  14. Estimation Accuracy on Execution Time of Run-Time Tasks in a Heterogeneous Distributed Environment

    Directory of Open Access Journals (Sweden)

    Qi Liu

    2016-08-01

    Full Text Available Distributed Computing has achieved tremendous development since cloud computing was proposed in 2006, and played a vital role promoting rapid growth of data collecting and analysis models, e.g., Internet of things, Cyber-Physical Systems, Big Data Analytics, etc. Hadoop has become a data convergence platform for sensor networks. As one of the core components, MapReduce facilitates allocating, processing and mining of collected large-scale data, where speculative execution strategies help solve straggler problems. However, there is still no efficient solution for accurate estimation on execution time of run-time tasks, which can affect task allocation and distribution in MapReduce. In this paper, task execution data have been collected and employed for the estimation. A two-phase regression (TPR method is proposed to predict the finishing time of each task accurately. Detailed data of each task have drawn interests with detailed analysis report being made. According to the results, the prediction accuracy of concurrent tasks’ execution time can be improved, in particular for some regular jobs.

  15. Establishing Functional Relationships between Abiotic Environment, Macrophyte Coverage, Resource Gradients and the Distribution of Mytilus trossulus in a Brackish Non-Tidal Environment.

    Directory of Open Access Journals (Sweden)

    Jonne Kotta

    Full Text Available Benthic suspension feeding mussels are an important functional guild in coastal and estuarine ecosystems. To date we lack information on how various environmental gradients and biotic interactions separately and interactively shape the distribution patterns of mussels in non-tidal environments. Opposing to tidal environments, mussels inhabit solely subtidal zone in non-tidal waterbodies and, thereby, driving factors for mussel populations are expected to differ from the tidal areas. In the present study, we used the boosted regression tree modelling (BRT, an ensemble method for statistical techniques and machine learning, in order to explain the distribution and biomass of the suspension feeding mussel Mytilus trossulus in the non-tidal Baltic Sea. BRT models suggested that (1 distribution patterns of M. trossulus are largely driven by separate effects of direct environmental gradients and partly by interactive effects of resource gradients with direct environmental gradients. (2 Within its suitable habitat range, however, resource gradients had an important role in shaping the biomass distribution of M. trossulus. (3 Contrary to tidal areas, mussels were not competitively superior over macrophytes with patterns indicating either facilitative interactions between mussels and macrophytes or co-variance due to common stressor. To conclude, direct environmental gradients seem to define the distribution pattern of M. trossulus, and within the favourable distribution range, resource gradients in interaction with direct environmental gradients are expected to set the biomass level of mussels.

  16. Staghorn: An Automated Large-Scale Distributed System Analysis Platform

    Energy Technology Data Exchange (ETDEWEB)

    Gabert, Kasimir [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Burns, Ian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Elliott, Steven [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kallaher, Jenna [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vail, Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-09-01

    Conducting experiments on large-scale distributed computing systems is becoming significantly easier with the assistance of emulation. Researchers can now create a model of a distributed computing environment and then generate a virtual, laboratory copy of the entire system composed of potentially thousands of virtual machines, switches, and software. The use of real software, running at clock rate in full virtual machines, allows experiments to produce meaningful results without necessitating a full understanding of all model components. However, the ability to inspect and modify elements within these models is bound by the limitation that such modifications must compete with the model, either running in or alongside it. This inhibits entire classes of analyses from being conducted upon these models. We developed a mechanism to snapshot an entire emulation-based model as it is running. This allows us to \\freeze time" and subsequently fork execution, replay execution, modify arbitrary parts of the model, or deeply explore the model. This snapshot includes capturing packets in transit and other input/output state along with the running virtual machines. We were able to build this system in Linux using Open vSwitch and Kernel Virtual Machines on top of Sandia's emulation platform Firewheel. This primitive opens the door to numerous subsequent analyses on models, including state space exploration, debugging distributed systems, performance optimizations, improved training environments, and improved experiment repeatability.

  17. Assessing safety risk in electricity distribution processes using ET & BA improved technique and its ranking by VIKOR and TOPSIS models in fuzzy environment

    OpenAIRE

    S. Rahmani; M. Omidvari

    2016-01-01

    Introduction: Electrical industries are among high risk industries. The present study aimed to assess safety risk in electricity distribution processes using  ET&BA technique and also to compare with both VIKOR & TOPSIS methods in fuzzy environments.   Material and Methods: The present research is a descriptive study and ET&BA worksheet is the main data collection tool. Both Fuzzy TOPSIS and Fuzzy VIKOR methods were used for the worksheet analysis.   Result: Findi...

  18. Diving into the consumer nutrition environment: A Bayesian spatial factor analysis of neighborhood restaurant environment.

    Science.gov (United States)

    Luan, Hui; Law, Jane; Lysy, Martin

    2018-02-01

    Neighborhood restaurant environment (NRE) plays a vital role in shaping residents' eating behaviors. While NRE 'healthfulness' is a multi-facet concept, most studies evaluate it based only on restaurant type, thus largely ignoring variations of in-restaurant features. In the few studies that do account for such features, healthfulness scores are simply averaged over accessible restaurants, thereby concealing any uncertainty that attributed to neighborhoods' size or spatial correlation. To address these limitations, this paper presents a Bayesian Spatial Factor Analysis for assessing NRE healthfulness in the city of Kitchener, Canada. Several in-restaurant characteristics are included. By treating NRE healthfulness as a spatially correlated latent variable, the adopted modeling approach can: (i) identify specific indicators most relevant to NRE healthfulness, (ii) provide healthfulness estimates for neighborhoods without accessible restaurants, and (iii) readily quantify uncertainties in the healthfulness index. Implications of the analysis for intervention program development and community food planning are discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Gerald: a general environment for radiation analysis and design

    International Nuclear Information System (INIS)

    Boyle, Ch.; Oliveira, P.I.E. de; Oliveira, C.R.E. de; Adams, M.L.; Galan, J.M.

    2005-01-01

    Full text of publication follows: This paper describes the status of the GERALD interactive workbench for the analysis of radiation transport problems. GERALD basically guides the user through the various steps that are necessary to solve a radiation transport problem, and is aimed at education, research and industry. The advantages of such workbench are many: quality assurance of problem setup, interaction of the user with problem solution, preservation of theory and legacy research codes, and rapid proto-typing and testing of new methods. The environment is of general applicability catering for analytical, deterministic and stochastic analysis of the radiation problem and is not tied to one specific solution method or code. However, GERALD is being developed as a portable, modular, open source framework which renders itself quite naturally to the coupling of existing computational tools through specifically developed plug-ins. By offering a common route for setting up, solving and analyzing radiation transport problems GERALD offers the possibility of methods intercomparison and validation. Such flexible radiation transport environment will also facilitate the coupling of radiation physics methods to other physical phenomena and their application to other areas of application such as medical physics and the environment. (authors)

  20. Silicon Bipolar Distributed Oscillator Design and Analysis | Aku ...

    African Journals Online (AJOL)

    The design of high frequency silicon bipolar oscillator using common emitter (CE) with distributed output and analysis is carried out. The general condition for oscillation and the resulting analytical expressions for the frequency of oscillators were reviewed. Transmission line design was carried out using Butterworth LC ...

  1. Tense Usage Analysis in Verb Distribution in Brazilian Portuguese.

    Science.gov (United States)

    Hoge, Henry W., Comp.

    This section of a four-part research project investigating the syntax of Brazilian Portuguese presents data concerning tense usage in verb distribution. The data are derived from the analysis of selected literary samples from representative and contemporary writers. The selection of authors and tabulation of data are also described. Materials…

  2. Three-Phase Harmonic Analysis Method for Unbalanced Distribution Systems

    Directory of Open Access Journals (Sweden)

    Jen-Hao Teng

    2014-01-01

    Full Text Available Due to the unbalanced features of distribution systems, a three-phase harmonic analysis method is essential to accurately analyze the harmonic impact on distribution systems. Moreover, harmonic analysis is the basic tool for harmonic filter design and harmonic resonance mitigation; therefore, the computational performance should also be efficient. An accurate and efficient three-phase harmonic analysis method for unbalanced distribution systems is proposed in this paper. The variations of bus voltages, bus current injections and branch currents affected by harmonic current injections can be analyzed by two relationship matrices developed from the topological characteristics of distribution systems. Some useful formulas are then derived to solve the three-phase harmonic propagation problem. After the harmonic propagation for each harmonic order is calculated, the total harmonic distortion (THD for bus voltages can be calculated accordingly. The proposed method has better computational performance, since the time-consuming full admittance matrix inverse employed by the commonly-used harmonic analysis methods is not necessary in the solution procedure. In addition, the proposed method can provide novel viewpoints in calculating the branch currents and bus voltages under harmonic pollution which are vital for harmonic filter design. Test results demonstrate the effectiveness and efficiency of the proposed method.

  3. AND LANDSCAPE-ECOLOGICAL ANALYSIS OF ITS DISTRIBUTION

    OpenAIRE

    S. M. Musaeva

    2012-01-01

    The article is devoted to the study of helminthofauna of the striped lizard in Lankaran natural region. The landscape and ecological analysis of distribution of the helminthofauna is provided. As a result of studies on 99 individuals of striped lizard totally 14 species of helminthes, including 1 trematode species, 1 species of cestode, 3 species of akantocefals and 9 species of nematodes were found.

  4. Analysis of thrips distribution: application of spatial statistics and Kriging

    Science.gov (United States)

    John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard

    1991-01-01

    Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...

  5. Data synthesis and display programs for wave distribution function analysis

    Science.gov (United States)

    Storey, L. R. O.; Yeh, K. J.

    1992-01-01

    At the National Space Science Data Center (NSSDC) software was written to synthesize and display artificial data for use in developing the methodology of wave distribution analysis. The software comprises two separate interactive programs, one for data synthesis and the other for data display.

  6. Distributed optical fibre temperature measurements in a low dose rate radiation environment based on Rayleigh backscattering

    Science.gov (United States)

    Faustov, A.; Gussarov, A.; Wuilpart, M.; Fotiadi, A. A.; Liokumovich, L. B.; Kotov, O. I.; Zolotovskiy, I. O.; Tomashuk, A. L.; Deschoutheete, T.; Mégret, P.

    2012-04-01

    On-line monitoring of environmental conditions in nuclear facilities is becoming a more and more important problem. Standard electronic sensors are not the ideal solution due to radiation sensitivity and difficulties in installation of multiple sensors. In contrast, radiation-hard optical fibres can sustain very high radiation doses and also naturally offer multi-point or distributed monitoring of external perturbations. Multiple local electro-mechanical sensors can be replaced by just one measuring fibre. At present, there are over four hundred operational nuclear power plants (NPPs) in the world 1. Operating experience has shown that ineffective control of the ageing degradation of major NPP components can threaten plant safety and also plant life. Among those elements, cables are vital components of I&C systems in NPPs. To ensure their safe operation and predict remaining life, environmental monitoring is necessary. In particular, temperature and radiation dose are considered to be the two most important parameters. The aim of this paper is to assess experimentally the feasibility of optical fibre temperature measurements in a low doserate radiation environment, using a commercially available reflectometer based on Rayleigh backscattering. Four different fibres were installed in the Sub-Pile Room of the BR2 Material testing nuclear reactor in Mol, Belgium. This place is man-accessible during the reactor shut-down, allowing easy fibre installation. When the reactor operates, the dose-rates in the room are in a range 0.005-5 Gy/h with temperatures of 40-60 °C, depending on the location. Such a surrounding is not much different to some "hot" environments in NPPs, where I&C cables are located.

  7. The Future of the Global Environment: A Model-based Analysis Supporting UNEP's First Global Environment Outlook

    NARCIS (Netherlands)

    Bakkes JA; Woerden JW van; Alcamo J; Berk MM; Bol P; Born GJ van den; Brink BJE ten; Hettelingh JP; Langeweg F; Niessen LW; Swart RJ; United Nations Environment; MNV

    1997-01-01

    This report documents the scenario analysis in UNEP's first Global Environment Outlook, published at the same time as the scenario analysis. This Outlook provides a pilot assessment of developments in the environment, both global and regional, between now and 2015, with a further projection to

  8. A Systematic Approach for Engagement Analysis Under Multitasking Environments

    Science.gov (United States)

    Zhang, Guangfan; Leddo, John; Xu, Roger; Richey, Carl; Schnell, Tom; McKenzie, Frederick; Li, Jiang

    2011-01-01

    An overload condition can lead to high stress for an operator and further cause substantial drops in performance. On the other extreme, in automated systems, an operator may become underloaded; in which case, it is difficult for the operator to maintain sustained attention. When an unexpected event occurs, either internal or external to the automated system, a disengaged operation may neglect, misunderstand, or respond slowly/inappropriately to the situation. In this paper, we discuss a systematic approach monitor for extremes of cognitive workload and engagement in multitasking environments. Inferences of cognitive workload ar engagement are based on subjective evaluations, objective performance measures, physiological signals, and task analysis results. The systematic approach developed In this paper aggregates these types of information collected under the multitasking environment and can provide a real-time assessment or engagement.

  9. Analysis on Space Environment from the Anomalies of Geosynchronous Satellites

    Directory of Open Access Journals (Sweden)

    Jaejin Lee

    2009-12-01

    Full Text Available While it is well known that space environment can produce spacecraft anomaly, defining space environment effects for each anomalies is difficult. This is caused by the fact that spacecraft anomaly shows various symptoms and reproducing it is impossible. In this study, we try to find the conditions of when spacecraft failures happen more frequently and give satellite operators useful information. Especially, our study focuses on the geosynchronous satellites which cost is high and required high reliability. We used satellite anomaly data given by Satellite News Digest which is internet newspaper providing space industry news. In our analysis, 88 anomaly cases occurred from 1997 to 2008 shows bad corelation with Kp index. Satellite malfunctions were likely to happen in spring and fall and in local time from midnight to dawn. In addition, we found the probability of anomaly increase when high energy electron flux is high. This is more clearly appeared in solar minimum than maximum period.

  10. Activation analysis in the environment: Science and technology

    International Nuclear Information System (INIS)

    Lenihan, J.

    1989-01-01

    Science is disciplined curiosity. Activation analysis was created more than 50 yr ago by Hevesy's curiosity and Levi's experimental skill. Technology is the exploitation of machines and materials for the fulfillment of human needs or wants. The early history of neutron activation analysis (NAA) was greatly influenced by military requirements. Since then the technique has found applications in many disciplines, including materials science, medicine, archaeology, geochemistry, agriculture, and forensic science. More recently, neutron activation analysts, responding to increasing public interest and concern, have made distinctive contributions to the study of environmental problems. Activation analysis, though it uses some procedures derived from physics, is essentially a chemical technique. The chemical study of the environment may be reviewed under many headings; three are discussed here: 1. occupational medicine 2. health of the general public 3. environmental pollution

  11. Statistical analysis of environmental dose data for Trombay environment

    International Nuclear Information System (INIS)

    Kale, M.S.; Padmanabhan, N.; Rekha Kutty, R.; Sharma, D.N.; Iyengar, T.S.; Iyer, M.R.

    1993-01-01

    The microprocessor based environmental dose logging system is functioning at six stations at Trombay for the past couple of years. The site emergency control centre (SECC) at modular laboratory receives telemetered data every five minutes from main guard house (South Site), Bhabha point (top of the hill), Cirus reactor, Mod Lab terrace, Hall No. 7 and Training School Hostel. The data collected are being stored in dbase III + format for easy processing in a PC. Various statistical parameters and distributions of environmental gamma dose are determined from the hourly dose data. On the basis of the reactor operation status an attempt has been made to separate the natural background and the gamma dose contribution due to the operating research reactors in each one of these monitoring stations. Similar investigations are being carried out for Tarapur environment. (author). 2 refs., 3 tabs., 2 figs

  12. Distribution of living larger benthic foraminifera in littoral environments of the United Arab Emirates

    Science.gov (United States)

    Fiorini, Flavia; Lokier, Stephen W.

    2015-04-01

    The distribution of larger benthic foraminifera in Recent littoral environment of the United Arab Emirates (Abu Dhabi and Western regions) was investigated with the aim of understanding the response of those foraminifera to an increase in water salinity. For this purpose, 100 sediment samples from nearshore shelf, beach-front, channel, lagoon, and intertidal environment were collected. Sampling was undertaken at a water depth shallower than 15 m in water with a temperature of 22 to 35˚C, a salinity ranging from 40 to 60‰ and a pH of 8. Samples were stained with rose Bengal at the moment of sample collection in order to identify living specimens. The most abundant epiphytic larger benthic foraminifera in the studied area were Peneroplis pertusus and P. planatus with less common Spirolina areatina, S. aciculate and Sorites marginalis. The living specimens of the above mentioned species with normal test growing were particularly abundant in the nearshore shelf and lagoonal samples collected on seaweed. Dead specimens were concentrated in the coarser sediments of the beach-front, probably transported from nearby environments. Shallow coastal ponds are located in the upper intertidal zone and have a maximum salinity of 60‰ and contain abundant detached seagrass. Samples collected from these ponds possess a living foraminifera assemblage dominated by Peneroplis pertusus and P. planatus. High percentages (up to 50% of the stained assemblage) of Peneroplis presented abnormality in test growth, such as the presence of multiple apertures with reduced size, deformation in the general shape of the test, irregular suture lines and abnormal coiling. The high percentage of abnormal tests reflects natural environmental stress mainly caused by high and variable salinity. The unique presence of living epiphytic species, suggests that epiphytic foraminifera may be transported into the pond together with seagrass and continued to live in the pond. This hypothesis is supported by

  13. Weibull distribution in reliability data analysis in nuclear power plant

    International Nuclear Information System (INIS)

    Ma Yingfei; Zhang Zhijian; Zhang Min; Zheng Gangyang

    2015-01-01

    Reliability is an important issue affecting each stage of the life cycle ranging from birth to death of a product or a system. The reliability engineering includes the equipment failure data processing, quantitative assessment of system reliability and maintenance, etc. Reliability data refers to the variety of data that describe the reliability of system or component during its operation. These data may be in the form of numbers, graphics, symbols, texts and curves. Quantitative reliability assessment is the task of the reliability data analysis. It provides the information related to preventing, detect, and correct the defects of the reliability design. Reliability data analysis under proceed with the various stages of product life cycle and reliability activities. Reliability data of Systems Structures and Components (SSCs) in Nuclear Power Plants is the key factor of probabilistic safety assessment (PSA); reliability centered maintenance and life cycle management. The Weibull distribution is widely used in reliability engineering, failure analysis, industrial engineering to represent manufacturing and delivery times. It is commonly used to model time to fail, time to repair and material strength. In this paper, an improved Weibull distribution is introduced to analyze the reliability data of the SSCs in Nuclear Power Plants. An example is given in the paper to present the result of the new method. The Weibull distribution of mechanical equipment for reliability data fitting ability is very strong in nuclear power plant. It's a widely used mathematical model for reliability analysis. The current commonly used methods are two-parameter and three-parameter Weibull distribution. Through comparison and analysis, the three-parameter Weibull distribution fits the data better. It can reflect the reliability characteristics of the equipment and it is more realistic to the actual situation. (author)

  14. Providing more informative projections of climate change impact on plant distribution in a mountain environment

    Science.gov (United States)

    Randin, C.; Engler, R.; Pearman, P.; Vittoz, P.; Guisan, A.

    2007-12-01

    Due to their conic shape and the reduction of area with increasing elevation, mountain ecosystems were early identified as potentially very sensitive to global warming. Moreover, mountain systems may experience unprecedented rates of warming during the next century, two or three times higher than that records of the 20th century. In this context, species distribution models (SDM) have become important tools for rapid assessment of the impact of accelerated land use and climate change on the distribution plant species. In this study, we developed and tested new predictor variables for species distribution models (SDM), specific to current and future geographic projections of plant species in a mountain system, using the Western Swiss Alps as model region. Since meso- and micro-topography are relevant to explain geographic patterns of plant species in mountain environments, we assessed the effect of scale on predictor variables and geographic projections of SDM. We also developed a methodological framework of space-for-time evaluation to test the robustness of SDM when projected in a future changing climate. Finally, we used a cellular automaton to run dynamic simulations of plant migration under climate change in a mountain landscape, including realistic distance of seed dispersal. Results of future projections for the 21st century were also discussed in perspective of vegetation changes monitored during the 20th century. Overall, we showed in this study that, based on the most severe A1 climate change scenario and realistic dispersal simulations of plant dispersal, species extinctions in the Western Swiss Alps could affect nearly one third (28.5%) of the 284 species modeled by 2100. With the less severe B1 scenario, only 4.6% of species are predicted to become extinct. However, even with B1, 54% (153 species) may still loose more than 80% of their initial surface. Results of monitoring of past vegetation changes suggested that plant species can react quickly to the

  15. Neuronal model with distributed delay: analysis and simulation study for gamma distribution memory kernel.

    Science.gov (United States)

    Karmeshu; Gupta, Varun; Kadambari, K V

    2011-06-01

    A single neuronal model incorporating distributed delay (memory)is proposed. The stochastic model has been formulated as a Stochastic Integro-Differential Equation (SIDE) which results in the underlying process being non-Markovian. A detailed analysis of the model when the distributed delay kernel has exponential form (weak delay) has been carried out. The selection of exponential kernel has enabled the transformation of the non-Markovian model to a Markovian model in an extended state space. For the study of First Passage Time (FPT) with exponential delay kernel, the model has been transformed to a system of coupled Stochastic Differential Equations (SDEs) in two-dimensional state space. Simulation studies of the SDEs provide insight into the effect of weak delay kernel on the Inter-Spike Interval(ISI) distribution. A measure based on Jensen-Shannon divergence is proposed which can be used to make a choice between two competing models viz. distributed delay model vis-á-vis LIF model. An interesting feature of the model is that the behavior of (CV(t))((ISI)) (Coefficient of Variation) of the ISI distribution with respect to memory kernel time constant parameter η reveals that neuron can switch from a bursting state to non-bursting state as the noise intensity parameter changes. The membrane potential exhibits decaying auto-correlation structure with or without damped oscillatory behavior depending on the choice of parameters. This behavior is in agreement with empirically observed pattern of spike count in a fixed time window. The power spectral density derived from the auto-correlation function is found to exhibit single and double peaks. The model is also examined for the case of strong delay with memory kernel having the form of Gamma distribution. In contrast to fast decay of damped oscillations of the ISI distribution for the model with weak delay kernel, the decay of damped oscillations is found to be slower for the model with strong delay kernel.

  16. User-friendly Tool for Power Flow Analysis and Distributed Generation Optimisation in Radial Distribution Networks

    Directory of Open Access Journals (Sweden)

    M. F. Akorede

    2017-06-01

    Full Text Available The intent of power distribution companies (DISCOs is to deliver electric power to their customers in an efficient and reliable manner – with minimal energy loss cost. One major way to minimise power loss on a given power system is to install distributed generation (DG units on the distribution networks. However, to maximise benefits, it is highly crucial for a DISCO to ensure that these DG units are of optimal size and sited in the best locations on the network. This paper gives an overview of a software package developed in this study, called Power System Analysis and DG Optimisation Tool (PFADOT. The main purpose of the graphical user interface-based package is to guide a DISCO in finding the optimal size and location for DG placement in radial distribution networks. The package, which is also suitable for load flow analysis, employs the GUI feature of MATLAB. Three objective functions are formulated into a single optimisation problem and solved with fuzzy genetic algorithm to simultaneously obtain DG optimal size and location. The accuracy and reliability of the developed tool was validated using several radial test systems, and the results obtained are evaluated against the existing similar package cited in the literature, which are impressive and computationally efficient.

  17. Central automatic control or distributed occupant control for better indoor environment quality in the future

    Energy Technology Data Exchange (ETDEWEB)

    Toftum, Joern [International Centre for Indoor Environment and Energy, Department of Civil Engineering, Technical University of Denmark, DTU, Building 402, DK-2800 Lyngby (Denmark)

    2010-01-15

    Based on a database accumulated from several recent surveys of office buildings located in a temperate climate (Denmark), the effect on occupant perceptions and symptom prevalence was compared in buildings with natural and with mechanical ventilation in which earlier studies have shown a discrepancy in the degree of perceived control. The database was composed of 1272 responses obtained in 24 buildings of which 15 had mechanical ventilation (997 responses) and nine had natural ventilation (275 responses). The number of occupant-reported control opportunities was higher in buildings with natural ventilation. Analysis of occupant responses, after grouping according to categories determined by the degree of satisfaction with the perceived control, showed that it was more likely the degree of control satisfaction that affected the prevalence of adverse perceptions and symptoms. Thus, the degree of control, as perceived by occupants, seemed more important for the prevalence of adverse symptoms and building-related symptoms than the ventilation mode per se. This result indicates that even though the development and application of new indoor environment sensors and HVAC control systems may allow for fully automated IEQ control, such systems should not compromise occupants' perception of having some degree of control of their indoor environment. (author)

  18. Electronic structure and partial charge distribution of Doxorubicin in different molecular environments.

    Science.gov (United States)

    Poudel, Lokendra; Wen, Amy M; French, Roger H; Parsegian, V Adrian; Podgornik, Rudolf; Steinmetz, Nicole F; Ching, Wai-Yim

    2015-05-18

    The electronic structure and partial charge of doxorubicin (DOX) in three different molecular environments-isolated, solvated, and intercalated in a DNA complex-are studied by first-principles density functional methods. It is shown that the addition of solvating water molecules to DOX, together with the proximity to and interaction with DNA, has a significant impact on the electronic structure as well as on the partial charge distribution. Significant improvement in estimating the DOX-DNA interaction energy is achieved. The results are further elucidated by resolving the total density of states and surface charge density into different functional groups. It is concluded that the presence of the solvent and the details of the interaction geometry matter greatly in determining the stability of DOX complexation. Ab initio calculations on realistic models are an important step toward a more accurate description of the long-range interactions in biomolecular systems. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Uranium distribution and sandstone depositional environments: oligocene and upper Cretaceous sediments, Cheyenne basin, Colorado

    International Nuclear Information System (INIS)

    Nibbelink, K.A.; Ethridge, F.G.

    1984-01-01

    Wyoming-type roll-front uranium deposits occur in the Upper Cretaceous Laramie and Fox Hills sandstones in the Cheyenne basin of northeastern Colorado. The location, geometry, and trend of specific depositional environments of the Oligocene White River and the Upper Cretaceous Laramie and Fox Hills formations are important factors that control the distribution of uranium in these sandstones. The Fox Hills Sandstone consists of up to 450 ft (140 m) of nearshore marine wave-dominated delta and barrier island-tidal channel sandstones which overlie offshore deposits of the Pierre Shale and which are overlain by delta-plain and fluvial deposits of the Laramie Formation. Uranium, which probably originated from volcanic ash in the White River Formation, was transported by groundwater through the fluvial-channel deposits of the White River into the sandstones of the Laramie and Fox Hills formations where it was precipitated. Two favorable depositional settings for uranium mineralization in the Fox Hills Sandstone are: (1) the landward side of barrier-island deposits where barrier sandstones thin and interfinger with back-barrier organic mudstones, and (2) the intersection of barrier-island and tidal channel sandstones. In both settings, sandstones were probably reduced during early burial by diagenesis of contained and adjacent organic matter. The change in permeability trends between the depositional strike-oriented barrier sandstones and the dip-oriented tidal-channel sandstones provided sites for dispersed groundwater flow and, as demonstrated in similar settings in other depositional systems, sites for uranium mineralization

  20. Environmental distributions and the practical utilisation of detection limited environment measurement data

    International Nuclear Information System (INIS)

    Daniels, W.M.; Higgins, N.A.

    2002-01-01

    This study investigated methods of providing summary statistics for measurements of radioactive contamination in food when the available measurements are incomplete. Several techniques to calculate, for instance, the mean level of contamination when a significant number of samples are found to have less than the minimum level reported by measurements, are discussed. To support the estimation of summary statistics the study identifies physical processes that give rise to observed distributions, eg lognormal for the range of radioactivity levels found in environmental and food samples. The improved estimates possible by application of the methods reviewed will allow the Food Standards Agency to gain a better understanding of the levels of radioactivity in the environment and if required direct effort to minimising the most significant uncertainties in these estimates. The Ministry of Agriculture, Fisheries and Food, Radiological Safety and Nutrition Division (now part of the Food Standards Agency) funded this study, under contract RP0342. This work was undertaken under the Environmental Assessments and Emergency Response Group's Quality Management System, which has been approved by Lloyd's Register Quality Assurance to the Quality Management Standards ISO 9001:2000 and TickIT Guide Issue 5, certificate number 956546. (author)

  1. 'Everything is everywhere: but the environment selects': ubiquitous distribution and ecological determinism in microbial biogeography.

    Science.gov (United States)

    O'Malley, Maureen A

    2008-09-01

    Recent discoveries of geographical patterns in microbial distribution are undermining microbiology's exclusively ecological explanations of biogeography and their fundamental assumption that 'everything is everywhere: but the environment selects'. This statement was generally promulgated by Dutch microbiologist Martinus Wilhelm Beijerinck early in the twentieth century and specifically articulated in 1934 by his compatriot, Lourens G. M. Baas Becking. The persistence of this precept throughout twentieth-century microbiology raises a number of issues in relation to its formulation and widespread acceptance. This paper will trace the conceptual history of Beijerinck's claim that 'everything is everywhere' in relation to a more general account of its theoretical, experimental and institutional context. His principle also needs to be situated in relationship to plant and animal biogeography, which, this paper will argue, forms a continuum of thought with microbial biogeography. Finally, a brief overview of the contemporary microbiological research challenging 'everything is everywhere' reveals that philosophical issues from Beijerinck's era of microbiology still provoke intense discussion in twenty-first century investigations of microbial biogeography.

  2. Virtual memory support for distributed computing environments using a shared data object model

    Science.gov (United States)

    Huang, F.; Bacon, J.; Mapp, G.

    1995-12-01

    Conventional storage management systems provide one interface for accessing memory segments and another for accessing secondary storage objects. This hinders application programming and affects overall system performance due to mandatory data copying and user/kernel boundary crossings, which in the microkernel case may involve context switches. Memory-mapping techniques may be used to provide programmers with a unified view of the storage system. This paper extends such techniques to support a shared data object model for distributed computing environments in which good support for coherence and synchronization is essential. The approach is based on a microkernel, typed memory objects, and integrated coherence control. A microkernel architecture is used to support multiple coherence protocols and the addition of new protocols. Memory objects are typed and applications can choose the most suitable protocols for different types of object to avoid protocol mismatch. Low-level coherence control is integrated with high-level concurrency control so that the number of messages required to maintain memory coherence is reduced and system-wide synchronization is realized without severely impacting the system performance. These features together contribute a novel approach to the support for flexible coherence under application control.

  3. Concentration and distribution of 14C in aquatic environment around Qinshan nuclear power plant

    International Nuclear Information System (INIS)

    Wang Zhongtang; Guo Qiuju; Hu Dan; Xu Hong

    2015-01-01

    In order to study the concentration and distribution of 14 C in aquatic environment in the vicinity of Qinshan Nuclear Power Plant (NPP) after twenty years' operation, an apparatus extracting dissolved inorganic carbon from water was set up and applied to pretreat the water samples collected around Qinshan NPP. The 14 C concentration was measured by accelerator mass spectrometer (AMS). The results show that the 14 C specific activities in surface seawater samples range from 196.8 to 206.5 Bq/kg 203.4 ± 5.6) Bq/kg in average), which are close to the background. The 14 C concentrations in cooling water discharged from Qinshan NPP are close to the 14 C values in near shore seawater samples out of liquid radioactive effluent discharge period. It can be further concluded that the 14 C discharged previously is diluted and diffused well, and no 14 C enrichment in seawater is found. Also, no obvious increment in the 14 C specific activities of surface water and underground water samples are found between Qinshan NPP region and the reference region. (authors)

  4. The cost of electricity distribution in Italy: a quantitative analysis

    International Nuclear Information System (INIS)

    Scarpa, C.

    1998-01-01

    This paper presents a quantitative analysis of the cost of medium and low tension electricity distribution in Italy. An econometric analysis of the cost function is proposed, on the basis of data on 147 zones of the dominant firm, ENEL. Data are available only for 1996, which has forced to carry out only a cross-section OLS analysis. The econometric estimate shows the existence of significant scale economies, that the current organisational structure does not exploit. On this basis is also possible to control to what extent exogenous cost drivers affect costs. The role of numerous exogenous factors considered seems however quite limited. The area of the distribution zone and an indicator of quality are the only elements that appear significant from an economic viewpoint [it

  5. Central automatic control or distributed occupant control for better indoor environment quality in the future

    DEFF Research Database (Denmark)

    Toftum, Jørn

    2008-01-01

    of adverse symptoms and building related symptoms than the ventilation mode per se. This result indicates that even though the development and application of new indoor environment sensors and HVAC control systems may allow for fully automated IEQ control, such systems should not compromise occupants...... in the degree of perceived control. The database was composed of 1353 responses obtained in 25 buildings of which 15 had mechanical ventilation (997 responses) and 9 had natural ventilation (275 responses). Analysis of occupant responses, after grouping according to categories determined by the degree...... of satisfaction with the perceived control, showed that the degree of control satisfaction, but rarely building category (natural vs. mechanical ventilation), affected the prevalence of adverse perceptions and symptoms. Thus, the degree of control, as perceived by occupants, was more important for the prevalence...

  6. Strengthening the weak link: Built Environment modelling for loss analysis

    Science.gov (United States)

    Millinship, I.

    2012-04-01

    Methods to analyse insured losses from a range of natural perils, including pricing by primary insurers and catastrophe modelling by reinsurers, typically lack sufficient exposure information. Understanding the hazard intensity in terms of spatial severity and frequency is only the first step towards quantifying the risk of a catastrophic event. For any given event we need to know: Are any structures affected? What type of buildings are they? How much damaged occurred? How much will the repairs cost? To achieve this, detailed exposure information is required to assess the likely damage and to effectively calculate the resultant loss. Modelling exposures in the Built Environment therefore plays as important a role in understanding re/insurance risk as characterising the physical hazard. Across both primary insurance books and aggregated reinsurance portfolios, the location of a property (a risk) and its monetary value is typically known. Exactly what that risk is in terms of detailed property descriptors including structure type and rebuild cost - and therefore its vulnerability to loss - is often omitted. This data deficiency is a primary source of variations between modelled losses and the actual claims value. Built Environment models are therefore required at a high resolution to describe building attributes that relate vulnerability to property damage. However, national-scale household-level datasets are often not computationally practical in catastrophe models and data must be aggregated. In order to provide more accurate risk analysis, we have developed and applied a methodology for Built Environment modelling for incorporation into a range of re/insurance applications, including operational models for different international regions and different perils and covering residential, commercial and industry exposures. Illustrated examples are presented, including exposure modelling suitable for aggregated reinsurance analysis for the UK and bespoke high resolution

  7. RANGE AND DISTRIBUTION OF TECHNETIUM KD VALUES IN THE SRS SUBSURFACE ENVIRONMENT

    International Nuclear Information System (INIS)

    Kaplan, D.

    2008-01-01

    Performance assessments (PAs) are risk calculations used to estimate the amount of low-level radioactive waste that can be disposed at DOE sites. Distribution coefficients (K d values) are input parameters used in PA calculations to provide a measure of radionuclide sorption to sediment; the greater the K d value, the greater the sorption and the slower the estimated movement of the radionuclide through sediment. Understanding and quantifying K d value variability is important for estimating the uncertainty of PA calculations. Without this information, it is necessary to make overly conservative estimates about the possible limits of K d values, which in turn may increase disposal costs. Finally, technetium is commonly found to be amongst the radionuclides posing potential risk at waste disposal locations because it is believed to be highly mobile in its anionic form (pertechnetate, TcO 4 - ), it exists in relatively high concentrations in SRS waste, and it has a long half-life (213,000 years). The objectives of this laboratory study were to determine under SRS environmental conditions: (1) whether and to what extent TcO 4 - sorbs to sediments, (2) the range of Tc K d values, (3) the distribution (normal or log-normal) of Tc K d values, and (4) how strongly Tc sorbs to SRS sediments through desorption experiments. Objective 3, to identify the Tc K d distribution is important because it provides a statistical description that influences stochastic modeling of estimated risk. The approach taken was to collect 26 sediments from a non-radioactive containing sediment core collected from E-Area, measure Tc K d values and then perform statistical analysis to describe the measured Tc K d values. The mean K d value was 3.4 ± 0.5 mL/g and ranged from -2.9 to 11.2 mL/g. The data did not have a Normal distribution (as defined by the Shapiro-Wilk's Statistic) and had a 95-percentile range of 2.4 to 4.4 mL/g. The E-Area subsurface is subdivided into three hydrostratigraphic

  8. Nuclear reactor decommissioning: an analysis of the regulatory environments

    International Nuclear Information System (INIS)

    Cantor, R.

    1986-08-01

    In the next several decades, the electric utility industry will be faced withthe retirement of 50,000 megawatts (mW) of nuclear capacity. Responsibility for the financial and technical burdens this activity entails has been delegated to the utilities operating the reactors. However, the operators will have to perform the tasks of reactor decommissioning within the regulatory environment dictated by federal, state and local regulations. The purpose of this study was to highlight some of the current and likely trends in regulations and regulatory practices that will significantly affect the costs, technical alternatives and financing schemes encountered by the electric utilities and their customers. To identify significant trends and practices among regulatory bodies and utilities, a reviw of these factors was undertaken at various levels in the regulatory hierarchy. The technical policies were examined in reference to their treatment of allowed technical modes, restoration of the plant site including any specific recognition of the residual radioactivity levels, and planning requirements. The financial policies were examined for specification of acceptable financing arrangements, mechanisms which adjust for changes in the important parameters used to establish the fund, tax and rate-base treatments of the payments to and earnings on the fund, and whether or not escalation and/or discounting were considered in the estimates of decommissioning costs. The attitudes of regulators toward financial risk, the tax treatment of the decommissioning fund, and the time distribution of the technical mode were found to have the greatest effect on the discounted revenue requirements. Under plausible assumptions, the cost of a highly restricted environment is about seven times that of the minimum revenue requirement environment for the plants that must be decommissioned in the next three decades

  9. Fractal analysis of urban environment: land use and sewer system

    Science.gov (United States)

    Gires, A.; Ochoa Rodriguez, S.; Van Assel, J.; Bruni, G.; Murla Tulys, D.; Wang, L.; Pina, R.; Richard, J.; Ichiba, A.; Willems, P.; Tchiguirinskaia, I.; ten Veldhuis, M. C.; Schertzer, D. J. M.

    2014-12-01

    Land use distribution are usually obtained by automatic processing of satellite and airborne pictures. The complexity of the obtained patterns which are furthermore scale dependent is enhanced in urban environment. This scale dependency is even more visible in a rasterized representation where only a unique class is affected to each pixel. A parameter commonly analysed in urban hydrology is the coefficient of imperviousness, which reflects the proportion of rainfall that will be immediately active in the catchment response. This coefficient is strongly scale dependent with a rasterized representation. This complex behaviour is well grasped with the help of the scale invariant notion of fractal dimension which enables to quantify the space occupied by a geometrical set (here the impervious areas) not only at a single scale but across all scales. This fractal dimension is also compared to the ones computed on the representation of the catchments with the help of operational semi-distributed models. Fractal dimensions of the corresponding sewer systems are also computed and compared with values found in the literature for natural river networks. This methodology is tested on 7 pilot sites of the European NWE Interreg IV RainGain project located in France, Belgium, Netherlands, United-Kingdom and Portugal. Results are compared between all the case study which exhibit different physical features (slope, level of urbanisation, population density...).

  10. Quantitative analysis of tritium distribution in austenitic stainless steels welds

    International Nuclear Information System (INIS)

    Roustila, A.; Kuromoto, N.; Brass, A.M.; Chene, J.

    1994-01-01

    Tritium autoradiography was used to study the tritium distribution in laser and arc (TIG) weldments performed on tritiated AISI 316 samples. Quantitative values of the local tritium concentration were obtained from the microdensitometric analysis of the autoradiographs. This procedure was used to map the tritium concentration in the samples before and after laser and TIG treatments. The effect of the detritiation conditions and of welding on the tritium distribution in the material is extensively characterized. The results illustrate the interest of the technique for predicting a possible embrittlement of the material associated with a local enhancement of the tritium concentration and the presence of helium 3 generated by tritium decay. ((orig.))

  11. Statistical analysis of rockfall volume distributions: Implications for rockfall dynamics

    Science.gov (United States)

    Dussauge, Carine; Grasso, Jean-Robert; Helmstetter, AgnèS.

    2003-06-01

    We analyze the volume distribution of natural rockfalls on different geological settings (i.e., calcareous cliffs in the French Alps, Grenoble area, and granite Yosemite cliffs, California Sierra) and different volume ranges (i.e., regional and worldwide catalogs). Contrary to previous studies that included several types of landslides, we restrict our analysis to rockfall sources which originated on subvertical cliffs. For the three data sets, we find that the rockfall volumes follow a power law distribution with a similar exponent value, within error bars. This power law distribution was also proposed for rockfall volumes that occurred along road cuts. All these results argue for a recurrent power law distribution of rockfall volumes on subvertical cliffs, for a large range of rockfall sizes (102-1010 m3), regardless of the geological settings and of the preexisting geometry of fracture patterns that are drastically different on the three studied areas. The power law distribution for rockfall volumes could emerge from two types of processes. First, the observed power law distribution of rockfall volumes is similar to the one reported for both fragmentation experiments and fragmentation models. This argues for the geometry of rock mass fragment sizes to possibly control the rockfall volumes. This way neither cascade nor avalanche processes would influence the rockfall volume distribution. Second, without any requirement of scale-invariant quenched heterogeneity patterns, the rock mass dynamics can arise from avalanche processes driven by fluctuations of the rock mass properties, e.g., cohesion or friction angle. This model may also explain the power law distribution reported for landslides involving unconsolidated materials. We find that the exponent values of rockfall volume on subvertical cliffs, 0.5 ± 0.2, is significantly smaller than the 1.2 ± 0.3 value reported for mixed landslide types. This change of exponents can be driven by the material strength, which

  12. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jianhua Ni

    2016-08-01

    Full Text Available The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.

  13. Communication and cooperation in networked environments: an experimental analysis.

    Science.gov (United States)

    Galimberti, C; Ignazi, S; Vercesi, P; Riva, G

    2001-02-01

    Interpersonal communication and cooperation do not happen exclusively face to face. In work contexts, as in private life, there are more and more situations of mediated communication and cooperation in which new online tools are used. However, understanding how to use the Internet to support collaborative interaction presents a substantial challenge for the designers and users of this emerging technology. First, collaborative Internet environments are designed to serve a purpose, so must be designed with intended users' tasks and goals explicitly considered. Second, in cooperative activities the key content of communication is the interpretation of the situations in which actors are involved. So, the most effective way of clarifying the meaning of messages is to connect them to a shared context of meaning. However, this is more difficult in the Internet than in other computer-based activities. This paper tries to understand the characteristics of cooperative activities in networked environments--shared 3D virtual worlds--through two different studies. The first used the analysis of conversations to explore the characteristics of the interaction during the cooperative task; the second analyzed whether and how the level of immersion in the networked environments influenced the performance and the interactional process. The results are analyzed to identify the psychosocial roots used to support cooperation in a digital interactive communication.

  14. How is an analysis of the enterprise economic environment done?

    Directory of Open Access Journals (Sweden)

    Carlos Parodi Trece

    2015-09-01

    Full Text Available The proper interpretation of the evolution of the economy is a useful tool to improve corporate decision-making. The aim of this paper is to explain, with examples applied to the current economic reality, how an analysis of the economic environment is made and how it serves for the corporate strategic planning. To do this, after the explanation of the overall conceptual framework; gross domestic product, inflation and external deficit as indicators, key in the "language" used by analysts are defined. These are economic indicators, related, that depend on domestic policy and exogenous shocks, defined as events that are out of the hands of economic policy designers, but that influence the three variables, such as the international financial crisis. Following it, the formalization through macroeconomic identities is made, in order to finally explain "how is the economy" through them; and the relationship between the internal to the external economic environment. Bringing the economy to business should be a priority in an increasingly integrated context, characterized by the fact that the positive and the negative of what happens in the world economy is transmitted through various channels, to companies located in different countries. Hence, companies should broaden their vision, as the economic environment does not only include what happens within the country, but the future of the world economy.

  15. Modular Environment for Graph Research and Analysis with a Persistent

    Energy Technology Data Exchange (ETDEWEB)

    2009-11-18

    The MEGRAPHS software package provides a front-end to graphs and vectors residing on special-purpose computing resources. It allows these data objects to be instantiated, destroyed, and manipulated. A variety of primitives needed for typical graph analyses are provided. An example program illustrating how MEGRAPHS can be used to implement a PageRank computation is included in the distribution.The MEGRAPHS software package is targeted towards developers of graph algorithms. Programmers using MEGRAPHS would write graph analysis programs in terms of high-level graph and vector operations. These computations are transparently executed on the Cray XMT compute nodes.

  16. HammerCloud: A Stress Testing System for Distributed Analysis

    International Nuclear Information System (INIS)

    Ster, Daniel C van der; García, Mario Úbeda; Paladin, Massimo; Elmsheuser, Johannes

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).

  17. HammerCloud: A Stress Testing System for Distributed Analysis

    CERN Document Server

    van der Ster, Daniel C; Ubeda Garcia, Mario; Paladin, Massimo

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud (HC) is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HC was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HC has been ...

  18. An improved algorithm for connectivity analysis of distribution networks

    International Nuclear Information System (INIS)

    Kansal, M.L.; Devi, Sunita

    2007-01-01

    In the present paper, an efficient algorithm for connectivity analysis of moderately sized distribution networks has been suggested. Algorithm is based on generation of all possible minimal system cutsets. The algorithm is efficient as it identifies only the necessary and sufficient conditions of system failure conditions in n-out-of-n type of distribution networks. The proposed algorithm is demonstrated with the help of saturated and unsaturated distribution networks. The computational efficiency of the algorithm is justified by comparing the computational efforts with the previously suggested appended spanning tree (AST) algorithm. The proposed technique has the added advantage as it can be utilized for generation of system inequalities which is useful in reliability estimation of capacitated networks

  19. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    Science.gov (United States)

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  20. Cost Analysis In A Multi-Mission Operations Environment

    Science.gov (United States)

    Newhouse, M.; Felton, L.; Bornas, N.; Botts, D.; Roth, K.; Ijames, G.; Montgomery, P.

    2014-01-01

    Spacecraft control centers have evolved from dedicated, single-mission or single missiontype support to multi-mission, service-oriented support for operating a variety of mission types. At the same time, available money for projects is shrinking and competition for new missions is increasing. These factors drive the need for an accurate and flexible model to support estimating service costs for new or extended missions; the cost model in turn drives the need for an accurate and efficient approach to service cost analysis. The National Aeronautics and Space Administration (NASA) Huntsville Operations Support Center (HOSC) at Marshall Space Flight Center (MSFC) provides operations services to a variety of customers around the world. HOSC customers range from launch vehicle test flights; to International Space Station (ISS) payloads; to small, short duration missions; and has included long duration flagship missions. The HOSC recently completed a detailed analysis of service costs as part of the development of a complete service cost model. The cost analysis process required the team to address a number of issues. One of the primary issues involves the difficulty of reverse engineering individual mission costs in a highly efficient multimission environment, along with a related issue of the value of detailed metrics or data to the cost model versus the cost of obtaining accurate data. Another concern is the difficulty of balancing costs between missions of different types and size and extrapolating costs to different mission types. The cost analysis also had to address issues relating to providing shared, cloud-like services in a government environment, and then assigning an uncertainty or risk factor to cost estimates that are based on current technology, but will be executed using future technology. Finally the cost analysis needed to consider how to validate the resulting cost models taking into account the non-homogeneous nature of the available cost data and the

  1. The Future of the Global Environment: A Model-based Analysis Supporting UNEP's First Global Environment Outlook

    OpenAIRE

    Bakkes JA; Woerden JW van; Alcamo J; Berk MM; Bol P; Born GJ van den; Brink BJE ten; Hettelingh JP; Langeweg F; Niessen LW; Swart RJ; United Nations Environment Programme (UNEP), Nairobi, Kenia; MNV

    1997-01-01

    This report documents the scenario analysis in UNEP's first Global Environment Outlook, published at the same time as the scenario analysis. This Outlook provides a pilot assessment of developments in the environment, both global and regional, between now and 2015, with a further projection to 2050. The study was carried out in support of the Agenda 21 interim evaluation, five years after 'Rio' and ten years after 'Brundtland'. The scenario analysis is based on only one scenario, Conventional...

  2. Cross-platform validation and analysis environment for particle physics

    Science.gov (United States)

    Chekanov, S. V.; Pogrebnyak, I.; Wilbern, D.

    2017-11-01

    A multi-platform validation and analysis framework for public Monte Carlo simulation for high-energy particle collisions is discussed. The front-end of this framework uses the Python programming language, while the back-end is written in Java, which provides a multi-platform environment that can be run from a web browser and can easily be deployed at the grid sites. The analysis package includes all major software tools used in high-energy physics, such as Lorentz vectors, jet algorithms, histogram packages, graphic canvases, and tools for providing data access. This multi-platform software suite, designed to minimize OS-specific maintenance and deployment time, is used for online validation of Monte Carlo event samples through a web interface.

  3. R AS THE ENVIRONMENT FOR DATA ANALYSIS IN PSYCHOLOGICAL EVALUATION

    Directory of Open Access Journals (Sweden)

    Ana María Ruiz-Ruano

    2016-01-01

    Full Text Available R is a free computing environment for statistical data analysis and graph creation. It is becoming a key tool in a wide range of knowledge domains. The interaction with the software is mainly based on a command line interface but efforts are currently being made to develop friendlier graphical user interfaces that will help novice users to become familiar with this programming language. R is a flexible and powerful system thanks to the development community that is working together to improve its capabilities. As a result, it is the chosen statistical software in many applied and basic contexts. This paper highlights the potential usefulness of R for psychological assessment and related areas. Additionally, the relevance of statistical data analysis is emphasised as an instrument that will boost the progress of psychology under the umbrella of the scientific method.

  4. TREEFINDER: a powerful graphical analysis environment for molecular phylogenetics

    Directory of Open Access Journals (Sweden)

    von Haeseler Arndt

    2004-06-01

    Full Text Available Abstract Background Most analysis programs for inferring molecular phylogenies are difficult to use, in particular for researchers with little programming experience. Results TREEFINDER is an easy-to-use integrative platform-independent analysis environment for molecular phylogenetics. In this paper the main features of TREEFINDER (version of April 2004 are described. TREEFINDER is written in ANSI C and Java and implements powerful statistical approaches for inferring gene tree and related analyzes. In addition, it provides a user-friendly graphical interface and a phylogenetic programming language. Conclusions TREEFINDER is a versatile framework for analyzing phylogenetic data across different platforms that is suited both for exploratory as well as advanced studies.

  5. Vibrational Energy Distribution Analysis (VEDA): Scopes and limitations

    Science.gov (United States)

    Jamróz, Michał H.

    2013-10-01

    The principle of operations of the VEDA program written by the author for Potential Energy Distribution (PED) analysis of theoretical vibrational spectra is described. Nowadays, the PED analysis is indispensible tool in serious analysis of the vibrational spectra. To perform the PED analysis it is necessary to define 3N-6 linearly independent local mode coordinates. Already for 20-atomic molecules it is a difficult task. The VEDA program reads the input data automatically from the Gaussian program output files. Then, VEDA automatically proposes an introductory set of local mode coordinates. Next, the more adequate coordinates are proposed by the program and optimized to obtain maximal elements of each column (internal coordinate) of the PED matrix (the EPM parameter). The possibility for an automatic optimization of PED contributions is a unique feature of the VEDA program absent in any other programs performing PED analysis.

  6. The ventilation influence on the spatial distribution of Rn-222 and its decay products in human inhabited environments

    International Nuclear Information System (INIS)

    Munoz, S.N.M.; Hadler, J.C.; Paulo, S.R.

    1996-01-01

    For the determination of the ventilation influence (directional flux of air induced by a fan) on the spatial distribution of Rn-222 and its decay products (daughters) present in human inhabited environments, a group of experimental results were obtained by means of the fission nuclear tracks left by α-particles over adequate plastic detectors CR-39). The exposure of these detectors was done in a closed environment considering the influence of ventilation for different angles, velocities and distances from fan. The results show that a relative quantity of daughters of Rn-222 are pulled out of the environment due to the effects of ventilation and plat-out

  7. Data analysis and mapping of the mountain permafrost distribution

    Science.gov (United States)

    Deluigi, Nicola; Lambiel, Christophe; Kanevski, Mikhail

    2017-04-01

    In Alpine environments mountain permafrost is defined as a thermal state of the ground and corresponds to any lithosphere material that is at or below 0°C for, at least, two years. Its degradation is potentially leading to an increasing rock fall activity, rock glacier accelerations and an increase in the sediment transfer rates. During the last 15 years, knowledge on this phenomenon has significantly increased thanks to many studies and monitoring projects. They revealed a spatial distribution extremely heterogeneous and complex. As a consequence, modelling the potential extent of the mountain permafrost recently became a very important task. Although existing statistical models generally offer a good overview at a regional scale, they are not always able to reproduce its strong spatial discontinuity at the micro scale. To overcome this lack, the objective of this study is to propose an alternative modelling approach using three classification algorithms belonging to statistics and machine learning: Logistic regression (LR), Support Vector Machines (SVM) and Random forests (RF). The former is a linear parametric classifier that commonly used as a benchmark classification algorithm to be employed before using more complex classifiers. Non-linear SVM is a non-parametric learning algorithm and it is a member of the so-called kernel methods. RF are an ensemble learning method based on bootstrap aggregating and offer an embedded measure of the variable importance. Permafrost evidences were selected in a 588 km2 area of the Western Swiss Alps and serve as training examples. They were mapped from field data (thermal and geoelectrical data) and ortho-image interpretation (rock glacier inventorying). The dataset was completed with environmental predictors such as altitude, mean annual air temperature, aspect, slope, potential incoming solar radiation, normalized difference vegetation index and planar, profile and combined terrain curvature indices. Aiming at predicting

  8. Virtual learning object and environment: a concept analysis.

    Science.gov (United States)

    Salvador, Pétala Tuani Candido de Oliveira; Bezerril, Manacés Dos Santos; Mariz, Camila Maria Santos; Fernandes, Maria Isabel Domingues; Martins, José Carlos Amado; Santos, Viviane Euzébia Pereira

    2017-01-01

    To analyze the concept of virtual learning object and environment according to Rodgers' evolutionary perspective. Descriptive study with a mixed approach, based on the stages proposed by Rodgers in his concept analysis method. Data collection occurred in August 2015 with the search of dissertations and theses in the Bank of Theses of the Coordination for the Improvement of Higher Education Personnel. Quantitative data were analyzed based on simple descriptive statistics and the concepts through lexicographic analysis with support of the IRAMUTEQ software. The sample was made up of 161 studies. The concept of "virtual learning environment" was presented in 99 (61.5%) studies, whereas the concept of "virtual learning object" was presented in only 15 (9.3%) studies. A virtual learning environment includes several and different types of virtual learning objects in a common pedagogical context. Analisar o conceito de objeto e de ambiente virtual de aprendizagem na perspectiva evolucionária de Rodgers. Estudo descritivo, de abordagem mista, realizado a partir das etapas propostas por Rodgers em seu modelo de análise conceitual. A coleta de dados ocorreu em agosto de 2015 com a busca de dissertações e teses no Banco de Teses e Dissertações da Coordenação de Aperfeiçoamento de Pessoal de Nível Superior. Os dados quantitativos foram analisados a partir de estatística descritiva simples e os conceitos pela análise lexicográfica com suporte do IRAMUTEQ. A amostra é constituída de 161 estudos. O conceito de "ambiente virtual de aprendizagem" foi apresentado em 99 (61,5%) estudos, enquanto o de "objeto virtual de aprendizagem" em apenas 15 (9,3%). Concluiu-se que um ambiente virtual de aprendizagem reúne vários e diferentes tipos de objetos virtuais de aprendizagem em um contexto pedagógico comum.

  9. Distribution System Reliability Analysis for Smart Grid Applications

    Science.gov (United States)

    Aljohani, Tawfiq Masad

    Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

  10. Risk analysis for a local gas distribution network

    International Nuclear Information System (INIS)

    Peters, J.W.

    1991-01-01

    Cost control and service reliability are popular topics when discussing strategic issues facing local distribution companies (LDCs) in the 1990s. The ability to provide secure and uninterrupted gas service is crucial for growth and company image, both with the public and regulatory agencies. At the same time, the industry is facing unprecedented competition from alternate fuels, and cost control is essential for maintaining a competitive edge in the market. On the surface, it would appear that cost control and service reliability are contradictory terms. Improvement in service reliability should cost something, or does it? Risk analysis can provide the answer from a distribution design perspective. From a gas distribution engineer's perspective, projects such as loops, backfeeds and even valve placement are designed to reduce, minimize and/or eliminate potential customer outages. These projects improve service reliability by acting as backups should a failure occur on a component of the distribution network. These contingency projects are cost-effective but their longterm benefit or true value is under question. Their purpose is to maintain supply to an area in the distribution network in the event of a failure somewhere else. Two phrases, potential customer outages and in the event of failure, identify uncertainty

  11. Analysis of rainfall distribution in Kelantan river basin, Malaysia

    Science.gov (United States)

    Che Ros, Faizah; Tosaka, Hiroyuki

    2018-03-01

    Using rainfall gauge on its own as input carries great uncertainties regarding runoff estimation, especially when the area is large and the rainfall is measured and recorded at irregular spaced gauging stations. Hence spatial interpolation is the key to obtain continuous and orderly rainfall distribution at unknown points to be the input to the rainfall runoff processes for distributed and semi-distributed numerical modelling. It is crucial to study and predict the behaviour of rainfall and river runoff to reduce flood damages of the affected area along the Kelantan river. Thus, a good knowledge on rainfall distribution is essential in early flood prediction studies. Forty six rainfall stations and their daily time-series were used to interpolate gridded rainfall surfaces using inverse-distance weighting (IDW), inverse-distance and elevation weighting (IDEW) methods and average rainfall distribution. Sensitivity analysis for distance and elevation parameters were conducted to see the variation produced. The accuracy of these interpolated datasets was examined using cross-validation assessment.

  12. Size distribution measurements and chemical analysis of aerosol components

    Energy Technology Data Exchange (ETDEWEB)

    Pakkanen, T.A.

    1995-12-31

    The principal aims of this work were to improve the existing methods for size distribution measurements and to draw conclusions about atmospheric and in-stack aerosol chemistry and physics by utilizing size distributions of various aerosol components measured. A sample dissolution with dilute nitric acid in an ultrasonic bath and subsequent graphite furnace atomic absorption spectrometric analysis was found to result in low blank values and good recoveries for several elements in atmospheric fine particle size fractions below 2 {mu}m of equivalent aerodynamic particle diameter (EAD). Furthermore, it turned out that a substantial amount of analyses associated with insoluble material could be recovered since suspensions were formed. The size distribution measurements of in-stack combustion aerosols indicated two modal size distributions for most components measured. The existence of the fine particle mode suggests that a substantial fraction of such elements with two modal size distributions may vaporize and nucleate during the combustion process. In southern Norway, size distributions of atmospheric aerosol components usually exhibited one or two fine particle modes and one or two coarse particle modes. Atmospheric relative humidity values higher than 80% resulted in significant increase of the mass median diameters of the droplet mode. Important local and/or regional sources of As, Br, I, K, Mn, Pb, Sb, Si and Zn were found to exist in southern Norway. The existence of these sources was reflected in the corresponding size distributions determined, and was utilized in the development of a source identification method based on size distribution data. On the Finnish south coast, atmospheric coarse particle nitrate was found to be formed mostly through an atmospheric reaction of nitric acid with existing coarse particle sea salt but reactions and/or adsorption of nitric acid with soil derived particles also occurred. Chloride was depleted when acidic species reacted

  13. Comparing Distributions of Environmental Outcomes for Regulatory Environmental Justice Analysis

    Directory of Open Access Journals (Sweden)

    Glenn Sheriff

    2011-05-01

    Full Text Available Economists have long been interested in measuring distributional impacts of policy interventions. As environmental justice (EJ emerged as an ethical issue in the 1970s, the academic literature has provided statistical analyses of the incidence and causes of various environmental outcomes as they relate to race, income, and other demographic variables. In the context of regulatory impacts, however, there is a lack of consensus regarding what information is relevant for EJ analysis, and how best to present it. This paper helps frame the discussion by suggesting a set of questions fundamental to regulatory EJ analysis, reviewing past approaches to quantifying distributional equity, and discussing the potential for adapting existing tools to the regulatory context.

  14. DIRAC - The Distributed MC Production and Analysis for LHCb

    CERN Document Server

    Tsaregorodtsev, A

    2004-01-01

    DIRAC is the LHCb distributed computing grid infrastructure for MC production and analysis. Its architecture is based on a set of distributed collaborating services. The service decomposition broadly follows the ARDA project proposal, allowing for the possibility of interchanging the EGEE/ARDA and DIRAC components in the future. Some components developed outside the DIRAC project are already in use as services, for example the File Catalog developed by the AliEn project. An overview of the DIRAC architecture will be given, in particular the recent developments to support user analysis. The main design choices will be presented. One of the main design goals of DIRAC is the simplicity of installation, configuring and operation of various services. This allows all the DIRAC resources to be easily managed by a single Production Manager. The modular design of the DIRAC components allows its functionality to be easily extended to include new computing and storage elements or to handle new tasks. The DIRAC system al...

  15. Energy system analysis of fuel cells and distributed generation

    DEFF Research Database (Denmark)

    Mathiesen, Brian Vad; Lund, Henrik

    2007-01-01

    This chapter introduces Energy System Analysis methodologies and tools, which can be used for identifying the best application of different Fuel Cell (FC) technologies to different regional or national energy systems. The main point is that the benefits of using FC technologies indeed depend...... on the energy system in which they are used. Consequently, coherent energy systems analyses of specific and complete energy systems must be conducted in order to evaluate the benefits of FC technologies and in order to be able to compare alternative solutions. In relation to distributed generation, FC...... technologies are very often connected to the use of hydrogen, which has to be provided e.g. from electrolysers. Decentralised and distributed generation has the possibility of improving the overall energy efficiency and flexibility of energy systems. Therefore, energy system analysis tools and methodologies...

  16. Distribution of Hanford reactor produced radionuclides in the marine environment, 1961-73

    International Nuclear Information System (INIS)

    Seymour, A.H.

    1980-01-01

    At Hanford (U.S.A.), the plutonium-producing reactors were in operation during 1944-1971. The period of maximum reactor operation was 1955-1965, when eight reactors were in operation. The reactor deactivation programme began in 1965 and the last reactor was deactivated in 1971. All these reactors were cooled by Columbia River water which passed through the reactors and then was discharged to the river and ultimately to the North Pacific. The Laboratory of Radiation Ecology (LRE) of the University of Washington started an environmental survey programme in 1965 and continued it upto 1973 i.e. even after the last plutonium producing reactor was deactivated. The programme objectives were: (1) to find the geographical distribution and concentration of Hanford produced radionuclides in water, sediments and biota of the marine environment, (2) to relate the operation of the Hanford reactors during the period of deactivation to the concentration of radionuclides in marine organisms, and (3) to observe the rate at which the marine organisms cleansed themselves of 65 Zn after the primary source had been removed. An account of the programme and highlights of the observations are reported. Most of the radioactivity entering the river water and marine organisms was due to 51 Cr, 65 Zn and 32 P of which 65 Zn was found to be the most abundant radionuclide in the biological samples. The rate of radioactivity from the river water entering into the Ocean was about 1000 curies per day and it did not produce any observable effects on populations of marine organisms. The internal dose to man from 65 Zn via seafoods was only a small fraction of the permissible dose for individual members of the population. (M.G.B.)

  17. A survey on distribution and toxigenicity of Aspergillus flavus from indoor and outdoor hospital environments.

    Science.gov (United States)

    Sepahvand, Asghar; Shams-Ghahfarokhi, Masoomeh; Allameh, Abdolamir; Jahanshiri, Zahra; Jamali, Mojdeh; Razzaghi-Abyaneh, Mehdi

    2011-11-01

    In the present study, genetic diversity and mycotoxin profiles of Aspergillus flavus isolated from air (indoors and outdoors), levels (surfaces), and soils of five hospitals in Southwest Iran were examined. From a total of 146 Aspergillus colonies, 63 isolates were finally identified as A. flavus by a combination of colony morphology, microscopic criteria, and mycotoxin profiles. No Aspergillus parasiticus was isolated from examined samples. Chromatographic analyses of A. flavus isolates cultured on yeast extract-sucrose broth by tip culture method showed that approximately 10% and 45% of the isolates were able to produce aflatoxin B(1) (AFB(1)) and cyclopiazonic acid (CPA), respectively. Around 40% of the isolates produced sclerotia on Czapek-Dox agar. The isolates were classified into four chemotypes based on the ability to produce AF and CPA that majority of them (55.5%) belonged to chemotype IV comprising non-mycotoxigenic isolates. Random amplified polymorphic DNA (RAPD) profiles generated by a combination of four selected primers were used to assess genetic relatedness of 16 selected toxigenic and non-toxigenic isolates. The resulting dendrogram demonstrated the formation of two separate clusters for the A. flavus comprised both mycotoxigenic and non-toxigenic isolates in a random distribution. The obtained results in this study showed that RAPD profiling is a promising and efficient tool to determine intra-specific genetic variation among A. flavus populations from hospital environments. A. flavus isolates, either toxigenic or non-toxigenic, should be considered as potential threats for hospitalized patients due to their obvious role in the etiology of nosocomial aspergillosis.

  18. AND LANDSCAPE-ECOLOGICAL ANALYSIS OF ITS DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    S. M. Musaeva

    2012-01-01

    Full Text Available The article is devoted to the study of helminthofauna of the striped lizard in Lankaran natural region. The landscape and ecological analysis of distribution of the helminthofauna is provided. As a result of studies on 99 individuals of striped lizard totally 14 species of helminthes, including 1 trematode species, 1 species of cestode, 3 species of akantocefals and 9 species of nematodes were found.

  19. Measurement of distribution coefficients of U series radionuclides on soils under shallow land environment (2). pH dependence of distribution coefficients

    International Nuclear Information System (INIS)

    Sakamoto, Yoshiaki; Takebe, Shinichi; Ogawa, Hiromichi; Inagawa, Satoshi; Sasaki, Tomozou

    2001-01-01

    In order to study sorption behavior of U series radionuclides (Pb, Ra, Th, Ac, Pa and U) under aerated zone environment (loam-rain water system) and aquifer environment (sand-groundwater system) for safety assessment of U bearing waste, pH dependence of distribution coefficients of each element has been obtained. The pH dependence of distribution coefficients of Pb, Ra, Th, Ac and U was analyzed by model calculation based on aqueous speciation of each element and soil surface charge characteristics, which is composed of a cation exchange capacity and surface hydroxyl groups. From the model calculation, the sorption behavior of Pb, Ra, Th, Ac and U could be described by a combination of cation exchange reaction and surface-complexation model. (author)

  20. Distributed Analysis Experience using Ganga on an ATLAS Tier2 infrastructure

    International Nuclear Information System (INIS)

    Fassi, F.; Cabrera, S.; Vives, R.; Fernandez, A.; Gonzalez de la Hoz, S.; Sanchez, J.; March, L.; Salt, J.; Kaci, M.; Lamas, A.; Amoros, G.

    2007-01-01

    The ATLAS detector will explore the high-energy frontier of Particle Physics collecting the proton-proton collisions delivered by the LHC (Large Hadron Collider). Starting in spring 2008, the LHC will produce more than 10 Peta bytes of data per year. The adapted tiered hierarchy for computing model at the LHC is: Tier-0 (CERN), Tiers-1 and Tiers-2 centres distributed around the word. The ATLAS Distributed Analysis (DA) system has the goal of enabling physicists to perform Grid-based analysis on distributed data using distributed computing resources. IFIC Tier-2 facility is participating in several aspects of DA. In support of the ATLAS DA activities a prototype is being tested, deployed and integrated. The analysis data processing applications are based on the Athena framework. GANGA, developed by LHCb and ATLAS experiments, allows simple switching between testing on a local batch system and large-scale processing on the Grid, hiding Grid complexities. GANGA deals with providing physicists an integrated environment for job preparation, bookkeeping and archiving, job splitting and merging. The experience with the deployment, configuration and operation of the DA prototype will be presented. Experiences gained of using DA system and GANGA in the Top physics analysis will be described. (Author)

  1. Model for teaching distributed computing in a distance-based educational environment

    CSIR Research Space (South Africa)

    le Roux, P

    2010-10-01

    Full Text Available Due to the prolific growth in connectivity, the development and implementation of distributed systems receives a lot of attention. Several technologies and languages exist for the development and implementation of such distributed systems; however...

  2. Integrated Software Environment for Pressurized Thermal Shock Analysis

    Directory of Open Access Journals (Sweden)

    Dino Araneo

    2011-01-01

    Full Text Available The present paper describes the main features and an application to a real Nuclear Power Plant (NPP of an Integrated Software Environment (in the following referred to as “platform” developed at University of Pisa (UNIPI to perform Pressurized Thermal Shock (PTS analysis. The platform is written in Java for the portability and it implements all the steps foreseen in the methodology developed at UNIPI for the deterministic analysis of PTS scenarios. The methodology starts with the thermal hydraulic analysis of the NPP with a system code (such as Relap5-3D and Cathare2, during a selected transient scenario. The results so obtained are then processed to provide boundary conditions for the next step, that is, a CFD calculation. Once the system pressure and the RPV wall temperature are known, the stresses inside the RPV wall can be calculated by mean a Finite Element (FE code. The last step of the methodology is the Fracture Mechanics (FM analysis, using weight functions, aimed at evaluating the stress intensity factor (KI at crack tip to be compared with the critical stress intensity factor KIc. The platform automates all these steps foreseen in the methodology once the user specifies a number of boundary conditions at the beginning of the simulation.

  3. Knowledge exchange and learning from failures in distributed environments: The role of contractor relationship management and work characteristics

    International Nuclear Information System (INIS)

    Gressgård, Leif Jarle; Hansen, Kåre

    2015-01-01

    Learning from failures is vital for improvement of safety performance, reliability, and resilience in organizations. In order for such learning to take place in distributed environments, knowledge has to be shared among organizational members at different locations and units. This paper reports on a study conducted in the context of drilling and well operations on the Norwegian Continental Shelf, which represents a high-risk distributed organizational environment. The study investigates the relationships between organizations' abilities to learn from failures, knowledge exchange within and between organizational units, quality of contractor relationship management, and work characteristics. The results show that knowledge exchange between units is the most important predictor of perceived ability to learn from failures. Contractor relationship management, leadership involvement, role clarity, and empowerment are also important factors for failure-based learning, both directly and through increased knowledge exchange. The results of the study enhance our understanding of how abilities to learn from failures can be improved in distributed environments where similar work processes take place at different locations and involve employees from several companies. Theoretical contributions and practical implications are discussed. - Highlights: • We investigate factors affecting failure-based learning in distributed environments. • Knowledge exchange between units is the most important predictor. • Contractor relationship management is positively related to knowledge exchange. • Leadership involvement, role clarity, and empowerment are significant variables. • Respondents from an operator firm and eight contractors are included in the study

  4. Automatic analysis of attack data from distributed honeypot network

    Science.gov (United States)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  5. Performance analysis of distributed beamforming in a spectrum sharing system

    KAUST Repository

    Yang, Liang; Alouini, Mohamed-Slim; Qaraqe, Khalid A.

    2012-01-01

    In this paper, we consider a distributed beamforming scheme (DBF) in a spectrum sharing system where multiple secondary users share the spectrum with the licensed primary users under an interference temperature constraint. We assume that DBF is applied at the secondary users. We first consider optimal beamforming and compare it with the user selection scheme in terms of the outage probability and bit-error rate performance. Since perfect feedback is difficult to obtain, we then investigate a limited feedback DBF scheme and develop an outage probability analysis for a random vector quantization (RVQ) design algorithm. Numerical results are provided to illustrate our mathematical formalism and verify our analysis. © 2012 IEEE.

  6. Performance analysis of distributed beamforming in a spectrum sharing system

    KAUST Repository

    Yang, Liang

    2012-09-01

    In this paper, we consider a distributed beamforming scheme (DBF) in a spectrum sharing system where multiple secondary users share the spectrum with the licensed primary users under an interference temperature constraint. We assume that DBF is applied at the secondary users. We first consider optimal beamforming and compare it with the user selection scheme in terms of the outage probability and bit-error rate performance. Since perfect feedback is difficult to obtain, we then investigate a limited feedback DBF scheme and develop an outage probability analysis for a random vector quantization (RVQ) design algorithm. Numerical results are provided to illustrate our mathematical formalism and verify our analysis. © 2012 IEEE.

  7. A Scheduling Algorithm for the Distributed Student Registration System in Transaction-Intensive Environment

    Science.gov (United States)

    Li, Wenhao

    2011-01-01

    Distributed workflow technology has been widely used in modern education and e-business systems. Distributed web applications have shown cross-domain and cooperative characteristics to meet the need of current distributed workflow applications. In this paper, the author proposes a dynamic and adaptive scheduling algorithm PCSA (Pre-Calculated…

  8. Study of the distribution characteristics of rare earth elements in Solanum lycocarpum from different tropical environments in Brazil by neutron activation analysis; Estudo das caracteristicas de distribuicao de elementos terras raras em Solanum lycocarpum em diferentes ambientes tropicais do Brasil por ativacao neutronica

    Energy Technology Data Exchange (ETDEWEB)

    Maria, Sheila Piorino

    2001-07-01

    In this work, the concentration of eight rare earth elements (REE), La, Ce, Nd, Sm, Eu, Tb, Yb and Lu, was determined by neutron activation analysis (INAA), in plant leaves of Solanum lycocarpum. This species is a typical Brazilian 'cerrado' plant, widely distributed in Brazil. The analysis of the plant reference materials CRM Pine Needles (NIST 1575) and Spruce Needles (BCR 101) proved that the methodology applied was sufficiently accurate and precise for the determination of REE in plants. In order to better evaluate the uptake of the REE from the soil to the plant, the host soil was also analyzed by ESiAA. The studied areas were Salitre, MG, Serra do Cipo, MG, Lagoa da Pampulha and Mangabeiras, in Belo Horizonte, MG, and Cerrado de Emas, in Pirassununga, SP. The results were analyzed through the calculation of transfer factors soil-plant and by using diagrams normalized to chondrites. The data obtained showed different transfer factors from soil to plant as the subtract changes. Similar distribution patterns for the soil and the plant were obtained in all the studied sites, presenting an enrichment of the light REE (La to Sm), in contrast to the heavy REE (Eu to Lu), less absorbed. These results indicate that the light REE remain available to the plant in the more superficial soil layers. The similarity between the distribution patterns indicates a typical REE absorption by this species, in spite of the significant differences in the substratum . (author)

  9. An Analysis of University Students' Attitudes towards Personalized Learning Environments

    Science.gov (United States)

    Sahin, Muhittin; Kisla, Tarik

    2016-01-01

    The aim of this research is to analyze university students' attitudes towards personalized learning environments with respect to the independent variables of gender, age, university, year of study, knowledge about the environment, participation in the environment and being willing to participate in the environment. The correlative survey model is…

  10. Designing Sustainable Systems for Urban Freight Distribution through techniques of Multicriteria Decision Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Muerza, V.; Larrode, E.; Moreno- Jimenez, J.M.

    2016-07-01

    This paper focuses on the analysis and selection of the parameters that have a major influence on the optimization of the urban freight distribution system by using sustainable means of transport, such as electric vehicles. In addition, a procedure has been be studied to identify the alternatives that may exist to establish the best system for urban freight distribution, which suits the stage that is considered using the most appropriate means of transportation available. To do this, it has been used the Analytic Hierarchy Process, one of the tools of multicriteria decision analysis. In order to establish an adequate planning of an urban freight distribution system using electric vehicles three hypotheses are necessary: (i) it is necessary to establish the strategic planning of the distribution process by defining the relative importance of the strategic objectives of the process of distribution of goods in the urban environment, both economically and technically and in social and environmental terms; (ii) it must be established the operational planning that allows the achievement of the strategic objectives with the most optimized allocation of available resources; and (iii) to determine the optimal architecture of the vehicle that best suits the operating conditions in which it will work and ensures optimum energy efficiency in operation. (Author)

  11. A Web-Based Development Environment for Collaborative Data Analysis

    Science.gov (United States)

    Erdmann, M.; Fischer, R.; Glaser, C.; Klingebiel, D.; Komm, M.; Müller, G.; Rieger, M.; Steggemann, J.; Urban, M.; Winchen, T.

    2014-06-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis flow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-configurable connections to remote machines supplying resources and local file access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and additional software either centralized or individually. We further report on the results of an application with more than 100 third-year students using VISPA for their regular particle physics exercises during the winter term 2012/13. Besides the ambition to support and simplify the development cycle of physics analyses, new use cases such as fast, location-independent status queries, the validation of results, and the ability to share analyses within worldwide collaborations with a single click become conceivable.

  12. A Web-Based Development Environment for Collaborative Data Analysis

    International Nuclear Information System (INIS)

    Erdmann, M; Fischer, R; Glaser, C; Klingebiel, D; Müller, G; Rieger, M; Urban, M; Winchen, T; Komm, M; Steggemann, J

    2014-01-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis flow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-configurable connections to remote machines supplying resources and local file access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and additional software either centralized or individually. We further report on the results of an application with more than 100 third-year students using VISPA for their regular particle physics exercises during the winter term 2012/13. Besides the ambition to support and simplify the development cycle of physics analyses, new use cases such as fast, location-independent status queries, the validation of results, and the ability to share analyses within worldwide collaborations with a single click become conceivable

  13. A Web-Based Development Environment for Collaborative Data Analysis

    CERN Document Server

    Erdmann, M; Glaser, C; Klingebiel, D; Komm, M; Müller, G; Rieger, M; Steggemann, J; Urban, M; Winchen, T

    2014-01-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis ow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-congurable connections to remote machines supplying resources and local le access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and a...

  14. Towards a Cloud Computing Environment: Near Real-time Cloud Product Processing and Distribution for Next Generation Satellites

    Science.gov (United States)

    Nguyen, L.; Chee, T.; Minnis, P.; Palikonda, R.; Smith, W. L., Jr.; Spangenberg, D.

    2016-12-01

    The NASA LaRC Satellite ClOud and Radiative Property retrieval System (SatCORPS) processes and derives near real-time (NRT) global cloud products from operational geostationary satellite imager datasets. These products are being used in NRT to improve forecast model, aircraft icing warnings, and support aircraft field campaigns. Next generation satellites, such as the Japanese Himawari-8 and the upcoming NOAA GOES-R, present challenges for NRT data processing and product dissemination due to the increase in temporal and spatial resolution. The volume of data is expected to increase to approximately 10 folds. This increase in data volume will require additional IT resources to keep up with the processing demands to satisfy NRT requirements. In addition, these resources are not readily available due to cost and other technical limitations. To anticipate and meet these computing resource requirements, we have employed a hybrid cloud computing environment to augment the generation of SatCORPS products. This paper will describe the workflow to ingest, process, and distribute SatCORPS products and the technologies used. Lessons learn from working on both AWS Clouds and GovCloud will be discussed: benefits, similarities, and differences that could impact decision to use cloud computing and storage. A detail cost analysis will be presented. In addition, future cloud utilization, parallelization, and architecture layout will be discussed for GOES-R.

  15. Distribution and importance of microplastics in the marine environment: A review of the sources, fate, effects, and potential solutions.

    Science.gov (United States)

    Auta, H S; Emenike, C U; Fauziah, S H

    2017-05-01

    The presence of microplastics in the marine environment poses a great threat to the entire ecosystem and has received much attention lately as the presence has greatly impacted oceans, lakes, seas, rivers, coastal areas and even the Polar Regions. Microplastics are found in most commonly utilized products (primary microplastics), or may originate from the fragmentation of larger plastic debris (secondary microplastics). The material enters the marine environment through terrestrial and land-based activities, especially via runoffs and is known to have great impact on marine organisms as studies have shown that large numbers of marine organisms have been affected by microplastics. Microplastic particles have been found distributed in large numbers in Africa, Asia, Southeast Asia, India, South Africa, North America, and in Europe. This review describes the sources and global distribution of microplastics in the environment, the fate and impact on marine biota, especially the food chain. Furthermore, the control measures discussed are those mapped out by both national and international environmental organizations for combating the impact from microplastics. Identifying the main sources of microplastic pollution in the environment and creating awareness through education at the public, private, and government sectors will go a long way in reducing the entry of microplastics into the environment. Also, knowing the associated behavioral mechanisms will enable better understanding of the impacts for the marine environment. However, a more promising and environmentally safe approach could be provided by exploiting the potentials of microorganisms, especially those of marine origin that can degrade microplastics. The concentration, distribution sources and fate of microplastics in the global marine environment were discussed, so also was the impact of microplastics on a wide range of marine biota. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Distribution of Aleutian mink disease virus contamination in the environment of infected mink farms.

    Science.gov (United States)

    Prieto, A; Fernández-Antonio, R; Díaz-Cao, J M; López, G; Díaz, P; Alonso, J M; Morrondo, P; Fernández, G

    2017-05-01

    Control and eradication of Aleutian Mink Disease Virus (AMDV) are a major concern for fur-bearing animal production. Despite notably reducing disease prevalence, current control programs are unable to prevent the reinfection of farms, and environmental AMDV persistence seems to play a major role regarding this issue. In this study 114 samples from different areas and elements of seven infected mink farms were analyzed by qPCR in order to evaluate the environmental distribution of AMDV load. Samples were classified into nine categories, depending on the type of sample and degree of proximity to the animals, the main source of infection. Two different commercial DNA extraction kits were employed in parallel for all samples. qPCR analysis showed 69.3% positive samples with one kit and 81.6% with the other, and significant differences between the two DNA extraction methods were found regarding AMDV DNA recovery. Regarding sample categorization, all categories showed a high percentage of AMDV positive samples (31%-100%). Quantification of positive samples showed a decrease in AMDV load from animal barns to the periphery of the farm. In addition, those elements in direct contact with animals, the street clothes and vehicles of farm workers and personal protective equipment used for sampling showed a high viral load, and statistical analysis revealed significant differences in AMDV load between the first and last categories. These results indicate high environmental contamination of positive farms, which is helpful for future considerations about cleaning and disinfection procedures and biosecurity protocols. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Performance Analysis of Information Services in a Grid Environment

    Directory of Open Access Journals (Sweden)

    Giovanni Aloisio

    2004-10-01

    Full Text Available The Information Service is a fundamental component in a grid environment. It has to meet a lot of requirements such as access to static and dynamic information related to grid resources, efficient and secure access to dynamic data, decentralized maintenance, fault tolerance etc., in order to achieve better performance, scalability, security and extensibility. Currently there are two different major approaches. One is based on a directory infrastructure and another one on a novel approach that exploits a relational DBMS. In this paper we present a performance comparison analysis between Grid Resource Information Service (GRIS and Local Dynamic Grid Catalog relational information service (LDGC, providing also information about two projects (iGrid and Grid Relational Catalog in the grid data management area.

  18. Augmented reality environment for temporomandibular joint motion analysis.

    Science.gov (United States)

    Wagner, A; Ploder, O; Zuniga, J; Undt, G; Ewers, R

    1996-01-01

    The principles of interventional video tomography were applied for the real-time visualization of temporomandibular joint movements in an augmented reality environment. Anatomic structures were extracted in three dimensions from planar cephalometric radiographic images. The live-image fusion of these graphic anatomic structures with real-time position data of the mandible and the articular fossa was performed with a see-through, head-mounted display and an electromagnetic tracking system. The dynamic fusion of radiographic images of the temporomandibular joint to anatomic temporomandibular joint structures in motion created a new modality for temporomandibular joint motion analysis. The advantages of the method are its ability to accurately examine the motion of the temporomandibular joint in three dimensions without restraining the subject and its ability to simultaneously determine the relationship of the bony temporomandibular joint and supporting structures (ie, occlusion, muscle function, etc) during movement before and after treatment.

  19. Privacy-Preserving k-Means Clustering under Multiowner Setting in Distributed Cloud Environments

    Directory of Open Access Journals (Sweden)

    Hong Rong

    2017-01-01

    Full Text Available With the advent of big data era, clients who lack computational and storage resources tend to outsource data mining tasks to cloud service providers in order to improve efficiency and reduce costs. It is also increasingly common for clients to perform collaborative mining to maximize profits. However, due to the rise of privacy leakage issues, the data contributed by clients should be encrypted using their own keys. This paper focuses on privacy-preserving k-means clustering over the joint datasets encrypted under multiple keys. Unfortunately, existing outsourcing k-means protocols are impractical because not only are they restricted to a single key setting, but also they are inefficient and nonscalable for distributed cloud computing. To address these issues, we propose a set of privacy-preserving building blocks and outsourced k-means clustering protocol under Spark framework. Theoretical analysis shows that our scheme protects the confidentiality of the joint database and mining results, as well as access patterns under the standard semihonest model with relatively small computational overhead. Experimental evaluations on real datasets also demonstrate its efficiency improvements compared with existing approaches.

  20. An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment

    Directory of Open Access Journals (Sweden)

    Vinothkumar Muthurajan

    2016-01-01

    Full Text Available Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function provide minimum protection level compared to asymmetric key (RSA, AES, and ECC schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation.

  1. A network-based distributed, media-rich computing and information environment

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, R.L.

    1995-12-31

    Sunrise is a Los Alamos National Laboratory (LANL) project started in October 1993. It is intended to be a prototype National Information Infrastructure development project. A main focus of Sunrise is to tie together enabling technologies (networking, object-oriented distributed computing, graphical interfaces, security, multi-media technologies, and data-mining technologies) with several specific applications. A diverse set of application areas was chosen to ensure that the solutions developed in the project are as generic as possible. Some of the application areas are materials modeling, medical records and image analysis, transportation simulations, and K-12 education. This paper provides a description of Sunrise and a view of the architecture and objectives of this evolving project. The primary objectives of Sunrise are three-fold: (1) To develop common information-enabling tools for advanced scientific research and its applications to industry; (2) To enhance the capabilities of important research programs at the Laboratory; (3) To define a new way of collaboration between computer science and industrially-relevant research.

  2. An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment.

    Science.gov (United States)

    Muthurajan, Vinothkumar; Narayanasamy, Balaji

    2016-01-01

    Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric) mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function) provide minimum protection level compared to asymmetric key (RSA, AES, and ECC) schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA) to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT) regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation.

  3. Study of Solid State Drives performance in PROOF distributed analysis system

    Science.gov (United States)

    Panitkin, S. Y.; Ernst, M.; Petkus, R.; Rind, O.; Wenaus, T.

    2010-04-01

    Solid State Drives (SSD) is a promising storage technology for High Energy Physics parallel analysis farms. Its combination of low random access time and relatively high read speed is very well suited for situations where multiple jobs concurrently access data located on the same drive. It also has lower energy consumption and higher vibration tolerance than Hard Disk Drive (HDD) which makes it an attractive choice in many applications raging from personal laptops to large analysis farms. The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF is especially efficient together with distributed local storage systems like Xrootd, when data are distributed over computing nodes. In such an architecture the local disk subsystem I/O performance becomes a critical factor, especially when computing nodes use multi-core CPUs. We will discuss our experience with SSDs in PROOF environment. We will compare performance of HDD with SSD in I/O intensive analysis scenarios. In particular we will discuss PROOF system performance scaling with a number of simultaneously running analysis jobs.

  4. Back analysis of fault-slip in burst prone environment

    Science.gov (United States)

    Sainoki, Atsushi; Mitri, Hani S.

    2016-11-01

    In deep underground mines, stress re-distribution induced by mining activities could cause fault-slip. Seismic waves arising from fault-slip occasionally induce rock ejection when hitting the boundary of mine openings, and as a result, severe damage could be inflicted. In general, it is difficult to estimate fault-slip-induced ground motion in the vicinity of mine openings because of the complexity of the dynamic response of faults and the presence of geological structures. In this paper, a case study is conducted for a Canadian underground mine, herein called "Mine-A", which is known for its seismic activities. Using a microseismic database collected from the mine, a back analysis of fault-slip is carried out with mine-wide 3-dimensional numerical modeling. A back analysis is conducted to estimate the physical and mechanical properties of the causative fracture or shear zones. One large seismic event has been selected for the back analysis to detect a fault-slip related seismic event. In the back analysis, the shear zone properties are estimated with respect to moment magnitude of the seismic event and peak particle velocity (PPV) recorded by a strong ground motion sensor. The estimated properties are then validated through comparison with peak ground acceleration recorded by accelerometers. Lastly, ground motion in active mining areas is estimated by conducting dynamic analysis with the estimated values. The present study implies that it would be possible to estimate the magnitude of seismic events that might occur in the near future by applying the estimated properties to the numerical model. Although the case study is conducted for a specific mine, the developed methodology can be equally applied to other mines suffering from fault-slip related seismic events.

  5. Thermographic Analysis of Stress Distribution in Welded Joints

    Directory of Open Access Journals (Sweden)

    Domazet Ž.

    2010-06-01

    Full Text Available The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  6. Thermographic Analysis of Stress Distribution in Welded Joints

    Science.gov (United States)

    Piršić, T.; Krstulović Opara, L.; Domazet, Ž.

    2010-06-01

    The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural) stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis) in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  7. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  8. Distribution and air-sea exchange of mercury (Hg) in polluted marine environments

    Science.gov (United States)

    Bagnato, E.; Sprovieri, M.; Bitetto, M.; Bonsignore, M.; Calabrese, S.; Di Stefano, V.; Oliveri, E.; Parello, F.; Mazzola, S.

    2012-04-01

    Mercury (Hg) is emitted in the atmosphere by anthropogenic and natural sources, these last accounting for one third of the total emissions. Since the pre-industrial age, the atmospheric deposition of mercury have increased notably, while ocean emissions have doubled owing to the re-emission of anthropogenic mercury. Exchange between the atmosphere and ocean plays an important role in cycling and transport of mercury. We present the preliminary results from a study on the distribution and evasion flux of mercury at the atmosphere/sea interface in the Augusta basin (SE Sicily, southern Italy), a semi-enclosed marine area affected by a high degree of contamination (heavy metals and PHA) due to the oil refineries placed inside its commercial harbor. It seems that the intense industrial activity of the past have lead to an high Hg pollution in the bottom sediments of the basin, whose concentrations are far from the background mercury value found in most of the Sicily Strait sediments. The release of mercury into the harbor seawater and its dispersion by diffusion from sediments to the surface, make the Augusta basin a potential supplier of mercury both to the Mediterranean Sea and the atmosphere. Based on these considerations, mercury concentration and flux at the air-sea interface of the Bay have been estimated using a real-time atomic adsorption spectrometer (LUMEX - RA915+) and an home-made accumulation chamber, respectively. Estimated Total Atmospheric Mercury (TGM) concentrations during the cruise on the bay were in the range of 1-3 ng · m-3, with a mean value of about 1.4 ng · m-3. These data well fit with the background Hgatm concentration values detected on the land (1-2 ng · m-3, this work), and, more in general, with the background atmospheric TGM levels found in the North Hemisphere (1.5-1.7 ng · m-3)a. Besides, our measurements are in the range of those reported for other important polluted marine areas. The mercury evasion flux at the air-sea interface

  9. Componential distribution analysis of food using near infrared ray image

    Science.gov (United States)

    Yamauchi, Hiroki; Kato, Kunihito; Yamamoto, Kazuhiko; Ogawa, Noriko; Ohba, Kimie

    2008-11-01

    The components of the food related to the "deliciousness" are usually evaluated by componential analysis. The component content and type of components in the food are determined by this analysis. However, componential analysis is not able to analyze measurements in detail, and the measurement is time consuming. We propose a method to measure the two-dimensional distribution of the component in food using a near infrared ray (IR) image. The advantage of our method is to be able to visualize the invisible components. Many components in food have characteristics such as absorption and reflection of light in the IR range. The component content is measured using subtraction between two wavelengths of near IR light. In this paper, we describe a method to measure the component of food using near IR image processing, and we show an application to visualize the saccharose in the pumpkin.

  10. An environment for operation planning of electrical energy distribution systems; Um ambiente para planejamento da operacao de sistemas de distribuicao de energia eletrica

    Energy Technology Data Exchange (ETDEWEB)

    Hashimoto, Kleber

    1997-07-01

    The aim of operation planning of electrical energy distribution systems is to define a group of interventions in the distribution network in a way to reach a global optimization. Both normal operation and emergency conditions are considered including scheduled interventions in the network. Under normal conditions, some aspects as voltage drop and loss minimization, for example, are considered; under emergency conditions, the minimization of non-distributed energy and interruption time are taken into account. This proposes the development of a computer environment for operation planning studies. The design and implementation of a specific operation planning database, based on a relational database model, was carried out so that new information modules could be easily included. A statistical approach has been applied to the load model including consumption habits and seasonal information, for each consumer category. A group of operation planning functions is proposed, which includes simulation modules responsible for system analysis. These functions and the database should be designed together. The developed computer environment allows the inclusion of new functions, due to the standardized query language used to access the database. Two functions have been implemented for illustration. The development of the proposed environment constitutes a step towards an open operation planning system, an integration with other information systems and an easy-to-use system based on graphical interface. (author)

  11. Principal Component Analysis for Normal-Distribution-Valued Symbolic Data.

    Science.gov (United States)

    Wang, Huiwen; Chen, Meiling; Shi, Xiaojun; Li, Nan

    2016-02-01

    This paper puts forward a new approach to principal component analysis (PCA) for normal-distribution-valued symbolic data, which has a vast potential of applications in the economic and management field. We derive a full set of numerical characteristics and variance-covariance structure for such data, which forms the foundation for our analytical PCA approach. Our approach is able to use all of the variance information in the original data than the prevailing representative-type approach in the literature which only uses centers, vertices, etc. The paper also provides an accurate approach to constructing the observations in a PC space based on the linear additivity property of normal distribution. The effectiveness of the proposed method is illustrated by simulated numerical experiments. At last, our method is applied to explain the puzzle of risk-return tradeoff in China's stock market.

  12. Growing axons analysis by using Granulometric Size Distribution

    International Nuclear Information System (INIS)

    Gonzalez, Mariela A; Ballarin, Virginia L; Rapacioli, Melina; CelIn, A R; Sanchez, V; Flores, V

    2011-01-01

    Neurite growth (neuritogenesis) in vitro is a common methodology in the field of developmental neurobiology. Morphological analyses of growing neurites are usually difficult because their thinness and low contrast usually prevent to observe clearly their shape, number, length and spatial orientation. This paper presents the use of the granulometric size distribution in order to automatically obtain information about the shape, size and spatial orientation of growing axons in tissue cultures. The results here presented show that the granulometric size distribution results in a very useful morphological tool since it allows the automatic detection of growing axons and the precise characterization of a relevant parameter indicative of the axonal growth spatial orientation such as the quantification of the angle of deviation of the growing direction. The developed algorithms automatically quantify this orientation by facilitating the analysis of these images, which is important given the large number of images that need to be processed for this type of study.

  13. Water hammer analysis in a water distribution system

    Directory of Open Access Journals (Sweden)

    John Twyman

    2017-04-01

    Full Text Available The solution to water hammer in a water distribution system (WDS is shown by applying three hybrid methods (HM based on the Box’s scheme, McCormack's method and Diffusive Scheme. Each HM formulation in conjunction with their relative advantages and disadvantages are reviewed. The analyzed WDS has pipes with different lengths, diameters and wave speeds, being the Courant number different in each pipe according to the adopted discretization. The HM results are compared with the results obtained by the Method of Characteristics (MOC. In reviewing the numerical attenuation, second order schemes based on Box and McCormack are more conservative from a numerical point of view, being recommendable their application in the analysis of water hammer in water distribution systems.

  14. Distribution pattern of radon and daughters in an urban environment and determination of organ-dose frequency distributions

    International Nuclear Information System (INIS)

    Steinhaeusler, F.; Hofmann, W.; Pohl, E.; Pohl-Rueling, J.

    1980-01-01

    In a normal urban environment, inhaled radon and its decay products cause an important part of the human radiation burden not only for the respiratory tract but also for several other organs. A study was made with 729 test persons in Salzburg, Austria. Each person was questioned regarding his living activity, i.e., the mean times he spent in individual sleeping, living, and working places and the corresponding physical activities that strongly influence the respiratory minute volume and consequently the inhaled radioactivity. At all these places, the annual means of the external gamma radiation as well as of the air content of 222 Rn and its short-lived decay products were determined. From all these measurements (more than 8000) and the information obtained, mean annual organ doses were calculated for each test person. The doses due to cosmic radiation, 40 K, and other radionuclides incorporated were totaled. The range of the mean annual doses in millirems is only from 73 to 126 for gonads and 70 to 336 for the kidneys and finally reaches from 117 to 10,700 for the basal cells of the bronchial epithelium

  15. Content and distribution of trace metals in pristine permafrost environments of Northeastern Siberia, Russia

    Science.gov (United States)

    Antcibor, I.; Eschenbach, A.; Kutzbach, L.; Bolshiyanov, D.; Pfeiffer, E.-M.

    2012-04-01

    middle floodplain. Correlation analysis between element concentrations, grain-size distribution and carbon content revealed a direct dependence of the element distribution within all soil profiles on its mineralogical composition. Based on the obtained results we suggest that there are negligible atmospheric depositions caused by human activity on the investigation site. Therefore this data can provide a point of comparison against man-made influences on permafrost-affected landscapes and also on similar pristine areas in the Arctic region.

  16. Distributed analysis using GANGA on the EGEE/LCG infrastructure

    International Nuclear Information System (INIS)

    Elmsheuser, J; Brochu, F; Harrison, K; Egede, U; Gaidioz, B; Liko, D; Maier, A; Moscicki, J; Muraru, A; Lee, H-C; Romanovsky, V; Soroko, A; Tan, C L

    2008-01-01

    The distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The need to facilitate the access to the resources is very high. In every experiment up to a thousand physicist will be submitting analysis jobs into the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without too much expertise in Grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment and the EGEE/LCG infrastructure. The integration with the ATLAS data management system DQ2 into GANGA is a key functionality. In combination with the job splitting mechanism large amounts of jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports tasks of user analysis with reconstructed data and small scale production of Monte Carlo data

  17. Statistical analysis of the spatial distribution of galaxies and clusters

    International Nuclear Information System (INIS)

    Cappi, Alberto

    1993-01-01

    This thesis deals with the analysis of the distribution of galaxies and clusters, describing some observational problems and statistical results. First chapter gives a theoretical introduction, aiming to describe the framework of the formation of structures, tracing the history of the Universe from the Planck time, t_p = 10"-"4"3 sec and temperature corresponding to 10"1"9 GeV, to the present epoch. The most usual statistical tools and models of the galaxy distribution, with their advantages and limitations, are described in chapter two. A study of the main observed properties of galaxy clustering, together with a detailed statistical analysis of the effects of selecting galaxies according to apparent magnitude or diameter, is reported in chapter three. Chapter four delineates some properties of groups of galaxies, explaining the reasons of discrepant results on group distributions. Chapter five is a study of the distribution of galaxy clusters, with different statistical tools, like correlations, percolation, void probability function and counts in cells; it is found the same scaling-invariant behaviour of galaxies. Chapter six describes our finding that rich galaxy clusters too belong to the fundamental plane of elliptical galaxies, and gives a discussion of its possible implications. Finally chapter seven reviews the possibilities offered by multi-slit and multi-fibre spectrographs, and I present some observational work on nearby and distant galaxy clusters. In particular, I show the opportunities offered by ongoing surveys of galaxies coupled with multi-object fibre spectrographs, focusing on the ESO Key Programme A galaxy redshift survey in the south galactic pole region to which I collaborate and on MEFOS, a multi-fibre instrument with automatic positioning. Published papers related to the work described in this thesis are reported in the last appendix. (author) [fr

  18. Analysis of Statistical Distributions Used for Modeling Reliability and Failure Rate of Temperature Alarm Circuit

    International Nuclear Information System (INIS)

    EI-Shanshoury, G.I.

    2011-01-01

    Several statistical distributions are used to model various reliability and maintainability parameters. The applied distribution depends on the' nature of the data being analyzed. The presented paper deals with analysis of some statistical distributions used in reliability to reach the best fit of distribution analysis. The calculations rely on circuit quantity parameters obtained by using Relex 2009 computer program. The statistical analysis of ten different distributions indicated that Weibull distribution gives the best fit distribution for modeling the reliability of the data set of Temperature Alarm Circuit (TAC). However, the Exponential distribution is found to be the best fit distribution for modeling the failure rate

  19. Distribution and transfer of radiocesium and radioiodine in the environment following the Fukushima nuclear accident - Distribution and transfer of radiocesium and radioiodine in the environment of Fukushima Prefecture following the nuclear accident

    Energy Technology Data Exchange (ETDEWEB)

    Muramatsu, Yasuyuki; Ohno, Takeshi; Sugiyama, Midori [Gakushuin University, Toshima-ku, Tokyo, 171-8588 (Japan); Sato, Mamoru [Fukushima Agricultural Technology Centre, Koriyama, Fukushima 963-0531 (Japan); Matsuzaki, Hiroyuki [The University of Tokyo, Bunkyo-ku, Tokyo 113-8656 (Japan)

    2014-07-01

    Large quantities of radioiodine and radiocesium were released from the accident at the Fukushima Daiichi Nuclear Power Plant (FDNPP) in March 2011. We have carried out intensive studies on the distribution and behaviour of these nuclides in the environment following the accident. Two topics obtained from our studies are presented. (1) Retrospective estimation of I-131 deposition through the analysis of I-129 in soil: It is necessary to obtain deposition data of radioiodine in Fukushima Prefecture for the assessment of thyroid doses due to the accident. However, short half-life of I-131 (8 days) made it impossible to obtain adequate sample coverage that would permit direct determination of the regional deposition patterns of I-131 within the prefecture and surrounding areas. On the other hand, I-129 released simultaneously during the accident still remains in soil, due to its long half-life of 1.57x10{sup 7} years. In order to reconstruct the I-131 deposition, we have determined I-129 concentrations by accelerator mass spectrometry (AMS). A good correlation was found between the measured concentrations of I-131 and I-129 in soils collected in the vicinity of FDNPP. We have analyzed I-129 in more than 500 soil samples collected systematically from Fukushima Prefecture. Using the obtained results, the I-131 deposition was calculated in different areas and the deposition map for I-131 was constructed. We also studied the vertical distribution of I-129 in soil. (2) Peculiar accumulation of radiocesium to some plants and mushrooms The radioactivity levels in agricultural crops decreased markedly in some months following the accident and their concentrations became lower than the Japanese guideline for foodstuffs (500 Bq/kg in 2011, and 100 Bq/kg after 2012). However, some agricultural products such as tea leaves and citrus fruits showed relatively higher values. Our analytical results obtained for the distribution of radiocesium in tea trees show that the root uptake

  20. Job optimization in ATLAS TAG-based distributed analysis

    Science.gov (United States)

    Mambelli, M.; Cranshaw, J.; Gardner, R.; Maeno, T.; Malon, D.; Novak, M.

    2010-04-01

    The ATLAS experiment is projected to collect over one billion events/year during the first few years of operation. The efficient selection of events for various physics analyses across all appropriate samples presents a significant technical challenge. ATLAS computing infrastructure leverages the Grid to tackle the analysis across large samples by organizing data into a hierarchical structure and exploiting distributed computing to churn through the computations. This includes events at different stages of processing: RAW, ESD (Event Summary Data), AOD (Analysis Object Data), DPD (Derived Physics Data). Event Level Metadata Tags (TAGs) contain information about each event stored using multiple technologies accessible by POOL and various web services. This allows users to apply selection cuts on quantities of interest across the entire sample to compile a subset of events that are appropriate for their analysis. This paper describes new methods for organizing jobs using the TAGs criteria to analyze ATLAS data. It further compares different access patterns to the event data and explores ways to partition the workload for event selection and analysis. Here analysis is defined as a broader set of event processing tasks including event selection and reduction operations ("skimming", "slimming" and "thinning") as well as DPD making. Specifically it compares analysis with direct access to the events (AOD and ESD data) to access mediated by different TAG-based event selections. We then compare different ways of splitting the processing to maximize performance.

  1. Precision Statistical Analysis of Images Based on Brightness Distribution

    Directory of Open Access Journals (Sweden)

    Muzhir Shaban Al-Ani

    2017-07-01

    Full Text Available Study the content of images is considered an important topic in which reasonable and accurate analysis of images are generated. Recently image analysis becomes a vital field because of huge number of images transferred via transmission media in our daily life. These crowded media with images lead to highlight in research area of image analysis. In this paper, the implemented system is passed into many steps to perform the statistical measures of standard deviation and mean values of both color and grey images. Whereas the last step of the proposed method concerns to compare the obtained results in different cases of the test phase. In this paper, the statistical parameters are implemented to characterize the content of an image and its texture. Standard deviation, mean and correlation values are used to study the intensity distribution of the tested images. Reasonable results are obtained for both standard deviation and mean value via the implementation of the system. The major issue addressed in the work is concentrated on brightness distribution via statistical measures applying different types of lighting.

  2. Risk analysis for autonomous underwater vehicle operations in extreme environments.

    Science.gov (United States)

    Brito, Mario Paulo; Griffiths, Gwyn; Challenor, Peter

    2010-12-01

    Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan-Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January-March 2009. © 2010 Society for Risk Analysis.

  3. Expressing Environment Assumptions and Real-time Requirements for a Distributed Embedded System with Shared Variables

    DEFF Research Database (Denmark)

    Tjell, Simon; Fernandes, João Miguel

    2008-01-01

    In a distributed embedded system, it is often necessary to share variables among its computing nodes to allow the distribution of control algorithms. It is therefore necessary to include a component in each node that provides the service of variable sharing. For that type of component, this paper...

  4. A distributed data base management facility for the CAD/CAM environment

    Science.gov (United States)

    Balza, R. M.; Beaudet, R. W.; Johnson, H. R.

    1984-01-01

    Current/PAD research in the area of distributed data base management considers facilities for supporting CAD/CAM data management in a heterogeneous network of computers encompassing multiple data base managers supporting a variety of data models. These facilities include coordinated execution of multiple DBMSs to provide for administration of and access to data distributed across them.

  5. Aeroelastic Analysis of a Distributed Electric Propulsion Wing

    Science.gov (United States)

    Massey, Steven J.; Stanford, Bret K.; Wieseman, Carol D.; Heeg, Jennifer

    2017-01-01

    An aeroelastic analysis of a prototype distributed electric propulsion wing is presented. Results using MSC Nastran (Registered Trademark) doublet lattice aerodynamics are compared to those based on FUN3D Reynolds Averaged Navier- Stokes aerodynamics. Four levels of grid refinement were examined for the FUN3D solutions and solutions were seen to be well converged. It was found that no oscillatory instability existed, only that of divergence, which occurred in the first bending mode at a dynamic pressure of over three times the flutter clearance condition.

  6. Continuous Distributed Top-k Monitoring over High-Speed Rail Data Stream in Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Hanning Wang

    2013-01-01

    Full Text Available In the environment of cloud computing, real-time mass data about high-speed rail which is based on the intense monitoring of large scale perceived equipment provides strong support for the safety and maintenance of high-speed rail. In this paper, we focus on the Top-k algorithm of continuous distribution based on Multisource distributed data stream for high-speed rail monitoring. Specifically, we formalized Top-k monitoring model of high-speed rail and proposed DTMR that is the Top-k monitoring algorithm with random, continuous, or strictly monotone aggregation functions. The DTMR was proved to be valid by lots of experiments.

  7. Advanced analysis of metal distributions in human hair

    International Nuclear Information System (INIS)

    Kempson, Ivan M.; Skinner, William M.

    2006-01-01

    A variety of techniques (secondary electron microscopy with energy dispersive X-ray analysis, time-of-flight-secondary ion mass spectrometry, and synchrotron X-ray fluorescence) were utilized to distinguish metal contamination occurring in hair arising from endogenous uptake from an individual exposed to a polluted environment, in this case a lead smelter. Evidence was sought for elements less affected by contamination and potentially indicative of biogenic activity. The unique combination of surface sensitivity, spatial resolution, and detection limits used here has provided new insight regarding hair analysis. Metals such as Ca, Fe, and Pb appeared to have little representative value of endogenous uptake and were mainly due to contamination. Cu and Zn, however, demonstrate behaviors worthy of further investigation into relating hair concentrations to endogenous function.

  8. High excitation rovibrational molecular analysis in warm environments

    Science.gov (United States)

    Zhang, Ziwei; Stancil, Phillip C.; Cumbee, Renata; Ferland, Gary J.

    2017-06-01

    Inspired by advances in infrared observation (e.g., Spitzer, Herschel and ALMA), we investigate rovibrational emission CO and SiO in warm astrophysical environments. With recent innovation in collisional rate coefficients and rescaling methods, we are able to construct more comprehensive collisional data with high rovibrational states (vibration up to v=5 and rotation up to J=40) and multiple colliders (H2, H and He). These comprehensive data sets are used in spectral simulations with the radiative transfer codes RADEX and Cloudy. We obtained line ratio diagnostic plots and line spectra for both near- and far-infrared emission lines over a broad range of density and temperature for the case of a uniform medium. Considering the importance of both molecules in probing conditions and activities of UV-irradiated interstellar gas, we model rovibrational emission in photodissociation region (PDR) and AGB star envelopes (such as VY Canis Majoris, IK Tau and IRC +10216) with Cloudy. Rotational diagrams, energy distribution diagrams, and spectra are produced to examine relative state abundances, line emission intensity, and other properties. With these diverse models, we expect to have a better understanding of PDRs and expand our scope in the chemical architecture and evolution of AGB stars and other UV-irradiated regions. The soon to be launched James Webb Space Telescope (JWST) will provide high resolution observations at near- to mid-infrared wavelengths, which opens a new window to study molecular vibrational emission calling for more detailed chemical modeling and comprehensive laboratory astrophysics data on more molecules. This work was partially supported by NASA grants NNX12AF42G and NNX15AI61G. We thank Benhui Yang, Kyle Walker, Robert Forrey, and N. Balakrishnan for collaborating on the collisional data adopted in the current work.

  9. Characterizing single-molecule FRET dynamics with probability distribution analysis.

    Science.gov (United States)

    Santoso, Yusdi; Torella, Joseph P; Kapanidis, Achillefs N

    2010-07-12

    Probability distribution analysis (PDA) is a recently developed statistical tool for predicting the shapes of single-molecule fluorescence resonance energy transfer (smFRET) histograms, which allows the identification of single or multiple static molecular species within a single histogram. We used a generalized PDA method to predict the shapes of FRET histograms for molecules interconverting dynamically between multiple states. This method is tested on a series of model systems, including both static DNA fragments and dynamic DNA hairpins. By fitting the shape of this expected distribution to experimental data, the timescale of hairpin conformational fluctuations can be recovered, in good agreement with earlier published results obtained using different techniques. This method is also applied to studying the conformational fluctuations in the unliganded Klenow fragment (KF) of Escherichia coli DNA polymerase I, which allows both confirmation of the consistency of a simple, two-state kinetic model with the observed smFRET distribution of unliganded KF and extraction of a millisecond fluctuation timescale, in good agreement with rates reported elsewhere. We expect this method to be useful in extracting rates from processes exhibiting dynamic FRET, and in hypothesis-testing models of conformational dynamics against experimental data.

  10. Languages, compilers and run-time environments for distributed memory machines

    CERN Document Server

    Saltz, J

    1992-01-01

    Papers presented within this volume cover a wide range of topics related to programming distributed memory machines. Distributed memory architectures, although having the potential to supply the very high levels of performance required to support future computing needs, present awkward programming problems. The major issue is to design methods which enable compilers to generate efficient distributed memory programs from relatively machine independent program specifications. This book is the compilation of papers describing a wide range of research efforts aimed at easing the task of programmin

  11. Designing distributed user interfaces for ambient intelligent environments using models and simulations

    OpenAIRE

    LUYTEN, Kris; VAN DEN BERGH, Jan; VANDERVELPEN, Chris; CONINX, Karin

    2006-01-01

    There is a growing demand for design support to create interactive systems that are deployed in ambient intelligent environments. Unlike traditional interactive systems, the wide diversity of situations these type of user interfaces need to work in require tool support that is close to the environment of the end-user on the one hand and provide a smooth integration with the application logic on the other hand. This paper shows how the model-based user interface development methodology can be ...

  12. Pair distribution function analysis applied to decahedral gold nanoparticles

    International Nuclear Information System (INIS)

    Nakotte, H; Silkwood, C; Kiefer, B; Karpov, D; Fohtung, E; Page, K; Wang, H-W; Olds, D; Manna, S; Fullerton, E E

    2017-01-01

    The five-fold symmetry of face-centered cubic (fcc) derived nanoparticles is inconsistent with the translational symmetry of a Bravais lattice and generally explained by multiple twinning of a tetrahedral subunit about a (joint) symmetry axis, with or without structural modification to the fcc motif. Unlike in bulk materials, five-fold twinning in cubic nanoparticles is common and strongly affects their structural, chemical, and electronic properties. To test and verify theoretical approaches, it is therefore pertinent that the local structural features of such materials can be fully characterized. The small size of nanoparticles severely limits the application of traditional analysis techniques, such as Bragg diffraction. A complete description of the atomic arrangement in nanoparticles therefore requires a departure from the concept of translational symmetry, and prevents fully evaluating all the structural features experimentally. We describe how recent advances in instrumentation, together with the increasing power of computing, are shaping the development of alternative analysis methods of scattering data for nanostructures. We present the application of Debye scattering and pair distribution function (PDF) analysis towards modeling of the total scattering data for the example of decahedral gold nanoparticles. PDF measurements provide a statistical description of the pair correlations of atoms within a material, allowing one to evaluate the probability of finding two atoms within a given distance. We explored the sensitivity of existing synchrotron x-ray PDF instruments for distinguishing four different simple models for our gold nanoparticles: a multiply twinned fcc decahedron with either a single gap or multiple distributed gaps, a relaxed body-centered orthorhombic (bco) decahedron, and a hybrid decahedron. The data simulations of the models were then compared with experimental data from synchrotron x-ray total scattering. We present our experimentally

  13. On the relevance of efficient, integrated computer and network monitoring in HEP distributed online environment

    CERN Document Server

    Carvalho, D F; Delgado, V; Albert, J N; Bellas, N; Javello, J; Miere, Y; Ruffinoni, D; Smith, G

    1996-01-01

    Large Scientific Equipments are controlled by Computer System whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, thhe sophistication of its trearment and, on the over hand by the fast evolution of the computer and network market. Some people call them generically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this frame- work the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is to integrate the various functions of DCCS monitoring into one general purpose Multi-layer ...

  14. On the Relevancy of Efficient, Integrated Computer and Network Monitoring in HEP Distributed Online Environment

    Science.gov (United States)

    Carvalho, D.; Gavillet, Ph.; Delgado, V.; Albert, J. N.; Bellas, N.; Javello, J.; Miere, Y.; Ruffinoni, D.; Smith, G.

    Large Scientific Equipments are controlled by Computer Systems whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, the sophistication of its treatment and, on the other hand by the fast evolution of the computer and network market. Some people call them genetically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this framework the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is proposed to integrate the various functions of DCCS monitoring into one general purpose Multi-layer System.

  15. A Real-Time Fault Management Software System for Distributed Environments, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — DyMA-FM (Dynamic Multivariate Assessment for Fault Management) is a software architecture for real-time fault management. Designed to run in a distributed...

  16. Multiuser Diversity with Adaptive Modulation in Non-Identically Distributed Nakagami Fading Environments

    KAUST Repository

    Rao, Anlei; Alouini, Mohamed-Slim

    2012-01-01

    In this paper, we analyze the performance of adaptive modulation with single-cell multiuser scheduling over independent but not identical distributed (i.n.i.d.) Nakagami fading channels. Closed-form expressions are derived for the average channel

  17. An Environment for Incremental Development of Distributed Extensible Asynchronous Real-time Systems

    Science.gov (United States)

    Ames, Charles K.; Burleigh, Scott; Briggs, Hugh C.; Auernheimer, Brent

    1996-01-01

    Incremental parallel development of distributed real-time systems is difficult. Architectural techniques and software tools developed at the Jet Propulsion Laboratory's (JPL's) Flight System Testbed make feasible the integration of complex systems in various stages of development.

  18. Distributed Data Analysis in the ATLAS Experiment: Challenges and Solutions

    International Nuclear Information System (INIS)

    Elmsheuser, Johannes; Van der Ster, Daniel

    2012-01-01

    The ATLAS experiment at the LHC at CERN is recording and simulating several 10's of PetaBytes of data per year. To analyse these data the ATLAS experiment has developed and operates a mature and stable distributed analysis (DA) service on the Worldwide LHC Computing Grid. The service is actively used: more than 1400 users have submitted jobs in the year 2011 and a total of more 1 million jobs run every week. Users are provided with a suite of tools to submit Athena, ROOT or generic jobs to the Grid, and the PanDA workload management system is responsible for their execution. The reliability of the DA service is high but steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. This paper will review the state of the DA tools and services, summarize the past year of distributed analysis activity, and present the directions for future improvements to the system.

  19. Implementation of force distribution analysis for molecular dynamics simulations

    Directory of Open Access Journals (Sweden)

    Seifert Christian

    2011-04-01

    Full Text Available Abstract Background The way mechanical stress is distributed inside and propagated by proteins and other biopolymers largely defines their function. Yet, determining the network of interactions propagating internal strain remains a challenge for both, experiment and theory. Based on molecular dynamics simulations, we developed force distribution analysis (FDA, a method that allows visualizing strain propagation in macromolecules. Results To be immediately applicable to a wide range of systems, FDA was implemented as an extension to Gromacs, a commonly used package for molecular simulations. The FDA code comes with an easy-to-use command line interface and can directly be applied to every system built using Gromacs. We provide an additional R-package providing functions for advanced statistical analysis and presentation of the FDA data. Conclusions Using FDA, we were able to explain the origin of mechanical robustness in immunoglobulin domains and silk fibers. By elucidating propagation of internal strain upon ligand binding, we previously also successfully revealed the functionality of a stiff allosteric protein. FDA thus has the potential to be a valuable tool in the investigation and rational design of mechanical properties in proteins and nano-materials.

  20. Mercury speciation and distribution in a glacierized mountain environment and their relevance to environmental risks in the inland Tibetan Plateau.

    Science.gov (United States)

    Sun, Xuejun; Zhang, Qianggong; Kang, Shichang; Guo, Junming; Li, Xiaofei; Yu, Zhengliang; Zhang, Guoshuai; Qu, Dongmei; Huang, Jie; Cong, Zhiyuan; Wu, Guangjian

    2018-08-01

    Glacierized mountain environments can preserve and release mercury (Hg) and play an important role in regional Hg biogeochemical cycling. However, the behavior of Hg in glacierized mountain environments and its environmental risks remain poorly constrained. In this research, glacier meltwater, runoff and wetland water were sampled in Zhadang-Qugaqie basin (ZQB), a typical glacierized mountain environment in the inland Tibetan Plateau, to investigate Hg distribution and its relevance to environmental risks. The total mercury (THg) concentrations ranged from 0.82 to 6.98ng·L -1 , and non-parametric pairwise multiple comparisons of the THg concentrations among the three different water samples showed that the THg concentrations were comparable. The total methylmercury (TMeHg) concentrations ranged from 0.041 to 0.115ng·L -1 , and non-parametric pairwise multiple comparisons of the TMeHg concentrations showed a significant difference. Both the THg and MeHg concentrations of water samples from the ZQB were comparable to those of other remote areas, indicating that Hg concentrations in the ZQB watershed are equivalent to the global background level. Particulate Hg was the predominant form of Hg in all runoff samples, and was significantly correlated with the total suspended particle (TSP) and not correlated with the dissolved organic carbon (DOC) concentration. The distribution of mercury in the wetland water differed from that of the other water samples. THg exhibited a significant correlation with DOC as well as TMeHg, whereas neither THg nor TMeHg was associated with TSP. Based on the above findings and the results from previous work, we propose a conceptual model illustrating the four Hg distribution zones in glacierized environments. We highlight that wetlands may enhance the potential hazards of Hg released from melting glaciers, making them a vital zone for investigating the environmental effects of Hg in glacierized environments and beyond. Copyright © 2018