WorldWideScience

Sample records for distributed analysis environment

  1. The CMS Analysis Chain In a Distributed Environment

    CERN Document Server

    De Filippis, N; Barrass, T; Bonacorsi, D; Corvo, M; Ciraolo, G; Innocente, V; Donvito, G; Maggi, M; Pierro, A; Silvestris, L; Faina, L; Spiga, D; Fanfani, A; Fanzago, F; Grandi, C; Lacaprara, S; Taylor, L; Tuura, L; Wildish, T

    2006-01-01

    The CMS collaboration is undertaking a big effort to define the analysis model and to develop software tools with the purpose of analyzing several millions of simulated and real data events by a large number of people in many geographically distributed sites. From the computing point of view, one of the most complex issues when doing remote analysis is the data discovery and access. Some software tools were developed in order to move data, make them available to the full international community and validate them for the subsequent analysis. The batch analysis processing is performed with workload management tools developed on purpose, which are mainly responsible for the job preparation and the job submission.The job monitoring and the output management are implemented as the last part of the analysis chain. Grid tools provided by the LCG project are evaluated to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the analysis jobs. An overview of the ...

  2. Distribution analysis of segmented wave sea clutter in littoral environments

    CSIR Research Space (South Africa)

    Strempel, MD

    2015-10-01

    Full Text Available are then fitted against the K-distribution. It is shown that the approach can accurately describe specific sections of the wave with a reduced error between actual and estimated distributions. The improved probability density function (PDF) representation...

  3. CRAB the CMS Tool to Allow Data Analysis in a distributed Environment

    CERN Document Server

    Fanzago, Federica; Codispoti, Giuseppe; Corvo, Marco; Fanfani, Alessandra; Farina, Fabio; Kavka, Carlos; Lacaprara, Stefano; Spiga, Daniele; Vaandering, Eric

    2008-01-01

    The CMS collaboration is developing a tool to allow physicists to access and analyze data stored in geographically distributed sites, simplifying the data discovery and hiding details related analysis job creation, execution and monitoring in the Grid environment. With this presentation we would like to show the progress of our work and some statistics about its usage. The CMS experiment will produce few PBytes of data each year to distribute and store in many computing centres spread in the countries participating to the CMS collaboration and made available for analysis to world-wide distributed physicists.CMS will use a distributed architecture based on Grid infrastructure to analyze data stored at remote sites, to assure data access only to authorized users and to ensure remote resources availability.Data analysis in a distributed environment is a task that assume to know which data are available, where data are stored and how to access them.To simplify analysis job creation and management the CMS collabor...

  4. A spatial pattern analysis of the halophytic species distribution in an arid coastal environment.

    Science.gov (United States)

    Badreldin, Nasem; Uria-Diez, J; Mateu, J; Youssef, Ali; Stal, Cornelis; El-Bana, Magdy; Magdy, Ahmed; Goossens, Rudi

    2015-05-01

    Obtaining information about the spatial distribution of desert plants is considered as a serious challenge for ecologists and environmental modeling due to the required intensive field work and infrastructures in harsh and remote arid environments. A new method was applied for assessing the spatial distribution of the halophytic species (HS) in an arid coastal environment. This method was based on the object-based image analysis for a high-resolution Google Earth satellite image. The integration of the image processing techniques and field work provided accurate information about the spatial distribution of HS. The extracted objects were based on assumptions that explained the plant-pixel relationship. Three different types of digital image processing techniques were implemented and validated to obtain an accurate HS spatial distribution. A total of 2703 individuals of the HS community were found in the case study, and approximately 82% were located above an elevation of 2 m. The micro-topography exhibited a significant negative relationship with pH and EC (r = -0.79 and -0.81, respectively, p < 0.001). The spatial structure was modeled using stochastic point processes, in particular a hybrid family of Gibbs processes. A new model is proposed that uses a hard-core structure at very short distances, together with a cluster structure in short-to-medium distances and a Poisson structure for larger distances. This model was found to fit the data perfectly well.

  5. Security Analysis of Measurement-Device-Independent Quantum Key Distribution in Collective-Rotation Noisy Environment

    Science.gov (United States)

    Li, Na; Zhang, Yu; Wen, Shuang; Li, Lei-lei; Li, Jian

    2018-01-01

    Noise is a problem that communication channels cannot avoid. It is, thus, beneficial to analyze the security of MDI-QKD in noisy environment. An analysis model for collective-rotation noise is introduced, and the information theory methods are used to analyze the security of the protocol. The maximum amount of information that Eve can eavesdrop is 50%, and the eavesdropping can always be detected if the noise level ɛ ≤ 0.68. Therefore, MDI-QKD protocol is secure as quantum key distribution protocol. The maximum probability that the relay outputs successful results is 16% when existing eavesdropping. Moreover, the probability that the relay outputs successful results when existing eavesdropping is higher than the situation without eavesdropping. The paper validates that MDI-QKD protocol has better robustness.

  6. Distributed Research Center for Analysis of Regional Climatic Changes and Their Impacts on Environment

    Science.gov (United States)

    Shiklomanov, A. I.; Okladnikov, I.; Gordov, E. P.; Proussevitch, A. A.; Titov, A. G.

    2016-12-01

    Presented is a collaborative project carrying out by joint team of researchers from the Institute of Monitoring of Climatic and Ecological Systems, Russia and Earth Systems Research Center, University of New Hampshire, USA. Its main objective is development of a hardware and software prototype of Distributed Research Center (DRC) for monitoring and projecting of regional climatic and and their impacts on the environment over the Northern extratropical areas. In the framework of the project new approaches to "cloud" processing and analysis of large geospatial datasets (big geospatial data) are being developed. It will be deployed on technical platforms of both institutions and applied in research of climate change and its consequences. Datasets available at NCEI and IMCES include multidimensional arrays of climatic, environmental, demographic, and socio-economic characteristics. The project is aimed at solving several major research and engineering tasks: 1) structure analysis of huge heterogeneous climate and environmental geospatial datasets used in the project, their preprocessing and unification; 2) development of a new distributed storage and processing model based on a "shared nothing" paradigm; 3) development of a dedicated database of metadata describing geospatial datasets used in the project; 4) development of a dedicated geoportal and a high-end graphical frontend providing intuitive user interface, internet-accessible online tools for analysis of geospatial data and web services for interoperability with other geoprocessing software packages. DRC will operate as a single access point to distributed archives of spatial data and online tools for their processing. Flexible modular computational engine running verified data processing routines will provide solid results of geospatial data analysis. "Cloud" data analysis and visualization approach will guarantee access to the DRC online tools and data from all over the world. Additionally, exporting of data

  7. Analysis of Ecological Distribution and Genomic Content from a Clade of Bacteroidetes Endemic to Sulfidic Environments

    Science.gov (United States)

    Zhou, K.; Sylvan, J. B.; Hallam, S. J.

    2017-12-01

    The Bacteroidetes are a ubiquitous phylum of bacteria found in a wide variety of habitats. Marine Bacteroidetes are known to utilize complex carbohydrates and have a potentially important role in the global carbon cycle through processing these compounds, which are not digestible by many other microbes. Some members of the phylum are known to perform denitrification and are facultative anaerobes, but Bacteroidetes are not known to participate in sulfur redox cycling. Recently, it was shown that a clade of uncultured Bacteroidetes, including the VC2.1_Bac22 group, appears to be endemic to sulfidic environments, including hydrothermal vent sulfide chimneys, sediments and marine water column oxygen minimum zones (OMZs). This clade, dubbed the Sulfiphilic Bacteroidetes, is not detected in 16S rRNA amplicon studies from non-sulfidic environments. To test the hypothesis that the Sulphiphilic Bacteroidetes are involved in sulfur redox chemistry, we updated our meta-analysis of the clade using 16s rRNA sequences from public databases and employed single-cell genomics to survey their genomic potential using 19 single amplified genomes (SAGs) isolated from the seasonally anoxic Saanich Inlet, a seasonally hypoxic basin in British Columbia. Initial analysis of these SAGs indicates the Sulphiphilic Bacteroidetes may perform sulfur redox reactions using a three gene psrABC operon encoding the polysulfide reductase enzyme complex with a thiosulfate sulfurtransferase (rhodanese), which putatively uses cyanide to convert thiosulfate to sulfite, just upstream. Interestingly, this is the same configuration as discovered recently in some Marine Group A bacteria. Further aspects of the Sulphiphilic Bacteroidetes' genomic potential will be presented in light of their presence in sulfidic environments.

  8. Biosensor for laboratory and lander-based analysis of benthicnitrate plus nitrite distribution in marine environments

    DEFF Research Database (Denmark)

    Revsbech, N. P.; Glud, Ronnie Nøhr

    2009-01-01

    We present a psychotropic bacteria–based biosensor that can be used in low–temperature seawater for the analysis of nitrate + nitrite (NOx –). The sensor can be used to resolve concentrations below 1 µmol L–1 at low temperature (

  9. Biosensor for laboratory and lander-based analysis of benthicnitrate plus nitrite distribution in marine environments

    DEFF Research Database (Denmark)

    Revsbech, N. P.; Glud, Ronnie Nøhr

    2009-01-01

    We present a psychotropic bacteria–based biosensor that can be used in low–temperature seawater for the analysis of nitrate + nitrite (NOx –). The sensor can be used to resolve concentrations below 1 µmol L–1 at low temperature (salinity (35‰), and in situ use in the deep sea...

  10. Antibiotics in the coastal environment of the Hailing Bay region, South China Sea: Spatial distribution, source analysis and ecological risks

    International Nuclear Information System (INIS)

    Chen, Hui; Liu, Shan; Xu, Xiang-Rong; Zhou, Guang-Jie; Liu, Shuang-Shuang; Yue, Wei-Zhong; Sun, Kai-Feng; Ying, Guang-Guo

    2015-01-01

    Highlights: • Thirty-eight antibiotics were systematically investigated in marine environment. • The distribution of antibiotics was significantly correlated with COD and NO 3 –N. • Untreated domestic sewage was the primary source of antibiotics. • Fluoroquinolones showed a strong sorption capacity onto sediments. • Oxytetracycline, norfloxacin and erythromycin–H 2 O indicated high risks. - Abstract: In this study, the occurrence and spatial distribution of 38 antibiotics in surface water and sediment samples of the Hailing Bay region, South China Sea, were investigated. Twenty-one, 16 and 15 of 38 antibiotics were detected with the concentrations ranging from <0.08 (clarithromycin) to 15,163 ng/L (oxytetracycline), 2.12 (methacycline) to 1318 ng/L (erythromycin–H 2 O), <1.95 (ciprofloxacin) to 184 ng/g (chlortetracycline) in the seawater, discharged effluent and sediment samples, respectively. The concentrations of antibiotics in the water phase were correlated positively with chemical oxygen demand and nitrate. The source analysis indicated that untreated domestic sewage was the primary source of antibiotics in the study region. Fluoroquinolones showed strong sorption capacity onto sediments due to their high pseudo-partitioning coefficients. Risk assessment indicated that oxytetracycline, norfloxacin and erythromycin–H 2 O posed high risks to aquatic organisms

  11. Team Cognition in Distributed Mission Environments

    National Research Council Canada - National Science Library

    Cooke, nancy

    2003-01-01

    ... in the context of military team environments. This part of the effort focuses on the increasingly common 'network centric' military environment in which individuals who are distributed in space communicate, share information, and make critical...

  12. Structural Optimization in a Distributed Computing Environment

    National Research Council Canada - National Science Library

    Voon, B. K; Austin, M. A

    1991-01-01

    ...) optimization algorithm customized to a Distributed Numerical Computing environment (DNC). DNC utilizes networking technology and an ensemble of loosely coupled processors to compute structural analyses concurrently...

  13. Spatial distribution and ecological environment analysis of great gerbil in Xinjiang Plague epidemic foci based on remote sensing

    International Nuclear Information System (INIS)

    Gao, Mengxu; Wang, Juanle; Li, Qun; Cao, Chunxiang

    2014-01-01

    Yersinia pestis (Plague bacterium) from great gerbil was isolated in 2005 in Xinjiang Dzungarian Basin, which confirmed the presence of the plague epidemic foci. This study analysed the spatial distribution and suitable habitat of great gerbil based on the monitoring data of great gerbil from Chinese Center for Disease Control and Prevention, as well as the ecological environment elements obtained from remote sensing products. The results showed that: (1) 88.5% (277/313) of great gerbil distributed in the area of elevation between 200 and 600 meters. (2) All the positive points located in the area with a slope of 0–3 degree, and the sunny tendency on aspect was not obvious. (3) All 313 positive points of great gerbil distributed in the area with an average annual temperature from 5 to 11 °C, and 165 points with an average annual temperature from 7 to 9 °C. (4) 72.8% (228/313) of great gerbil survived in the area with an annual precipitation of 120–200mm. (5) The positive points of great gerbil increased correspondingly with the increasing of NDVI value, but there is no positive point when NDVI is higher than 0.521, indicating the suitability of vegetation for great gerbil. This study explored a broad and important application for the monitoring and prevention of plague using remote sensing and geographic information system

  14. Advanced Analysis Environments - Summary

    International Nuclear Information System (INIS)

    Panacek, Suzanne

    2001-01-01

    This is a summary of the panel discussion on Advanced Analysis Environments. Rene Brun, Tony Johnson, and Lassi Tuura shared their insights about the trends and challenges in analysis environments. This paper contains the initial questions, a summary of the speakers' presentation, and the questions asked by the audience

  15. Rare earth elements determination and distribution patterns in sediments of polluted marine environment by instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Akyil, S.; Yusof, A.M.; Wood, A.K.H.

    2001-01-01

    Results obtained from the analysis of sediment core samples taken from a fairly polluted marine environment were analyzed for the REE contents to determine the concentrations of La, Ce, Sm, Eu, Tb, Dy and Yb using instrumental neutron activation analysis. Core samples were divided into strata of between 2 to 3 cm intervals and prepared in the powdered form before irradiating them in a TRIGA Mk.II reactor. Down-core concentration profiles of La, Ce, Sm, Eu, Tb, Dy and Yb in 3 core sediments from three sites are obtained. The shale-normalized REE pattern from each site was examined and later used to explain the history of sedimentation by natural processes such as shoreline erosion and weathering products deposited on the seabed and furnishing some baseline data and/or pollution trend occurring within the study area

  16. Distributed Large Data-Object Environments: End-to-End Performance Analysis of High Speed Distributed Storage Systems in Wide Area ATM Networks

    Science.gov (United States)

    Johnston, William; Tierney, Brian; Lee, Jason; Hoo, Gary; Thompson, Mary

    1996-01-01

    We have developed and deployed a distributed-parallel storage system (DPSS) in several high speed asynchronous transfer mode (ATM) wide area networks (WAN) testbeds to support several different types of data-intensive applications. Architecturally, the DPSS is a network striped disk array, but is fairly unique in that its implementation allows applications complete freedom to determine optimal data layout, replication and/or coding redundancy strategy, security policy, and dynamic reconfiguration. In conjunction with the DPSS, we have developed a 'top-to-bottom, end-to-end' performance monitoring and analysis methodology that has allowed us to characterize all aspects of the DPSS operating in high speed ATM networks. In particular, we have run a variety of performance monitoring experiments involving the DPSS in the MAGIC testbed, which is a large scale, high speed, ATM network and we describe our experience using the monitoring methodology to identify and correct problems that limit the performance of high speed distributed applications. Finally, the DPSS is part of an overall architecture for using high speed, WAN's for enabling the routine, location independent use of large data-objects. Since this is part of the motivation for a distributed storage system, we describe this architecture.

  17. Problem solving environment for distributed interactive applications

    NARCIS (Netherlands)

    Rycerz, K.; Bubak, M.; Sloot, P.; Getov, V.; Gorlatch, S.; Bubak, M.; Priol, T.

    2008-01-01

    Interactive Problem Solving Environments (PSEs) offer an integrated approach for constructing and running complex systems, such as distributed simulation systems. To achieve efficient execution of High Level Architecture (HLA)-based distributed interactive simulations on the Grid, we introduce a PSE

  18. Rare earth elements determination and distribution patterns in sediments of a polluted marine environment by instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Yusof, A.M.

    2001-01-01

    Results obtained from the analysis of sediment core samples taken from a fairly polluted marine environment were analyzed for the REE contents to determine the concentrations of La, Ce, Sm, Eu, Tb, Dy and Yb using instrumental neutron activation analysis. Core samples were divided into strata of between 2 to 3 cm intervals and prepared in the powdered form before irradiating them in a neutron flux of about 5.0 x 10 12 n x cm -2 x s -1 in a Triga Mark II reactor. Down-core concentration profiles of La, Ce, Sm, Eu, Tb, Dy and Yb in 3 core sediments from three sites are obtained. The shale-normalized REE pattern from each site was examined and later used to explain the history of sedimentation by natural processes such as shoreline erosion and weathering products deposited on the seabed and furnishing some baseline data and/or pollution trend occurring within the study area. The shale-normalized REE patterns also showed that LREE in the sediment samples exhibit enrichment relative to HREE particularly, La and Sm showing enrichment compared to the ratios in shale. REE concentrations of 124 μg/g at the surface of sediment collected at two of the three sites were found to decrease to 58 and 95 μg/g, respectively. This was of particular interest when it is used to explain the anomalies occurring in the marine sediment as a result of geochemical processes over a long period of time. Changes in concentrations from surface to bottom of the sediments ratioed to Sm concentrations and the correlation between concentrations of Sm and these elements were also investigated and correlation coefficients were calculated for all REEs and sites. Validation of the method used was done using a Soil-7 SRM. (author)

  19. Distributed Analysis in CMS

    CERN Document Server

    Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank

    2009-01-01

    The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.

  20. Sensors in Distributed Mixed Reality Environments

    Directory of Open Access Journals (Sweden)

    Felix Hamza-Lup

    2005-04-01

    Full Text Available A distributed mixed-reality (MR or virtual reality (VR environment implies the cooperative engagement of a set of software and hardware resources. With the advances in sensors and computer networks we have seen an increase in the number of potential MR/VR applications that require large amounts of information from the real world collected through sensors (e.g. position and orientation tracking sensors. These sensors collect data from the real environment in real-time at different locations and a distributed environment connecting them must assure data distribution among collaborative sites at interactive speeds. With the advances in sensor technology, we envision that in future systems a significant amount of data will be collected from sensors and devices attached to the participating nodes This paper proposes a new architecture for sensor based interactive distributed MR/VR environments that falls in-between the atomistic peer-to-peer model and the traditional client-server model. Each node is autonomous and fully manages its resources and connectivity. The dynamic behavior of the nodes is dictated by the human participants that manipulate the sensors attached to these nodes.

  1. Control and Observation in Distributed Environments

    Science.gov (United States)

    Smith, Warren; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation provides information on the ideas surrounding the control and observation of a distributed computing environment based on the design of the electrical power grid. The implementation of a reliable control system for the management of the computational grid is crucial. Different architectures have been suggested.

  2. RESOURCE DISTRIBUTION MODEL IN CLOUD ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    Ramil I. Khantimirov

    2015-01-01

    Full Text Available A new approach to distribution of load in computer clouds is proposed, based on the analysis of the equability of resource usage and resource usage forecast using intelligent algorithms. 

  3. The ATLAS distributed analysis system

    International Nuclear Information System (INIS)

    Legger, F

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  4. The ATLAS distributed analysis system

    Science.gov (United States)

    Legger, F.; Atlas Collaboration

    2014-06-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  5. Distributed analysis in ATLAS

    CERN Document Server

    Dewhurst, Alastair; The ATLAS collaboration

    2015-01-01

    The ATLAS experiment accumulated more than 140 PB of data during the first run of the Large Hadron Collider (LHC) at CERN. The analysis of such an amount of data for the distributed physics community is a challenging task. The Distributed Analysis (DA) system of the ATLAS experiment is an established and stable component of the ATLAS distributed computing operations. About half a million user jobs are daily running on DA resources, submitted by more than 1500 ATLAS physicists. The reliability of the DA system during the first run of the LHC and the following shutdown period has been high thanks to the continuous automatic validation of the distributed analysis sites and the user support provided by a dedicated team of expert shifters. During the LHC shutdown, the ATLAS computing model has undergone several changes to improve the analysis workflows, including the re-design of the production system, a new analysis data format and event model, and the development of common reduction and analysis frameworks. We r...

  6. Distribution-free discriminant analysis

    Energy Technology Data Exchange (ETDEWEB)

    Burr, T.; Doak, J.

    1997-05-01

    This report describes our experience in implementing a non-parametric (distribution-free) discriminant analysis module for use in a wide range of pattern recognition problems. Issues discussed include performance results on both real and simulated data sets, comparisons to other methods, and the computational environment. In some cases, this module performs better than other existing methods. Nearly all cases can benefit from the application of multiple methods.

  7. Dynamic, Interactive and Visual Analysis of Population Distribution and Mobility Dynamics in an Urban Environment Using the Mobility Explorer Framework

    Directory of Open Access Journals (Sweden)

    Jan Peters-Anders

    2017-05-01

    Full Text Available This paper investigates the extent to which a mobile data source can be utilised to generate new information intelligence for decision-making in smart city planning processes. In this regard, the Mobility Explorer framework is introduced and applied to the City of Vienna (Austria by using anonymised mobile phone data from a mobile phone service provider. This framework identifies five necessary elements that are needed to develop complex planning applications. As part of the investigation and experiments a new dynamic software tool, called Mobility Explorer, has been designed and developed based on the requirements of the planning department of the City of Vienna. As a result, the Mobility Explorer enables city stakeholders to interactively visualise the dynamic diurnal population distribution, mobility patterns and various other complex outputs for planning needs. Based on the experiences during the development phase, this paper discusses mobile data issues, presents the visual interface, performs various user-defined analyses, demonstrates the application’s usefulness and critically reflects on the evaluation results of the citizens’ motion exploration that reveal the great potential of mobile phone data in smart city planning but also depict its limitations. These experiences and lessons learned from the Mobility Explorer application development provide useful insights for other cities and planners who want to make informed decisions using mobile phone data in their city planning processes through dynamic visualisation of Call Data Record (CDR data.

  8. Analysis of Impact of Geographical Environment and Socio-economic Factors on the Spatial Distribution of Kaohsiung Dengue Fever Epidemic

    Science.gov (United States)

    Hsu, Wei-Yin; Wen, Tzai-Hung; Yu, Hwa-Lung

    2013-04-01

    Taiwan is located in subtropical and tropical regions with high temperature and high humidity in the summer. This kind of climatic condition is the hotbed for the propagation and spread of the dengue vector mosquito. Kaohsiung City has been the worst dengue fever epidemic city in Taiwan. During the study period, from January 1998 to December 2011, Taiwan CDC recorded 7071 locally dengue epidemic cases in Kaohsiung City, and the number of imported case is 118. Our research uses Quantile Regression, a spatial infection disease distribution, to analyze the correlation between dengue epidemic and geographic environmental factors and human society factors in Kaohsiung. According to our experiment statistics, agriculture and natural forest have a positive relation to dengue fever(5.5~34.39 and 3.91~15.52). The epidemic will rise when the ratio for agriculture and natural forest increases. Residential ratio has a negative relation for quantile 0.1 to 0.4(-0.005~-0.78), and a positive relation for quantile 0.5 to0.9(0.01~18.0) . The mean income is also a significant factor in social economy field, and it has a negative relation to dengue fever(-0.01~-0.04). Conclusion from our research is that the main factor affecting the degree of dengue fever in predilection area is the residential proportion and the ratio of agriculture and natural forest plays an important role affecting the degree of dengue fever in non predilection area. Moreover, the serious epidemic area located by regression model is the same as the actual condition in Kaohsiung. This model can be used to predict the serious epidemic area of dengue fever and provide some references for the Health Agencies

  9. Distributed data analysis in LHCb

    CERN Document Server

    Paterson, S K

    2008-01-01

    The LHCb distributed data analysis system consists of the Ganga job submission front-end and the DIRAC Workload and Data Management System (WMS). Ganga is jointly developed with ATLAS and allows LHCb users to submit jobs on several backends including: several batch systems, LCG and DIRAC. The DIRAC API provides a transparent and secure way for users to run jobs to the Grid and is the default mode of submission for the LHCb Virtual Organisation (VO). This is exploited by Ganga to perform distributed user analysis for LHCb. This system provides LHCb with a consistent, efficient and simple user experience in a variety of heterogeneous environments and facilitates the incremental development of user analysis from local test jobs to the Worldwide LHC Computing Grid. With a steadily increasing number of users, the LHCb distributed analysis system has been tuned and enhanced over the past two years. This paper will describe the recent developments to support distributed data analysis for the LHCb experiment on WLCG.

  10. Wigner Ville Distribution in Signal Processing, using Scilab Environment

    Directory of Open Access Journals (Sweden)

    Petru Chioncel

    2011-01-01

    Full Text Available The Wigner Ville distribution offers a visual display of quantitative information about the way a signal’s energy is distributed in both, time and frequency. Through that, this distribution embodies the fundamentally concepts of the Fourier and time-domain analysis. The energy of the signal is distributed so that specific frequencies are localized in time by the group delay time and at specifics instants in time the frequency is given by the instantaneous frequency. The net positive volum of the Wigner distribution is numerically equal to the signal’s total energy. The paper shows the application of the Wigner Ville distribution, in the field of signal processing, using Scilab environment.

  11. Research computing in a distributed cloud environment

    International Nuclear Information System (INIS)

    Fransham, K; Agarwal, A; Armstrong, P; Bishop, A; Charbonneau, A; Desmarais, R; Hill, N; Gable, I; Gaudet, S; Goliath, S; Impey, R; Leavett-Brown, C; Ouellete, J; Paterson, M; Pritchet, C; Penfold-Brown, D; Podaima, W; Schade, D; Sobie, R J

    2010-01-01

    The recent increase in availability of Infrastructure-as-a-Service (IaaS) computing clouds provides a new way for researchers to run complex scientific applications. However, using cloud resources for a large number of research jobs requires significant effort and expertise. Furthermore, running jobs on many different clouds presents even more difficulty. In order to make it easy for researchers to deploy scientific applications across many cloud resources, we have developed a virtual machine resource manager (Cloud Scheduler) for distributed compute clouds. In response to a user's job submission to a batch system, the Cloud Scheduler manages the distribution and deployment of user-customized virtual machines across multiple clouds. We describe the motivation for and implementation of a distributed cloud using the Cloud Scheduler that is spread across both commercial and dedicated private sites, and present some early results of scientific data analysis using the system.

  12. Research computing in a distributed cloud environment

    Energy Technology Data Exchange (ETDEWEB)

    Fransham, K; Agarwal, A; Armstrong, P; Bishop, A; Charbonneau, A; Desmarais, R; Hill, N; Gable, I; Gaudet, S; Goliath, S; Impey, R; Leavett-Brown, C; Ouellete, J; Paterson, M; Pritchet, C; Penfold-Brown, D; Podaima, W; Schade, D; Sobie, R J, E-mail: fransham@uvic.ca

    2010-11-01

    The recent increase in availability of Infrastructure-as-a-Service (IaaS) computing clouds provides a new way for researchers to run complex scientific applications. However, using cloud resources for a large number of research jobs requires significant effort and expertise. Furthermore, running jobs on many different clouds presents even more difficulty. In order to make it easy for researchers to deploy scientific applications across many cloud resources, we have developed a virtual machine resource manager (Cloud Scheduler) for distributed compute clouds. In response to a user's job submission to a batch system, the Cloud Scheduler manages the distribution and deployment of user-customized virtual machines across multiple clouds. We describe the motivation for and implementation of a distributed cloud using the Cloud Scheduler that is spread across both commercial and dedicated private sites, and present some early results of scientific data analysis using the system.

  13. Research computing in a distributed cloud environment

    Science.gov (United States)

    Fransham, K.; Agarwal, A.; Armstrong, P.; Bishop, A.; Charbonneau, A.; Desmarais, R.; Hill, N.; Gable, I.; Gaudet, S.; Goliath, S.; Impey, R.; Leavett-Brown, C.; Ouellete, J.; Paterson, M.; Pritchet, C.; Penfold-Brown, D.; Podaima, W.; Schade, D.; Sobie, R. J.

    2010-11-01

    The recent increase in availability of Infrastructure-as-a-Service (IaaS) computing clouds provides a new way for researchers to run complex scientific applications. However, using cloud resources for a large number of research jobs requires significant effort and expertise. Furthermore, running jobs on many different clouds presents even more difficulty. In order to make it easy for researchers to deploy scientific applications across many cloud resources, we have developed a virtual machine resource manager (Cloud Scheduler) for distributed compute clouds. In response to a user's job submission to a batch system, the Cloud Scheduler manages the distribution and deployment of user-customized virtual machines across multiple clouds. We describe the motivation for and implementation of a distributed cloud using the Cloud Scheduler that is spread across both commercial and dedicated private sites, and present some early results of scientific data analysis using the system.

  14. Distributed computing and nuclear reactor analysis

    International Nuclear Information System (INIS)

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-01-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations

  15. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  16. CHIME: A Metadata-Based Distributed Software Development Environment

    National Research Council Canada - National Science Library

    Dossick, Stephen E; Kaiser, Gail E

    2005-01-01

    We introduce CHIME, the Columbia Hypermedia IMmersion Environment, a metadata-based information environment, and describe its potential applications for internet and intranet-based distributed software development...

  17. Distributed autonomous mapping of indoor environments

    Science.gov (United States)

    Rogers, J.; Paluri, M.; Cunningham, A.; Christensen, H. I.; Michael, N.; Kumar, V.; Ma, J.; Matthies, L.

    2011-06-01

    This paper describes the results of a Joint Experiment performed on behalf of the MAST CTA. The system developed for the Joint Experiment makes use of three robots which work together to explore and map an unknown environment. Each of the robots used in this experiment is equipped with a laser scanner for measuring walls and a camera for locating doorways. Information from both of these types of structures is concurrently incorporated into each robot's local map using a graph based SLAM technique. A Distributed-Data-Fusion algorithm is used to efficiently combine local maps from each robot into a shared global map. Each robot computes a compressed local feature map and transmits it to neighboring robots, which allows each robot to merge its map with the maps of its neighbors. Each robot caches the compressed maps from its neighbors, allowing it to maintain a coherent map with a common frame of reference. The robots utilize an exploration strategy to efficiently cover the unknown environment which allows collaboration on an unreliable communications channel. As each new branching point is discovered by a robot, it broadcasts the information about where this point is along with the robot's path from a known landmark to the other robots. When the next robot reaches a dead-end, new branching points are allocated by auction. In the event of communication interruption, the robot which observed the branching point will eventually explore it; therefore, the exploration is complete in the face of communication failure.

  18. Adaptive Distributed Environment for Procedure Training (ADEPT)

    Science.gov (United States)

    Domeshek, Eric; Ong, James; Mohammed, John

    2013-01-01

    ADEPT (Adaptive Distributed Environment for Procedure Training) is designed to provide more effective, flexible, and portable training for NASA systems controllers. When creating a training scenario, an exercise author can specify a representative rationale structure using the graphical user interface, annotating the results with instructional texts where needed. The author's structure may distinguish between essential and optional parts of the rationale, and may also include "red herrings" - hypotheses that are essential to consider, until evidence and reasoning allow them to be ruled out. The system is built from pre-existing components, including Stottler Henke's SimVentive? instructional simulation authoring tool and runtime. To that, a capability was added to author and exploit explicit control decision rationale representations. ADEPT uses SimVentive's Scalable Vector Graphics (SVG)- based interactive graphic display capability as the basis of the tool for quickly noting aspects of decision rationale in graph form. The ADEPT prototype is built in Java, and will run on any computer using Windows, MacOS, or Linux. No special peripheral equipment is required. The software enables a style of student/ tutor interaction focused on the reasoning behind systems control behavior that better mimics proven Socratic human tutoring behaviors for highly cognitive skills. It supports fast, easy, and convenient authoring of such tutoring behaviors, allowing specification of detailed scenario-specific, but content-sensitive, high-quality tutor hints and feedback. The system places relatively light data-entry demands on the student to enable its rationale-centered discussions, and provides a support mechanism for fostering coherence in the student/ tutor dialog by including focusing, sequencing, and utterance tuning mechanisms intended to better fit tutor hints and feedback into the ongoing context.

  19. The distributed development environment for SDSS software

    International Nuclear Information System (INIS)

    Berman, E.; Gurbani, V.; Mackinnon, B.; Newberg, H. Nicinski, T.; Petravick, D.; Pordes, R.; Sergey, G.; Stoughton, C.; Lupton, R.

    1994-04-01

    The authors present an integrated science software development environment, code maintenance and support system for the Sloan Digital Sky Survey (SDSS) now being actively used throughout the collaboration

  20. Distributed Computations Environment Protection Using Artificial Immune Systems

    Directory of Open Access Journals (Sweden)

    A. V. Moiseev

    2011-12-01

    Full Text Available In this article the authors describe possibility of artificial immune systems applying for distributed computations environment protection from definite types of malicious impacts.

  1. Reducing Kernel Development Complexity in Distributed Environments

    OpenAIRE

    Lèbre , Adrien; Lottiaux , Renaud; Focht , Erich; Morin , Christine

    2008-01-01

    Setting up generic and fully transparent distributed services for clusters implies complex and tedious kernel developments. More flexible approaches such as user-space libraries are usually preferred with the drawback of requiring application recompilation. A second approach consists in using specific kernel modules (such as FUSE in Gnu/Linux system) to transfer kernel complexity into user space. In this paper, we present a new way to design and implement kernel distributed services for clust...

  2. Distributed computing environment monitoring and user expectations

    International Nuclear Information System (INIS)

    Cottrell, R.L.A.; Logg, C.A.

    1996-01-01

    This paper discusses the growing needs for distributed system monitoring and compares it to current practices. It then goes to identify the components of distributed system monitoring and shows how they are implemented and successfully used at one site today to address the Local area Network (WAN), and host monitoring. It shows how this monitoring can be used to develop realistic service level expectations and also identifies the costs. Finally, the paper briefly discusses the future challenges in network monitoring. (author)

  3. Study of the distribution characteristics of rare earth elements in Solanum lycocarpum from different tropical environments in Brazil by neutron activation analysis

    International Nuclear Information System (INIS)

    Maria, Sheila Piorino

    2001-01-01

    In this work, the concentration of eight rare earth elements (REE), La, Ce, Nd, Sm, Eu, Tb, Yb and Lu, was determined by neutron activation analysis (INAA), in plant leaves of Solanum lycocarpum. This species is a typical Brazilian 'cerrado' plant, widely distributed in Brazil. The analysis of the plant reference materials CRM Pine Needles (NIST 1575) and Spruce Needles (BCR 101) proved that the methodology applied was sufficiently accurate and precise for the determination of REE in plants. In order to better evaluate the uptake of the REE from the soil to the plant, the host soil was also analyzed by ESiAA. The studied areas were Salitre, MG, Serra do Cipo, MG, Lagoa da Pampulha and Mangabeiras, in Belo Horizonte, MG, and Cerrado de Emas, in Pirassununga, SP. The results were analyzed through the calculation of transfer factors soil-plant and by using diagrams normalized to chondrites. The data obtained showed different transfer factors from soil to plant as the subtract changes. Similar distribution patterns for the soil and the plant were obtained in all the studied sites, presenting an enrichment of the light REE (La to Sm), in contrast to the heavy REE (Eu to Lu), less absorbed. These results indicate that the light REE remain available to the plant in the more superficial soil layers. The similarity between the distribution patterns indicates a typical REE absorption by this species, in spite of the significant differences in the substratum . (author)

  4. Distributed computing environment monitoring and user expectations

    International Nuclear Information System (INIS)

    Cottrell, R.L.A.; Logg, C.A.

    1995-11-01

    This paper discusses the growing needs for distributed system monitoring and compares it to current practices. It then goes on to identify the components of distributed system monitoring and shows how they are implemented and successfully used at one site today to address the Local Area Network (LAN), network services and applications, the Wide Area Network (WAN), and host monitoring. It shows how this monitoring can be used to develop realistic service level expectations and also identifies the costs. Finally, the paper briefly discusses the future challenges in network monitoring

  5. ENVIRONMENT AND AMPHIBIAN DISTRIBUTION IN ZULULAND ...

    African Journals Online (AJOL)

    knowledge shows that this area is populated by an essentially East African assemblage (e.g'. Poynton 196\\). To the ... Succession of ampilibian faunas along tile eastern and soutilern coastal strip of Soutbern Africa. of distribution in ... accompanied by physiognomic changes that cbuld affect the diversity and availability of.

  6. Designing Distributed Learning Environments with Intelligent Software Agents

    Science.gov (United States)

    Lin, Fuhua, Ed.

    2005-01-01

    "Designing Distributed Learning Environments with Intelligent Software Agents" reports on the most recent advances in agent technologies for distributed learning. Chapters are devoted to the various aspects of intelligent software agents in distributed learning, including the methodological and technical issues on where and how intelligent agents…

  7. PIXE analysis of work environment aerosols

    International Nuclear Information System (INIS)

    Akabayashi, Hideo; Fujimoto, Fuminori; Komaki, Kenichiro; Ootuka, Akio; Kobayashi, Koichi; Yamashita, Hiroshi

    1988-01-01

    In labor environment, the quantity of chemical substances in the air is more, and their kinds are more diversified than in general home environment. It has been well known that some substances contained in the aerosol in labor environment (floating dust in the atmosphere) such as asbestos and hexavalent chromium have the possibility of causing serious injuries such as cancer of respiratory organ. In order to identify the harmful substances to which laborers are exposed and to take the measures for removing them, it is necessary to investigate in detail into many factors related to the effect of aerosol on human bodies, such as the composition of elements, chemical condition, concentration, the particle size of dust and temporal and spatial distributions. For the purpose, sampling and analysis must be carried out so that information can be extracted as much as possible from a minute amount of sample. The particle induced x-ray emission (PIXE) analysis is very effective for this application. In this paper, the development of a PIXE analysis system and the knowledge obtained by the sampling and measurement of aerosol in indoor labor environment are reported. The labor environment selected is that of the workshop of Department of Liberal Arts, University of Tokyo. Sampling, the experimental apparatus, the method of data analysis and the results of analysis are described. (Kako, I.)

  8. Distributed intelligent urban environment monitoring system

    Science.gov (United States)

    Du, Jinsong; Wang, Wei; Gao, Jie; Cong, Rigang

    2018-02-01

    The current environmental pollution and destruction have developed into a world-wide major social problem that threatens human survival and development. Environmental monitoring is the prerequisite and basis of environmental governance, but overall, the current environmental monitoring system is facing a series of problems. Based on the electrochemical sensor, this paper designs a small, low-cost, easy to layout urban environmental quality monitoring terminal, and multi-terminal constitutes a distributed network. The system has been small-scale demonstration applications and has confirmed that the system is suitable for large-scale promotion

  9. Flexible session management in a distributed environment

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Zach; /Wisconsin U., Madison; Bradley, Dan; /Wisconsin U., Madison; Tannenbaum, Todd; /Wisconsin U., Madison; Sfiligoi, Igor; /Fermilab

    2010-01-01

    Many secure communication libraries used by distributed systems, such as SSL, TLS, and Kerberos, fail to make a clear distinction between the authentication, session, and communication layers. In this paper we introduce CEDAR, the secure communication library used by the Condor High Throughput Computing software, and present the advantages to a distributed computing system resulting from CEDAR's separation of these layers. Regardless of the authentication method used, CEDAR establishes a secure session key, which has the flexibility to be used for multiple capabilities. We demonstrate how a layered approach to security sessions can avoid round-trips and latency inherent in network authentication. The creation of a distinct session management layer allows for optimizations to improve scalability by way of delegating sessions to other components in the system. This session delegation creates a chain of trust that reduces the overhead of establishing secure connections and enables centralized enforcement of system-wide security policies. Additionally, secure channels based upon UDP datagrams are often overlooked by existing libraries; we show how CEDAR's structure accommodates this as well. As an example of the utility of this work, we show how the use of delegated security sessions and other techniques inherent in CEDAR's architecture enables US CMS to meet their scalability requirements in deploying Condor over large-scale, wide-area grid systems.

  10. The comprehensive intermodal planning system in distributed environment

    Directory of Open Access Journals (Sweden)

    Olgierd Dziamski

    2013-06-01

    Full Text Available Background: Goods distribution by containers has opened new opportunities for them in the global supply chain network. Standardization of transportation units allow to simplify transport logistics operations and enable the integration of different types of transport. Currently, container transport is the most popular means of transport by sea, but recent studies show an increase in its popularity in land transport. In the paper presented the concept of a comprehensive intermodal planning system in a distributed environment which enables more efficient use containers in the global transport. The aim of this paper was to formulate and develop new concept of Internet Intermodal Planning System in distribution environment, supporting the common interests of the consignors and the logistic service providers by integrating the transportation modes, routing, logistics operations and scheduling in order to create intermodal transport plans. Methods: In the paper presented and discussed the new approach to the management of logistics resources used in international intermodal transport. Detailed analysis of proposed classification of variety transportation means has been carried out along with their specific properties, which may affect the time and cost of transportation. Have been presented the modular web-based distribution planning system supporting container transportation planning. Conducted the analysis of the main planning process and curried out the discussion on the effectiveness of the new web-based container transport planning system. Result and conclusions: This paper presents a new concept of the distributed container transport planning system integrating the available on the market logistics service providers with freight exchange. Specific to container planning new transport categories has been identified which simplify and facilitate the management of intermodal planning process. The paper presents a modular structure of Integrated

  11. Mobile agent location in distributed environments

    Science.gov (United States)

    Fountoukis, S. G.; Argyropoulos, I. P.

    2012-12-01

    An agent is a small program acting on behalf of a user or an application which plays the role of a user. Artificial intelligence can be encapsulated in agents so that they can be capable of both behaving autonomously and showing an elementary decision ability regarding movement and some specific actions. Therefore they are often called autonomous mobile agents. In a distributed system, they can move themselves from one processing node to another through the interconnecting network infrastructure. Their purpose is to collect useful information and to carry it back to their user. Also, agents are used to start, monitor and stop processes running on the individual interconnected processing nodes of computer cluster systems. An agent has a unique id to discriminate itself from other agents and a current position. The position can be expressed as the address of the processing node which currently hosts the agent. Very often, it is necessary for a user, a processing node or another agent to know the current position of an agent in a distributed system. Several procedures and algorithms have been proposed for the purpose of position location of mobile agents. The most basic of all employs a fixed computing node, which acts as agent position repository, receiving messages from all the moving agents and keeping records of their current positions. The fixed node, responds to position queries and informs users, other nodes and other agents about the position of an agent. Herein, a model is proposed that considers pairs and triples of agents instead of single ones. A location method, which is investigated in this paper, attempts to exploit this model.

  12. Multimedia Services in Open Distributed Telecommunications Environments

    NARCIS (Netherlands)

    Leydekkers, Peter

    1997-01-01

    The design and analysis of transport protocols for reliable communications constitutes the topic of this dissertation. These transport protocols guarantee the sequenced and complete delivery of user data over networks which may lose, duplicate and reorder packets. Reliable transport services are

  13. Distributed collaborative environments for virtual capability-based planning

    Science.gov (United States)

    McQuay, William K.

    2003-09-01

    Distributed collaboration is an emerging technology that will significantly change how decisions are made in the 21st century. Collaboration involves two or more geographically dispersed individuals working together to share and exchange data, information, knowledge, and actions. The marriage of information, collaboration, and simulation technologies provides the decision maker with a collaborative virtual environment for planning and decision support. This paper reviews research that is focusing on the applying open standards agent-based framework with integrated modeling and simulation to a new Air Force initiative in capability-based planning and the ability to implement it in a distributed virtual environment. Virtual Capability Planning effort will provide decision-quality knowledge for Air Force resource allocation and investment planning including examining proposed capabilities and cost of alternative approaches, the impact of technologies, identification of primary risk drivers, and creation of executable acquisition strategies. The transformed Air Force business processes are enabled by iterative use of constructive and virtual modeling, simulation, and analysis together with information technology. These tools are applied collaboratively via a technical framework by all the affected stakeholders - warfighter, laboratory, product center, logistics center, test center, and primary contractor.

  14. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2001-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures. Distribution System Modeling and Analysis helps prevent those errors. It gives readers a basic understanding of the modeling and operating characteristics of the major components of a distribution system. One by one, the author develops and analyzes each component as a stand-alone element, then puts them all together to analyze a distribution system comprising the various shunt and series devices for power-flow and short-circuit studies. He includes the derivation of all models and includes many num...

  15. HLA component based environment for distributed multiscale simulations

    NARCIS (Netherlands)

    Rycerz, K.; Bubak, M.; Sloot, P.M.A.; Getov, V.

    2008-01-01

    In this paper we present the Grid environment that supports application building basing on a High Level Architecture (HLA) component model. The proposed model is particularly suitable for distributed multiscale simulations. Original HLA partly supports interoperability and composability of

  16. Zooplankton distribution in the polluted environment around Bombay

    Digital Repository Service at National Institute of Oceanography (India)

    Gajbhiye, S.N.; Abidi, S.A.H.

    Zooplankton distribution, abundance and composition with reference to polluted environments off Bombay was estimated. This study was taken up along three transects viz. Versova, Mahim and Thana covering eleven stations around Bombay during 1980...

  17. Analysis of irregularly distributed points

    DEFF Research Database (Denmark)

    Hartelius, Karsten

    1996-01-01

    , but the usage of simple kriging may lead to ill-conditioned matrices when applied to highly irregularly distributed points. Adaptive Kalman filter schemes are investigated. A new parallel Kalman filter algorithm based on windowing technique gives good results in a case study on the Igallico satellite scene...... and represents an interesting contextuel classifier. Extended Kalman filtering on the other hand seems to be well suited for interpolation in gradually changing environments. Bayesian restoration is applied to a point matching problem, which consists of matching a grid to an image of (irregularly) distributed...

  18. Dynamic shared state maintenance in distributed virtual environments

    Science.gov (United States)

    Hamza-Lup, Felix George

    Advances in computer networks and rendering systems facilitate the creation of distributed collaborative environments in which the distribution of information at remote locations allows efficient communication. Particularly challenging are distributed interactive Virtual Environments (VE) that allow knowledge sharing through 3D information. The purpose of this work is to address the problem of latency in distributed interactive VE and to develop a conceptual model for consistency maintenance in these environments based on the participant interaction model. An area that needs to be explored is the relationship between the dynamic shared state and the interaction with the virtual entities present in the shared scene. Mixed Reality (MR) and VR environments must bring the human participant interaction into the loop through a wide range of electronic motion sensors, and haptic devices. Part of the work presented here defines a novel criterion for categorization of distributed interactive VE and introduces, as well as analyzes, an adaptive synchronization algorithm for consistency maintenance in such environments. As part of the work, a distributed interactive Augmented Reality (AR) testbed and the algorithm implementation details are presented. Currently the testbed is part of several research efforts at the Optical Diagnostics and Applications Laboratory including 3D visualization applications using custom built head-mounted displays (HMDs) with optical motion tracking and a medical training prototype for endotracheal intubation and medical prognostics. An objective method using quaternion calculus is applied for the algorithm assessment. In spite of significant network latency, results show that the dynamic shared state can be maintained consistent at multiple remotely located sites. In further consideration of the latency problems and in the light of the current trends in interactive distributed VE applications, we propose a hybrid distributed system architecture for

  19. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration; Pacheco Pages, A; Stradling, A

    2013-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  20. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  1. Distributed data analysis in ATLAS

    Science.gov (United States)

    Nilsson, Paul; Atlas Collaboration

    2012-12-01

    Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and Nordu Grid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interface to the multitude of execution backends (local, batch, and grid). The PanDA workload management system provides a set of utilities called PanDA Client; with these tools users can easily submit Athena analysis jobs to the PanDA-managed resources. Distributed data is managed by Don Quixote 2, a system developed by ATLAS; DQ2 is used to replicate datasets according to the data distribution policies and maintains a central catalog of file locations. The operation of the grid resources is continually monitored by the Ganga Robot functional testing system, and infrequent site stress tests are performed using the Hammer Cloud system. In addition, the DAST shift team is a group of power users who take shifts to provide distributed analysis user support; this team has effectively relieved the burden of support from the developers.

  2. Virtual environment for behavioral analysis of malware

    OpenAIRE

    KOVÁŘ, Jaroslav

    2013-01-01

    The goal of this bachelor thesis was to propose and implement a virtual environment for behavioral analysis of malware. Specifically with simulated network services. Subsequently was this environment applied to the analysis of several selected malware samples.

  3. Innovative Training Concepts for Use in Distributed Interactive Simulation Environments

    Science.gov (United States)

    1993-03-22

    simulation environments for training, part ydisributed intemacive simulation DIS) environments, innovative training concepts are needed to capitalize fully...the powerful role that simulation environments can piay in trwining derived from the FBC research program. These concepts were formulated to capitalize ...field training, the General Accounting Office (GAO, 1991) sUma I the commnon training shortfalls identified by CALL based on their analysis of lessons

  4. Protect Heterogeneous Environment Distributed Computing from Malicious Code Assignment

    Directory of Open Access Journals (Sweden)

    V. S. Gorbatov

    2011-09-01

    Full Text Available The paper describes the practical implementation of the protection system of heterogeneous environment distributed computing from malicious code for the assignment. A choice of technologies, development of data structures, performance evaluation of the implemented system security are conducted.

  5. Comparison of aerosol size distribution in coastal and oceanic environments

    NARCIS (Netherlands)

    Kusmierczyk-Michulec, J.T.; Eijk, A.M.J. van

    2006-01-01

    The results of applying the empirical orthogonal functions (EOF) method to decomposition and approximation of aerosol size distributions are presented. A comparison was made for two aerosol data sets, representing coastal and oceanic environments. The first data set includes measurements collected

  6. Computer network environment planning and analysis

    Science.gov (United States)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  7. Distributed computing testbed for a remote experimental environment

    International Nuclear Information System (INIS)

    Butner, D.N.; Casper, T.A.; Howard, B.C.; Henline, P.A.; Davis, S.L.; Barnes, D.

    1995-01-01

    Collaboration is increasing as physics research becomes concentrated on a few large, expensive facilities, particularly in magnetic fusion energy research, with national and international participation. These facilities are designed for steady state operation and interactive, real-time experimentation. We are developing tools to provide for the establishment of geographically distant centers for interactive operations; such centers would allow scientists to participate in experiments from their home institutions. A testbed is being developed for a Remote Experimental Environment (REE), a ''Collaboratory.'' The testbed will be used to evaluate the ability of a remotely located group of scientists to conduct research on the DIII-D Tokamak at General Atomics. The REE will serve as a testing environment for advanced control and collaboration concepts applicable to future experiments. Process-to-process communications over high speed wide area networks provide real-time synchronization and exchange of data among multiple computer networks, while the ability to conduct research is enhanced by adding audio/video communication capabilities. The Open Software Foundation's Distributed Computing Environment is being used to test concepts in distributed control, security, naming, remote procedure calls and distributed file access using the Distributed File Services. We are exploring the technology and sociology of remotely participating in the operation of a large scale experimental facility

  8. Investment decisions in environment of uncertainties integrated to the viability analysis of the distribution and transmission projects; Decisao de investimentos em ambiente de incertezas integrada a analise de viabilidade de projetos de transmissao e distribuicao

    Energy Technology Data Exchange (ETDEWEB)

    Gazzi, L.M.P. [Universidade de Sao Paulo (USP), SP (Brazil). Curso de Pos-graduacao em Engenharia Eletrica; Ramos, D.S. [Universidade de Sao Paulo (USP), SP (Brazil)

    2009-07-01

    This work intends to support the necessary investment decisions to the expansion of the network of 138/88 kV, belonging to the electric energy distribution companies. In the economic and financial analysis, are considered the needs of the Regulator for the tariff recognition of an investment, using economic engineering techniques to evaluate the feedback of the invested capital but considering an uncertainty environment, where the best decision depends on exogenous variables to the traditional planning process. In order to make viable the inclusion of the main variables of random behavior that condition the economic and financial performances of the analyzed alternatives for decision making, it was chosen the utilization of the methodology based on 'Real Options', which is an used technique in the financial market for some time.

  9. Response Time Analysis of Distributed Web Systems Using QPNs

    Directory of Open Access Journals (Sweden)

    Tomasz Rak

    2015-01-01

    Full Text Available A performance model is used for studying distributed Web systems. Performance evaluation is done by obtaining load test measurements. Queueing Petri Nets formalism supports modeling and performance analysis of distributed World Wide Web environments. The proposed distributed Web systems modeling and design methodology have been applied in the evaluation of several system architectures under different external loads. Furthermore, performance analysis is done to determine the system response time.

  10. Business models for distributed generation in a liberalized market environment

    International Nuclear Information System (INIS)

    Gordijn, Jaap; Akkermans, Hans

    2007-01-01

    The analysis of the potential of emerging innovative technologies calls for a systems-theoretic approach that takes into account technical as well as socio-economic factors. This paper reports the main findings of several business case studies of different future applications in various countries of distributed power generation technologies, all based on a common methodology for networked business modeling and analysis. (author)

  11. Distributed Data Analysis in ATLAS

    CERN Document Server

    Nilsson, P; The ATLAS collaboration

    2012-01-01

    Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and NorduGrid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interfa...

  12. Distributed Data Analysis in ATLAS

    CERN Document Server

    Nilsson, P

    2009-01-01

    Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and NorduGrid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interfa...

  13. Accelerating Computation of DNA Sequence Alignment in Distributed Environment

    Science.gov (United States)

    Guo, Tao; Li, Guiyang; Deaton, Russel

    Sequence similarity and alignment are most important operations in computational biology. However, analyzing large sets of DNA sequence seems to be impractical on a regular PC. Using multiple threads with JavaParty mechanism, this project has successfully implemented in extending the capabilities of regular Java to a distributed environment for simulation of DNA computation. With the aid of JavaParty and the design of multiple threads, the results of this study demonstrated that the modified regular Java program could perform parallel computing without using RMI or socket communication. In this paper, an efficient method for modeling and comparing DNA sequences with dynamic programming and JavaParty was firstly proposed. Additionally, results of this method in distributed environment have been discussed.

  14. A Virtual Hosting Environment for Distributed Online Gaming

    Science.gov (United States)

    Brossard, David; Prieto Martinez, Juan Luis

    With enterprise boundaries becoming fuzzier, it’s become clear that businesses need to share resources, expose services, and interact in many different ways. In order to achieve such a distribution in a dynamic, flexible, and secure way, we have designed and implemented a virtual hosting environment (VHE) which aims at integrating business services across enterprise boundaries and virtualising the ICT environment within which these services operate in order to exploit economies of scale for the businesses as well as achieve shorter concept-to-market time scales. To illustrate the relevance of the VHE, we have applied it to the online gaming world. Online gaming is an early adopter of distributed computing and more than 30% of gaming developer companies, being aware of the shift, are focusing on developing high performance platforms for the new online trend.

  15. Distributed collaborative environments for 21st century modeling and simulation

    Science.gov (United States)

    McQuay, William K.

    2001-09-01

    Distributed collaboration is an emerging technology that will significantly change how modeling and simulation is employed in 21st century organizations. Modeling and simulation (M&S) is already an integral part of how many organizations conduct business and, in the future, will continue to spread throughout government and industry enterprises and across many domains from research and development to logistics to training to operations. This paper reviews research that is focusing on the open standards agent-based framework, product and process modeling, structural architecture, and the integration technologies - the glue to integrate the software components. A distributed collaborative environment is the underlying infrastructure that makes communication between diverse simulations and other assets possible and manages the overall flow of a simulation based experiment. The AFRL Collaborative Environment concept will foster a major cultural change in how the acquisition, training, and operational communities employ M&S.

  16. Optimising metadata workflows in a distributed information environment

    OpenAIRE

    Robertson, R. John; Barton, Jane

    2005-01-01

    The different purposes present within a distributed information environment create the potential for repositories to enhance their metadata by capitalising on the diversity of metadata available for any given object. This paper presents three conceptual reference models required to achieve this optimisation of metadata workflow: the ecology of repositories, the object lifecycle model, and the metadata lifecycle model. It suggests a methodology for developing the metadata lifecycle model, and ...

  17. A distributed execution environment for large data visualization

    International Nuclear Information System (INIS)

    Huang Jian; Liu Huadong; Beck, Micah; Gao Jinzhu; Moore, Terry

    2006-01-01

    Over the years, homogeneous computer cluster have been the most popular, and, in some sense, the only viable, platform for use in parallel visualization. In this work, we designed an execution environment for data-intensive visualization that is suitable to handle SciDAC scale datasets. This environment is solely based on computers distributed across the Internet that are owned and operated by independent institutions, while being openly shared for free. Those Internet computers are inherently of heterogeneous hardware configuration and running a variety of operating systems. Using 100 processors of such kind, we have been able to obtain the same level of performance offered by a 64-node cluster of 2.2 GHz P4 processors, while processing a 75GBs subset of TSI simulation data. Due to its inherently shared nature, this execution environment for data-intensive visualization could provide a viable means of collaboration among geographically separated SciDAC scientists

  18. Study of cache performance in distributed environment for data processing

    International Nuclear Information System (INIS)

    Makatun, Dzmitry; Lauret, Jérôme; Šumbera, Michal

    2014-01-01

    Processing data in distributed environment has found its application in many fields of science (Nuclear and Particle Physics (NPP), astronomy, biology to name only those). Efficiently transferring data between sites is an essential part of such processing. The implementation of caching strategies in data transfer software and tools, such as the Reasoner for Intelligent File Transfer (RIFT) being developed in the STAR collaboration, can significantly decrease network load and waiting time by reusing the knowledge of data provenance as well as data placed in transfer cache to further expand on the availability of sources for files and data-sets. Though, a great variety of caching algorithms is known, a study is needed to evaluate which one can deliver the best performance in data access considering the realistic demand patterns. Records of access to the complete data-sets of NPP experiments were analyzed and used as input for computer simulations. Series of simulations were done in order to estimate the possible cache hits and cache hits per byte for known caching algorithms. The simulations were done for cache of different sizes within interval 0.001 – 90% of complete data-set and low-watermark within 0-90%. Records of data access were taken from several experiments and within different time intervals in order to validate the results. In this paper, we will discuss the different data caching strategies from canonical algorithms to hybrid cache strategies, present the results of our simulations for the diverse algorithms, debate and identify the choice for the best algorithm in the context of Physics Data analysis in NPP. While the results of those studies have been implemented in RIFT, they can also be used when setting up cache in any other computational work-flow (Cloud processing for example) or managing data storages with partial replicas of the entire data-set

  19. Distributed Motor Controller (DMC) for Operation in Extreme Environments

    Science.gov (United States)

    McKinney, Colin M.; Yager, Jeremy A.; Mojarradi, Mohammad M.; Some, Rafi; Sirota, Allen; Kopf, Ted; Stern, Ryan; Hunter, Don

    2012-01-01

    This paper presents an extreme environment capable Distributed Motor Controller (DMC) module suitable for operation with a distributed architecture of future spacecraft systems. This motor controller is designed to be a bus-based electronics module capable of operating a single Brushless DC motor in extreme space environments: temperature (-120 C to +85 C required, -180 C to +100 C stretch goal); radiation (>;20K required, >;100KRad stretch goal); >;360 cycles of operation. Achieving this objective will result in a scalable modular configuration for motor control with enhanced reliability that will greatly lower cost during the design, fabrication and ATLO phases of future missions. Within the heart of the DMC lies a pair of cold-capable Application Specific Integrated Circuits (ASICs) and a Field Programmable Gate Array (FPGA) that enable its miniaturization and operation in extreme environments. The ASICs are fabricated in the IBM 0.5 micron Silicon Germanium (SiGe) BiCMOS process and are comprised of Analog circuitry to provide telemetry information, sensor interface, and health and status of DMC. The FPGA contains logic to provide motor control, status monitoring and spacecraft interface. The testing and characterization of these ASICs have yielded excellent functionality in cold temperatures (-135 C). The DMC module has demonstrated successful operation of a motor at temperature.

  20. Security Aspects in Resource Management Systems in Distributed Computing Environments

    Directory of Open Access Journals (Sweden)

    Adamski Marcin

    2017-12-01

    Full Text Available In many distributed computing systems, aspects related to security are getting more and more relevant. Security is ubiquitous and could not be treated as a separated problem or a challenge. In our opinion it should be considered in the context of resource management in distributed computing environments like Grids and Clouds, e.g. scheduled computations can be much delayed because of cyber-attacks, inefficient infrastructure or users valuable and sensitive data can be stolen even in the process of correct computation. To prevent such cases there is a need to introduce new evaluation metrics for resource management that will represent the level of security of computing resources and more broadly distributed computing infrastructures. In our approach, we have introduced a new metric called reputation, which simply determines the level of reliability of computing resources from the security perspective and could be taken into account during scheduling procedures. The new reputation metric is based on various relevant parameters regarding cyber-attacks (also energy attacks, administrative activities such as security updates, bug fixes and security patches. Moreover, we have conducted various computational experiments within the Grid Scheduling Simulator environment (GSSIM inspired by real application scenarios. Finally, our experimental studies of new resource management approaches taking into account critical security aspects are also discussed in this paper.

  1. Community problem-solving framed as a distributed information use environment: bridging research and practice

    Directory of Open Access Journals (Sweden)

    Joan C. Durrance

    2006-01-01

    Full Text Available Introduction. This article results from a qualitative study of 1 information behavior in community problem-solving framed as a distributed information use environment and 2 approaches used by a best-practice library to anticipate information needs associated with community problem solving. Method. Several approaches to data collection were used - focus groups, interviews, observation of community and library meetings, and analysis of supporting documents. We focused first on the information behaviour of community groups. Finding that the library supported these activities we sought to understand its approach. Analysis. Data were coded thematically for both information behaviour concepts and themes germane to problem-solving activity. A grounded theory approach was taken to capture aspects of the library staff's practice. Themes evolved from the data; supporting documentation - reports, articles and library communication - was also coded. Results. The study showed 1 how information use environment components (people, setting, problems, problem resolutions combine in this distributed information use environment to determine specific information needs and uses; and 2 how the library contributed to the viability of this distributed information use environment. Conclusion. Community problem solving, here explicated as a distributed IUE, is likely to be seen in multiple communities. The library model presented demonstrates that by reshaping its information practice within the framework of an information use environment, a library can anticipate community information needs as they are generated and where they are most relevant.

  2. Environment effects on the flattening distribution of galaxies

    International Nuclear Information System (INIS)

    Freitas Pacheco, J.A. de; Souza, R.E. de; Arakaki, L.

    1983-01-01

    The results of a study of environment effects on the flattening distribution of galaxies are presented. Samples of different morphological types in regions of high and low projected galaxian density were prepared using the Uppsala General Catalogue of Galaxies. No significant differences were detected for the E and Spiral galaxies. However this study suggest that there is an excess of flat S phi systems in regions of high galaxian density with respect the field systems. These 'flat' S phi's are interpreted as being gas stripped Sa galaxies and the consequences of such a hypothesis are also discussed. (Author) [pt

  3. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  4. Platform for distributed multimedia environments supporting arbitrarily nested team structures

    Science.gov (United States)

    Nazari Shirehjini, Ali A.; Guddat, Hannes; Noll, Stefan; Schiffner, Norbert

    2003-11-01

    In this paper a novel platform, HOUCOMTM, for the development of team based distributed collaborative applications is presented. Its implementation within a scenario for distributed cooperative virtual product development, ProViT, will be shown, describing the features and advantages of the presented platform. The specified platform consists of a decentrally organized, dynamic and user-configurable architecture. The main entities within the given platform are Conferences (working groups), Sessions (sub-groups), Users, Components, and Shared Resources. The system provides support of hierarchical Session Management, allowing for arbitrarily nested groups and multi-conferencing. Within the given platform Users can be individuals as well as technical devices, e.g. a streaming framework. The ProViT scenario builds a collaborative environment for interactive distributed VR Design reviews for the mechanical engineering industry. Here several distributed clusters form a working group, allowing individual partners to immersively collaborate on 3D models and supplementary documents and communicate via A/V-streaming. This paper divides into three chapters, first describing the ProViT scenario and deriving its requirements. The subsequent chapter examines the novel concept in general and the features that helped meeting the given project requirements in particular. In the conclusion the authors give an outlook on future extensions and applications of the developed platform.

  5. A Distributed Feature-based Environment for Collaborative Design

    Directory of Open Access Journals (Sweden)

    Wei-Dong Li

    2003-02-01

    Full Text Available This paper presents a client/server design environment based on 3D feature-based modelling and Java technologies to enable design information to be shared efficiently among members within a design team. In this environment, design tasks and clients are organised through working sessions generated and maintained by a collaborative server. The information from an individual design client during a design process is updated and broadcast to other clients in the same session through an event-driven and call-back mechanism. The downstream manufacturing analysis modules can be wrapped as agents and plugged into the open environment to support the design activities. At the server side, a feature-feature relationship is established and maintained to filter the varied information of a working part, so as to facilitate efficient information update during the design process.

  6. Sources and distribution of anthropogenic radionuclides in different marine environments

    International Nuclear Information System (INIS)

    Holm, E.

    1997-01-01

    The knowledge of the distribution in time and space radiologically important radionuclides from different sources in different marine environments is important for assessment of dose commitment following controlled or accidental releases and for detecting eventual new sources. Present sources from nuclear explosion tests, releases from nuclear facilities and the Chernobyl accident provide a tool for such studies. The different sources can be distinguished by different isotopic and radionuclide composition. Results show that radiocaesium behaves rather conservatively in the south and north Atlantic while plutonium has a residence time of about 8 years. On the other hand enhanced concentrations of plutonium in surface waters in arctic regions where vertical mixing is small and iceformation plays an important role. Significantly increased concentrations of plutonium are also found below the oxic layer in anoxic basins due to geochemical concentration. (author)

  7. Taxing energy to improve the environment. Efficiency and distributional effects

    International Nuclear Information System (INIS)

    Heijdra, B.J.; Van der Horst, A.

    1998-02-01

    The effects of environmental tax policy in a dynamic overlapping-generations model of a small open economy with environmental quality incorporated as a durable consumption good have been studied. Raising the energy tax may deliver an efficiency gain if agents care enough about the environment. The benefits are unevenly distributed across generations since capital ownership, and the capital loss induced by a tax increase, rises with age. A suitable egalitarian bond policy can be employed in order to ensure everybody gains to the same extent. With this additional instrument the optimal energy tax can be computed. The authors further considered a tax reform that simultaneously lowers labour taxation and raises the energy tax. This policy delivers qualitatively similar consequences as the first scenario, though all changes are less pronounced. A double dividend may appear soon after the reform but vanishes in the course of the transition. 22 refs

  8. Detecting Distributed SQL Injection Attacks in a Eucalyptus Cloud Environment

    Science.gov (United States)

    Kebert, Alan; Barnejee, Bikramjit; Solano, Juan; Solano, Wanda

    2013-01-01

    The cloud computing environment offers malicious users the ability to spawn multiple instances of cloud nodes that are similar to virtual machines, except that they can have separate external IP addresses. In this paper we demonstrate how this ability can be exploited by an attacker to distribute his/her attack, in particular SQL injection attacks, in such a way that an intrusion detection system (IDS) could fail to identify this attack. To demonstrate this, we set up a small private cloud, established a vulnerable website in one instance, and placed an IDS within the cloud to monitor the network traffic. We found that an attacker could quite easily defeat the IDS by periodically altering its IP address. To detect such an attacker, we propose to use multi-agent plan recognition, where the multiple source IPs are considered as different agents who are mounting a collaborative attack. We show that such a formulation of this problem yields a more sophisticated approach to detecting SQL injection attacks within a cloud computing environment.

  9. DESIGN COORDINATION IN DISTRIBUTED ENVIRONMENTS USING VIRTUAL REALITY SYSTEMS

    Directory of Open Access Journals (Sweden)

    Rusdi HA

    2003-01-01

    Full Text Available This paper presents a research project, which investigates the use of virtual reality and computer communication technology to facilitate building design coordination in distributed environments. The emphasis of the system, called VR-based DEsign COordination (VRDECO is providing a communication tool that can be used by remote designers for settling ideas before they fully engage in concurrent engineering environments. VRDECO provides the necessary design tools, library of building elements and communication procedures, for designers from remote places to perform and coordinate their initial tasks. It has been implemented using available commercial software packages, and is used in designing a simple house. VRDECO facilitates the creation a preliminary design and simple communication with the client. There are, however, some difficulties in the development of the full version of VRDECO, i.e.: creating an adequate number of building elements, building specification database with a sufficient number of choices, and establishing a systematic rule to determine the parts of a building that are updateable.

  10. Prediction of oil droplet size distribution in agitated aquatic environments

    International Nuclear Information System (INIS)

    Khelifa, A.; Lee, K.; Hill, P.S.

    2004-01-01

    Oil spilled at sea undergoes many transformations based on physical, biological and chemical processes. Vertical dispersion is the hydrodynamic mechanism controlled by turbulent mixing due to breaking waves, vertical velocity, density gradients and other environmental factors. Spilled oil is dispersed in the water column as small oil droplets. In order to estimate the mass of an oil slick in the water column, it is necessary to know how the droplets formed. Also, the vertical dispersion and fate of oil spilled in aquatic environments can be modelled if the droplet-size distribution of the oil droplets is known. An oil spill remediation strategy can then be implemented. This paper presented a newly developed Monte Carlo model to predict droplet-size distribution due to Brownian motion, turbulence and a differential settling at equilibrium. A kinematic model was integrated into the proposed model to simulate droplet breakage. The key physical input of the model is the maximum droplet size permissible in the simulation. Laboratory studies were found to be in good agreement with field studies. 26 refs., 1 tab., 5 figs

  11. ESC Track Fusion Demonstration Tool for Distributed Environments

    Science.gov (United States)

    Cox, C.; Degraaf, E.; Perry, R.; Diaz, R.

    A key requirement of future net-centric Space Situational Awareness systems development and operations will be decentralized operations, including multi-level distributed data fusion. Raytheon has developed a demonstration for ESC 850 ELSG/NS that fuses sensor-supplied tracks in a dense resident space object (RSO) environment. The demonstration use the state vector and covariance input data from single pass orbit solutions and applies track-to-track correlation algorithms to fuse the individual tracks into composite orbits. Platform independent Java technology and an agent-based software design using asynchronous inter-process communications was used in the demonstration tool development. The tool has been tested against a simulated scenario corresponding to the future 100,000+ object catalog environment. Ten days of simulated data from Fylingdales, Shemya, Eglin, and a future Space Fence sensor were generated for a co-orbiting family of 122 sun-synchronous objects between 700 and 800 km altitude from the NASA simulated small debris for 2015. The selected set exceeds the average object densities for the 100,000+ RSO environment, and provides a scenario similar to an evolved breakup where the debris has had time to disperse. The demo produced very good results using fast and simple astrodynamic models. A total of 16678 input tracks were fused, with less than 1.6% being misassociated. Pure tracks were generated for 65% of the 122 truth objects, and 97% of the objects had a misassociation rate tool that can be used to assess current breakups such as the Chinese ASAT event.

  12. Distribution system analysis and automation

    CERN Document Server

    Gers, Juan

    2013-01-01

    A comprehensive guide to techniques that allow engineers to simulate, analyse and optimise power distribution systems which combined with automation, underpin the emerging concept of the "smart grid". This book is supported by theoretical concepts with real-world applications and MATLAB exercises.

  13. Distributed temperature and distributed acoustic sensing for remote and harsh environments

    Science.gov (United States)

    Mondanos, Michael; Parker, Tom; Milne, Craig H.; Yeo, Jackson; Coleman, Thomas; Farhadiroushan, Mahmoud

    2015-05-01

    Advances in opto-electronics and associated signal processing have enabled the development of Distributed Acoustic and Temperature Sensors. Unlike systems relying on discrete optical sensors a distributed system does not rely upon manufactured sensors but utilises passive custom optical fibre cables resistant to harsh environments, including high temperature applications (600°C). The principle of distributed sensing is well known from the distributed temperature sensor (DTS) which uses the interaction of the source light with thermal vibrations (Raman scattering) to determine the temperature at all points along the fibre. Distributed Acoustic Sensing (DAS) uses a novel digital optical detection technique to precisely capture the true full acoustic field (amplitude, frequency and phase) over a wide dynamic range at every point simultaneously. A number of signal processing techniques have been developed to process a large array of acoustic signals to quantify the coherent temporal and spatial characteristics of the acoustic waves. Predominantly these systems have been developed for the oil and gas industry to assist reservoir engineers in optimising the well lifetime. Nowadays these systems find a wide variety of applications as integrity monitoring tools in process vessels, storage tanks and piping systems offering the operator tools to schedule maintenance programs and maximize service life.

  14. Transient stability analysis of a distribution network with distributed generators

    NARCIS (Netherlands)

    Xyngi, I.; Ishchenko, A.; Popov, M.; Van der Sluis, L.

    2009-01-01

    This letter describes the transient stability analysis of a 10-kV distribution network with wind generators, microturbines, and CHP plants. The network being modeled in Matlab/Simulink takes into account detailed dynamic models of the generators. Fault simulations at various locations are

  15. Pesticide distribution in an agricultural environment in Argentina.

    Science.gov (United States)

    Loewy, Ruth M; Monza, Liliana B; Kirs, Veronica E; Savini, Monica C

    2011-01-01

    An assessment of the off-site migration of pesticides from agricultural activity into the environment in the Neuquen River Valley was performed. The aim of this study was to evaluate the distribution of pesticides in several compartments of a small agricultural sub-catchment. Soil, surface water, shallow groundwater and drift deposition were analyzed for pesticide residues. Results showed the presence of some pesticide residues in soil, surface water and shallow groundwater compartments. The highest detection frequencies in water (surface and subsurface) were found for azinphos-methyl and chlorpyrifos (>70%). In terms of concentration, the highest levels were observed in shallow groundwater for azinphos methyl (22.5 μg/L) and carbaryl (45.7 μg/L). In the soil, even before the application period had started, accumulation of residues was present. These residues increased during the period studied. Spray drift during pesticide application was found to be a significant pathway for the migration of pesticide residues in surface water, while leaching and preferential flows were the main transport routes contributing to subsurface contamination.

  16. Visualization for Information Retrieval in Regional Distributed Environment

    Directory of Open Access Journals (Sweden)

    Amany Salama

    2013-09-01

    Full Text Available Information retrieval (IR is the task of representing, storing, organizing, and offering access to information items. The problem for search engines is not only to find topic relevant results, but results consistent with the user’s information need. How to retrieve desired information from the Internet with high efficiency and good effectiveness is become the main concern of internet user-based. The interface of the systems does not help them to perceive the precision of these results. Speed, resources consuming, searching and retrieving process also aren't optimal. The search engine's aim is developing and improving the performance of information retrieval system and gifting the user whatever his culture' level. The proposed system is using information visualization for interface problems, and for improving other side of web IR system's problems, it uses the regional crawler on distributed search environment with conceptual query processing and enhanced vector space information retrieval model (VSM. It is an effective attempt to match renewal user's needs and get a better performance than ordinary system.

  17. Guest Editor's introduction: Special issue on distributed virtual environments

    Science.gov (United States)

    Lea, Rodger

    1998-09-01

    Distributed virtual environments (DVEs) combine technology from 3D graphics, virtual reality and distributed systems to provide an interactive 3D scene that supports multiple participants. Each participant has a representation in the scene, often known as an avatar, and is free to navigate through the scene and interact with both the scene and other viewers of the scene. Changes to the scene, for example, position changes of one avatar as the associated viewer navigates through the scene, or changes to objects in the scene via manipulation, are propagated in real time to all viewers. This ensures that all viewers of a shared scene `see' the same representation of it, allowing sensible reasoning about the scene. Early work on such environments was restricted to their use in simulation, in particular in military simulation. However, over recent years a number of interesting and potentially far-reaching attempts have been made to exploit the technology for a range of other uses, including: Social spaces. Such spaces can be seen as logical extensions of the familiar text chat space. In 3D social spaces avatars, representing participants, can meet in shared 3D scenes and in addition to text chat can use visual cues and even in some cases spatial audio. Collaborative working. A number of recent projects have attempted to explore the use of DVEs to facilitate computer-supported collaborative working (CSCW), where the 3D space provides a context and work space for collaboration. Gaming. The shared 3D space is already familiar, albeit in a constrained manner, to the gaming community. DVEs are a logical superset of existing 3D games and can provide a rich framework for advanced gaming applications. e-commerce. The ability to navigate through a virtual shopping mall and to look at, and even interact with, 3D representations of articles has appealed to the e-commerce community as it searches for the best method of presenting merchandise to electronic consumers. The technology

  18. Distribution and migration of long lived radionuclides in the environment around the Chernobyl Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Amano, Hikaru; Matsunaga, Takeshi; Ueno, Takashi; Nagao, Seiya; Yanase, Nobuyuki; Watanabe, Miki; Hanzawa, Yukiko [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1999-03-01

    Characteristics of the distribution and migration of long lived radionuclides in the environment around the Chernobyl Nuclear Power Plant (30 km exclusion zone) has been investigated. Research items are, (i) Distribution of long lived radionuclides in the surface environment, (ii) Speciation of long lived radionuclides in the surface environment, (iii) Characteristics of the migration in the surface environment, (iv) Characteristics of the uptake into the vegetables, (v) Prediction of future radioecological situation in the environment, respectively. (author)

  19. Scene analysis in the natural environment

    DEFF Research Database (Denmark)

    Lewicki, Michael S; Olshausen, Bruno A; Surlykke, Annemarie

    2014-01-01

    The problem of scene analysis has been studied in a number of different fields over the past decades. These studies have led to important insights into problems of scene analysis, but not all of these insights are widely appreciated, and there remain critical shortcomings in current approaches......, all exhibit behaviors that require robust solutions to scene analysis problems encountered in the natural environment. By examining the behaviors of these seemingly disparate animals, we emerge with a framework for studying scene analysis comprising four essential properties: (1) the ability to solve...

  20. Cluster analysis for determining distribution center location

    Science.gov (United States)

    Lestari Widaningrum, Dyah; Andika, Aditya; Murphiyanto, Richard Dimas Julian

    2017-12-01

    Determination of distribution facilities is highly important to survive in the high level of competition in today’s business world. Companies can operate multiple distribution centers to mitigate supply chain risk. Thus, new problems arise, namely how many and where the facilities should be provided. This study examines a fast-food restaurant brand, which located in the Greater Jakarta. This brand is included in the category of top 5 fast food restaurant chain based on retail sales. There were three stages in this study, compiling spatial data, cluster analysis, and network analysis. Cluster analysis results are used to consider the location of the additional distribution center. Network analysis results show a more efficient process referring to a shorter distance to the distribution process.

  1. IRST infrared background analysis of bay environments

    CSIR Research Space (South Africa)

    Schwering, PBW

    2008-04-01

    Full Text Available threats can be present in environments with cluttered backgrounds as well as rapidly varying atmospheric conditions. During trials executed in False Bay a large amount of target, background and atmosphere data was gathered that is of use in analysis...

  2. Data analysis in an Object Request Broker environment

    International Nuclear Information System (INIS)

    Malon, D.M.; May, E.N.; Grossman, R.L.; Day, C.T.; Quarrie, D.R.

    1995-01-01

    Computing for the Next Millenium will require software interoperability in heterogeneous, increasingly object-oriented environments. The Common Object Request Broker Architecture (CORBA) is a software industry effort, under the aegis of the Object Management Group (OMG), to standardize mechanisms for software interaction among disparate applications written in a variety of languages and running on a variety of distributed platforms. In this paper, we describe some of the design and performance implications for software that must function in such a brokered environment in a standards-compliant way. We illustrate these implications with a physics data analysis example as a case study

  3. Radioactivity Distribution in Malaysian Marine Environment. Chapter 4

    International Nuclear Information System (INIS)

    Zal U'yun Wan Mahmood; Abdul Kadir Ishak; Norfaizal Mohamad; Wo, Y.M.; Kamarudin Samuding

    2015-01-01

    The marine radioactivity distribution mapping in malaysia was developed with the aim to illustrate the pattern of the distribution of both anthropogenic and natural radionuclides in seawater and sediments. The data collected will form the basis for an anthropogenic radioactivity.

  4. Clinical learning environment inventory: factor analysis.

    Science.gov (United States)

    Newton, Jennifer M; Jolly, Brian C; Ockerby, Cherene M; Cross, Wendy M

    2010-06-01

    This paper is a report of the psychometric testing of the Clinical Learning Environment Inventory. The clinical learning environment is a complex socio-cultural entity that offers a variety of opportunities to engage or disengage in learning. The Clinical Learning Environment Inventory is a self-report instrument consisting of 42 items classified into six scales: personalization, student involvement, task orientation, innovation, satisfaction and individualization. It was developed to examine undergraduate nursing students' perceptions of the learning environment whilst on placement in clinical settings. As a component of a longitudinal project, Bachelor of Nursing students (n = 659) from two campuses of a university in Australia, completed the Clinical Learning Environment Inventory from 2006 to 2008. Principal components analysis using varimax rotation was conducted to explore the factor structure of the inventory. Data for 513 students (77%) were eligible for inclusion. Constraining data to a 6-factor solution explained 51% of the variance. The factors identified were: student-centredness, affordances and engagement, individualization, fostering workplace learning, valuing nurses' work, and innovative and adaptive workplace culture. These factors were reviewed against recent theoretical developments in the literature. The study offers an empirically based and theoretically informed extension of the original Clinical Learning Environment Inventory, which had previously relied on ad hoc clustering of items and the use of internal reliability of its sub-scales. Further research is required to establish the consistency of these new factors.

  5. Bioprospecting and Functional Analysis of Neglected Environments

    DEFF Research Database (Denmark)

    Vogt, Josef Korbinian

    on Biological Diversity (explained in Chapter 3). Proteolytic enzymes – described in Chapter 4 – are the target for bioprospecting due to their high market value. Section II describes methods used for the analysis of metagenomic and RNA-seq datasets, including Manuscript I, which includes the taxonomic...... annotation of a late Pleistocene horse metagenome and the functional annotation of the donkey genome. The functional analysis and the identification of novel proteolytic enzymes in the polar marine environment and the full transcriptome analysis of the carnivorous plant Dionaea muscipula is also presented...... of arctic marine metagenomes reveals bacterial strategies for deep sea persistence (Manuscript II). Furthermore, this extreme environment is a fertile ground to mine for novel proteolytic enzymes. Manuscript III presents a bioinformatics approach to identify sequences for potential commercialization...

  6. Distributed optimization-based control of multi-agent networks in complex environments

    CERN Document Server

    Zhu, Minghui

    2015-01-01

    This book offers a concise and in-depth exposition of specific algorithmic solutions for distributed optimization based control of multi-agent networks and their performance analysis. It synthesizes and analyzes distributed strategies for three collaborative tasks: distributed cooperative optimization, mobile sensor deployment and multi-vehicle formation control. The book integrates miscellaneous ideas and tools from dynamic systems, control theory, graph theory, optimization, game theory and Markov chains to address the particular challenges introduced by such complexities in the environment as topological dynamics, environmental uncertainties, and potential cyber-attack by human adversaries. The book is written for first- or second-year graduate students in a variety of engineering disciplines, including control, robotics, decision-making, optimization and algorithms and with backgrounds in aerospace engineering, computer science, electrical engineering, mechanical engineering and operations research. Resea...

  7. Distributed Fiber Optic Gas Sensing for Harsh Environment

    Energy Technology Data Exchange (ETDEWEB)

    Juntao Wu

    2008-03-14

    This report summarizes work to develop a novel distributed fiber-optic micro-sensor that is capable of detecting common fossil fuel gases in harsh environments. During the 32-month research and development (R&D) program, GE Global Research successfully synthesized sensing materials using two techniques: sol-gel based fiber surface coating and magnetron sputtering based fiber micro-sensor integration. Palladium nanocrystalline embedded silica matrix material (nc-Pd/Silica), nanocrystalline palladium oxides (nc-PdO{sub x}) and palladium alloy (nc-PdAuN{sub 1}), and nanocrystalline tungsten (nc-WO{sub x}) sensing materials were identified to have high sensitivity and selectivity to hydrogen; while the palladium doped and un-doped nanocrystalline tin oxide (nc-PdSnO{sub 2} and nc-SnO{sub 2}) materials were verified to have high sensitivity and selectivity to carbon monoxide. The fiber micro-sensor comprises an apodized long-period grating in a single-mode fiber, and the fiber grating cladding surface was functionalized by above sensing materials with a typical thickness ranging from a few tens of nanometers to a few hundred nanometers. GE found that the morphologies of such sensing nanomaterials are either nanoparticle film or nanoporous film with a typical size distribution from 5-10 nanometers. nc-PdO{sub x} and alloy sensing materials were found to be highly sensitive to hydrogen gas within the temperature range from ambient to 150 C, while nc-Pd/Silica and nc-WO{sub x} sensing materials were found to be suitable to be operated from 150 C to 500 C for hydrogen gas detection. The palladium doped and un-doped nc-SnO{sub 2} materials also demonstrated sensitivity to carbon monoxide gas at approximately 500 C. The prototyped fiber gas sensing system developed in this R&D program is based on wavelength-division-multiplexing technology in which each fiber sensor is identified according to its transmission spectra features within the guiding mode and cladding modes. The

  8. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2016-04-01

    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  9. Distributed Scaffolding: Synergy in Technology-Enhanced Learning Environments

    Science.gov (United States)

    Ustunel, Hale H.; Tokel, Saniye Tugba

    2018-01-01

    When technology is employed challenges increase in learning environments. Kim et al. ("Sci Educ" 91(6):1010-1030, 2007) presented a pedagogical framework that provides a valid technology-enhanced learning environment. The purpose of the present design-based study was to investigate the micro context dimension of this framework and to…

  10. Distribution and characteristics of gamma and cosmic ray dose rate in living environment

    International Nuclear Information System (INIS)

    Nagaoka, Toshi; Moriuchi, Shigeru

    1991-01-01

    A series of environmental radiation surveys was carried out from the viewpoint of characterizing the natural radiation dose rate distribution in the living environment, including natural and artificial ones. Through the analysis of the data obtained at numbers of places, several aspects of the radiation field in living environments were clarified. That is the gamma ray dose rate varies due to the following three dominant causes: 1) the radionuclide concentration of surrounding materials acting as gamma ray sources, 2) the spatial distribution of surrounding materials, and 3) the geometrical and shielding conditions between the natural gamma ray sources and the measured point; whereas, the cosmic ray dose rate varies due to the thickness of upper shielding materials. It was also suggested that the gamma ray dose rate generally shows an upward tendency, and the cosmic ray dose rate a downward one in artificial environment. This kind of knowledge is expected to serve as fundamental information for accurate and realistic evaluation of the collective dose in the living environment. (author)

  11. DEVELOPMENT OF A HETEROGENIC DISTRIBUTED ENVIRONMENT FOR SPATIAL DATA PROCESSING USING CLOUD TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    A. S. Garov

    2016-06-01

    Full Text Available We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (http://cartsrv.mexlab.ru/geoportal will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.

  12. Development of a Heterogenic Distributed Environment for Spatial Data Processing Using Cloud Technologies

    Science.gov (United States)

    Garov, A. S.; Karachevtseva, I. P.; Matveev, E. V.; Zubarev, A. E.; Florinsky, I. V.

    2016-06-01

    We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (geoportal"target="_blank">http://cartsrv.mexlab.ru/geoportal) will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.

  13. Distributed metadata in a high performance computing environment

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Zhang, Zhenhua; Liu, Xuezhao; Tang, Haiying

    2017-07-11

    A computer-executable method, system, and computer program product for managing meta-data in a distributed storage system, wherein the distributed storage system includes one or more burst buffers enabled to operate with a distributed key-value store, the co computer-executable method, system, and computer program product comprising receiving a request for meta-data associated with a block of data stored in a first burst buffer of the one or more burst buffers in the distributed storage system, wherein the meta data is associated with a key-value, determining which of the one or more burst buffers stores the requested metadata, and upon determination that a first burst buffer of the one or more burst buffers stores the requested metadata, locating the key-value in a portion of the distributed key-value store accessible from the first burst buffer.

  14. Models and analysis for distributed systems

    CERN Document Server

    Haddad, Serge; Pautet, Laurent; Petrucci, Laure

    2013-01-01

    Nowadays, distributed systems are increasingly present, for public software applications as well as critical systems. software applications as well as critical systems. This title and Distributed Systems: Design and Algorithms - from the same editors - introduce the underlying concepts, the associated design techniques and the related security issues.The objective of this book is to describe the state of the art of the formal methods for the analysis of distributed systems. Numerous issues remain open and are the topics of major research projects. One current research trend consists of pro

  15. Scene analysis in the natural environment

    Directory of Open Access Journals (Sweden)

    Michael S Lewicki

    2014-04-01

    Full Text Available The problem of scene analysis has been studied in a number of different fields over the past decades. These studies have led to a number of important insights into problems of scene analysis, but not all of these insights are widely appreciated. Despite this progress, there are also critical shortcomings in current approaches that hinder further progress. Here we take the view that scene analysis is a universal problem solved by all animals, and that we can gain new insight by studying the problems that animals face in complex natural environments. In particular, the jumping spider, songbird, echolocating bat, and electric fish, all exhibit behaviors that require robust solutions to scene analysis problems encountered in the natural environment. By examining the behaviors of these seemingly disparate animals, we emerge with a framework for studying analysis comprising four essential properties: 1 the ability to solve ill-posed problems, 2 the ability to integrate and store information across time and modality, 3 efficient recovery and representation of 3D scene structure, and 4 the use of optimal motor actions for acquiring information to progress towards behavioral goals.

  16. [Biomass structure analysis of Atractylodes lancea in different ecological environments].

    Science.gov (United States)

    Sun, Yu-Zhang; Guo, Lan-Ping; Yang, Xiao-Qiong; Huang, Lu-Qi; Zhu, Wen-Quan; Pan, Yao-Zhong

    2008-07-01

    To study the ecological environments of Atractylodes lancea by biomass structural analysis. Through the scientific investigation in Maoshan, the sampling spots were set up, the relation between growth and ecological environments was researched and the ecological environments of A. lancea were divided as following: the vegetation layer, the shrub layer, the shrub-weed layer and the weed layer. The ramet biomass, height, leaves and coverage of A. lancea were studied. The several factors (ramet biomass, height, leaves and coverage) showed the regular change. Among maximum, minimum and average, the shrub layer was the biggest, the shrub and weed layer was the second biggest and the vegetation layer and the weed layer was the least. A. lancea tends to distribute in the shrub layer and the shrub-weed layer.

  17. Distributed computing environments for future space control systems

    Science.gov (United States)

    Viallefont, Pierre

    1993-01-01

    The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.

  18. Distributed Diagnosis in Uncertain Environments Using Dynamic Bayesian Networks

    Data.gov (United States)

    National Aeronautics and Space Administration — This paper presents a distributed Bayesian fault diagnosis scheme for physical systems. Our diagnoser design is based on a procedure for factoring the global system...

  19. Distributed Algorithms for Time Optimal Reachability Analysis

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    Time optimal reachability analysis is a novel model based technique for solving scheduling and planning problems. After modeling them as reachability problems using timed automata, a real-time model checker can compute the fastest trace to the goal states which constitutes a time optimal schedule....... We propose distributed computing to accelerate time optimal reachability analysis. We develop five distributed state exploration algorithms, implement them in \\uppaal enabling it to exploit the compute resources of a dedicated model-checking cluster. We experimentally evaluate the implemented...

  20. Analyzing coastal environments by means of functional data analysis

    Science.gov (United States)

    Sierra, Carlos; Flor-Blanco, Germán; Ordoñez, Celestino; Flor, Germán; Gallego, José R.

    2017-07-01

    Here we used Functional Data Analysis (FDA) to examine particle-size distributions (PSDs) in a beach/shallow marine sedimentary environment in Gijón Bay (NW Spain). The work involved both Functional Principal Components Analysis (FPCA) and Functional Cluster Analysis (FCA). The grainsize of the sand samples was characterized by means of laser dispersion spectroscopy. Within this framework, FPCA was used as a dimension reduction technique to explore and uncover patterns in grain-size frequency curves. This procedure proved useful to describe variability in the structure of the data set. Moreover, an alternative approach, FCA, was applied to identify clusters and to interpret their spatial distribution. Results obtained with this latter technique were compared with those obtained by means of two vector approaches that combine PCA with CA (Cluster Analysis). The first method, the point density function (PDF), was employed after adapting a log-normal distribution to each PSD and resuming each of the density functions by its mean, sorting, skewness and kurtosis. The second applied a centered-log-ratio (clr) to the original data. PCA was then applied to the transformed data, and finally CA to the retained principal component scores. The study revealed functional data analysis, specifically FPCA and FCA, as a suitable alternative with considerable advantages over traditional vector analysis techniques in sedimentary geology studies.

  1. Pseudodifferential Analysis, Automorphic Distributions in the Plane and Modular Forms

    CERN Document Server

    Unterberger, Andre

    2011-01-01

    Pseudodifferential analysis, introduced in this book in a way adapted to the needs of number theorists, relates automorphic function theory in the hyperbolic half-plane I to automorphic distribution theory in the plane. Spectral-theoretic questions are discussed in one or the other environment: in the latter one, the problem of decomposing automorphic functions in I according to the spectral decomposition of the modular Laplacian gives way to the simpler one of decomposing automorphic distributions in R2 into homogeneous components. The Poincare summation process, which consists in building au

  2. Analysis on Voltage Profile of Distribution Network with Distributed Generation

    Science.gov (United States)

    Shao, Hua; Shi, Yujie; Yuan, Jianpu; An, Jiakun; Yang, Jianhua

    2018-02-01

    Penetration of distributed generation has some impacts on a distribution network in load flow, voltage profile, reliability, power loss and so on. After the impacts and the typical structures of the grid-connected distributed generation are analyzed, the back/forward sweep method of the load flow calculation of the distribution network is modelled including distributed generation. The voltage profiles of the distribution network affected by the installation location and the capacity of distributed generation are thoroughly investigated and simulated. The impacts on the voltage profiles are summarized and some suggestions to the installation location and the capacity of distributed generation are given correspondingly.

  3. Spatial distribution of tritium in the Rawatbhata Rajasthan site environment

    International Nuclear Information System (INIS)

    GilI, Rajpal; Tiwari, S.N.; Gocher, A.K.; Ravi, P.M.; Tripathi, R.M.

    2014-01-01

    Tritium is one of the most environmentally mobile radionuclides and hence has high potential for migration into the different compartments of environment. Tritium from nuclear facilities at RAPS site is released into the environment through 93 m and 100 m high stack mainly as tritiated water (HTO). The released tritium undergoes dilution and dispersion and then follows the ecological pathway of water molecule. Environmental Survey Laboratory of Health Physics Division, Bhabha Atomic Research Centre (BARC), located at Rajasthan Atomic Power Station (RAPS) site is continuously monitoring the concentration of tritium in the environment to ensure the public safety. Atmospheric tritium activity during the period (2009-2013) was measured regularly around Rajasthan Atomic Power Station (RAPS). Data collected showed a large variation of H-3 concentration in air fluctuating in the range of 0.43 - 5.80 Bq.m -3 at site boundary of 1.6 km. This paper presents the result of analyses of tritium in atmospheric environment covering an area up to 20 km radius around RAPS site. Large number of air moisture samples were collected around the RAPS site, for estimating tritium in atmospheric environment to ascertain the atmospheric dispersion and computation of radiation dose to the public

  4. Implementing Run-Time Evaluation of Distributed Timing Constraints in a Real-Time Environment

    DEFF Research Database (Denmark)

    Kristensen, C. H.; Drejer, N.

    1994-01-01

    In this paper we describe a solution to the problem of implementing run-time evaluation of timing constraints in distributed real-time environments......In this paper we describe a solution to the problem of implementing run-time evaluation of timing constraints in distributed real-time environments...

  5. Distributed energy resources and benefits to the environment

    DEFF Research Database (Denmark)

    Akorede, Mudathir Funsho; Hizam, Hashim; Pouresmaeil, Edris

    2010-01-01

    generation in 2030 would be produced from fossil fuels. This global dependence on fossil fuels is dangerous to our environment in terms of their emissions unless specific policies and measures are put in place. Nevertheless, recent research reveals that a reduction in the emissions of these gases is possible...... on fossil fuels to our environment. The study finally justifies how DG technologies could substantially reduce greenhouse gas emissions when fully adopted; hence, reducing the public concerns over human health risks caused by the conventional method of electricity generation....

  6. GIS-based poverty and population distribution analysis in China

    Science.gov (United States)

    Cui, Jing; Wang, Yingjie; Yan, Hong

    2009-07-01

    Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

  7. Location Analysis of Freight Distribution Terminal of Jakarta City, Indonesia

    Directory of Open Access Journals (Sweden)

    Nahry Nahry

    2016-03-01

    Full Text Available Currently Jakarta has two freight terminals, namely Pulo Gebang and Tanah Merdeka. But, both terminals are just functioned for parking and have not been utilized properly yet, e.g. for consolidation. Goods consolidation, which is usually performed in distribution terminal, may reduce number of freight flow within the city. This paper is aimed to determine the best location of distribution terminal in Jakarta among those two terminals and two additional alternative sites, namely Lodan and Rawa Buaya. It is initialized by the identification of important factors that affect the location selection. It is carried out by Likert analysis through the questionnaires distributed to logistics firms. The best location is determined by applying Overlay Analysis using ArcGIS 9.2. Four grid maps are produced to represent the accessibility, cost, time, and environment factors as the important factors of location. The result shows that the ranking from the best is; Lodan, Tanah Merdeka, Pulo Gebang, and Rawa Buaya.

  8. Source Coding for Wireless Distributed Microphones in Reverberant Environments

    DEFF Research Database (Denmark)

    Zahedi, Adel

    2016-01-01

    's recording. This means that ignoring this correlation will be a waste of the scarce power and bandwidth resources. In this thesis, we study both information-theoretic and audio coding aspects of the coding problem in the above-mentioned framework. We formulate rate-distortion problems which take into account...... on the performance of the audio coding system. We derive explicit formulas for the rate-distortion functions, and design coding schemes that asymptotically achieve the performance bounds. We justify the Gaussianity assumption by showing that the results will still be relevant for non-Gaussian sources including audio...... for the transmitting microphone to reduce the rate by using distributed source coding. Within this framework, we apply our coding schemes to Gaussian signals as well as audio measurements and compare the rate-distortion performances for distributed and non-distributed source coding scenarios. We also compare...

  9. Extensible Interest Management for Scalable Persistent Distributed Virtual Environments

    Science.gov (United States)

    1999-12-01

    interest expression. Proximity Detection (Steinman and Wieland 1994; Morse, Bic et al. 2000) uses a "fuzzy"-grid-based filtering mechanism. The grid cells...CCTT • Proximity Detection • NPSNET • STOW-E + • STOW ED-1 + • UFRs • Sub-regions • Spline • SmallView + • MASSIVE-2 • HLA (RTI 1.3) O O...Steinman, J. S. and F. Wieland (1994). Parallel Proximity Detection and the Distributed List Algorithm. The 1994 Workshop on Parallel and Distributed

  10. Node-based analysis of species distributions

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Rahbek, Carsten; Fjeldså, Jon

    2014-01-01

    with case studies on two groups with well-described biogeographical histories: a local-scale community data set of hummingbirds in the North Andes, and a large-scale data set of the distribution of all species of New World flycatchers. The node-based analysis of these two groups generates a set...

  11. Distribution of cyanotoxins in aquatic environments in the Niger Delta

    African Journals Online (AJOL)

    The presence and types of cyanotoxins in some aquatic environments in the Niger Delta were investigated. Water samples surveyed in the study were surface water of Sombreiro, Nun and New Calabar Rivers. Others were groundwater from Abonnema and Kiama and pond water from Ogboro. Sampling locations of ...

  12. Seasonal distribution of organic matter in mangrove environment of Goa

    Digital Repository Service at National Institute of Oceanography (India)

    Jagtap, T.G.

    Water and sediments were studied for the distribution of suspended matter, organic carbon and nitrogen Suspended matter ranged from 3-373 mg.l-1 while particulate organic carbon (POC) from 0.03-9.94 mg.l-1 POC value showed significant correlation...

  13. Students' perception of the learning environment in a distributed medical programme

    OpenAIRE

    Veerapen, Kiran; McAleer, Sean

    2010-01-01

    Background: The learning environment of a medical school has a significant impact on students’ achievements and learning outcomes. The importance of equitable learning environments across programme sites is implicit in distributed undergraduate medical programmes being developed and implemented. Purpose: To study the learning environment and its equity across two classes and three geographically separate sites of a distributed medical programme at the University of British Columbia Medi...

  14. Interactive analysis environment of unified accelerator libraries

    International Nuclear Information System (INIS)

    Fine, V.; Malitsky, N.; Talman, R.

    2006-01-01

    Unified Accelerator Libraries (UAL, http://www.ual.bnl.gov) software is an open accelerator simulation environment addressing a broad spectrum of accelerator tasks ranging from efficient online-oriented modeling to full-scale realistic beam dynamics studies. The paper introduces a new package integrating UAL simulation algorithms with the QT-based Graphical User Interface and the ROOT data analysis and visualization framework (http://root.cern.ch). The primary user application is implemented as an interactive and configurable Accelerator Physics Player. Its interface to visualization components is based on the QT layer (http://root.bnl.gov) supported by the STAR experiment

  15. Interactive analysis environment of unified accelerator libraries

    CERN Document Server

    Fine, V; Talman, R

    2006-01-01

    Unified Accelerator Libraries (UAL, http://www.ual.bnl.gov) software is an open accelerator simulation environment addressing a broad spectrum of accelerator tasks ranging from efficient online-oriented modeling to full-scale realistic beam dynamics studies. The paper introduces a new package integrating UAL simulation algorithms with the QT-based Graphical User Interface and the ROOT data analysis and visualization framework (http://root.cern.ch). The primary user application is implemented as an interactive and configurable Accelerator Physics Player. Its interface to visualization components is based on the QT layer (http://root.bnl.gov) supported by the STAR experiment.

  16. Interactive analysis environment of unified accelerator libraries

    Energy Technology Data Exchange (ETDEWEB)

    Fine, V. [Brookhaven National Laboratory, P.O. Box 5000, Upton, NY 11973 (United States)]. E-mail: fine@bnl.gov; Malitsky, N. [Brookhaven National Laboratory, P.O. Box 5000, Upton, NY 11973 (United States); Talman, R. [Cornell University, Ithaca, NY 14853 (United States)

    2006-04-01

    Unified Accelerator Libraries (UAL, http://www.ual.bnl.gov) software is an open accelerator simulation environment addressing a broad spectrum of accelerator tasks ranging from efficient online-oriented modeling to full-scale realistic beam dynamics studies. The paper introduces a new package integrating UAL simulation algorithms with the QT-based Graphical User Interface and the ROOT data analysis and visualization framework (http://root.cern.ch). The primary user application is implemented as an interactive and configurable Accelerator Physics Player. Its interface to visualization components is based on the QT layer (http://root.bnl.gov) supported by the STAR experiment.

  17. Income distribution: Boltzmann analysis and its extension

    Science.gov (United States)

    Yuqing, He

    2007-04-01

    The paper aims at describing income distribution in moderate income regions. Starting with dividing income behaviors into the two parts: random and deterministic, and by introducing “instantaneous model” for theoretical derivations and “cumulative model” for positive tests, this paper applies the equilibrium approach of statistical mechanics in the study of nonconserved individual income course. The random income follows a stationary distribution similar to the Maxwell-Boltzmann distribution in the instantaneous model. Combining this result with marginal analysis, the probability distribution of individual income process that is composed of the random and deterministic income courses approximately obeys a distribution law mixing exponential function with a logarithmic prefactor. Using the census or income survey data of USA, UK, Japan, and New Zealand, the distribution law has been tested. The results show that it agrees very well with most of the empirical data. The discussion suggests that there might be essentially different income processes to happen in moderate and high income regions.

  18. Performance optimisations for distributed analysis in ALICE

    CERN Document Server

    Betev, L; Gheata, M; Grigoras, C; Hristov, P

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the framewo rks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available r esources and ranging from fully I/O - bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by a...

  19. An ontological framework for requirement change management in distributed environment

    International Nuclear Information System (INIS)

    Khatoon, A.; Hafeez, Y.; Ali, T.

    2014-01-01

    Global Software Development (GSD) is getting fame in the software industry gradually. However, in GSD, multiple and diverse stakeholders are involved in the development of complex software systems. GSD introduces several challenges, i.e. physical distance, time zone, culture difference, language barriers. As requirements play a significant role in any software development. The greatest challenge in GSD environment is to maintain a consistent view of the system even if the requirements change. But at the same time single change in the requirement might affect several other modules. In GSD different people use terms and have different ways of expressing the concepts for which people at remote sites are unable to get uniformity regarding the semantics of the terms. In a global environment requires effective communication and coordination. However, to overcome inconsistencies and ambiguities among the team members and to make the team members aware of the consistent view, a shared and common understanding is required. In this paper an approach beneficial to software industry has been proposed, focusing on changing requirements in a Global Software Development environment. A case study has been used for the evaluation of the proposed approach. Therefore, Requirements change management process has been improved by applying the approach of the case study. The proposed approach is beneficial to the software development organizations where frequent changes occur. It guided the software industry to provide the common understandings to all the development teams residing in remote locations. (author)

  20. An Embedded Software Platform for Distributed Automotive Environment Management

    Directory of Open Access Journals (Sweden)

    Seepold Ralf

    2009-01-01

    Full Text Available This paper discusses an innovative extension of the actual vehicle platforms that integrate intelligent environments in order to carry out e-safety tasks improving the driving security. These platforms are dedicated to automotive environments which are characterized by sensor networks deployed along the vehicles. Since this kind of platform infrastructure is hardly extensible and forms a non-scalable process unit, an embedded OSGi-based UPnP platform extension is proposed in this article. Such extension deploys a compatible and scalable uniform environment that allows to manage the vehicle components heterogeneity and to provide plug and play support, being compatible with all kind of devices and sensors located in a car network. Furthermore, such extension allows to autoregister any kind of external devices, wherever they are located, providing the in-vehicle system with additional services and data supplied by them. This extension also supports service provisioning and connections to external and remote network services using SIP technology.

  1. Dynamic analysis environment for nuclear forensic analyses

    Science.gov (United States)

    Stork, C. L.; Ummel, C. C.; Stuart, D. S.; Bodily, S.; Goldblum, B. L.

    2017-01-01

    A Dynamic Analysis Environment (DAE) software package is introduced to facilitate group inclusion/exclusion method testing, evaluation and comparison for pre-detonation nuclear forensics applications. Employing DAE, the multivariate signatures of a questioned material can be compared to the signatures for different, known groups, enabling the linking of the questioned material to its potential process, location, or fabrication facility. Advantages of using DAE for group inclusion/exclusion include built-in query tools for retrieving data of interest from a database, the recording and documentation of all analysis steps, a clear visualization of the analysis steps intelligible to a non-expert, and the ability to integrate analysis tools developed in different programming languages. Two group inclusion/exclusion methods are implemented in DAE: principal component analysis, a parametric feature extraction method, and k nearest neighbors, a nonparametric pattern recognition method. Spent Fuel Isotopic Composition (SFCOMPO), an open source international database of isotopic compositions for spent nuclear fuels (SNF) from 14 reactors, is used to construct PCA and KNN models for known reactor groups, and 20 simulated SNF samples are utilized in evaluating the performance of these group inclusion/exclusion models. For all 20 simulated samples, PCA in conjunction with the Q statistic correctly excludes a large percentage of reactor groups and correctly includes the true reactor of origination. Employing KNN, 14 of the 20 simulated samples are classified to their true reactor of origination.

  2. Study of particle size distribution of environment certified reference material

    Directory of Open Access Journals (Sweden)

    I. E. Vasilyeva

    2015-01-01

    Full Text Available One of the most important stages of the developing certified reference materials (CRM of solid natural samples is to describe a particle size distribution of prepared powders. The particle size distribution affects the degree of material homogeneity and the value representative of the analytical sample mass. The collection of CRMs was being produced at the Vinogradov Institute of Geochemistry SB RAS through a long time span; therefore the grain-size compositions of the CRM powders were measured by different instrumental methods and assessed at different scales. The laser diffraction analyzer HELOS/BR was employed to accurately and rapidly measure the grain-size composition of CRM natural sample powders. New measurements confirm that the particle size distribution of CRMs of magmatic and metamorphic rocks and sediments of Lake Baikal developed 45 and 25 years ago, accordingly have not changed fundamentally. The multimodal distributions ofparticle sizes of investigated CRMs clearly reflect the differences in mineral and chemical compositions. Aggregating of the particles of different composition and origin during long-term storage of powders is not observed. The measurement results of particle size compositions of the CRM powders show a slight dependence on the weight put into the device, as well as its mineral composition. The homogeneity of the substance of studied standard samples was confirmed by low quantities of representative sub-samples (0.075-0.100 g for a wide range of elements determined by modern instrumental analytical methods. The use of laser diffraction analyzers type HELOS could help to certify the particle size composition of CRM powder as repeatable metrological characteristic.

  3. Monte Carlo in radiotherapy: experience in a distributed computational environment

    Science.gov (United States)

    Caccia, B.; Mattia, M.; Amati, G.; Andenna, C.; Benassi, M.; D'Angelo, A.; Frustagli, G.; Iaccarino, G.; Occhigrossi, A.; Valentini, S.

    2007-06-01

    New technologies in cancer radiotherapy need a more accurate computation of the dose delivered in the radiotherapeutical treatment plan, and it is important to integrate sophisticated mathematical models and advanced computing knowledge into the treatment planning (TP) process. We present some results about using Monte Carlo (MC) codes in dose calculation for treatment planning. A distributed computing resource located in the Technologies and Health Department of the Italian National Institute of Health (ISS) along with other computer facilities (CASPUR - Inter-University Consortium for the Application of Super-Computing for Universities and Research) has been used to perform a fully complete MC simulation to compute dose distribution on phantoms irradiated with a radiotherapy accelerator. Using BEAMnrc and GEANT4 MC based codes we calculated dose distributions on a plain water phantom and air/water phantom. Experimental and calculated dose values below ±2% (for depth between 5 mm and 130 mm) were in agreement both in PDD (Percentage Depth Dose) and transversal sections of the phantom. We consider these results a first step towards a system suitable for medical physics departments to simulate a complete treatment plan using remote computing facilities for MC simulations.

  4. SPATIAL ANALYSIS TO SUPPORT GEOGRAPHIC TARGETING OF GENOTYPES TO ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    Glenn eHyman

    2013-03-01

    Full Text Available Crop improvement efforts have benefited greatly from advances in available data, computing technology and methods for targeting genotypes to environments. These advances support the analysis of genotype by environment interactions to understand how well a genotype adapts to environmental conditions. This paper reviews the use of spatial analysis to support crop improvement research aimed at matching genotypes to their most appropriate environmental niches. Better data sets are now available on soils, weather and climate, elevation, vegetation, crop distribution and local conditions where genotypes are tested in experimental trial sites. The improved data are now combined with spatial analysis methods to compare environmental conditions across sites, create agro-ecological region maps and assess environment change. Climate, elevation and vegetation data sets are now widely available, supporting analyses that were much more difficult even five or ten years ago. While detailed soil data for many parts of the world remains difficult to acquire for crop improvement studies, new advances in digital soil mapping are likely to improve our capacity. Site analysis and matching and regional targeting methods have advanced in parallel to data and technology improvements. All these developments have increased our capacity to link genotype to phenotype and point to a vast potential to improve crop adaptation efforts.

  5. Earth observation scientific workflows in a distributed computing environment

    CSIR Research Space (South Africa)

    Van Zyl, TL

    2011-09-01

    Full Text Available in the geospatial scientific computing environment. References K. Asanovic, R. Bodik, B.C. Catanzaro, J.J. Gebis, P. Husbands, K. Keutzer, D.A. Patterson, W.L. Plishker, J. Shalf, S.W. Williams, et al. The landscape of parallel computing research: A view from... an imperative for re- searchers to share their results with the broader scientific community and to verify, interrogate and build upon the results of others in the community, an e-toolset is required that can sup- port these tasks. This points to a need for a...

  6. Distributed Energy Systems in the Built Environment - European Legal Framework on Distributed Energy Systems in the Built Environment

    NARCIS (Netherlands)

    Pront-van Bommel, S.; Bregman, A.

    2013-01-01

    This paper aims to outline the stimuli provided by European law for promoting integrated planning of distributed renewable energy installations on the one hand and its limitations thereof on the other hand, also with regard to the role of local governments.

  7. Distribution of pesticides in dust particles in urban environments.

    Science.gov (United States)

    Richards, Jaben; Reif, Ruben; Luo, Yuzhuo; Gan, Jay

    2016-07-01

    In regions with a mild climate, pesticides are often used around homes for pest control. Recent monitoring studies have linked pesticide use in residential areas to aquatic toxicity in urban surface water ecosystems, and suggested dust particles on paved surfaces as an important source of pesticides. To test the hypothesis that dust on hard surfaces is a significant source of pesticides, we evaluated spatial and temporal patterns of current-use insecticides in Southern California, and further explored their distribution as a function of particle sizes. Pyrethroid insecticides were detected in dust from the driveway, curb gutter and street at 53.5-94.8%, with median concentrations of 1-46 ng g(-1). Pyrethroid residues were uniformly distributed in areas adjacent to a house, suggesting significant redistribution. The total levels of pyrethroids in dust significantly (p fine particles that have a higher mobility in runoff than coarse particles. Results from this study highlight the widespread occurrence of pesticides in outdoor dust around homes and the potential contribution to downstream surface water contamination via rain-induced runoff. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. R: a statistical environment for hydrological analysis

    Science.gov (United States)

    Zambrano-Bigiarini, Mauricio; Bellin, Alberto

    2010-05-01

    The free software environment for statistical computing and graphics "R" has been developed and it is maintained by statistical programmers, with the support of an increasing community of users with many different backgrounds, which allows access to both well-established and experimental techniques. Hydrological modelling practitioners spent large amount of time in pre- and post-processing data and results with traditional instruments. In this work "R" and some of its packages are presented as powerful tools to explore and extract patterns from raw information, to pre-process input data of hydrological models, and post-processing its results. In particular, examples are taken from analysing 30-years of daily data for a basin of 85000 km2, saving a large amount of time that could be better spent in doing analysis. In doing so, vectorial and raster GIS files were imported, for carrying out spatial and geostatistical analysis. Thousands of raw text files with time series of precipitation, temperature and streamflow were summarized and organized. Gauging stations to be used in the modelling process are selected according to the amount of days with information, and missing time series data are filled in using spatial interpolation. Time series on the gauging stations are summarized through daily, monthly and annual plots. Input files in dbase format are automatically created in a batch process. Results of a hydrological model are compared with observed values through plots and numerical goodness of fit indexes. Two packages specifically developed to assists hydrologists in the previous tasks are briefly presented. At the end, we think the "R" environment would be a valuable tool to support undergraduate and graduate education in hydrology, because it is helpful to capture the main features of large amount of data; it is a flexible and fully functional programming language, able to be interfaced to existing Fortran and C code and well suited to the ever growing demands

  9. A Framework for Control and Observation in Distributed Environments

    Science.gov (United States)

    Smith, Warren

    2001-01-01

    As organizations begin to deploy large computational grids, it has become apparent that systems for observation and control of the resources, services, and applications that make up such grids are needed. Administrators must observe the operation of resources and services to ensure that they are operating correctly and they must control the resources and services to ensure that their operation meets the needs of users. Further, users need to observe the performance of their applications so that this performance can be improved and control how their applications execute in a dynamic grid environment. In this paper we describe our software framework for control and observation of resources, services, and applications that supports such uses and we provide examples of how our framework can be used.

  10. AXAF user interfaces for heterogeneous analysis environments

    Science.gov (United States)

    Mandel, Eric; Roll, John; Ackerman, Mark S.

    1992-01-01

    The AXAF Science Center (ASC) will develop software to support all facets of data center activities and user research for the AXAF X-ray Observatory, scheduled for launch in 1999. The goal is to provide astronomers with the ability to utilize heterogeneous data analysis packages, that is, to allow astronomers to pick the best packages for doing their scientific analysis. For example, ASC software will be based on IRAF, but non-IRAF programs will be incorporated into the data system where appropriate. Additionally, it is desired to allow AXAF users to mix ASC software with their own local software. The need to support heterogeneous analysis environments is not special to the AXAF project, and therefore finding mechanisms for coordinating heterogeneous programs is an important problem for astronomical software today. The approach to solving this problem has been to develop two interfaces that allow the scientific user to run heterogeneous programs together. The first is an IRAF-compatible parameter interface that provides non-IRAF programs with IRAF's parameter handling capabilities. Included in the interface is an application programming interface to manipulate parameters from within programs, and also a set of host programs to manipulate parameters at the command line or from within scripts. The parameter interface has been implemented to support parameter storage formats other than IRAF parameter files, allowing one, for example, to access parameters that are stored in data bases. An X Windows graphical user interface called 'agcl' has been developed, layered on top of the IRAF-compatible parameter interface, that provides a standard graphical mechanism for interacting with IRAF and non-IRAF programs. Users can edit parameters and run programs for both non-IRAF programs and IRAF tasks. The agcl interface allows one to communicate with any command line environment in a transparent manner and without any changes to the original environment. For example, the authors

  11. Data distribution method of workflow in the cloud environment

    Science.gov (United States)

    Wang, Yong; Wu, Junjuan; Wang, Ying

    2017-08-01

    Cloud computing for workflow applications provides the required high efficiency calculation and large storage capacity and it also brings challenges to the protection of trade secrets and other privacy data. Because of privacy data will cause the increase of the data transmission time, this paper presents a new data allocation algorithm based on data collaborative damage degree, to improve the existing data allocation strategy? Safety and public cloud computer algorithm depends on the private cloud; the static allocation method in the initial stage only to the non-confidential data division to improve the original data, in the operational phase will continue to generate data to dynamically adjust the data distribution scheme. The experimental results show that the improved method is effective in reducing the data transmission time.

  12. 3D Game Content Distributed Adaptation in Heterogeneous Environments

    Directory of Open Access Journals (Sweden)

    Berretty Robert-Paul

    2007-01-01

    Full Text Available Most current multiplayer 3D games can only be played on a single dedicated platform (a particular computer, console, or cell phone, requiring specifically designed content and communication over a predefined network. Below we show how, by using signal processing techniques such as multiresolution representation and scalable coding for all the components of a 3D graphics object (geometry, texture, and animation, we enable online dynamic content adaptation, and thus delivery of the same content over heterogeneous networks to terminals with very different profiles, and its rendering on them. We present quantitative results demonstrating how the best displayed quality versus computational complexity versus bandwidth tradeoffs have been achieved, given the distributed resources available over the end-to-end content delivery chain. Additionally, we use state-of-the-art, standardised content representation and compression formats (MPEG-4 AFX, JPEG 2000, XML, enabling deployment over existing infrastructure, while keeping hooks to well-established practices in the game industry.

  13. A Study of Land Surface Temperature Retrieval and Thermal Environment Distribution Based on Landsat-8 in Jinan City

    Science.gov (United States)

    Dong, Fang; Chen, Jian; Yang, Fan

    2018-01-01

    Based on the medium resolution Landsat 8 OLI/TIRS, the temperature distribution in four seasons of urban area in Jinan City was obtained by using atmospheric correction method for the retrieval of land surface temperature. Quantitative analysis of the spatio-temporal distribution characteristics, development trend of urban thermal environment, the seasonal variation and the relationship between surface temperature and normalized difference vegetation index (NDVI) was studied. The results show that the distribution of high temperature areas is concentrated in Jinan, and there is a tendency to expand from east to west, revealing a negative correlation between land surface temperature distribution and NDVI. So as to provide theoretical references and scientific basis of improving the ecological environment of Jinan City, strengthening scientific planning and making overall plan addressing climate change.

  14. Distributed analysis in ATLAS using GANGA

    International Nuclear Information System (INIS)

    Elmsheuser, Johannes; Brochu, Frederic; Egede, Ulrik; Reece, Will; Williams, Michael; Gaidioz, Benjamin; Maier, Andrew; Moscicki, Jakub; Vanderster, Daniel; Lee, Hurng-Chun; Pajchel, Katarina; Samset, Bjorn; Slater, Mark; Soroko, Alexander; Cowan, Greig

    2010-01-01

    Distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The needs to manage the resources are very high. In every experiment up to a thousand physicists will be submitting analysis jobs to the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without expertise in Grid technology. These tools enlarge the number of Grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments, provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment. Support for all Grids presently used by ATLAS, namely the LCG/EGEE, NDGF/NorduGrid, and OSG/PanDA is provided. The integration and interaction with the ATLAS data management system DQ2 into GANGA is a key functionality. An intelligent job brokering is set up by using the job splitting mechanism together with data-set and file location knowledge. The brokering is aided by an automated system that regularly processes test analysis jobs at all ATLAS DQ2 supported sites. Large numbers of analysis jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports amongst other things tasks of user analysis with reconstructed data and small scale production of Monte Carlo data.

  15. Removal and distribution of iodate in mangrove environment

    International Nuclear Information System (INIS)

    Machado, E.C; Machado, W; Sanders, L.F; Bellido, L.F; Patchineelam, S.R; Bellido Jr, A.V

    2003-01-01

    Retention experiments were carried out in laboratory in order to investigate the iodate ion removal from tidal water by the underlying creek sediment, as well as, distribution in the mangrove sediment of Sepetiba Bay, Rio de Janeiro. Local tidal water spiked with 131 IO 3 was added to several cores containing 6-7 cm of mangrove sediment. A sample solution was taken everyday for 8□9 days and assayed by means of a high purity germanium detector coupled to a multichannel analyser. It was found that, after two days, around 56 % of the radioiodate was taken up by the sediment. At the end of the experiment, the sediment was sectioned in 1-cm layers and also analyzed by gamma spectrometry. The overall results, in seven cores, showed that 89% of initial 131 IO 3 activities were transferred to sediment, whereas 78% of the total activity was within the first cm of the upper layer. It seems from the determination of total organic matter content, that the organic matter present in the Sepetiba's mangrove sediment, have a substantial influence in the iodate removal from the water column (author)

  16. Objective Bayesian Analysis of Skew- t Distributions

    KAUST Repository

    BRANCO, MARCIA D'ELIA

    2012-02-27

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student\\'s t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric around zero and proper. Moreover, we propose a Student\\'s t approximation of the Jeffreys prior that makes an objective Bayesian analysis easy to perform. We carry out a Monte Carlo simulation study that demonstrates an overall better behaviour of the maximum a posteriori estimator compared with the maximum likelihood estimator. We also compare the frequentist coverage of the credible intervals based on the Jeffreys prior and its approximation and show that they are similar. We further discuss location-scale models under scale mixtures of skew-normal distributions and show some conditions for the existence of the posterior distribution and its moments. Finally, we present three numerical examples to illustrate the implications of our results on inference for skew-t distributions. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  17. A cooperative model for IS security risk management in distributed environment.

    Science.gov (United States)

    Feng, Nan; Zheng, Chundong

    2014-01-01

    Given the increasing cooperation between organizations, the flexible exchange of security information across the allied organizations is critical to effectively manage information systems (IS) security in a distributed environment. In this paper, we develop a cooperative model for IS security risk management in a distributed environment. In the proposed model, the exchange of security information among the interconnected IS under distributed environment is supported by Bayesian networks (BNs). In addition, for an organization's IS, a BN is utilized to represent its security environment and dynamically predict its security risk level, by which the security manager can select an optimal action to safeguard the firm's information resources. The actual case studied illustrates the cooperative model presented in this paper and how it can be exploited to manage the distributed IS security risk effectively.

  18. Open-Source, Distributed Computational Environment for Virtual Materials Exploration

    Science.gov (United States)

    2015-01-01

    using implicit shape geometry with three distinct material  zones. It approximated a  jet  engine turbine, with material zones reflecting microstructure of...Element Analysis  FEM  Finite Element Modeling  GE  General Electric  GTF  Geared  Turbo  Fan  GUI  Graphical User Interface  HPC  High Performance

  19. Comparative analysis of distributed power control algorithms in CDMA

    OpenAIRE

    Abdulhamid, Mohanad F.

    2017-01-01

    This paper presents comparative analysis of various algorithms of distributed power control used in Code Division Multiple Access (CDMA) systems. These algorithms include Distributed Balancing power control algorithm (DB), Modified Distributed Balancing power control algorithm (MDB), Fully Distributed Power Control algorithm (FDPC), Distributed Power Control algorithm (DPC), Distributed Constrained Power Control algorithm (DCPC), Unconstrained Second-Order Power Control algorithm (USOPC), Con...

  20. Distribution and communication in software engineering environments. Application to the HELIOS Software Bus.

    OpenAIRE

    Jean, F. C.; Jaulent, M. C.; Coignard, J.; Degoulet, P.

    1991-01-01

    Modularity, distribution and integration are current trends in Software Engineering. To reach these goals HELIOS, a distributive Software Engineering Environment dedicated to the medical field, has been conceived and a prototype implemented. This environment is made by the collaboration of several, well encapsulated Software Components. This paper presents the architecture retained to allow communication between the different components and focus on the implementation details of the Software ...

  1. Distributed multiscale computing with MUSCLE 2, the Multiscale Coupling Library and Environment

    NARCIS (Netherlands)

    Borgdorff, J.; Mamonski, M.; Bosak, B.; Kurowski, K.; Ben Belgacem, M.; Chopard, B.; Groen, D.; Coveney, P.V.; Hoekstra, A.G.

    2014-01-01

    We present the Multiscale Coupling Library and Environment: MUSCLE 2. This multiscale component-based execution environment has a simple to use Java, C++, C, Python and Fortran API, compatible with MPI, OpenMP and threading codes. We demonstrate its local and distributed computing capabilities and

  2. Performance optimisations for distributed analysis in ALICE

    International Nuclear Information System (INIS)

    Betev, L; Gheata, A; Grigoras, C; Hristov, P; Gheata, M

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with ''sensors'' collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis

  3. Instructional Design Issues in a Distributed Collaborative Engineering Design (CED) Instructional Environment

    Science.gov (United States)

    Koszalka, Tiffany A.; Wu, Yiyan

    2010-01-01

    Changes in engineering practices have spawned changes in engineering education and prompted the use of distributed learning environments. A distributed collaborative engineering design (CED) course was designed to engage engineering students in learning about and solving engineering design problems. The CED incorporated an advanced interactive…

  4. Analysis of Jingdong Mall Logistics Distribution Model

    Science.gov (United States)

    Shao, Kang; Cheng, Feng

    In recent years, the development of electronic commerce in our country to speed up the pace. The role of logistics has been highlighted, more and more electronic commerce enterprise are beginning to realize the importance of logistics in the success or failure of the enterprise. In this paper, the author take Jingdong Mall for example, performing a SWOT analysis of their current situation of self-built logistics system, find out the problems existing in the current Jingdong Mall logistics distribution and give appropriate recommendations.

  5. Comparative phyloinformatics of virus genes at micro and macro levels in a distributed computing environment.

    Science.gov (United States)

    Singh, Dadabhai T; Trehan, Rahul; Schmidt, Bertil; Bretschneider, Timo

    2008-01-01

    Preparedness for a possible global pandemic caused by viruses such as the highly pathogenic influenza A subtype H5N1 has become a global priority. In particular, it is critical to monitor the appearance of any new emerging subtypes. Comparative phyloinformatics can be used to monitor, analyze, and possibly predict the evolution of viruses. However, in order to utilize the full functionality of available analysis packages for large-scale phyloinformatics studies, a team of computer scientists, biostatisticians and virologists is needed--a requirement which cannot be fulfilled in many cases. Furthermore, the time complexities of many algorithms involved leads to prohibitive runtimes on sequential computer platforms. This has so far hindered the use of comparative phyloinformatics as a commonly applied tool in this area. In this paper the graphical-oriented workflow design system called Quascade and its efficient usage for comparative phyloinformatics are presented. In particular, we focus on how this task can be effectively performed in a distributed computing environment. As a proof of concept, the designed workflows are used for the phylogenetic analysis of neuraminidase of H5N1 isolates (micro level) and influenza viruses (macro level). The results of this paper are hence twofold. Firstly, this paper demonstrates the usefulness of a graphical user interface system to design and execute complex distributed workflows for large-scale phyloinformatics studies of virus genes. Secondly, the analysis of neuraminidase on different levels of complexity provides valuable insights of this virus's tendency for geographical based clustering in the phylogenetic tree and also shows the importance of glycan sites in its molecular evolution. The current study demonstrates the efficiency and utility of workflow systems providing a biologist friendly approach to complex biological dataset analysis using high performance computing. In particular, the utility of the platform Quascade

  6. Massive calculations of electrostatic potentials and structure maps of biopolymers in a distributed computing environment

    International Nuclear Information System (INIS)

    Akishina, T.P.; Ivanov, V.V.; Stepanenko, V.A.

    2013-01-01

    Among the key factors determining the processes of transcription and translation are the distributions of the electrostatic potentials of DNA, RNA and proteins. Calculations of electrostatic distributions and structure maps of biopolymers on computers are time consuming and require large computational resources. We developed the procedures for organization of massive calculations of electrostatic potentials and structure maps for biopolymers in a distributed computing environment (several thousands of cores).

  7. Measurement and distribution of nitrogen dioxide in urban environments

    International Nuclear Information System (INIS)

    Kirby, C.

    1999-06-01

    and the implications of this study for their use as ambient NO 2 monitors is discussed. In the second part of the thesis, relationships between NO 2 , NO x and O 3 are explored using concurrent hourly mean measurements from an automated monitoring site in Cambridge. A significant new model is developed using Principal Component Analysis. Principal components are extracted which have clear associations with underlying chemistry. The model provides insight into the behaviour of the photostationary state system, perturbed by changing pollutant concentrations. Important features of this perturbation are demonstrated; the exceptional impact of the seasons on the diurnal cycle and the outcome of this for NO 2 and O 3 levels. The ability of the model to describe differences in photochemical behaviour due to location and proximity to emission sources is explored for other sites in the UK (using national archive data from the Internet). The limitations of the model are discussed and future developments suggested. (author)

  8. Natural environment suitability of China and its relationship with population distributions.

    Science.gov (United States)

    Yang, Xiaohuan; Ma, Hanqing

    2009-12-01

    The natural environment factor is one of the main indexes for evaluating human habitats, sustained economic growth and ecological health status. Based on Geographic Information System (GIS) technology and an analytic hierarchy process method, this article presents the construction of the Natural Environment Suitability Index (NESI) model of China by using natural environment data including climate, hydrology, surface configuration and ecological conditions. The NESI value is calculated in grids of 1 km by 1 km through ArcGIS. The spatial regularity of NESI is analyzed according to its spatial distribution and proportional structure. The relationship of NESI with population distribution and economic growth is also discussed by analyzing NESI results with population distribution data and GDP data in 1 km by 1 km grids. The study shows that: (1) the value of NESI is higher in the East and lower in the West in China; The best natural environment area is the Yangtze River Delta region and the worst are the northwest of Tibet and southwest of Xinjiang. (2) There is a close correlation among natural environment, population distribution and economic growth; the best natural environment area, the Yangtze River Delta region, is also the region with higher population density and richer economy. The worst natural environment areas, Northwest and Tibetan Plateau, are also regions with lower population density and poorer economies.

  9. Natural Environment Suitability of China and Its Relationship with Population Distributions

    Directory of Open Access Journals (Sweden)

    Xiaohuan Yang

    2009-12-01

    Full Text Available The natural environment factor is one of the main indexes for evaluating human habitats, sustained economic growth and ecological health status. Based on Geographic Information System (GIS technology and an analytic hierarchy process method, this article presents the construction of the Natural Environment Suitability Index (NESI model of China by using natural environment data including climate, hydrology, surface configuration and ecological conditions. The NESI value is calculated in grids of 1 km by 1 km through ArcGIS. The spatial regularity of NESI is analyzed according to its spatial distribution and proportional structure. The relationship of NESI with population distribution and economic growth is also discussed by analyzing NESI results with population distribution data and GDP data in 1 km by 1 km grids. The study shows that: (1 the value of NESI is higher in the East and lower in the West in China; The best natural environment area is the Yangtze River Delta region and the worst are the northwest of Tibet and southwest of Xinjiang. (2 There is a close correlation among natural environment, population distribution and economic growth; the best natural environment area, the Yangtze River Delta region, is also the region with higher population density and richer economy. The worst natural environment areas, Northwest and Tibetan Plateau, are also regions with lower population density and poorer economies.

  10. Natural Environment Suitability of China and Its Relationship with Population Distributions

    Science.gov (United States)

    Yang, Xiaohuan; Ma, Hanqing

    2009-01-01

    The natural environment factor is one of the main indexes for evaluating human habitats, sustained economic growth and ecological health status. Based on Geographic Information System (GIS) technology and an analytic hierarchy process method, this article presents the construction of the Natural Environment Suitability Index (NESI) model of China by using natural environment data including climate, hydrology, surface configuration and ecological conditions. The NESI value is calculated in grids of 1 km by 1 km through ArcGIS. The spatial regularity of NESI is analyzed according to its spatial distribution and proportional structure. The relationship of NESI with population distribution and economic growth is also discussed by analyzing NESI results with population distribution data and GDP data in 1 km by 1 km grids. The study shows that: (1) the value of NESI is higher in the East and lower in the West in China; The best natural environment area is the Yangtze River Delta region and the worst are the northwest of Tibet and southwest of Xinjiang. (2) There is a close correlation among natural environment, population distribution and economic growth; the best natural environment area, the Yangtze River Delta region, is also the region with higher population density and richer economy. The worst natural environment areas, Northwest and Tibetan Plateau, are also regions with lower population density and poorer economies. PMID:20049243

  11. Ecosystem classification for EU habitat distribution assessment in sandy coastal environments: an application in central Italy.

    Science.gov (United States)

    Carranza, Maria Laura; Acosta, Alicia T R; Stanisci, Angela; Pirone, Gianfranco; Ciaschetti, Giampiero

    2008-05-01

    Many recent developments in coastal science have gone against the demands of European Union legislation. Coastal dune systems which cover small areas of the earth can host a high level of biodiversity. However, human pressure on coastal zones around the world has increased dramatically in the last 50 years. In addition to direct habitat loss, the rapid extinction of many species that are unique to these systems can be attributed to landscape deterioration through the lack of appropriate management. In this paper, we propose to use of an ecosystem classification technique that integrates potential natural vegetation distribution as a reference framework for coastal dune EU Habitats (92/43) distribution analysis and assessment. As an example, the present study analyses the EU Habitats distribution within a hierarchical ecosystem classification of the coastal dune systems of central Italy. In total, 24 land elements belonging to 8 land units, 5 land facets, 2 land systems and 2 land regions were identified for the coastal dunes of central Italy, based on diagnostic land attributes. In central Italy, coastal dune environments including all the beach area, mobile dunes and all the fixed-dune land elements contain or could potentially hold at least one EU habitat of interest. Almost all dune slack transitions present the potentiality for the spontaneous development of EU woodlands of interest. The precise information concerning these ecosystems distribution and ecological relationships that this method produces, makes it very effective in Natura 2000 European network assessment. This hierarchical ecosystem classification method facilitates the identification of areas to be surveyed and eventually bound, under the implementation of EU Habitat directive (92/43) including areas with highly disturbed coastal dune ecosystems.

  12. Geospatial analysis of food environment demonstrates associations with gestational diabetes.

    Science.gov (United States)

    Kahr, Maike K; Suter, Melissa A; Ballas, Jerasimos; Ramin, Susan M; Monga, Manju; Lee, Wesley; Hu, Min; Shope, Cindy D; Chesnokova, Arina; Krannich, Laura; Griffin, Emily N; Mastrobattista, Joan; Dildy, Gary A; Strehlow, Stacy L; Ramphul, Ryan; Hamilton, Winifred J; Aagaard, Kjersti M

    2016-01-01

    Gestational diabetes mellitus (GDM) is one of most common complications of pregnancy, with incidence rates varying by maternal age, race/ethnicity, obesity, parity, and family history. Given its increasing prevalence in recent decades, covariant environmental and sociodemographic factors may be additional determinants of GDM occurrence. We hypothesized that environmental risk factors, in particular measures of the food environment, may be a diabetes contributor. We employed geospatial modeling in a populous US county to characterize the association of the relative availability of fast food restaurants and supermarkets to GDM. Utilizing a perinatal database with >4900 encoded antenatal and outcome variables inclusive of ZIP code data, 8912 consecutive pregnancies were analyzed for correlations between GDM and food environment based on countywide food permit registration data. Linkage between pregnancies and food environment was achieved on the basis of validated 5-digit ZIP code data. The prevalence of supermarkets and fast food restaurants per 100,000 inhabitants for each ZIP code were gathered from publicly available food permit sources. To independently authenticate our findings with objective data, we measured hemoglobin A1c levels as a function of geospatial distribution of food environment in a matched subset (n = 80). Residence in neighborhoods with a high prevalence of fast food restaurants (fourth quartile) was significantly associated with an increased risk of developing GDM (relative to first quartile: adjusted odds ratio, 1.63; 95% confidence interval, 1.21-2.19). In multivariate analysis, this association held true after controlling for potential confounders (P = .002). Measurement of hemoglobin A1c levels in a matched subset were significantly increased in association with residence in a ZIP code with a higher fast food/supermarket ratio (n = 80, r = 0.251 P food environment and risk for gestational diabetes was identified. Copyright © 2016

  13. Performance analysis and optimization of AMGA for the WISDOM environment

    CERN Document Server

    Ahn, Sunil; Lee, Seehoon; Hwang, Soonwook; Breton, Vincent; Koblitz, Birger

    2008-01-01

    AMGA is a gLite-metadata catalogue service designed to offer access to metadata for files stored on the Grid. We evaluated AMGA to analyze whether it is suitable for the WISDOM environment, where thousands of jobs access it simultaneously to get metadata describing docking results and the status of jobs. In this work, we address performance issues on AMGA and propose new techniques to improve AMGA performance in the WISDOM environment. In the WISDOM environment, thousands of job agents distributed on the Grid may have access to an AMGA server simultaneously (1) to take docking tasks out of the AMGA server to execute on the machine that they are sitting, (2) to get the related ligand and target information, and (3) to store the docking results. The docking tasks take about 10 to 30 minutes to finish depending on the machine that they run and the docking configuration. We have carried out some performance analysis on the current AMGA implementation. Due to the overhead required to handle GSI/SSL connection on t...

  14. Students’ perception of the learning environment in a distributed medical programme

    Directory of Open Access Journals (Sweden)

    Kiran Veerapen

    2010-09-01

    Full Text Available Background : The learning environment of a medical school has a significant impact on students’ achievements and learning outcomes. The importance of equitable learning environments across programme sites is implicit in distributed undergraduate medical programmes being developed and implemented. Purpose : To study the learning environment and its equity across two classes and three geographically separate sites of a distributed medical programme at the University of British Columbia Medical School that commenced in 2004. Method : The validated Dundee Ready Educational Environment Survey was sent to all students in their 2nd and 3rd year (classes graduating in 2009 and 2008 of the programme. The domains of the learning environment surveyed were: students’ perceptions of learning, students’ perceptions of teachers, students’ academic self-perceptions, students’ perceptions of the atmosphere, and students’ social self-perceptions. Mean scores, frequency distribution of responses, and inter- and intrasite differences were calculated. Results : The perception of the global learning environment at all sites was more positive than negative. It was characterised by a strongly positive perception of teachers. The work load and emphasis on factual learning were perceived negatively. Intersite differences within domains of the learning environment were more evident in the pioneer class (2008 of the programme. Intersite differences consistent across classes were largely related to on-site support for students. Conclusions : Shared strengths and weaknesses in the learning environment at UBC sites were evident in areas that were managed by the parent institution, such as the attributes of shared faculty and curriculum. A greater divergence in the perception of the learning environment was found in domains dependent on local arrangements and social factors that are less amenable to central regulation. This study underlines the need for ongoing

  15. Students' perception of the learning environment in a distributed medical programme.

    Science.gov (United States)

    Veerapen, Kiran; McAleer, Sean

    2010-09-24

    The learning environment of a medical school has a significant impact on students' achievements and learning outcomes. The importance of equitable learning environments across programme sites is implicit in distributed undergraduate medical programmes being developed and implemented. To study the learning environment and its equity across two classes and three geographically separate sites of a distributed medical programme at the University of British Columbia Medical School that commenced in 2004. The validated Dundee Ready Educational Environment Survey was sent to all students in their 2nd and 3rd year (classes graduating in 2009 and 2008) of the programme. The domains of the learning environment surveyed were: students' perceptions of learning, students' perceptions of teachers, students' academic self-perceptions, students' perceptions of the atmosphere, and students' social self-perceptions. Mean scores, frequency distribution of responses, and inter- and intrasite differences were calculated. The perception of the global learning environment at all sites was more positive than negative. It was characterised by a strongly positive perception of teachers. The work load and emphasis on factual learning were perceived negatively. Intersite differences within domains of the learning environment were more evident in the pioneer class (2008) of the programme. Intersite differences consistent across classes were largely related to on-site support for students. Shared strengths and weaknesses in the learning environment at UBC sites were evident in areas that were managed by the parent institution, such as the attributes of shared faculty and curriculum. A greater divergence in the perception of the learning environment was found in domains dependent on local arrangements and social factors that are less amenable to central regulation. This study underlines the need for ongoing comparative evaluation of the learning environment at the distributed sites and

  16. A planning and analysis framework for evaluating distributed generation and utility strategies

    International Nuclear Information System (INIS)

    Ault, Graham W.

    2000-01-01

    The numbers of smaller scale distributed power generation units connected to the distribution networks of electricity utilities in the UK and elsewhere have grown significantly in recent years. Numerous economic and political drivers have stimulated this growth and continue to provide the environment for future growth in distributed generation. The simple fact that distributed generation is independent from the distribution utility complicates planning and operational tasks for the distribution network. The uncertainty relating to the number, location and type of distributed generating units to connect to the distribution network in the future makes distribution planning a particularly difficult activity. This thesis concerns the problem of distribution network and business planning in the era of distributed generation. A distributed generation strategic analysis framework is proposed to provide the required analytical capability and planning and decision making framework to enable distribution utilities to deal effectively with the challenges and opportunities presented to them by distributed generation. The distributed generation strategic analysis framework is based on the best features of modern planning and decision making methodologies and facilitates scenario based analysis across many utility strategic options and uncertainties. Case studies are presented and assessed to clearly illustrate the potential benefits of such an approach to distributed generation planning in the UK electricity supply industry. (author)

  17. Residence time distribution software analysis. User's manual

    International Nuclear Information System (INIS)

    1996-01-01

    Radiotracer applications cover a wide range of industrial activities in chemical and metallurgical processes, water treatment, mineral processing, environmental protection and civil engineering. Experiment design, data acquisition, treatment and interpretation are the basic elements of tracer methodology. The application of radiotracers to determine impulse response as RTD as well as the technical conditions for conducting experiments in industry and in the environment create a need for data processing using special software. Important progress has been made during recent years in the preparation of software programs for data treatment and interpretation. The software package developed for industrial process analysis and diagnosis by the stimulus-response methods contains all the methods for data processing for radiotracer experiments

  18. CMS distributed data analysis with CRAB3

    Science.gov (United States)

    Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; Riahi, H.; Tanasijczuk, A. J.; Yusli, M. N. B.; Wolf, M.; Woodard, A. E.; Vaandering, E.

    2015-12-01

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day. CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks and submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. The new system improves the robustness, scalability and sustainability of the service. Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.

  19. Built Environment Analysis Tool: April 2013

    Energy Technology Data Exchange (ETDEWEB)

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  20. Distribution tactics for success in turbulent versus stable environments: A complexity theory approach

    Directory of Open Access Journals (Sweden)

    Roger Bruce Mason

    2013-05-01

    Full Text Available This article proposes that the external environment influences the choice of distribution tactics. Since businesses and markets are complex adaptive systems, using complexity theory to understand such environments is necessary, but it has not been widely researched. A qualitative case method using in-depth interviews investigated four successful, versus less successful, companies in turbulent versus stable environments. The results tentatively confirmed that the more successful company, in a turbulent market, sees distribution activities as less important than other aspects of the marketing mix, but uses them to stabilise customer relationships and to maintain distribution processes. These findings can benefit marketers by emphasising a new way to consider place activities. How marketers can be assisted, and suggestions for further research, are provided.

  1. Distribution tactics for success in turbulent versus stable environments: A complexity theory approach

    Directory of Open Access Journals (Sweden)

    Roger Bruce Mason

    2013-11-01

    Full Text Available This article proposes that the external environment influences the choice of distribution tactics. Since businesses and markets are complex adaptive systems, using complexity theory to understand such environments is necessary, but it has not been widely researched. A qualitative case method using in-depth interviews investigated four successful, versus less successful, companies in turbulent versus stable environments. The results tentatively confirmed that the more successful company, in a turbulent market, sees distribution activities as less important than other aspects of the marketing mix, but uses them to stabilise customer relationships and to maintain distribution processes. These findings can benefit marketers by emphasising a new way to consider place activities. How marketers can be assisted, and suggestions for further research, are provided.

  2. The structure and pyrolysis product distribution of lignite from different sedimentary environment

    International Nuclear Information System (INIS)

    Liu, Peng; Zhang, Dexiang; Wang, Lanlan; Zhou, Yang; Pan, Tieying; Lu, Xilan

    2016-01-01

    Highlights: • Carbon structure of three lignites was measured by solid 13 C NMR. • Effect of carbon structure on pyrolysis product distribution was studied. • Tar yield is influenced by aliphatic carbon and oxygen functional group. • C1–C4 content of pyrolysis gas is related to CH 2 /CH 3 ratio. - Abstract: Low-temperature pyrolysis is an economically efficient method for lignite to obtain coal tar and improve its combustion calorific value. The research on the distribution of pyrolysis product (especially coal tar yield) plays an important role in energy application and economic development in the now and future. Pyrolysis test was carried out in a tube reactor at 873 K for 15 min. The structure of the lignite was measured by solid 13 C nuclear magnetic resonance (NMR) and Fourier transform infrared spectroscopy (FTIR). The thermal analysis was analyzed by thermo-gravimetric (TG) analyzer. The results show that the pyrolysis product distribution is related to the breakage of branch structures of aromatic ring in lignites from different sedimentary environment. The gas yield and composition are related to the decomposition of carbonyl group and the breakage of aliphatic carbon. The tar yield derived from lignite pyrolysis follows the order: Xianfeng lignite (XF, 13.67 wt.%) > Xiaolongtan lignite (XLT, 7.97 wt.%) > Inner Mongolia lignite (IM, 6.30 wt.%), which is mainly influenced by the aliphatic carbon contents, the CH 2 /CH 3 ratio and the oxygen functional groups in lignite. The pyrolysis water yield depends on the decomposition of oxygen functional groups. IM has the highest content of oxygen-linked carbon so that the pyrolysis water yield derived from IM is the highest (9.20 wt.%), and is far more than that from the other two lignites.

  3. ATLAS Distributed Data Analysis: challenges and performance

    CERN Document Server

    Fassi, Farida; The ATLAS collaboration

    2015-01-01

    In the LHC operations era the key goal is to analyse the results of the collisions of high-energy particles as a way of probing the fundamental forces of nature. The ATLAS experiment at the LHC at CERN is recording and simulating several 10's of PetaBytes of data per year. The ATLAS Computing Model was designed around the concepts of Grid Computing. Large data volumes from the detectors and simulations require a large number of CPUs and storage space for data processing. To cope with this challenge a global network known as the Worldwide LHC Computing Grid (WLCG) was built. This is the most sophisticated data taking and analysis system ever built. ATLAS accumulated more than 140 PB of data between 2009 and 2014. To analyse these data ATLAS developed, deployed and now operates a mature and stable distributed analysis (DA) service on the WLCG. The service is actively used: more than half a million user jobs run daily on DA resources, submitted by more than 1500 ATLAS physicists. A significant reliability of the...

  4. ATLAS Distributed Data Analysis: performance and challenges

    CERN Document Server

    Fassi, Farida; The ATLAS collaboration

    2015-01-01

    In the LHC operations era the key goal is to analyse the results of the collisions of high-energy particles as a way of probing the fundamental forces of nature. The ATLAS experiment at the LHC at CERN is recording and simulating several 10's of PetaBytes of data per year. The ATLAS Computing Model was designed around the concepts of Grid Computing. Large data volumes from the detectors and simulations require a large number of CPUs and storage space for data processing. To cope with this challenge a global network known as the Worldwide LHC Computing Grid (WLCG) was built. This is the most sophisticated data taking and analysis system ever built. ATLAS accumulated more than 140 PB of data between 2009 and 2014. To analyse these data ATLAS developed, deployed and now operates a mature and stable distributed analysis (DA) service on the WLCG. The service is actively used: more than half a million user jobs run daily on DA resources, submitted by more than 1500 ATLAS physicists. A significant reliability of the...

  5. Data management in an object-oriented distributed aircraft conceptual design environment

    Science.gov (United States)

    Lu, Zhijie

    In the competitive global market place, aerospace companies are forced to deliver the right products to the right market, with the right cost, and at the right time. However, the rapid development of technologies and new business opportunities, such as mergers, acquisitions, supply chain management, etc., have dramatically increased the complexity of designing an aircraft. Therefore, the pressure to reduce design cycle time and cost is enormous. One way to solve such a dilemma is to develop and apply advanced engineering environments (AEEs), which are distributed collaborative virtual design environments linking researchers, technologists, designers, etc., together by incorporating application tools and advanced computational, communications, and networking facilities. Aircraft conceptual design, as the first design stage, provides major opportunity to compress design cycle time and is the cheapest place for making design changes. However, traditional aircraft conceptual design programs, which are monolithic programs, cannot provide satisfactory functionality to meet new design requirements due to the lack of domain flexibility and analysis scalability. Therefore, we are in need of the next generation aircraft conceptual design environment (NextADE). To build the NextADE, the framework and the data management problem are two major problems that need to be addressed at the forefront. Solving these two problems, particularly the data management problem, is the focus of this research. In this dissertation, in light of AEEs, a distributed object-oriented framework is firstly formulated and tested for the NextADE. In order to improve interoperability and simplify the integration of heterogeneous application tools, data management is one of the major problems that need to be tackled. To solve this problem, taking into account the characteristics of aircraft conceptual design data, a robust, extensible object-oriented data model is then proposed according to the

  6. Merging assistance function with task distribution model to enhance user performance in collaborative virtual environment

    International Nuclear Information System (INIS)

    Khalid, S.; Alam, A.

    2016-01-01

    Collaborative Virtual Environments (CVEs) falls under Virtual Reality (VR) where two or more users manipulate objects collaboratively. In this paper we have made some experiments to make assembly from constituents parts scattered in Virtual Environment (VE) based on task distribution model using assistance functions for checking and enhancing user performance. The CVEs subjects setting on distinct connected machines via local area network. In this perspective, we consider the effects of assistance function with oral communication on collaboration, co-presence and users performance. Twenty subjects performed collaboratively an assembly task on static and dynamic based task distribution. We examine the degree of influence of assistance function with oral communications on user's performance based on task distribution model. The results show that assistance functions with oral communication based on task distribution model not only increase user performance but also enhance the sense of copresence and awareness. (author)

  7. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    International Nuclear Information System (INIS)

    Scheidt, Rafael de Faria; Vilain, Patrícia; Dantas, M A R

    2014-01-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers

  8. Evaluation of trace elements distribution in water, sediment, soil and cassava plant in Muria peninsula environment by NAA method

    International Nuclear Information System (INIS)

    Muryono, H.; Sumining; Agus Taftazani; Kris Tri Basuki; Sukarman, A.

    1999-01-01

    The evaluation of trace elements distribution in water, sediment, soil and cassava plant in Muria peninsula by NAA method were done. The nuclear power plant (NPP) and the coal power plant (CPP) will be built in Muria peninsula, so, the Muria peninsula is an important site for samples collection and monitoring of environment. River-water, sediment, dryland-soil and cassava plant were choosen as specimens samples from Muria peninsula environment. The analysis result of trace elements were used as a contributed data for environment monitoring before and after NPP was built. The trace elements in specimens of river-water, sediment, dryland-soil and cassava plant samples were analyzed by INAA method. It was found that the trace elements distribution were not evenly distributed. Percentage of trace elements distribution in river-water, sediment, dryland-soil and cassava leaves were 0.00026-0.037% in water samples, 0.49-62.7% in sediment samples, 36.29-99.35% in soil samples and 0.21-99.35% in cassava leaves. (author)

  9. Full immersion simulation: validation of a distributed simulation environment for technical and non-technical skills training in Urology.

    Science.gov (United States)

    Brewin, James; Tang, Jessica; Dasgupta, Prokar; Khan, Muhammad S; Ahmed, Kamran; Bello, Fernando; Kneebone, Roger; Jaye, Peter

    2015-07-01

    To evaluate the face, content and construct validity of the distributed simulation (DS) environment for technical and non-technical skills training in endourology. To evaluate the educational impact of DS for urology training. DS offers a portable, low-cost simulated operating room environment that can be set up in any open space. A prospective mixed methods design using established validation methodology was conducted in this simulated environment with 10 experienced and 10 trainee urologists. All participants performed a simulated prostate resection in the DS environment. Outcome measures included surveys to evaluate the DS, as well as comparative analyses of experienced and trainee urologist's performance using real-time and 'blinded' video analysis and validated performance metrics. Non-parametric statistical methods were used to compare differences between groups. The DS environment demonstrated face, content and construct validity for both non-technical and technical skills. Kirkpatrick level 1 evidence for the educational impact of the DS environment was shown. Further studies are needed to evaluate the effect of simulated operating room training on real operating room performance. This study has shown the validity of the DS environment for non-technical, as well as technical skills training. DS-based simulation appears to be a valuable addition to traditional classroom-based simulation training. © 2014 The Authors BJU International © 2014 BJU International Published by John Wiley & Sons Ltd.

  10. Distribution entropy analysis of epileptic EEG signals.

    Science.gov (United States)

    Li, Peng; Yan, Chang; Karmakar, Chandan; Liu, Changchun

    2015-01-01

    It is an open-ended challenge to accurately detect the epileptic seizures through electroencephalogram (EEG) signals. Recently published studies have made elaborate attempts to distinguish between the normal and epileptic EEG signals by advanced nonlinear entropy methods, such as the approximate entropy, sample entropy, fuzzy entropy, and permutation entropy, etc. Most recently, a novel distribution entropy (DistEn) has been reported to have superior performance compared with the conventional entropy methods for especially short length data. We thus aimed, in the present study, to show the potential of DistEn in the analysis of epileptic EEG signals. The publicly-accessible Bonn database which consisted of normal, interictal, and ictal EEG signals was used in this study. Three different measurement protocols were set for better understanding the performance of DistEn, which are: i) calculate the DistEn of a specific EEG signal using the full recording; ii) calculate the DistEn by averaging the results for all its possible non-overlapped 5 second segments; and iii) calculate it by averaging the DistEn values for all the possible non-overlapped segments of 1 second length, respectively. Results for all three protocols indicated a statistically significantly increased DistEn for the ictal class compared with both the normal and interictal classes. Besides, the results obtained under the third protocol, which only used very short segments (1 s) of EEG recordings showed a significantly (p entropy algorithm. The capability of discriminating between the normal and interictal EEG signals is of great clinical relevance since it may provide helpful tools for the detection of a seizure onset. Therefore, our study suggests that the DistEn analysis of EEG signals is very promising for clinical and even portable EEG monitoring.

  11. Sensitivity analysis of distributed volcanic source inversion

    Science.gov (United States)

    Cannavo', Flavio; Camacho, Antonio G.; González, Pablo J.; Puglisi, Giuseppe; Fernández, José

    2016-04-01

    A recently proposed algorithm (Camacho et al., 2011) claims to rapidly estimate magmatic sources from surface geodetic data without any a priori assumption about source geometry. The algorithm takes the advantages of fast calculation from the analytical models and adds the capability to model free-shape distributed sources. Assuming homogenous elastic conditions, the approach can determine general geometrical configurations of pressured and/or density source and/or sliding structures corresponding to prescribed values of anomalous density, pressure and slip. These source bodies are described as aggregation of elemental point sources for pressure, density and slip, and they fit the whole data (keeping some 3D regularity conditions). Although some examples and applications have been already presented to demonstrate the ability of the algorithm in reconstructing a magma pressure source (e.g. Camacho et al., 2011,Cannavò et al., 2015), a systematic analysis of sensitivity and reliability of the algorithm is still lacking. In this explorative work we present results from a large statistical test designed to evaluate the advantages and limitations of the methodology by assessing its sensitivity to the free and constrained parameters involved in inversions. In particular, besides the source parameters, we focused on the ground deformation network topology, and noise in measurements. The proposed analysis can be used for a better interpretation of the algorithm results in real-case applications. Camacho, A. G., González, P. J., Fernández, J. & Berrino, G. (2011) Simultaneous inversion of surface deformation and gravity changes by means of extended bodies with a free geometry: Application to deforming calderas. J. Geophys. Res. 116. Cannavò F., Camacho A.G., González P.J., Mattia M., Puglisi G., Fernández J. (2015) Real Time Tracking of Magmatic Intrusions by means of Ground Deformation Modeling during Volcanic Crises, Scientific Reports, 5 (10970) doi:10.1038/srep

  12. Exploring Distributed Leadership for the Quality Management of Online Learning Environments

    Science.gov (United States)

    Palmer, Stuart; Holt, Dale; Gosper, Maree; Sankey, Michael; Allan, Garry

    2013-01-01

    Online learning environments (OLEs) are complex information technology (IT) systems that intersect with many areas of university organisation. Distributed models of leadership have been proposed as appropriate for the good governance of OLEs. Based on theoretical and empirical research, a group of Australian universities proposed a framework for…

  13. Collaborative Virtual Environments as Means to Increase the Level of Intersubjectivity in a Distributed Cognition System

    Science.gov (United States)

    Ligorio, M. Beatrice; Cesareni, Donatella; Schwartz, Neil

    2008-01-01

    Virtual environments are able to extend the space of interaction beyond the classroom. In order to analyze how distributed cognition functions in such an extended space, we suggest focusing on the architecture of intersubjectivity. The Euroland project--a virtual land created and populated by seven classrooms supported by a team of…

  14. Framing and Enhancing Distributed Leadership in the Quality Management of Online Learning Environments in Higher Education

    Science.gov (United States)

    Holt, Dale; Palmer, Stuart; Gosper, Maree; Sankey, Michael; Allan, Garry

    2014-01-01

    This article reports on the findings of senior leadership interviews in a nationally funded project on distributed leadership in the quality management of online learning environments (OLEs) in higher education. Questions were framed around the development of an OLE quality management framework and the situation of the characteristics of…

  15. RELIABILITY ANALYSIS OF POWER DISTRIBUTION SYSTEMS

    Directory of Open Access Journals (Sweden)

    Popescu V.S.

    2012-04-01

    Full Text Available Power distribution systems are basic parts of power systems and reliability of these systems at present is a key issue for power engineering development and requires special attention. Operation of distribution systems is accompanied by a number of factors that produce random data a large number of unplanned interruptions. Research has shown that the predominant factors that have a significant influence on the reliability of distribution systems are: weather conditions (39.7%, defects in equipment(25% and unknown random factors (20.1%. In the article is studied the influence of random behavior and are presented estimations of reliability of predominantly rural electrical distribution systems.

  16. Corroded scale analysis from water distribution pipes

    Directory of Open Access Journals (Sweden)

    Rajaković-Ognjanović Vladana N.

    2011-01-01

    Full Text Available The subject of this study was the steel pipes that are part of Belgrade's drinking water supply network. In order to investigate the mutual effects of corrosion and water quality, the corrosion scales on the pipes were analyzed. The idea was to improve control of corrosion processes and prevent impact of corrosion on water quality degradation. The instrumental methods for corrosion scales characterization used were: scanning electron microscopy (SEM, for the investigation of corrosion scales of the analyzed samples surfaces, X-ray diffraction (XRD, for the analysis of the presence of solid forms inside scales, scanning electron microscopy (SEM, for the microstructural analysis of the corroded scales, and BET adsorption isotherm for the surface area determination. Depending on the composition of water next to the pipe surface, corrosion of iron results in the formation of different compounds and solid phases. The composition and structure of the iron scales in the drinking water distribution pipes depends on the type of the metal and the composition of the aqueous phase. Their formation is probably governed by several factors that include water quality parameters such as pH, alkalinity, buffer intensity, natural organic matter (NOM concentration, and dissolved oxygen (DO concentration. Factors such as water flow patterns, seasonal fluctuations in temperature, and microbiological activity as well as water treatment practices such as application of corrosion inhibitors can also influence corrosion scale formation and growth. Therefore, the corrosion scales found in iron and steel pipes are expected to have unique features for each site. Compounds that are found in iron corrosion scales often include goethite, lepidocrocite, magnetite, hematite, ferrous oxide, siderite, ferrous hydroxide, ferric hydroxide, ferrihydrite, calcium carbonate and green rusts. Iron scales have characteristic features that include: corroded floor, porous core that contains

  17. The bamA Gene for Anaerobic Ring Fission is Widely Distributed in the Environment

    Directory of Open Access Journals (Sweden)

    Abigail W. Porter

    2013-10-01

    Full Text Available Aromatic compounds are a major component of the global carbon pool, and include a diverse range of compounds such as humic acid, lignin, amino acids, and industrial chemicals. Due to the prevalence of aromatic compounds in the environment, microorganisms have evolved mechanisms to metabolize that available carbon. While anaerobic monoaromatic degradation can be initiated in a number of different ways, the signature central metabolite for these pathways is benzoyl-CoA. Aromatic chemicals with different upstream degradation pathways all funnel into the downstream benzoyl-CoA pathway. Different genes encoding enzymes of the benzoyl-CoA pathway could be used as biomarkers, however the ring opening hydrolase, encoded by the bamA gene, is ideal because it is detected under a range of respiratory conditions, including denitrifying, iron-reducing, sulfate-reducing, and fermentative conditions. In this work we evaluated a number of DNA samples from diverse environments for the presence of the bamA gene, and had positive results for every sample. We further explored the bamA gene diversity in six of these sites as compared to published genome sequences and found that our clones were distributed throughout the dendrogram, although there were clone sequences from two sites that formed a unique clade. When we used a functional operational taxonomic unit based clustering analysis to compare the diversity of our sites to several locations reported in the literature, we found that there were two clusters of sites, and benzene contaminated sites were present in both clusters. Given the number of potential upstream inputs from natural and manmade monoaromatic compounds, the benzoyl-CoA pathway and the bamA gene play an important role in the global carbon cycle that has thus far been understudied.

  18. silicon bipolar distributed oscillator design and analysis

    African Journals Online (AJOL)

    Distributed circuits can defy some of the performance trade- offs in conventional circuits by taking advantage of multiple parallel signal paths. This multiple signal path feature of the distributed systems often results in strong electromagnetic couplings between circuit components across multiple levels of abstraction. The term ...

  19. Biomimetic Coacervate Environments for Protein Analysis

    Science.gov (United States)

    Perry, Sarah; McCall, Patrick; Srivastava, Samavayan; Kovar, David; Gardel, Margaret; Tirrell, Matthew

    2015-03-01

    Living cells have evolved sophisticated intracellular organization strategies that are challenging to reproduce synthetically. Biomolecular function depends on both the structure of the molecule itself and the properties of the surrounding medium. The ability to simulate the in vivo environment and isolate biological networks for study in an artificial milieu without sacrificing the crowding, structure, and compartmentalization of a cellular environment, represent engineering challenges with tremendous potential to impact both biological studies and biomedical applications. Emerging experience has shown that polypeptide-based complex coacervation (electrostatically-driven liquid-liquid phase separation) produces a biomimetic microenvironment capable of tuning protein biochemical activity. We have investigated the effect of polypeptide-based coacervates on the dynamic self-assembly of cytoskeletal actin filaments. Coacervate materials are able to directly affect the nucleation and assembly dynamics. We observe effects that can be attributed to the length and chemical specificity of the encapsulating polypeptides, as well as the overall crowded nature of a polymer-rich coacervate phase. Coacervate-based systems are particularly attractive for use in biochemical assays because the compartmentalization afforded by liquid-liquid phase separation does not necessarily inhibit the transport of molecules across the compartmental barrier.

  20. Evaluating data distribution and drift vulnerabilities of machine learning algorithms in secure and adversarial environments

    Science.gov (United States)

    Nelson, Kevin; Corbin, George; Blowers, Misty

    2014-05-01

    Machine learning is continuing to gain popularity due to its ability to solve problems that are difficult to model using conventional computer programming logic. Much of the current and past work has focused on algorithm development, data processing, and optimization. Lately, a subset of research has emerged which explores issues related to security. This research is gaining traction as systems employing these methods are being applied to both secure and adversarial environments. One of machine learning's biggest benefits, its data-driven versus logic-driven approach, is also a weakness if the data on which the models rely are corrupted. Adversaries could maliciously influence systems which address drift and data distribution changes using re-training and online learning. Our work is focused on exploring the resilience of various machine learning algorithms to these data-driven attacks. In this paper, we present our initial findings using Monte Carlo simulations, and statistical analysis, to explore the maximal achievable shift to a classification model, as well as the required amount of control over the data.

  1. Data intensive high energy physics analysis in a distributed cloud

    International Nuclear Information System (INIS)

    Charbonneau, A; Impey, R; Podaima, W; Agarwal, A; Anderson, M; Armstrong, P; Fransham, K; Gable, I; Harris, D; Leavett-Brown, C; Paterson, M; Sobie, R J; Vliet, M

    2012-01-01

    We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.

  2. Data intensive high energy physics analysis in a distributed cloud

    Science.gov (United States)

    Charbonneau, A.; Agarwal, A.; Anderson, M.; Armstrong, P.; Fransham, K.; Gable, I.; Harris, D.; Impey, R.; Leavett-Brown, C.; Paterson, M.; Podaima, W.; Sobie, R. J.; Vliet, M.

    2012-02-01

    We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.

  3. Scaling analysis of meteorite shower mass distributions

    DEFF Research Database (Denmark)

    Oddershede, Lene; Meibom, A.; Bohr, Jakob

    1998-01-01

    Meteorite showers are the remains of extraterrestrial objects which are captivated by the gravitational field of the Earth. We have analyzed the mass distribution of fragments from 16 meteorite showers for scaling. The distributions exhibit distinct scaling behavior over several orders of magnetude...... the observed scaling exponents to exponents observed in laboratory experiments and discuss the possibility that one can derive insight into the original shapes of the meteoroids....

  4. Fast variogram analysis of remotely sensed images in HPC environment

    Science.gov (United States)

    Pesquer, Lluís; Cortés, Anna; Masó, Joan; Pons, Xavier

    2013-04-01

    Exploring and describing spatial variation of images is one of the main applications of geostatistics to remote sensing. The variogram is a very suitable tool to carry out this spatial pattern analysis. Variogram analysis is composed of two steps: empirical variogram generation and fitting a variogram model. The empirical variogram generation is a very quick procedure for most analyses of irregularly distributed samples, but time consuming increases quite significantly for remotely sensed images, because number of samples (pixels) involved is usually huge (more than 30 million for a Landsat TM scene), basically depending on extension and spatial resolution of images. In several remote sensing applications this type of analysis is repeated for each image, sometimes hundreds of scenes and sometimes for each radiometric band (high number in the case of hyperspectral images) so that there is a need for a fast implementation. In order to reduce this high execution time, we carried out a parallel solution of the variogram analyses. The solution adopted is the master/worker programming paradigm in which the master process distributes and coordinates the tasks executed by the worker processes. The code is written in ANSI-C language, including MPI (Message Passing Interface) as a message-passing library in order to communicate the master with the workers. This solution (ANSI-C + MPI) guarantees portability between different computer platforms. The High Performance Computing (HPC) environment is formed by 32 nodes, each with two Dual Core Intel(R) Xeon (R) 3.0 GHz processors with 12 Gb of RAM, communicated with integrated dual gigabit Ethernet. This IBM cluster is located in the research laboratory of the Computer Architecture and Operating Systems Department of the Universitat Autònoma de Barcelona. The performance results for a 15km x 15km subcene of 198-31 path-row Landsat TM image are shown in table 1. The proximity between empirical speedup behaviour and theoretical

  5. Web Based Distributed Coastal Image Analysis System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project develops Web based distributed image analysis system processing the Moderate Resolution Imaging Spectroradiometer (MODIS) data to provide decision...

  6. Finite element analysis in a minicomputer/mainframe environment

    Science.gov (United States)

    Storaasli, O. O.; Murphy, R. C.

    1978-01-01

    Design considerations were evaluated for general purpose finite element systems to maximize performance when installed on distributed computer hardware/software systems. It is shown how the features of current minicomputers complement those of a modular implementation of the finite element method for increasing the control, speed, and visibility (interactive graphics) in solving structural problems at reduced cost. The approach used is to implement a finite element system in a distributed computer environment to solve structural problems and to explore alternatives in distributing finite element computations.

  7. Analysis of fuel oxygenates in the environment

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, T.C.; Berg, M.; Haderlein, S.B. [Swiss Federal Inst. for Environmental Science and Technology (EAWAG) (Switzerland); Swiss Federal Inst. of Technology (ETH), Duebendorf (Switzerland); Duong, Hong-Anh [Vietnam National Univ.,Hanoi (Viet Nam). Center for Environmental Chemistry

    2001-07-01

    This paper presents an overview of currently available analytical methods for fuel oxygenates such as methyl tert-butyl ether and ethanol and highlights the advantages and disadvantages of the different methods. The occurrence of fuel oxygenates in water and air is explored, and sampling and enrichment of oxygenates in water are described covering water sampling, direct aqueous injection into a chromatographic column, headspace analysis, purge and trap enrichment, and solid phase microextraction. Methods for sampling and enrichment of oxygenates in air, separation of fuel oxygenates, preparation of standards and calibration, and detection using flame ionisation and photoionisation detection, mass spectrometry, atomic emission detection, and Fourier transform infrared spectroscopy are examined. Environmentally relevant physiochemical parameters of fuel oxygenates are tabulated, and injection and enrichment techniques for water analysis are compared.

  8. IMPEx - a web-based distributed research environment for planetary plasma science

    Science.gov (United States)

    Topf, Florian; Khodachenko, Maxim; Kallio, Esa; Génot, Vincent; Al-Ubaidi, Tarek; Modolo, Ronan; Hess, Sébastien; Schmidt, Walter; Scherf, Manuel; Alexeev, Igor; Gangloff, Michel; Budnik, Elena; Bouchemit, Myriam; Renard, Benjamin; Bourrel, Natacha; Penou, Emmanuel; André, Nicolas; Belenkaya, Elena

    2014-05-01

    The FP7-SPACE project IMPEx (http://impex-fp7.oeaw.ac.at/) was established to provide a web-based infrastructure to facilitate the inter-comparison of spacecraft in-situ measurements and computational models in the fields of planetary plasma science. Within this project several observational (CDAWeb, AMDA, CLWeb), as well as numerical simulation (FMI, LATMOS, SINP) databases provide datasets, which can be combined for further joint analysis and scientific investigation. The major goal of this project consists in providing an environment for the connection and joint operation of the different types of numerical and observational data sources in order to validate numerical simulations with spacecraft observations and vice versa. As an important milestone of IMPEx a common metadata standard was developed for the description of the currently integrated simulation models and the archived datasets. This standard is based on the SPASE data model (DM), which originates from the Heliospheric physics community. This DM was developed for the description of observational data, and that is why it was chosen as a basis within the scope of IMPEx. A considerable part of the project effort is dedicated to the development of standardized (web service-) interfaces and protocols using the SPASE DM as an elaborated IMPEx DM for the communication between the different tools and databases of the IMPEx research infrastructure. For the visualization and analysis of the archived datasets available within IMPEx and beyond, several tools (AMDA, 3DView, ClWeb) were upgraded to be able to work with the newly developed metadata standards and protocols. A practical example will be presented to demonstrate the capabilities and potentials of the achievements of IMPEx by using these tools. Furthermore the IMPEx DM has by now also been successfully applied outside the project's core infrastructure: A prototype for UCLA MHD description can be seen at LatHyS. Besides that IRAP is currently working on a

  9. Distribution coefficients for plutonium and americium on particulates in aquatic environments

    International Nuclear Information System (INIS)

    Sanchez, A.L.; Schell, W.R.; Sibley, T.H.

    1982-01-01

    The distribution coefficients of two transuranic elements, plutonium and americium, were measured experimentally in laboratory systems of selected freshwater, estuarine, and marine environments. Gamma-ray emitting isotopes of these radionuclides, 237 Pu and 241 Am, were significantly greater than the sorption Ksub(d) values, suggesting some irreversibility in the sorption of these radionuclides onto sediments. The effects of pH and of sediment concentration on the distribution coefficients were also investigated. There were significant changes in the Ksub(d) values as these parameters were varied. Experiments using sterilized and nonsterilized samples for some of the sediment/water systems indicate possible bacterial effects on Ksub(d) values. (author)

  10. Expressing Environment Assumptions and Real-time Requirements for a Distributed Embedded System with Shared Variables

    DEFF Research Database (Denmark)

    Tjell, Simon; Fernandes, João Miguel

    2008-01-01

    In a distributed embedded system, it is often necessary to share variables among its computing nodes to allow the distribution of control algorithms. It is therefore necessary to include a component in each node that provides the service of variable sharing. For that type of component, this paper...... for the component. The CPN model can be used to validate the environment assumptions and the requirements. The validation is performed by execution of the model during which traces of events and states are automatically generated and evaluated against the requirements....

  11. Distribution analysis of airborne nicotine concentrations in hospitality facilities.

    Science.gov (United States)

    Schorp, Matthias K; Leyden, Donald E

    2002-02-01

    A number of publications report statistical summaries for environmental tobacco smoke (ETS) concentrations. Despite compelling evidence for the data not being normally distributed, these publications typically report the arithmetic mean and standard deviation of the data, thereby losing important information related to the distribution of values contained in the original data. We were interested in the frequency distributions of reported nicotine concentrations in hospitality environments and subjected available data to distribution analyses. The distribution of experimental indoor airborne nicotine concentration data taken from hospitality facilities worldwide was fit to lognormal, Weibull, exponential, Pearson (Type V), logistic, and loglogistic distribution models. Comparison of goodness of fit (GOF) parameters and indications from the literature verified the selection of a lognormal distribution as the overall best model. When individual data were not reported in the literature, statistical summaries of results were used to model sets of lognormally distributed data that are intended to mimic the original data distribution. Grouping the data into various categories led to 31 frequency distributions that were further interpreted. The median values in nonsmoking environments are about half of the median values in smoking sections. When different continents are compared, Asian, European, and North American median values in restaurants are about a factor of three below levels encountered in other hospitality facilities. On a comparison of nicotine concentrations in North American smoking sections and nonsmoking sections, median values are about one-third of the European levels. The results obtained may be used to address issues related to exposure to ETS in the hospitality sector.

  12. Enabling Efficient Intelligence Analysis in Degraded Environments

    Science.gov (United States)

    2013-06-01

    the Computer Graphics Forum, vol . 29 , no. 3, pp. 863-872. Card, S., (2008), Information visualization, In Sears, A. & Jacko, J.A., The Human...Network Visualisation, In Scott, J. & Carrington J.P, Handbook of Social Network Analysis, Sage Eds., London/New Delhi . 16 Lecocq, R., Lavigne...Counterinsurgency: une étude de cas – Afghanistan 2001-2008, Université Pantheon-Assas, Paris II. DOI : http://www.drmcc.org/spip.php?article328 , last accessed

  13. Tritium in the environment: origins and analysis

    International Nuclear Information System (INIS)

    Baglan, N.

    2009-01-01

    The author recalls the chemical reactions at the origin of tritium formation, that tritium has been introduced in the atmosphere by nuclear tests, and that it is now produced by nuclear reactors. He also outlines that tritium is mainly released in waters (oceans, seas, rivers) or in the atmosphere. Some high concentrations may therefore occur. He discusses measurements of activity in rain waters and surface waters, and outlines the impact of the end of atmospheric nuclear tests. He discusses the technical challenges of low level tritium analysis and activity measurement

  14. Field distribution analysis in deflecting structures

    Energy Technology Data Exchange (ETDEWEB)

    Paramonov, V.V. [Joint Inst. for Nuclear Research, Moscow (Russian Federation)

    2013-02-15

    Deflecting structures are used now manly for bunch rotation in emittance exchange concepts, bunch diagnostics and to increase the luminosity. The bunch rotation is a transformation of a particles distribution in the six dimensional phase space. Together with the expected transformations, deflecting structures introduce distortions due to particularities - aberrations - in the deflecting field distribution. The distributions of deflecting fields are considered with respect to non linear additions, which provide emittance deteriorations during a transformation. The deflecting field is treated as combination of hybrid waves HE{sub 1} and HM{sub 1}. The criteria for selection and formation of deflecting structures with minimized level of aberrations are formulated and applied to known structures. Results of the study are confirmed by comparison with results of numerical simulations.

  15. Analysis Of Aspects Of Messages Hiding In Text Environments

    Directory of Open Access Journals (Sweden)

    Afanasyeva Olesya

    2015-09-01

    Full Text Available In the work are researched problems, which arise during hiding of messages in text environments, being transmitted by electronic communication channels and the Internet. The analysis of selection of places in text environment (TE, which can be replaced by word from the message is performed. Selection and replacement of words in the text environment is implemented basing on semantic analysis of text fragment, consisting of the inserted word, and its environment in TE. For implementation of such analysis is used concept of semantic parameters of words coordination and semantic value of separate word. Are used well-known methods of determination of values of these parameters. This allows moving from quality level to quantitative level analysis of text fragments semantics during their modification by word substitution. Invisibility of embedded messages is ensured by providing preset values of the semantic cooperation parameter deviations.

  16. Distributed service-based approach for sensor data fusion in IoT environments.

    Science.gov (United States)

    Rodríguez-Valenzuela, Sandra; Holgado-Terriza, Juan A; Gutiérrez-Guerrero, José M; Muros-Cobos, Jesús L

    2014-10-15

    The Internet of Things (IoT) enables the communication among smart objects promoting the pervasive presence around us of a variety of things or objects that are able to interact and cooperate jointly to reach common goals. IoT objects can obtain data from their context, such as the home, office, industry or body. These data can be combined to obtain new and more complex information applying data fusion processes. However, to apply data fusion algorithms in IoT environments, the full system must deal with distributed nodes, decentralized communication and support scalability and nodes dynamicity, among others restrictions. In this paper, a novel method to manage data acquisition and fusion based on a distributed service composition model is presented, improving the data treatment in IoT pervasive environments.

  17. About normal distribution on SO(3) group in texture analysis

    Science.gov (United States)

    Savyolova, T. I.; Filatov, S. V.

    2017-12-01

    This article studies and compares different normal distributions (NDs) on SO(3) group, which are used in texture analysis. Those NDs are: Fisher normal distribution (FND), Bunge normal distribution (BND), central normal distribution (CND) and wrapped normal distribution (WND). All of the previously mentioned NDs are central functions on SO(3) group. CND is a subcase for normal CLT-motivated distributions on SO(3) (CLT here is Parthasarathy’s central limit theorem). WND is motivated by CLT in R 3 and mapped to SO(3) group. A Monte Carlo method for modeling normally distributed values was studied for both CND and WND. All of the NDs mentioned above are used for modeling different components of crystallites orientation distribution function in texture analysis.

  18. Investigation on Oracle GoldenGate Veridata for Data Consistency in WLCG Distributed Database Environment

    OpenAIRE

    Asko, Anti; Lobato Pardavila, Lorena

    2014-01-01

    Abstract In the distributed database environment, the data divergence can be an important problem: if it is not discovered and correctly identified, incorrect data can lead to poor decision making, errors in the service and in the operative errors. Oracle GoldenGate Veridata is a product to compare two sets of data and identify and report on data that is out of synchronization. IT DB is providing a replication service between databases at CERN and other computer centers worldwide as a par...

  19. Calculation of Stationary, Free Molecular Flux Distributions in General 3D Environments

    Science.gov (United States)

    Labello, Jesse

    2011-10-01

    This article presents an application of the angular coefficient method for diffuse reflection to calculate stationary molecular flux distributions in general three dimensional environments. The method of angular coefficients is reviewed and the integration of the method into Blender, a free, open-source, 3D modeling software package, is described. Some example calculations are compared to analytical and Direct Simulation Monte Carlo (DSMC) results with excellent agreement.

  20. An Artificial Intelligence-Based Environment Quality Analysis System

    OpenAIRE

    Oprea , Mihaela; Iliadis , Lazaros

    2011-01-01

    Part 20: Informatics and Intelligent Systems Applications for Quality of Life information Services (ISQLIS) Workshop; International audience; The paper describes an environment quality analysis system based on a combination of some artificial intelligence techniques, artificial neural networks and rule-based expert systems. Two case studies of the system use are discussed: air pollution analysis and flood forecasting with their impact on the environment and on the population health. The syste...

  1. Integrated Design Engineering Analysis (IDEA) Environment - Aerodynamics, Aerothermodynamics, and Thermal Protection System Integration Module

    Science.gov (United States)

    Kamhawi, Hilmi N.

    2011-01-01

    This report documents the work performed during from March 2010 October 2011. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed environment using the Adaptive Modeling Language (AML) as the underlying framework. This report will focus on describing the work done in the area of extending the aerodynamics, and aerothermodynamics module using S/HABP, CBAERO, PREMIN and LANMIN. It will also detail the work done integrating EXITS as the TPS sizing tool.

  2. High-Surety Telemedicine in a Distributed, 'Plug-andPlan' Environment

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard L.; Funkhouser, Donald R.; Gallagher, Linda K.; Garcia, Rudy J.; Parks, Raymond C.; Warren, Steve

    1999-05-17

    Commercial telemedicine systems are increasingly functional, incorporating video-conferencing capabilities, diagnostic peripherals, medication reminders, and patient education services. However, these systems (1) rarely utilize information architectures which allow them to be easily integrated with existing health information networks and (2) do not always protect patient confidentiality with adequate security mechanisms. Using object-oriented methods and software wrappers, we illustrate the transformation of an existing stand-alone telemedicine system into `plug-and-play' components that function in a distributed medical information environment. We show, through the use of open standards and published component interfaces, that commercial telemedicine offerings which were once incompatible with electronic patient record systems can now share relevant data with clinical information repositories while at the same time hiding the proprietary implementations of the respective systems. Additionally, we illustrate how leading-edge technology can secure this distributed telemedicine environment, maintaining patient confidentiality and the integrity of the associated electronic medical data. Information surety technology also encourages the development of telemedicine systems that have both read and write access to electronic medical records containing patient-identifiable information. The win-win approach to telemedicine information system development preserves investments in legacy software and hardware while promoting security and interoperability in a distributed environment.

  3. High-Surety Telemedicine in a Distributed, 'Plug-and-Play' Environment

    International Nuclear Information System (INIS)

    Craft, Richard L.; Funkhouser, Donald R.; Gallagher, Linda K.; Garcia, Rudy J.; Parks, Raymond C.; Warren, Steve

    1999-01-01

    Commercial telemedicine systems are increasingly functional, incorporating video-conferencing capabilities, diagnostic peripherals, medication reminders, and patient education services. However, these systems (1) rarely utilize information architectures which allow them to be easily integrated with existing health information networks and (2) do not always protect patient confidentiality with adequate security mechanisms. Using object-oriented methods and software wrappers, we illustrate the transformation of an existing stand-alone telemedicine system into 'plug-and-play' components that function in a distributed medical information environment. We show, through the use of open standards and published component interfaces, that commercial telemedicine offerings which were once incompatible with electronic patient record systems can now share relevant data with clinical information repositories while at the same time hiding the proprietary implementations of the respective systems. Additionally, we illustrate how leading-edge technology can secure this distributed telemedicine environment, maintaining patient confidentiality and the integrity of the associated electronic medical data. Information surety technology also encourages the development of telemedicine systems that have both read and write access to electronic medical records containing patient-identifiable information. The win-win approach to telemedicine information system development preserves investments in legacy software and hardware while promoting security and interoperability in a distributed environment

  4. Analysis of refrigerant mal-distribution

    DEFF Research Database (Denmark)

    Kærn, Martin Ryhl; Elmegaard, Brian

    2009-01-01

    to be two straight tubes. The refrigerant maldistribution is then induced to the evaporator by varying the vapor quality at the inlet to each tube and the air-flow across each tube. Finally it is shown that mal-distribution can be compensated by an intelligent distributor, that ensures equal superheat...

  5. Assessing the distribution and protection status of two types of cool environment to facilitate their conservation under climate change.

    Science.gov (United States)

    Gollan, John R; Ramp, Daniel; Ashcroft, Michael B

    2014-04-01

    Strategies to mitigate climate change can protect different types of cool environments. Two are receiving much attention: protection of ephemeral refuges (i.e., places with low maximum temperatures) and of stable refugia (i.e., places that are cool, have a stable environment, and are isolated). Problematically, they are often treated as equivalents. Careful delineation of their qualities is needed to prevent misdirected conservation initiatives; yet, no one has determined whether protecting one protects the other. We mapped both types of cool environments across a large (∼3.4M ha) mixed-use landscape with a geographic information system and conducted a patch analysis to compare their spatial distributions; examine relations between land use and their size and shape; and assess their current protection status. With a modest, but arbitrary, threshold for demarcating both types of cool environments (i.e., values below the 0.025 quantile) there were 146,523 ha of ephemeral refuge (62,208 ha) and stable refugia (62,319 ha). Ephemeral refuges were generally aggregated at high elevation, and more refuge area occurred in protected areas (55,184 ha) than in unprotected areas (7,024 ha). In contrast, stable refugia were scattered across the landscape, and more stable-refugium area occurred on unprotected (40,135 ha) than on protected land (22,184 ha). Although sensitivity analysis showed that varying the thresholds that define cool environments affected outcomes, it also exposed the challenge of choosing a threshold for strategies to address climate change; there is no single value that is appropriate for all of biodiversity. The degree of overlap between ephemeral refuges and stable refugia revealed that targeting only the former for protection on currently unprotected land would capture ∼17% of stable refugia. Targeting only stable refugia would capture ∼54% of ephemeral refuges. Thus, targeting one type of cool environment did not fully protect the other. © 2014

  6. Nonlinear Progressive Collapse Analysis Including Distributed Plasticity

    Directory of Open Access Journals (Sweden)

    Mohamed Osama Ahmed

    2016-01-01

    Full Text Available This paper demonstrates the effect of incorporating distributed plasticity in nonlinear analytical models used to assess the potential for progressive collapse of steel framed regular building structures. Emphasis on this paper is on the deformation response under the notionally removed column, in a typical Alternate Path (AP method. The AP method employed in this paper is based on the provisions of the Unified Facilities Criteria – Design of Buildings to Resist Progressive Collapse, developed and updated by the U.S. Department of Defense [1]. The AP method is often used for to assess the potential for progressive collapse of building structures that fall under Occupancy Category III or IV. A case study steel building is used to examine the effect of incorporating distributed plasticity, where moment frames were used on perimeter as well as the interior of the three dimensional structural system. It is concluded that the use of moment resisting frames within the structural system will enhance resistance to progressive collapse through ductile deformation response and that it is conserative to ignore the effects of distributed plasticity in determining peak displacement response under the notionally removed column.

  7. Integrated Design and Engineering Analysis (IDEA) Environment - Propulsion Related Module Development and Vehicle Integration

    Science.gov (United States)

    Kamhawi, Hilmi N.

    2013-01-01

    This report documents the work performed during the period from May 2011 - October 2012 on the Integrated Design and Engineering Analysis (IDEA) environment. IDEA is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML). This report will focus on describing the work done in the areas of: (1) Integrating propulsion data (turbines, rockets, and scramjets) in the system, and using the data to perform trajectory analysis; (2) Developing a parametric packaging strategy for a hypersonic air breathing vehicles allowing for tank resizing when multiple fuels and/or oxidizer are part of the configuration; and (3) Vehicle scaling and closure strategies.

  8. The clinical learning environment in nursing education: a concept analysis.

    Science.gov (United States)

    Flott, Elizabeth A; Linden, Lois

    2016-03-01

    The aim of this study was to report an analysis of the clinical learning environment concept. Nursing students are evaluated in clinical learning environments where skills and knowledge are applied to patient care. These environments affect achievement of learning outcomes, and have an impact on preparation for practice and student satisfaction with the nursing profession. Providing clarity of this concept for nursing education will assist in identifying antecedents, attributes and consequences affecting student transition to practice. The clinical learning environment was investigated using Walker and Avant's concept analysis method. A literature search was conducted using WorldCat, MEDLINE and CINAHL databases using the keywords clinical learning environment, clinical environment and clinical education. Articles reviewed were written in English and published in peer-reviewed journals between 1995-2014. All data were analysed for recurring themes and terms to determine possible antecedents, attributes and consequences of this concept. The clinical learning environment contains four attribute characteristics affecting student learning experiences. These include: (1) the physical space; (2) psychosocial and interaction factors; (3) the organizational culture and (4) teaching and learning components. These attributes often determine achievement of learning outcomes and student self-confidence. With better understanding of attributes comprising the clinical learning environment, nursing education programmes and healthcare agencies can collaborate to create meaningful clinical experiences and enhance student preparation for the professional nurse role. © 2015 John Wiley & Sons Ltd.

  9. An Effective Distributed Model for Power System Transient Stability Analysis

    Directory of Open Access Journals (Sweden)

    MUTHU, B. M.

    2011-08-01

    Full Text Available The modern power systems consist of many interconnected synchronous generators having different inertia constants, connected with large transmission network and ever increasing demand for power exchange. The size of the power system grows exponentially due to increase in power demand. The data required for various power system applications have been stored in different formats in a heterogeneous environment. The power system applications themselves have been developed and deployed in different platforms and language paradigms. Interoperability between power system applications becomes a major issue because of the heterogeneous nature. The main aim of the paper is to develop a generalized distributed model for carrying out power system stability analysis. The more flexible and loosely coupled JAX-RPC model has been developed for representing transient stability analysis in large interconnected power systems. The proposed model includes Pre-Fault, During-Fault, Post-Fault and Swing Curve services which are accessible to the remote power system clients when the system is subjected to large disturbances. A generalized XML based model for data representation has also been proposed for exchanging data in order to enhance the interoperability between legacy power system applications. The performance measure, Round Trip Time (RTT is estimated for different power systems using the proposed JAX-RPC model and compared with the results obtained using traditional client-server and Java RMI models.

  10. Empirical analysis for Distributed Energy Resources' impact on future distribution network

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2012-01-01

    operation will be alternated. In this paper, quantitative results in terms of how the future distribution grid will be changed by the deployment of distributed generation, active demand and electric vehicles, are presented. The analysis is based on the conditions for both a radial and a meshed distribution...... network. The input parameters are based on the current and envisioned DER deployment scenarios proposed for Sweden....

  11. PYTHON-based Physics Analysis Environment for LHCb

    CERN Document Server

    Belyaev, I; Mato, P; Barrand, G; Tsaregorodtsev, A; de Oliveira, E; Tsaregorodtsev, A Yu; De Oliveira, E

    2004-01-01

    BENDER is the PYTHON based physics analysis application for LHCb. It combines the best features of the underlying GAUDI software architecture with the flexibility of the PYTHON scripting language and provides end-users with a friendly physics analysis oriented environment.

  12. The physics analysis environment of the ZEUS experiment

    International Nuclear Information System (INIS)

    Bauerdick, L.A.T.; Derugin, O.; Gilkinson, D.; Kasemann, M.; Manczak, O.

    1995-12-01

    The ZEUS Experiment has over the last three years developed its own model of the central computing environment for physics analysis. This model has been designed to provide ZEUS physicists with powerful and user friendly tools for data analysis as well as to be truly scalable and open. (orig.)

  13. Comparison and suitability of genotype by environment analysis ...

    African Journals Online (AJOL)

    ACSS

    notable of which are; analysis of variance. (ANOVA), environmental ... estimation of variance components used to calculate trait heritability. .... Genotype by environment analysis for grain yield (kg ha-1). Rank. Finley and. Wricke's. Static stability. Cultivar. REML. ASVi. GGE biplot. Yield stability. Wilkenson ecovalence.

  14. A NEW INITIATIVE FOR TILING, STITCHING AND PROCESSING GEOSPATIAL BIG DATA IN DISTRIBUTED COMPUTING ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    A. Olasz

    2016-06-01

    Full Text Available Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/ developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu. The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows written in different development frameworks (e.g. Python, R or C#. It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.

  15. a New Initiative for Tiling, Stitching and Processing Geospatial Big Data in Distributed Computing Environments

    Science.gov (United States)

    Olasz, A.; Nguyen Thai, B.; Kristóf, D.

    2016-06-01

    Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage) nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats) are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/) developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu). The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows) written in different development frameworks (e.g. Python, R or C#). It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.

  16. DISTRIBUTIONS OF LONG-LIVED RADIOACTIVE NUCLEI PROVIDED BY STAR-FORMING ENVIRONMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Fatuzzo, Marco [Department of Physics, Xavier University, Cincinnati, OH 45207 (United States); Adams, Fred C. [Physics Department, University of Michigan, Ann Arbor, MI 48109 (United States)

    2015-11-01

    Radioactive nuclei play an important role in planetary evolution by providing an internal heat source, which affects planetary structure and helps facilitate plate tectonics. A minimum level of nuclear activity is thought to be necessary—but not sufficient—for planets to be habitable. Extending previous work that focused on short-lived nuclei, this paper considers the delivery of long-lived radioactive nuclei to circumstellar disks in star forming regions. Although the long-lived nuclear species are always present, their abundances can be enhanced through multiple mechanisms. Most stars form in embedded cluster environments, so that disks can be enriched directly by intercepting ejecta from supernovae within the birth clusters. In addition, molecular clouds often provide multiple episodes of star formation, so that nuclear abundances can accumulate within the cloud; subsequent generations of stars can thus receive elevated levels of radioactive nuclei through this distributed enrichment scenario. This paper calculates the distribution of additional enrichment for {sup 40}K, the most abundant of the long-lived radioactive nuclei. We find that distributed enrichment is more effective than direct enrichment. For the latter mechanism, ideal conditions lead to about 1 in 200 solar systems being directly enriched in {sup 40}K at the level inferred for the early solar nebula (thereby doubling the abundance). For distributed enrichment from adjacent clusters, about 1 in 80 solar systems are enriched at the same level. Distributed enrichment over the entire molecular cloud is more uncertain, but can be even more effective.

  17. Distributed open environment for data retrieval based on pattern recognition techniques

    International Nuclear Information System (INIS)

    Pereira, A.; Vega, J.; Castro, R.; Portas, A.

    2010-01-01

    Pattern recognition methods for data retrieval have been applied to fusion databases for the localization and extraction of similar waveforms within temporal evolution signals. In order to standardize the use of these methods, a distributed open environment has been designed. It is based on a client/server architecture that supports distribution, interoperability and portability between heterogeneous platforms. The server part is a single desktop application based on J2EE (Java 2 Enterprise Edition), which provides a mature standard framework and a modular architecture. It can handle transactions and concurrency of components that are deployed on JETTY, an embedded web container within the Java server application for providing HTTP services. The data management is based on Apache DERBY, a relational database engine also embedded on the same Java based solution. This encapsulation allows hiding of unnecessary details about the installation, distribution, and configuration of all these components but with the flexibility to create and allocate many databases on different servers. The DERBY network module increases the scope of the installed database engine by providing traditional Java database network connections (JDBC-TCP/IP). This avoids scattering several database engines (a unique embedded engine defines the rules for accessing the distributed data). Java thin clients (Java 5 or above is the unique requirement) can be executed in the same computer than the server program (for example a desktop computer) but also server and client software can be distributed in a remote participation environment (wide area networks). The thin client provides graphic user interface to look for patterns (entire waveforms or specific structural forms) and display the most similar ones. This is obtained with HTTP requests and by generating dynamic content (servlets) in response to these client requests.

  18. Statistical distribution analysis of rubber fatigue data

    Science.gov (United States)

    DeRudder, J. L.

    1981-10-01

    Average rubber fatigue resistance has previously been related to such factors as elastomer type, cure system, cure temperature, and stress history. This paper extends this treatment to a full statistical analysis of rubber fatigue data. Analyses of laboratory fatigue data are used to predict service life. Particular emphasis is given to the prediction of early tire splice failures, and to adaptations of statistical fatigue analysis for the particular service conditions of the rubber industry.

  19. Distributed interactive virtual environments for collaborative experiential learning and training independent of distance over Internet2.

    Science.gov (United States)

    Alverson, Dale C; Saiki, Stanley M; Jacobs, Joshua; Saland, Linda; Keep, Marcus F; Norenberg, Jeffrey; Baker, Rex; Nakatsu, Curtis; Kalishman, Summers; Lindberg, Marlene; Wax, Diane; Mowafi, Moad; Summers, Kenneth L; Holten, James R; Greenfield, John A; Aalseth, Edward; Nickles, David; Sherstyuk, Andrei; Haines, Karen; Caudell, Thomas P

    2004-01-01

    Medical knowledge and skills essential for tomorrow's healthcare professionals continue to change faster than ever before creating new demands in medical education. Project TOUCH (Telehealth Outreach for Unified Community Health) has been developing methods to enhance learning by coupling innovations in medical education with advanced technology in high performance computing and next generation Internet2 embedded in virtual reality environments (VRE), artificial intelligence and experiential active learning. Simulations have been used in education and training to allow learners to make mistakes safely in lieu of real-life situations, learn from those mistakes and ultimately improve performance by subsequent avoidance of those mistakes. Distributed virtual interactive environments are used over distance to enable learning and participation in dynamic, problem-based, clinical, artificial intelligence rules-based, virtual simulations. The virtual reality patient is programmed to dynamically change over time and respond to the manipulations by the learner. Participants are fully immersed within the VRE platform using a head-mounted display and tracker system. Navigation, locomotion and handling of objects are accomplished using a joy-wand. Distribution is managed via the Internet2 Access Grid using point-to-point or multi-casting connectivity through which the participants can interact. Medical students in Hawaii and New Mexico (NM) participated collaboratively in problem solving and managing of a simulated patient with a closed head injury in VRE; dividing tasks, handing off objects, and functioning as a team. Students stated that opportunities to make mistakes and repeat actions in the VRE were extremely helpful in learning specific principles. VRE created higher performance expectations and some anxiety among VRE users. VRE orientation was adequate but students needed time to adapt and practice in order to improve efficiency. This was also demonstrated successfully

  20. Non-uniform distribution pattern for differentially expressed genes of transgenic rice Huahui 1 at different developmental stages and environments.

    Directory of Open Access Journals (Sweden)

    Zhi Liu

    Full Text Available DNA microarray analysis is an effective method to detect unintended effects by detecting differentially expressed genes (DEG in safety assessment of genetically modified (GM crops. With the aim to reveal the distribution of DEG of GM crops under different conditions, we performed DNA microarray analysis using transgenic rice Huahui 1 (HH1 and its non-transgenic parent Minghui 63 (MH63 at different developmental stages and environmental conditions. Considerable DEG were selected in each group of HH1 under different conditions. For each group of HH1, the number of DEG was different; however, considerable common DEG were shared between different groups of HH1. These findings suggested that both DEG and common DEG were adequate for investigation of unintended effects. Furthermore, a number of significantly changed pathways were found in all groups of HH1, indicating genetic modification caused everlasting changes to plants. To our knowledge, our study for the first time provided the non-uniformly distributed pattern for DEG of GM crops at different developmental stages and environments. Our result also suggested that DEG selected in GM plants at specific developmental stage and environment could act as useful clues for further evaluation of unintended effects of GM plants.

  1. Determination of hydraulic conductivity from grain-size distribution for different depositional environments

    KAUST Repository

    Rosas, Jorge

    2013-06-06

    Over 400 unlithified sediment samples were collected from four different depositional environments in global locations and the grain-size distribution, porosity, and hydraulic conductivity were measured using standard methods. The measured hydraulic conductivity values were then compared to values calculated using 20 different empirical equations (e.g., Hazen, Carman-Kozeny) commonly used to estimate hydraulic conductivity from grain-size distribution. It was found that most of the hydraulic conductivity values estimated from the empirical equations correlated very poorly to the measured hydraulic conductivity values with errors ranging to over 500%. To improve the empirical estimation methodology, the samples were grouped by depositional environment and subdivided into subgroups based on lithology and mud percentage. The empirical methods were then analyzed to assess which methods best estimated the measured values. Modifications of the empirical equations, including changes to special coefficients and addition of offsets, were made to produce modified equations that considerably improve the hydraulic conductivity estimates from grain size data for beach, dune, offshore marine, and river sediments. Estimated hydraulic conductivity errors were reduced to 6 to 7.1m/day for the beach subgroups, 3.4 to 7.1m/day for dune subgroups, and 2.2 to 11m/day for offshore sediments subgroups. Improvements were made for river environments, but still produced high errors between 13 and 23m/day. © 2013, National Ground Water Association.

  2. Telefacturing Based Distributed Manufacturing Environment for Optimal Manufacturing Service by Enhancing the Interoperability in the Hubs

    Directory of Open Access Journals (Sweden)

    V. K. Manupati

    2017-01-01

    Full Text Available Recent happenings are surrounding the manufacturing sector leading to intense progress towards the development of effective distributed collaborative manufacturing environments. This evolving collaborative manufacturing not only focuses on digitalisation of this environment but also necessitates service-dependent manufacturing system that offers an uninterrupted approach to a number of diverse, complicated, dynamic manufacturing operations management systems at a common work place (hub. This research presents a novel telefacturing based distributed manufacturing environment for recommending the manufacturing services based on the user preferences. The first step in this direction is to deploy the most advanced tools and techniques, that is, Ontology-based Protégé 5.0 software for transforming the huge stored knowledge/information into XML schema of Ontology Language (OWL documents and Integration of Process Planning and Scheduling (IPPS for multijobs in a collaborative manufacturing system. Thereafter, we also investigate the possibilities of allocation of skilled workers to the best feasible operations sequence. In this context, a mathematical model is formulated for the considered objectives, that is, minimization of makespan and total training cost of the workers. With an evolutionary algorithm and developed heuristic algorithm, the performance of the proposed manufacturing system has been improved. Finally, to manifest the capability of the proposed approach, an illustrative example from the real-time manufacturing industry is validated for optimal service recommendation.

  3. A Distributed Snapshot Protocol for Efficient Artificial Intelligence Computation in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    JongBeom Lim

    2018-01-01

    Full Text Available Many artificial intelligence applications often require a huge amount of computing resources. As a result, cloud computing adoption rates are increasing in the artificial intelligence field. To support the demand for artificial intelligence applications and guarantee the service level agreement, cloud computing should provide not only computing resources but also fundamental mechanisms for efficient computing. In this regard, a snapshot protocol has been used to create a consistent snapshot of the global state in cloud computing environments. However, the existing snapshot protocols are not optimized in the context of artificial intelligence applications, where large-scale iterative computation is the norm. In this paper, we present a distributed snapshot protocol for efficient artificial intelligence computation in cloud computing environments. The proposed snapshot protocol is based on a distributed algorithm to run interconnected multiple nodes in a scalable fashion. Our snapshot protocol is able to deal with artificial intelligence applications, in which a large number of computing nodes are running. We reveal that our distributed snapshot protocol guarantees the correctness, safety, and liveness conditions.

  4. Analysis of urea distribution volume in hemodialysis.

    Science.gov (United States)

    Maduell, F; Sigüenza, F; Caridad, A; Miralles, F; Serrato, F

    1994-01-01

    According to the urea kinetic model it is considered that the urea distribution volume (V) is that of body water, and that it is distributed in only one compartment. Since the V value is different to measure, it is normal to use 58% of body weight, in spite of the fact that it may range from 35 to 75%. In this study, we have calculated the value of V by using an accurate method based on the total elimination of urea from the dialysate. We have studied the V, and also whether the different dialysis characteristics modify it. Thirty-five patients were included in this study, 19 men and 16 women, under a chronic hemodialysis programme. The dialysate was collected in a graduated tank, and the concentration of urea in plasma and in dialysate were determined every hour. Every patient received six dialysis sessions, changing the blood flow (250 or 350 ml/min), the ultrafiltration (0.5 or 1.5 l/h), membrane (cuprophane or polyacrylonitrile) and/or buffer (bicarbonate or acetate). At the end of the hemodialysis session, the V value ranged from 43 to 72% of body weight; nevertheless, this value was practically constant in every patient. The V value gradually increased throughout the dialysis session, 42.1 +/- 6.9% of body weight in the first hour, 50.7 +/- 7.5% in the second hour and 55.7 +/- 7.9% at the end of the dialysis session. The change of blood flow, ultrafiltration, membrane or buffer did not alter the results. The V value was significantly higher in men in comparison with women, 60.0 +/- 6.6% vs. 50.5 +/- 5.9% of body weight (p < 0.001).

  5. Pigeon interaction mode switch-based UAV distributed flocking control under obstacle environments.

    Science.gov (United States)

    Qiu, Huaxin; Duan, Haibin

    2017-11-01

    Unmanned aerial vehicle (UAV) flocking control is a serious and challenging problem due to local interactions and changing environments. In this paper, a pigeon flocking model and a pigeon coordinated obstacle-avoiding model are proposed based on a behavior that pigeon flocks will switch between hierarchical and egalitarian interaction mode at different flight phases. Owning to the similarity between bird flocks and UAV swarms in essence, a distributed flocking control algorithm based on the proposed pigeon flocking and coordinated obstacle-avoiding models is designed to coordinate a heterogeneous UAV swarm to fly though obstacle environments with few informed individuals. The comparative simulation results are elaborated to show the feasibility, validity and superiority of our proposed algorithm. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Capacity of Cognitive Radio with Partial Channel Distribution Information in Rayleigh Fading Environments

    Science.gov (United States)

    Xu, D.; Li, Q.

    2015-11-01

    This paper investigates the capacity of the secondary user (SU) in a cognitive radio (CR) network in Rayleigh fading environments. Different from existing works where perfect channel state information (CSI) or channel distribution information (CDI) of the interference link from the SU to the primary user (PU) is assumed to be available, this paper assumes that only partial CDI is available. Specifically, we assume the distribution parameter is unknown and estimated from a set of channel gain samples. With such partial CDI, closed-form expressions for the ergodic and outage capacities of the SU are obtained under the transmit power and the interference outage constraints. It is shown that the capacity with partial CDI is not degraded compared to that with perfect CDI if the interference outage constraint is loose. It is also shown that the capacity can be significantly improved by increasing the number of channel gain samples.

  7. Robust blind identification of room acoustic channels in symmetric alpha-stable distributed noise environments.

    Science.gov (United States)

    He, Hongsen; Lu, Jing; Chen, Jingdong; Qiu, Xiaojun; Benesty, Jacob

    2014-08-01

    Blind multichannel identification is generally sensitive to background noise. Although there have been some efforts in the literature devoted to improving the robustness of blind multichannel identification with respect to noise, most of those works assume that the noise is Gaussian distributed, which is often not valid in real room acoustic environments. This paper deals with the more practical scenario where the noise is not Gaussian. To improve the robustness of blind multichannel identification to non-Gaussian noise, a robust normalized multichannel frequency-domain least-mean M-estimate algorithm is developed. Unlike the traditional approaches that use the squared error as the cost function, the proposed algorithm uses an M-estimator to form the cost function, which is shown to be immune to non-Gaussian noise with a symmetric α-stable distribution. Experiments based on the identification of a single-input/multiple-output acoustic system demonstrate the robustness of the proposed algorithm.

  8. Integrating autonomous distributed control into a human-centric C4ISR environment

    Science.gov (United States)

    Straub, Jeremy

    2017-05-01

    This paper considers incorporating autonomy into human-centric Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance (C4ISR) environments. Specifically, it focuses on identifying ways that current autonomy technologies can augment human control and the challenges presented by additive autonomy. Three approaches to this challenge are considered, stemming from prior work in two converging areas. In the first, the problem is approached as augmenting what humans currently do with automation. In the alternate approach, the problem is approached as treating humans as actors within a cyber-physical system-of-systems (stemming from robotic distributed computing). A third approach, combines elements of both of the aforementioned.

  9. Analysis and Synthesis of Distributed Real-Time Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    requirements. As real-time systems become more complex, they are often implemented using distributed heterogeneous architectures. Analysis and Synthesis of Distributed Real-Time Embedded Systems addresses the design of real-time applications implemented using distributed heterogeneous architectures...... of the communication infrastructure, which has a significant impact on the overall system performance and cost. Analysis and Synthesis of Distributed Real-Time Embedded Systems considers the mapping and scheduling tasks within an incremental design process. To reduce the time-to-market of products, the design of real...... in important reductions of design costs. Analysis and Synthesis of Distributed Real-Time Embedded Systems will be of interest to advanced undergraduates, graduate students, researchers and designers involved in the field of embedded systems....

  10. Analysis and Optimization of Distributed Real-Time Embedded Systems

    DEFF Research Database (Denmark)

    Pop, Paul; Eles, Petru; Peng, Zebo

    2006-01-01

    An increasing number of real-time applications are today implemented using distributed heterogeneous architectures composed of interconnected networks of processors. The systems are heterogeneous not only in terms of hardware and software components, but also in terms of communication protocols...... for such heterogeneous distributed real-time embedded systems. More specifically, we discuss the schedulability analysis of hard real-time systems, highlighting particular aspects related to the heterogeneous and distributed nature of the applications. We also introduce several design optimization problems...

  11. Multi-objective optimization with estimation of distribution algorithm in a noisy environment.

    Science.gov (United States)

    Shim, Vui Ann; Tan, Kay Chen; Chia, Jun Yong; Al Mamun, Abdullah

    2013-01-01

    Many real-world optimization problems are subjected to uncertainties that may be characterized by the presence of noise in the objective functions. The estimation of distribution algorithm (EDA), which models the global distribution of the population for searching tasks, is one of the evolutionary computation techniques that deals with noisy information. This paper studies the potential of EDAs; particularly an EDA based on restricted Boltzmann machines that handles multi-objective optimization problems in a noisy environment. Noise is introduced to the objective functions in the form of a Gaussian distribution. In order to reduce the detrimental effect of noise, a likelihood correction feature is proposed to tune the marginal probability distribution of each decision variable. The EDA is subsequently hybridized with a particle swarm optimization algorithm in a discrete domain to improve its search ability. The effectiveness of the proposed algorithm is examined via eight benchmark instances with different characteristics and shapes of the Pareto optimal front. The scalability, hybridization, and computational time are rigorously studied. Comparative studies show that the proposed approach outperforms other state of the art algorithms.

  12. Fugitive methane emissions from leak-prone natural gas distribution infrastructure in urban environments.

    Science.gov (United States)

    Hendrick, Margaret F; Ackley, Robert; Sanaie-Movahed, Bahare; Tang, Xiaojing; Phillips, Nathan G

    2016-06-01

    Fugitive emissions from natural gas systems are the largest anthropogenic source of the greenhouse gas methane (CH4) in the U.S. and contribute to the risk of explosions in urban environments. Here, we report on a survey of CH4 emissions from 100 natural gas leaks in cast iron distribution mains in Metro Boston, MA. Direct measures of CH4 flux from individual leaks ranged from 4.0 - 2.3 × 10(4) g CH4•day(-1). The distribution of leak size is positively skewed, with 7% of leaks contributing 50% of total CH4 emissions measured. We identify parallels in the skewed distribution of leak size found in downstream systems with midstream and upstream stages of the gas process chain. Fixing 'superemitter' leaks will disproportionately stem greenhouse gas emissions. Fifteen percent of leaks surveyed qualified as potentially explosive (Grade 1), and we found no difference in CH4 flux between Grade 1 leaks and all remaining leaks surveyed (p = 0.24). All leaks must be addressed, as even small leaks cannot be disregarded as 'safely leaking.' Key methodological impediments to quantifying and addressing the impacts of leaking natural gas distribution infrastructure involve inconsistencies in the manner in which gas leaks are defined, detected, and classified. To address this need, we propose a two-part leak classification system that reflects both the safety and climatic impacts of natural gas leaks. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Analysis of the moral habitability of the nursing work environment.

    Science.gov (United States)

    Peter, Elizabeth H; Macfarlane, Amy V; O'Brien-Pallas, Linda L

    2004-08-01

    Following health reform, nurses have experienced the tremendous stress of heavy workloads, long hours and difficult professional responsibilities. In recognition of these problems, a study was conducted that examined the impact of the working environment on the health of nurses. After conducting focus groups across Canada with nurses and others well acquainted with nursing issues, it became clear that the difficult work environments described had significant ethical implications. The aim of this paper is to report the findings of research that examined the moral habitability of the nursing working environment. A secondary analysis was conducted using the theoretical work of Margaret Urban Walker. Moral practices and responsibilities from Walker's perspective cannot be extricated from other social roles, practices and divisions of labour. Moral-social orders, such as work environments in this research, must be made transparent to examine their moral habitability. Morally habitable environments are those in which differently situated people experience their responsibilities as intelligible and coherent. They also foster recognition, cooperation and shared benefits. Four overarching categories were developed through the analysis of the data: (1) oppressive work environments; (2) incoherent moral understandings; (3) moral suffering and (4) moral influence and resistance. The findings clearly indicate that participants perceived the work environment to be morally uninhabitable. The social and spatial positioning of nurses left them vulnerable to being overburdened by and unsure of their responsibilities. Nevertheless, nurses found meaningful ways to resist and to influence the moral environment. We recommend that nurses develop strong moral identities, make visible the inseparability of their proximity to patients and moral accountability, and further identify what forms of collective action are most effective in improving the moral habitability of their work

  14. Multiple defect distributions on weibull statistical analysis of fatigue ...

    African Journals Online (AJOL)

    By relaxing the assumptions of a single cast defect distribution, of uniformity throughout the material and of uniformity from specimen to specimen, Weibull statistical analysis for multiple defect distributions have been applied to correctly describe the fatigue life data of aluminium alloy castings having multiple cast defects ...

  15. Analysis of color environment in nuclear power plants

    International Nuclear Information System (INIS)

    Natori, Kazuyuki; Akagi, Ichiro; Souma, Ichiro; Hiraki, Tadao; Sakurai, Yukihiro.

    1996-01-01

    This article reports the results of color and psychological analysis of the outlook of nuclear power plants and the visual environments inside of the plants. Study one was the color measurements of the outlook of nuclear plants and the visual environment inside of the plants. Study two was a survey of the impressions on the visual environments of nuclear plants obtained from observers and interviews of the workers. Through these analysis, we have identified the present state of, and the problems of the color environments of the nuclear plants. In the next step, we have designed the color environments of inside and outside of the nuclear plants which we would recommend (inside designs were about fuel handling room, operation floor of turbine building, observers' pathways, central control room, rest room for the operators). Study three was the survey about impressions on our design inside and outside of the nuclear plants. Nuclear plant observers, residents in Osaka city, residents near the nuclear plants, the operators, employees of subsidiary company and the PR center guides rated their impressions on the designs. Study four was the survey about the design of the rest room for the operators controlling the plants. From the results of four studies, we have proposed some guidelines and problems about the future planning about the visual environments of nuclear power plants. (author)

  16. Evolution of A Distributed Live, Virtual, Constructive Environment for Human in the Loop Unmanned Aircraft Testing

    Science.gov (United States)

    Murphy, James R.; Otto, Neil M.

    2017-01-01

    NASA's Unmanned Aircraft Systems Integration in the National Airspace System Project is conducting human in the loop simulations and flight testing intended to reduce barriers associated with enabling routine airspace access for unmanned aircraft. The primary focus of these tests is interaction of the unmanned aircraft pilot with the display of detect and avoid alerting and guidance information. The project's integrated test and evaluation team was charged with developing the test infrastructure. As with any development effort, compromises in the underlying system architecture and design were made to allow for the rapid prototyping and open-ended nature of the research. In order to accommodate these design choices, a distributed test environment was developed incorporating Live, Virtual, Constructive, (LVC) concepts. The LVC components form the core infrastructure support simulation of UAS operations by integrating live and virtual aircraft in a realistic air traffic environment. This LVC infrastructure enables efficient testing by leveraging the use of existing assets distributed across multiple NASA Centers. Using standard LVC concepts enable future integration with existing simulation infrastructure.

  17. Gamma-ray Burst Formation Environment: Comparison of Redshift Distributions of GRB Afterglows

    Directory of Open Access Journals (Sweden)

    Sung-Eun Kim

    2005-12-01

    Full Text Available Since gamma-ray bursts(GRBs have been first known to science societites in 1973, many scientists are involved in their studies. Observations of GRB afterglows provide us with much information on the environment in which the observed GRBs are born. Study of GRB afterglows deals with longer timescale emissions in lower energy bands (e.g., months or even up to years than prompt emissions in gamma-rays. Not all the bursts accompany afterglows in whole ranges of wavelengths. It has been suggested as a reason for that, for instance, that radio and/or X-ray afterglows are not recorded mainly due to lower sensitivity of detectors, and optical afterglows due to extinctions in intergalactic media or self-extinctions within a host galaxy itself. Based on the idea that these facts may also provide information on the GRB environment, we analyze statistical properties of GRB afterglows. We first select samples of the redshift-known GRBs according to the wavelength of afterglow they accompanied. We then compare their distributions as a function of redshift, using statistical methods. As a results, we find that the distribution of the GRBs with X-ray afterglows is consistent with that of the GRBs with optical afterglows. We, therefore, conclude that the lower detection rate of optical afterglows is not due to extinctions in intergalactic media.

  18. [Distribution and molecular type ofSalmonellafrom external environment in Henan province].

    Science.gov (United States)

    Mu, Y J; Zhang, B F; Zhao, J Y; Wang, Z Q; Xia, S L; Huang, X Y; Xu, B L

    2016-10-10

    Objective: To understand the distribution of Salmonella in external environment in Henan province, and explore the distribution of different serotypes of the Salmonella and their homology. Methods: A total of 4 488 samples were collected form animal dung, meat products and kitchen utensils, and identified by biochemical tests and serotyped by serum agglutination reaction. The predominant serotypes were further typed by PFGE. Results: A total of 324 Salmonella strains were detected in these samples, the detection rate was 7.21 % . The 324 Salmonella isolates belonged to 39 serotypes, S. enteritidis (24.07 %, 78/324) and S. derby (20.37 %, 66/324) were predominant. Forty six strains of S. enteritidis and 30 strains of S. derby were divided into 12 and 17 molecular patterns by digestion with Xba Ⅰ , while chicken and swine were the predominant animal hosts. Conclusions: Serotyping of external environment Salmonella were phenotypically diverse and the serotype of Salmonella from different sources were different. The same clone was prevalent in same area. It is necessary to strengthen supervision and surveillance to ensure food safety.

  19. [Analysis in distribution of Atractylodes lancea at Maoshan regions].

    Science.gov (United States)

    Sun, Yu-Zhang; Guo, Lan-Ping; Huang, Lu-Qi; Zhu, Wen-Quan; Pan, Yao-Zhong; Gao, Wei

    2008-05-01

    To Investigate the distribution of Atractylodes lancea at Maoshan regions. To combine the plot sampling with GIS technology in the analysis of distribution and its factors. The biomass of Atractylodes lancea was related to the growth of Ouercus serrata var. brevipetiolata, slope and humidity. The distribution of Atractylodes lancea which was less in north Maoshen region, most in south region, least in middle region. The main factor of distributing sintuation is the human beings. The leading factors in the biomass of Atractylodes lancea are Ouercus serrata var. brevipetiolata and slope.

  20. The environment power system analysis tool development program

    Science.gov (United States)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  1. POPULATION STRUCTURE AND SPATIAL DISTRIBUTION OF Ceratozamia mexicana BRONGN. (ZAMIACEAE IN PRESERVED AND DISTURBED ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    Andrés Rivera-Fernández

    2012-11-01

    Full Text Available Vegetal populations are affected by biotic and abiotic factors that influence the regeneration processes. The aims of this study were to know the population structure of Ceratozamia mexicana under two contrasting conditions (conserved site and disturbed site, and to determine if the sexual structure, the population density and the spatial distribution of C. mexicana are modified by effect of disturbance. Eight plots of 25 m2 within each site (conserved and disturbed were used. The structure and spatial distribution of the sites were determined. Methods included analysis of variance, spatial distribution indexes, and climatic and edaphic factors determined by conventional methods for their comparison. The conserved site showed a demographic structure of an inverted "J", while the disturbed site varied slightly with more discontinuous distribution. Population density was 0.78 individuals/m2 in the conserved site and 0.26 individuals/m2 in the disturbed site. Spatial distribution for all development stages of the plant was random, with the exception of the seedling stage, which was aggregated. Results showed that perturbation decreases the density of plants and removes reproductive individuals, which threatens the persistence of the population.

  2. Estimation of Speciation and Distribution of {sup 131}I in urban and natural Environments

    Energy Technology Data Exchange (ETDEWEB)

    Hormann, Volker; Fischer, Helmut W. [University of Bremen, Institute of Environmental Physics, Otto-Hahn-Allee 1, 28359 Bremen (Germany)

    2014-07-01

    {sup 131}I is a radionuclide that may be introduced into natural and urban environments via several pathways. As a result of nuclear accidents it may be washed out from air or settle onto the ground by dry deposition. In urban landscapes this is again washed out by rain, partly introduced into the sewer system and thus transported to the next wastewater plant where it may accumulate in certain compartments. In rural landscapes it may penetrate the soil and be more or less available to plant uptake depending on chemical and physical conditions. On a regular basis, {sup 131}I is released into the urban sewer system in the course of therapeutic and diagnostic treatment of patients with thyroid diseases. The speciation of iodine in the environment is complex. Depending on redox state and biological activity, it may appear as I{sup -}, IO{sub 3}{sup -}, I{sub 2} or bound to organic molecules (e.g. humic acids). Moreover, some of these species are bound to surfaces of particles suspended in water or present in soil, e.g. hydrous ferric oxides (HFO). It is to be expected that speciation and solid-liquid distribution of iodine strongly depends on environmental conditions. In this study, the speciation and solid-liquid distribution of iodine in environmental samples such as waste water, sewage sludge and soil are estimated with the help of the geochemical code PHREEQC. The calculations are carried out using chemical equilibrium and sorption data from the literature and chemical analyses of the media. We present the results of these calculations and compare them with experimental results of medical {sup 131}I in waste water and sewage sludge. The output of this study will be used in future work where transport and distribution of iodine in wastewater treatment plants and in irrigated agricultural soils will be modeled. (authors)

  3. A distributed computing environment with support for constraint-based task scheduling and scientific experimentation

    Energy Technology Data Exchange (ETDEWEB)

    Ahrens, J.P.; Shapiro, L.G.; Tanimoto, S.L. [Univ. of Washington, Seattle, WA (United States). Dept. of Computer Science and Engineering

    1997-04-01

    This paper describes a computing environment which supports computer-based scientific research work. Key features include support for automatic distributed scheduling and execution and computer-based scientific experimentation. A new flexible and extensible scheduling technique that is responsive to a user`s scheduling constraints, such as the ordering of program results and the specification of task assignments and processor utilization levels, is presented. An easy-to-use constraint language for specifying scheduling constraints, based on the relational database query language SQL, is described along with a search-based algorithm for fulfilling these constraints. A set of performance studies show that the environment can schedule and execute program graphs on a network of workstations as the user requests. A method for automatically generating computer-based scientific experiments is described. Experiments provide a concise method of specifying a large collection of parameterized program executions. The environment achieved significant speedups when executing experiments; for a large collection of scientific experiments an average speedup of 3.4 on an average of 5.5 scheduled processors was obtained.

  4. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas

    2002-01-01

    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic...

  5. PROTONEGOCIATIONS - SALES FORECAST AND COMPETITIVE ENVIRONMENT ANALYSIS METHOD

    Directory of Open Access Journals (Sweden)

    Lupu Adrian Gelu

    2009-05-01

    Full Text Available Protonegotiation management, as part of successful negotiations of the organizations, is an issue for analysis extremely important for today’s managers in the confrontations generated by the changes of the environments in the period of transition to marke

  6. Analysis of aeromagnetic data over Garkida and environs, North ...

    African Journals Online (AJOL)

    The study area lies between latitude 10o00' and 11o00'N and longitudes 12o 15' and 13o 15'E in the Northeastern basement complex of Nigeria. The total magnetic intensity over Garkida and its environs after the digitization showed magnetic signature ranging from 7720 nT to 7960 nT. Two dimensional Spectral analysis ...

  7. analysis of aeromagnetic data over garkida and environs, north ...

    African Journals Online (AJOL)

    BARTH EKWUEME

    1974-01-01

    Jan 1, 1974 ... eastern basement complex of Nigeria. The total magnetic intensity over Garkida and its environs after the digitization showed magnetic signature ranging from 7720 nT to 7960 nT. Two dimensional Spectral analysis of Aeromagnetic data over the area has been carried out in order to determine the average ...

  8. Genotype x environment interaction and stability analysis for yield ...

    African Journals Online (AJOL)

    etc

    2015-05-06

    May 6, 2015 ... Various stability indices were used to assess stability and genotype by environment performances. Combined analysis of variance (ANOVA) for yield .... Mean grain yield (kg/ha) of 17 Kabuli-type chickpea genotypes grown at five locations in Ethiopia. ... Performance trials have to be conducted in multiple.

  9. Integrated Design Engineering Analysis (IDEA) Environment Automated Generation of Structured CFD Grids using Topology Methods

    Science.gov (United States)

    Kamhawi, Hilmi N.

    2012-01-01

    This report documents the work performed from March 2010 to March 2012. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML) as a framework and supporting the configuration design and parametric CFD grid generation. This report will focus on describing the work in the area of parametric CFD grid generation using novel concepts for defining the interaction between the mesh topology and the geometry in such a way as to separate the mesh topology from the geometric topology while maintaining the link between the mesh topology and the actual geometry.

  10. Water consumption and soil moisture distribution in melon crop with mulching and in a protected environment

    Directory of Open Access Journals (Sweden)

    Rodrigo Otávio Câmara Monteiro

    2013-06-01

    Full Text Available Mulching has become an important technique for land cover, but there are some technical procedures which should be adjusted for these new modified conditions to establish optimum total water depth. It is also important to observe the soil-water relations as soil water distribution and wetted volume dimensions. The objective of the present study was to estimate melon evapotranspiration under mulching in a protected environment and to verify the water spatial distribution around the melon root system in two soil classes. Mulching provided 27 mm water saving by reducing water evaporation. In terms of volume each plant received, on average, the amount of 175.2 L of water in 84 days of cultivation without mulching, while when was used mulching the water requirement was 160.2 L per plant. The use of mulching reduced the soil moisture variability throughout the crop cycle and allowed a greater distribution of soil water that was more intense in the clay soil. The clayey soil provided on average 43 mm more water depth retention in 0.50 m soil deep relative to the sandy loam soil, and reduced 5.6 mm the crop cycle soil moisture variation compared to sandy loam soil.

  11. Advanced and Highly Reliable Distributed Computing in a Fully Automated Environment

    Science.gov (United States)

    Ballhorn, Gerd; Bloecker, Martin; Keogan, Danny; Moberts, Peter

    2002-12-01

    Within the past decades data file sizes and the related computing power for mask data preparation grew linearly following Moore"s law. However, within the last two years the balance between rising data complexity and computing equipment became unstable due to the massive introduction of OPC and the broad rollout of complex variable shaped beam (VSB) data formats. The disturbance of the former linear coherence led to exploding data conversion times (exceeding 100 hours for a single layer) accompanied by heavily escalating data volumes. A very promising way out of that dilemma is the recently announced introduction of distributed job processing within the mask data processing flow. This way was initially introduced to fracture flat jobs. Building on our first promising results last year we now implemented a fully automated design flow with an integrated Linux based cluster for distributed processing. The cluster solution is built in an automated environment in coexistence with our conventional SUN servers. We implemented a highly reliable DP flow on a large scale base which became as stable as our former Solaris SUN system. In the meanwhile we reached a job first time success rate exceeding 99%. After reaching a very stable state we recently started to extend our flat processing conversion steps by investigating hierarchical distributed processing in CATS version 23. We also report on benchmark results comparing new promising hardware configurations to further improve the cluster performance.

  12. Experimental investigation of angular distributions of exposure rate in natural environments

    International Nuclear Information System (INIS)

    Nagaoka, Toshi; Sakamoto, Ryuichi; Saito, Kimiaki; Moriuchi, Shigeru

    1981-01-01

    Angular distributions of exposure rate in natural radiation environment were measured using a 1''diameter x 1'' NaI(Tl) scintillation detector which was placed on the axis of a set of concentric circular lead shieldings. Incident polar angle of γ ray was determined by changing the relative position between the detector and the shieldings. Incident γ rays were integrated over the whole azimuthal angle (0 - 2π) in each polar angle. And then, the angular distribution was obtained by subtracting the responses between each polar angle. As the incident solid angle of γ ray could be set large enough, it was possible not only to set the measuring time short but also to measure with high accuracy. Measurements were made under various environmental conditions; two on open field and one in a pine grove at a height of 1 m, and one at a height of 10 m on a tower. And also, one measurement was made in a one-storied wooden house. Experimental results supported the angular distributions expected from calculations, previously reported by other authors, and it was shown that this method can be applied well even where a calculational evaluation is difficult due to the complexity of its topography. (author)

  13. Modeling and analysis of solar distributed generation

    Science.gov (United States)

    Ortiz Rivera, Eduardo Ivan

    power point tracking algorithms. Finally, several PV power applications will be presented like circuit analysis for a load connected to two different PV arrays, speed control for a do motor connected to a PVM, and a novel single phase photovoltaic inverter system using the Z-source converter.

  14. Characterizing and predicting species distributions across environments and scales: Argentine ant occurrences in the eye of the beholder

    Science.gov (United States)

    Menke, S.B.; Holway, D.A.; Fisher, R.N.; Jetz, W.

    2009-01-01

    Aim: Species distribution models (SDMs) or, more specifically, ecological niche models (ENMs) are a useful and rapidly proliferating tool in ecology and global change biology. ENMs attempt to capture associations between a species and its environment and are often used to draw biological inferences, to predict potential occurrences in unoccupied regions and to forecast future distributions under environmental change. The accuracy of ENMs, however, hinges critically on the quality of occurrence data. ENMs often use haphazardly collected data rather than data collected across the full spectrum of existing environmental conditions. Moreover, it remains unclear how processes affecting ENM predictions operate at different spatial scales. The scale (i.e. grain size) of analysis may be dictated more by the sampling regime than by biologically meaningful processes. The aim of our study is to jointly quantify how issues relating to region and scale affect ENM predictions using an economically important and ecologically damaging invasive species, the Argentine ant (Linepithema humile). Location: California, USA. Methods: We analysed the relationship between sampling sufficiency, regional differences in environmental parameter space and cell size of analysis and resampling environmental layers using two independently collected sets of presence/absence data. Differences in variable importance were determined using model averaging and logistic regression. Model accuracy was measured with area under the curve (AUC) and Cohen's kappa. Results: We first demonstrate that insufficient sampling of environmental parameter space can cause large errors in predicted distributions and biological interpretation. Models performed best when they were parametrized with data that sufficiently sampled environmental parameter space. Second, we show that altering the spatial grain of analysis changes the relative importance of different environmental variables. These changes apparently result

  15. GALAXY ENVIRONMENTS OVER COSMIC TIME: THE NON-EVOLVING RADIAL GALAXY DISTRIBUTIONS AROUND MASSIVE GALAXIES SINCE z = 1.6

    International Nuclear Information System (INIS)

    Tal, Tomer; Van Dokkum, Pieter G.; Leja, Joel; Franx, Marijn; Wake, David A.; Whitaker, Katherine E.

    2013-01-01

    We present a statistical study of the environments of massive galaxies in four redshift bins between z = 0.04 and z = 1.6, using data from the Sloan Digital Sky Survey and the NEWFIRM Medium Band Survey. We measure the projected radial distribution of galaxies in cylinders around a constant number density selected sample of massive galaxies and utilize a statistical subtraction of contaminating sources. Our analysis shows that massive primary galaxies typically live in group halos and are surrounded by 2-3 satellites with masses more than one-tenth of the primary galaxy mass. The cumulative stellar mass in these satellites roughly equals the mass of the primary galaxy itself. We further find that the radial number density profile of galaxies around massive primaries has not evolved significantly in either slope or overall normalization in the past 9.5 Gyr. A simplistic interpretation of this result can be taken as evidence for a lack of mergers in the studied groups and as support for a static evolution model of halos containing massive primaries. Alternatively, there exists a tight balance between mergers and accretion of new satellites such that the overall distribution of galaxies in and around the halo is preserved. The latter interpretation is supported by a comparison to a semi-analytic model, which shows a similar constant average satellite distribution over the same redshift range.

  16. Prototype for a generic thin-client remote analysis environment for CMS

    International Nuclear Information System (INIS)

    Steenberg, C.D.; Bunn, J.J.; Hickey, T.M.; Holtman, K.; Legrand, I.; Litvin, V.; Newman, H.B.; Samar, A.; Singh, S.; Wilkinson, R.

    2001-01-01

    The multi-tiered architecture of the highly-distributed CMS computing systems necessitates a flexible data distribution and analysis environment. The authors describe a prototype analysis environment which functions efficiently over wide area networks using a server installed at the Caltech/UCSD Tier 2 prototype to analyze CMS data stored at various locations using a thin client. The analysis environment is based on existing HEP (Anaphe) and CMS (CARF, ORCA, IGUANA) software technology on the server accessed from a variety of clients. A Java Analysis Studio (JAS, from SLAC) plug-in is being developed as a reference client. The server is operated as a 'black box' on the proto-Tier2 system. ORCA objectivity databases (e.g. an existing large CMS Muon sample) are hosted on the master and slave nodes, and remote clients can request processing of queries across the server nodes, and get the histogram results returned and rendered in the client. The server is implemented using pure C++, and use XML-RPC as a language-neutral transport. This has several benefits, including much better scalability, better integration with CARF-ORCA, and importantly, makes the work directly useful to other non-Java general-purpose analysis and presentation tools such as Hippodraw, Lizard, or ROOT

  17. Probabilistic analysis in normal operation of distribution system with distributed generation

    DEFF Research Database (Denmark)

    Villafafila-Robles, R.; Sumper, A.; Bak-Jensen, B.

    2011-01-01

    Nowadays, the incorporation of high levels of small-scale non-dispatchable distributed generation is leading to the transition from the traditional 'vertical' power system structure to a 'horizontally-operated' power system, where the distribution networks contain both stochastic generation...... and load. This fact increases the number of stochastic inputs and dependence structures between them need to be considered. The deterministic analysis is not enough to cope with these issues and a new approach is needed. Probabilistic analysis provides a better approach. Moreover, as distribution systems...... consist of a small areas, the dependence between stochastic inputs should be considered. In this paper, probabilistic analysis based on Monte Carlo simulation is described and applied to a real system....

  18. Comparative analysis through probability distributions of a data set

    Science.gov (United States)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  19. Ephemeral-Secret-Leakage Secure ID-Based Three-Party Authenticated Key Agreement Protocol for Mobile Distributed Computing Environments

    Directory of Open Access Journals (Sweden)

    Chao-Liang Liu

    2018-03-01

    Full Text Available A three-party Authenticated Key Agreement (AKA protocol in the distributed computing environment is a client that requests services from an application server through an authentication server. The authentication server is responsible for authenticating the participating entities and helping them to construct a common session key. Adopting the Key Transfer Authentication Protocol (KTAP in such an environment, the authentication server is able to monitor the communication messages to prevent and trace network crime. However, the session key in the KTAP setting is created only by the authentication server and is vulnerable to the resilience of key control. On the other hand, with the rapid growth of network technologies, mobile devices are widely used by people to access servers in the Internet. Many AKA protocols for mobile devices have been proposed, however, most protocols are vulnerable to Ephemeral Secret Leakage (ESL attacks which compromise the private keys of clients and the session key by an adversary from eavesdropped messages. This paper proposes a novel ESL-secure ID-based three-party AKA protocol for mobile distributed computing environments based on ESL-secure ID-based Authenticated Key Exchange (ID-AKE protocol. The proposed protocol solves the key control problem in KTAP while retaining the advantages of preventing and tracing network crime in KTAP and also resists ESL attacks. The AVISPA tool simulation results confirm the correctness of the protocol security analysis. Furthermore, we present a parallel version of the proposed ESL-secure ID-based three-party AKA protocol that is communication-efficient.

  20. System analysis and planning of a gas distribution network

    Energy Technology Data Exchange (ETDEWEB)

    Salas, Edwin F.M.; Farias, Helio Monteiro [AUTOMIND, Rio de Janeiro, RJ (Brazil); Costa, Carla V.R. [Universidade Salvador (UNIFACS), BA (Brazil)

    2009-07-01

    The increase in demand by gas consumers require that projects or improvements in gas distribution networks be made carefully and safely to ensure a continuous, efficient and economical supply. Gas distribution companies must ensure that the networks and equipment involved are defined and designed at the appropriate time to attend to the demands of the market. To do that a gas distribution network analysis and planning tool should use distribution networks and transmission models for the current situation and the future changes to be implemented. These models are used to evaluate project options and help in making appropriate decisions in order to minimize the capital investment in new components or simple changes in operational procedures. Gas demands are increasing and it is important that gas distribute design new distribution systems to ensure this growth, considering financial constraints of the company, as well as local legislation and regulation. In this study some steps of developing a flexible system that attends to those needs will be described. The analysis of distribution requires geographically referenced data for the models as well as an accurate connectivity and the attributes of the equipment. GIS systems are often used as a deposit center that holds the majority of this information. GIS systems are constantly updated as distribution network equipment is modified. The distribution network modeling gathered from this system ensures that the model represents the current network condition. The benefits of this architecture drastically reduce the creation and maintenance cost of the network models, because network components data are conveniently made available to populate the distribution network. This architecture ensures that the models are continually reflecting the reality of the distribution network. (author)

  1. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2013-01-01

    operation will be changed by various parameters of DERs. This article proposed a modelling framework for an overview analysis on the correlation between DERs. Furthermore, to validate the framework, the authors described the reference models of different categories of DERs with their unique characteristics......, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...

  2. Heavy-tailed distribution of the SSH Brute-force attack duration in a multi-user environment

    Science.gov (United States)

    Lee, Jae-Kook; Kim, Sung-Jun; Park, Chan Yeol; Hong, Taeyoung; Chae, Huiseung

    2016-07-01

    Quite a number of cyber-attacks to be place against supercomputers that provide highperformance computing (HPC) services to public researcher. Particularly, although the secure shell protocol (SSH) brute-force attack is one of the traditional attack methods, it is still being used. Because stealth attacks that feign regular access may occur, they are even harder to detect. In this paper, we introduce methods to detect SSH brute-force attacks by analyzing the server's unsuccessful access logs and the firewall's drop events in a multi-user environment. Then, we analyze the durations of the SSH brute-force attacks that are detected by applying these methods. The results of an analysis of about 10 thousands attack source IP addresses show that the behaviors of abnormal users using SSH brute-force attacks are based on human dynamic characteristics of a typical heavy-tailed distribution.

  3. A review on distribution and monitoring of hormones in the environment and their removal in wastewater treatment systems

    OpenAIRE

    Rahele Kafaei; Sina Dobaradaran

    2014-01-01

    Steroid hormones of the Endocrine disrupting compounds (EDC) are steroid hormones, which cause negative effects on human health, animals and ecosystems balance, have become a major concern in modern societies. In recent years numerous studies have performed on hormone distribution in the environment, especially in aquatic environments and the ways that they have been removed. Hormones entrance into the environment primarily is through wastewater, municipal wastewater treatment sludge, hospita...

  4. Multiuser Diversity with Adaptive Modulation in Non-Identically Distributed Nakagami Fading Environments

    KAUST Repository

    Rao, Anlei

    2012-09-08

    In this paper, we analyze the performance of adaptive modulation with single-cell multiuser scheduling over independent but not identical distributed (i.n.i.d.) Nakagami fading channels. Closed-form expressions are derived for the average channel capacity, spectral efficiency, and bit-error-rate (BER) for both constant-power variable-rate and variable-power variable-rate uncoded/coded M-ary quadrature amplitude modulation (M-QAM) schemes. We also study the impact of time delay on the average BER of adaptive M-QAM. Selected numerical results show that the multiuser diversity brings a considerably better performance even over i.n.i.d. fading environments.

  5. A proposal for a user-level, message passing interface in a distributed memory environment

    Energy Technology Data Exchange (ETDEWEB)

    Dongarra, J.J. (Tennessee Univ., Knoxville, TN (United States). Dept. of Computer Science Oak Ridge National Lab., TN (United States)); Hempel, R. (Gesellschaft fuer Mathematik und Datenverarbeitung mbH Bonn, Sankt Augustin (Germany)); Hey, A.J.G. (Southampton Univ. (United Kingdom). Dept. of Electronics and Computer Science); Walker, D.W. (Oak Ridge National Lab., TN (United States))

    1993-02-01

    This paper describes Message Passing Interface 1 (MPI1), a proposed library interface standard for supporting point-to-point message passing. The intended standard will be provided with Fortran 77 and C interfaces, and will form the basis of a standard high level communication environment featuring collective communication and data distribution transformations. The standard proposed here provides blocking, nonblocking, and synchronized message passing between pairs of processes, with message selectivity by source process and message type. Provision is made for noncontiguous messages. Context control provides a convenient means of avoiding message selectivity conflicts between different phases of an application. The ability to form and manipulate process groups permits task parallelism to be exploited, and is a useful abstraction in controlling certain types of collective communication.

  6. The development of a distributed computing environment for the design and modeling of plasma spectroscopy experiments

    International Nuclear Information System (INIS)

    Nash, J.K.; Eme, W.G.; Lee, R.W.; Salter, J.M.

    1994-10-01

    The design and analysis of plasma spectroscopy experiments can be significantly complicated by relatively routine computational tasks arising from the massive amount of data encountered in the experimental design and analysis stages of the work. Difficulties in obtaining, computing, manipulating and visualizing the information represent not simply an issue of convenience -- they have a very real limiting effect on the final quality of the data and on the potential for arriving at meaningful conclusions regarding an experiment. We describe ongoing work in developing a portable UNIX environment shell with the goal of simplifying and enabling these activities for the plasma-modeling community. Applications to the construction of atomic kinetics models and to the analysis of x-ray transmission spectroscopy will be shown

  7. WCET Analysis of Java Bytecode Featuring Common Execution Environments

    DEFF Research Database (Denmark)

    Luckow, Kasper Søe; Thomsen, Bent; Frost, Christian

    2011-01-01

    are modelled as Networks of Timed Automata (NTA) and given as input to the state-of-the-art UPPAAL model checking tool. The tool is evaluated through a case study based on the classic text-book example of a hard real-time control system in a mine pump. The system is hosted on an execution environment featuring......We present a novel tool for statically determining the Worst Case Execution Time (WCET) of Java Bytecode-based programs called Tool for Execution Time Analysis of Java bytecode (TetaJ). This tool differentiates itself from existing tools by separating the individual constituents of the execution...... environment into independent components. The prime benefit is that it can be used for execution environments featuring common embedded processors and software implementations of the JVM. TetaJ employs a model checking approach for statically determining WCET where the Java program, the JVM, and the hardware...

  8. An integrated economic and distributional analysis of energy policies

    International Nuclear Information System (INIS)

    Labandeira, Xavier; Labeaga, Jose M.; Rodriguez, Miguel

    2009-01-01

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation.

  9. An integrated economic and distributional analysis of energy policies

    Energy Technology Data Exchange (ETDEWEB)

    Labandeira, Xavier [Facultade de CC. Economicas, University of Vigo, 36310 Vigo (Spain); Labeaga, Jose M. [Instituto de Estudios Fiscales, Avda. Cardenal Herrera Oria 378, 28035 Madrid (Spain); Rodriguez, Miguel [Facultade de CC. Empresariais e Turismo, University of Vigo, 32004 Ourense (Spain)

    2009-12-15

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  10. Analysis of communication based distributed control of MMC for HVDC

    DEFF Research Database (Denmark)

    Huang, Shaojun; Teodorescu, Remus; Mathe, Laszlo

    2013-01-01

    for high power and high voltage application is a very challenging task. For the reason that distributed control architecture could maintain the modularity of the MMC, this control architecture will be investigated and a distributed control system dedicated for MMC will be proposed in this paper....... The suitable communication technologies, modulation and control techniques for the proposed distributed control system are discussed and compared. Based on the frequency domain modeling and analysis of the distributed control system, the controllers of the different control loops are designed by analytical...... methods and Matlab tools. Finally, sensitiveness of the distributed control system to modulation effect (phase-shifted PWM), communication delay, individual carrier frequency and sampling frequency is studied through simulations that are made in Matlab Simulink and PLECS....

  11. Nonstationary Frequency Analysis of Extreme Floods Using the Time-varying Two-component Mixture Distributions

    Science.gov (United States)

    Yan, L.; Xiong, L.; Liu, D.; Hu, T.; Xu, C. Y.

    2016-12-01

    The basic IID assumption of the traditional flood frequency analysis has been challenged by nonstationarity. The most popular practice for analyzing nonstationarity of flood series is to use a fixed single-type probability distribution incorporated with the time-varying moments. However, the type of probability distribution could be both complex because of distinct flood populations and time-varying under changing environments. To allow the investigation of this complex nature, the time-varying two-component mixture distributions (TTMD) method is proposed in this study by considering the time variations of not only the moments of its component distributions but also the weighting coefficients. Having identified the existence of mixed flood populations based on circular statistics, the proposed TTMD was applied to model the annual maximum flood series (AMFS) of two stations in the Weihe River basin (WRB), with the model parameters calibrated by the meta-heuristic maximum likelihood (MHML) method. The performance of TTMD was evaluated by different diagnostic plots and indexes and compared with stationary single-type distributions, stationary mixture distributions and time-varying single-type distributions. The results highlighted the advantages of using TTMD models and physically-based covariates in nonstationary flood frequency analysis. Besides, the optimal TTMD models were considered to be capable of settling the issue of nonstationarity and capturing the mixed flood populations satisfactorily. It is concluded that the TTMD model is a good alternative in the nonstationary frequency analysis and can be applied to other regions with mixed flood populations.

  12. Environment

    DEFF Research Database (Denmark)

    Valentini, Chiara

    2017-01-01

    The term environment refers to the internal and external context in which organizations operate. For some scholars, environment is defined as an arrangement of political, economic, social and cultural factors existing in a given context that have an impact on organizational processes and structures....... For others, environment is a generic term describing a large variety of stakeholders and how these interact and act upon organizations. Organizations and their environment are mutually interdependent and organizational communications are highly affected by the environment. This entry examines the origin...... and development of organization-environment interdependence, the nature of the concept of environment and its relevance for communication scholarships and activities....

  13. Improving Department of Defense Global Distribution Performance Through Network Analysis

    Science.gov (United States)

    2016-06-01

    maximum time allowed by SDDB business rules, 365 days, and run the improvement algorithm again. Using the budget of 20 improvement days for both the...OF DEFENSE GLOBAL DISTRIBUTION PERFORMANCE THROUGH NETWORK ANALYSIS by Justin A. Thompson June 2016 Thesis Advisor: Samuel E. Buttrey...REPORT DATE June 2016 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE IMPROVING DEPARTMENT OF DEFENSE GLOBAL DISTRIBUTION

  14. Temporal distribution of Pu and Am in the Marine Environment of Southern Coast of Spain

    International Nuclear Information System (INIS)

    Iranzo, E.; Gasco, C.; Romero, L.; Martinez, A.; Mingarro, E.; Rivas, P.

    1989-01-01

    The distribution of transuranides and 137Cs in selected sediment cores from southern Spanish coast has been studied. The area is of special interest as an accidental release of transuranides (four nuclear bombs fallen down) occurred near the seaside in 1966 (Palomares). The possible transference to the coast by the river (contaminated by one bomb, surface contamination: 12-120 α Bq/m2) or by aerosol produced in the explosion had never been studied. Several sediments cores between Cape of Palos and Cape of Gata and surface sediments were raised. The profiles of transuranides distribution were determined and were dated in order to determine the year of the contributions. An enhanced concentration of transuranides has been observed at southern area of Palomares coast, the causes of this fenomena have been analyzed in this paper. The accuracy of radiochemical analysis was checked by analyzing standard material and duplicated samples. (Author)

  15. Temporal distribution of Pu and Am in the marine environment of southern coast of Spain

    International Nuclear Information System (INIS)

    Iranzo, E.; Gascon, C.; Romero, L.; Martinez, M.; Mingarro, E.; Rivas, P.

    1989-01-01

    The distribution of trasuranides and Cs-137 in selected sediment cores from southern Spanish coast has been studied. The area is of special interest as an accidental release of transuranides (four nuclear bombs fallen down) occured near the seasidse in 1966 (Palomare-s). The possible transference to the coast by the river (contaminated by one bomb, surface contamination: 12-120 alpha Bq/m2) or by aerosol produced in the explosion had never been estudied. Several sediments cores between Cape of Palos ands Cape of Gata and surface sediments were raised. The profiles of transuranides distribution were determined and were dated in order to determine the year of the contributions. An enhaced concentration of transuranides has been observed at southern area of Palomares coast, the causes of this fenomena have been analyzed in this paper. The accuracy of radiochemical analysis was checked by analyzing standard material and duplicated samples. (Author). 15 figs.; 27 refs

  16. Temporal distribution of Pu and Am in the Marine Environment of Southern Coast of Spain

    Energy Technology Data Exchange (ETDEWEB)

    Iranzo, E.; Gasco, C.; Romero, L.; Martinez, A.; Mingarro, E.; Rivas, P.

    1989-07-01

    The distribution of transuranides and 137Cs in selected sediment cores from southern Spanish coast has been studied. The area is of special interest as an accidental release of transuranides (four nuclear bombs fallen down) occurred near the seaside in 1966 (Palomares). The possible transference to the coast by the river (contaminated by one bomb, surface contamination: 12-120 {alpha} Bq/m2) or by aerosol produced in the explosion had never been studied. Several sediments cores between Cape of Palos and Cape of Gata and surface sediments were raised. The profiles of transuranides distribution were determined and were dated in order to determine the year of the contributions. An enhanced concentration of transuranides has been observed at southern area of Palomares coast, the causes of this fenomena have been analyzed in this paper. The accuracy of radiochemical analysis was checked by analyzing standard material and duplicated samples. (Author)

  17. [Relationship between Yangtze River floodplain micro ecological environment and distribution of Oncomelania hupensis snails].

    Science.gov (United States)

    Zhao, Jin-song; Wang, An-yun; Zhou, Shu-lin

    2014-04-01

    To explore the relationship between the Yangtze River floodplain ecological environment (vegetation, soil, water and light intensity) and the distribution of Oncomelania hupensis snails, so as to provide the evidence for ecological snail control. Three regions (the Lu-Gang Bridge, Dragon Nest Lake in the bund, and Dragon Nest lake beach) were selected to investigate the plant characteristics (species, height, coverage, frequency and strain of clusters), soil characteristics (temperature, humidity, light intensity) and pH value. All the results were analyzed statistically with SPSS 18 software. A total of 920 boxes were investigated. The vegetation coverage was 3.7%-63.5%, and the dominant population was Cyperusrotundus L. cluster on the marshland. The soil temperature was 19.0 degrees C-24.0 degrees C, pH 5.0-5.7, and humidity 53%-75%. There were statistical significants in average number of living snails and dead snails among 3 groups (P micro ecological environment factors, such as vegetation, soil, water and light intensity.

  18. Manipulation of volumetric patient data in a distributed virtual reality environment.

    Science.gov (United States)

    Dech, F; Ai, Z; Silverstein, J C

    2001-01-01

    Due to increases in network speed and bandwidth, distributed exploration of medical data in immersive Virtual Reality (VR) environments is becoming increasingly feasible. The volumetric display of radiological data in such environments presents a unique set of challenges. The shear size and complexity of the datasets involved not only make them difficult to transmit to remote sites, but these datasets also require extensive user interaction in order to make them understandable to the investigator and manageable to the rendering hardware. A sophisticated VR user interface is required in order for the clinician to focus on the aspects of the data that will provide educational and/or diagnostic insight. We will describe a software system of data acquisition, data display, Tele-Immersion, and data manipulation that supports interactive, collaborative investigation of large radiological datasets. The hardware required in this strategy is still at the high-end of the graphics workstation market. Future software ports to Linux and NT, along with the rapid development of PC graphics cards, open the possibility for later work with Linux or NT PCs and PC clusters.

  19. The Usability Analysis of An E-Learning Environment

    Directory of Open Access Journals (Sweden)

    Fulya TORUN

    2015-10-01

    Full Text Available In this research, an E-learning environment is developed for the teacher candidates taking the course on Scientific Research Methods. The course contents were adapted to one of the constructivist approach models referred to as 5E, and an expert opinion was received for the compliance of this model. An usability analysis was also performed to determine the usability of the e-learning environment. The participants of the research comprised 42 teacher candidates. The mixed method was used in the research. 3 different data collection tools were used in order to measure the three basic concepts of usability analyses, which are the dimensions of effectiveness, efficiency and satisfaction. Two of the data collection tools were the scales developed by different researchers and were applied with the approval received from the researchers involved. On the other hand, the usability test as another data tool was prepared by the researchers who conducted this study for the purpose of determining the participants’ success in handling the twelve tasks assigned to them with respect to the use of elearning environment, the seconds they spent on that environment and the number of clicks they performed. Considering the results of the analyses performed within the data obtained, the usability of the developed e-learning environment proved to be at a higher rate.

  20. Developing a Black Carbon-Substituted Multimedia Model for Simulating the PAH Distributions in Urban Environments.

    Science.gov (United States)

    Wang, Chunhui; Zhou, Shenglu; He, Yue; Wang, Junxiao; Wang, Fei; Wu, Shaohua

    2017-11-06

    A multimedia fugacity model with spatially resolved environmental phases at an urban scale was developed. In this model, the key parameter, organic matter, was replaced with black carbon (BC) and applied to simulate the distributions of phenanthrene (Phe), pyrene (Pyr) and benzo[α]pyrene (BaP) in Nanjing, China. Based on the estimated emissions and measured inflows of air and water, the Phe, Pyr and BaP concentrations in different environment media were calculated under steady-state assumptions. The original model (OC-Model), BC-inclusive model (dual C-Model) and improved model (BC-Model) were validated by comparing observed and predicted Phe, Pyr and BaP concentrations. Our results suggested that lighter polycyclic aromatic hydrocarbons (PAHs) were more affected by BC substitution than their heavier counterparts. We advocate the utilization of sorption with BC in future multimedia fate models for lighter PAHs based on the comparison of the calculated and observed values from measured and published sources. The spatial distributions of the Phe, Pyr and BaP concentrations in all phases were rationally mapped based on the calculated concentrations from the BC-Model, indicating that soil was the dominant sink of PAHs in terrestrial systems, while sediment was the dominant sink of PAHs in aquatic systems.

  1. Optimizing NEURON Simulation Environment Using Remote Memory Access with Recursive Doubling on Distributed Memory Systems.

    Science.gov (United States)

    Shehzad, Danish; Bozkuş, Zeki

    2016-01-01

    Increase in complexity of neuronal network models escalated the efforts to make NEURON simulation environment efficient. The computational neuroscientists divided the equations into subnets amongst multiple processors for achieving better hardware performance. On parallel machines for neuronal networks, interprocessor spikes exchange consumes large section of overall simulation time. In NEURON for communication between processors Message Passing Interface (MPI) is used. MPI_Allgather collective is exercised for spikes exchange after each interval across distributed memory systems. The increase in number of processors though results in achieving concurrency and better performance but it inversely affects MPI_Allgather which increases communication time between processors. This necessitates improving communication methodology to decrease the spikes exchange time over distributed memory systems. This work has improved MPI_Allgather method using Remote Memory Access (RMA) by moving two-sided communication to one-sided communication, and use of recursive doubling mechanism facilitates achieving efficient communication between the processors in precise steps. This approach enhanced communication concurrency and has improved overall runtime making NEURON more efficient for simulation of large neuronal network models.

  2. Inputs and distributions of synthetic musk fragrances in an estuarine and coastal environment; a case study

    International Nuclear Information System (INIS)

    Sumner, Nicola R.; Guitart, Carlos; Fuentes, Gustavo; Readman, James W.

    2010-01-01

    Synthetic musks are ubiquitous contaminants in the environment. Compartmental distributions (dissolved, suspended particle associated and sedimentary) of the compounds throughout an axial estuarine transect and in coastal waters are reported. High concentrations of Galaxolide (HHCB) and Tonalide (AHTN) (987-2098 ng/L and 55-159 ng/L, respectively) were encountered in final effluent samples from sewage treatment plants (STPs) discharging into the Tamar and Plym Estuaries (UK), with lower concentrations of Celestolide (ADBI) (4-13 ng/L), Phantolide (AHMI) (6-9 ng/L), musk xylene (MX) (4-7 ng/L) and musk ketone (MK) (18-30 ng/L). Rapid dilution from the outfalls is demonstrated with resulting concentrations of HHCB spanning from 5 to 30 ng/L and those for AHTN from 3 to 15 ng/L. The other musks were generally not detected in the estuarine and coastal waters. The suspended particulate matter (SPM) and sedimentary profiles and compositions (HHCB:AHTN ratios) generally reflect the distribution in the water column with highest concentrations adjacent to sewage outfalls. - Synthetic musks were determined in coastal environmental compartments along an estuarine transect indicating their ubiquitous occurrence in transitional waters.

  3. Meteorological and Land Surface Properties Impacting Sea Breeze Extent and Aerosol Distribution in a Dry Environment

    Science.gov (United States)

    Igel, Adele L.; van den Heever, Susan C.; Johnson, Jill S.

    2018-01-01

    The properties of sea breeze circulations are influenced by a variety of meteorological and geophysical factors that interact with one another. These circulations can redistribute aerosol particles and pollution and therefore can play an important role in local air quality, as well as impact remote sensing. In this study, we select 11 factors that have the potential to impact either the sea breeze circulation properties and/or the spatial distribution of aerosols. Simulations are run to identify which of the 11 factors have the largest influence on the sea breeze properties and aerosol concentrations and to subsequently understand the mean response of these variables to the selected factors. All simulations are designed to be representative of conditions in coastal sub tropical environments and are thus relatively dry, as such they do not support deep convection associated with the sea breeze front. For this dry sea breeze regime, we find that the background wind speed was the most influential factor for the sea breeze propagation, with the soil saturation fraction also being important. For the spatial aerosol distribution, the most important factors were the soil moisture, sea-air temperature difference, and the initial boundary layer height. The importance of these factors seems to be strongly tied to the development of the surface-based mixed layer both ahead of and behind the sea breeze front. This study highlights potential avenues for further research regarding sea breeze dynamics and the impact of sea breeze circulations on pollution dispersion and remote sensing algorithms.

  4. DLTAP: A Network-efficient Scheduling Method for Distributed Deep Learning Workload in Containerized Cluster Environment

    Directory of Open Access Journals (Sweden)

    Qiao Wei

    2017-01-01

    Full Text Available Deep neural networks (DNNs have recently yielded strong results on a range of applications. Training these DNNs using a cluster of commodity machines is a promising approach since training is time consuming and compute-intensive. Furthermore, putting DNN tasks into containers of clusters would enable broader and easier deployment of DNN-based algorithms. Toward this end, this paper addresses the problem of scheduling DNN tasks in the containerized cluster environment. Efficiently scheduling data-parallel computation jobs like DNN over containerized clusters is critical for job performance, system throughput, and resource utilization. It becomes even more challenging with the complex workloads. We propose a scheduling method called Deep Learning Task Allocation Priority (DLTAP which performs scheduling decisions in a distributed manner, and each of scheduling decisions takes aggregation degree of parameter sever task and worker task into account, in particularly, to reduce cross-node network transmission traffic and, correspondingly, decrease the DNN training time. We evaluate the DLTAP scheduling method using a state-of-the-art distributed DNN training framework on 3 benchmarks. The results show that the proposed method can averagely reduce 12% cross-node network traffic, and decrease the DNN training time even with the cluster of low-end servers.

  5. Balancing Uplink and Downlink under Asymmetric Traffic Environments Using Distributed Receive Antennas

    Science.gov (United States)

    Sohn, Illsoo; Lee, Byong Ok; Lee, Kwang Bok

    Recently, multimedia services are increasing with the widespread use of various wireless applications such as web browsers, real-time video, and interactive games, which results in traffic asymmetry between the uplink and downlink. Hence, time division duplex (TDD) systems which provide advantages in efficient bandwidth utilization under asymmetric traffic environments have become one of the most important issues in future mobile cellular systems. It is known that two types of intercell interference, referred to as crossed-slot interference, additionally arise in TDD systems; the performances of the uplink and downlink transmissions are degraded by BS-to-BS crossed-slot interference and MS-to-MS crossed-slot interference, respectively. The resulting performance unbalance between the uplink and downlink makes network deployment severely inefficient. Previous works have proposed intelligent time slot allocation algorithms to mitigate the crossed-slot interference problem. However, they require centralized control, which causes large signaling overhead in the network. In this paper, we propose to change the shape of the cellular structure itself. The conventional cellular structure is easily transformed into the proposed cellular structure with distributed receive antennas (DRAs). We set up statistical Markov chain traffic model and analyze the bit error performances of the conventional cellular structure and proposed cellular structure under asymmetric traffic environments. Numerical results show that the uplink and downlink performances of the proposed cellular structure become balanced with the proper number of DRAs and thus the proposed cellular structure is notably cost-effective in network deployment compared to the conventional cellular structure. As a result, extending the conventional cellular structure into the proposed cellular structure with DRAs is a remarkably cost-effective solution to support asymmetric traffic environments in future mobile cellular

  6. A Distributed Control System Prototyping Environment to Support Control Room Modernization

    Energy Technology Data Exchange (ETDEWEB)

    Lew, Roger Thomas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ulrich, Thomas Anthony [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-12-01

    Operators of critical processes, such as nuclear power production, must contend with highly complex systems, procedures, and regulations. Developing human-machine interfaces (HMIs) that better support operators is a high priority for ensuring the safe and reliable operation of critical processes. Human factors engineering (HFE) provides a rich and mature set of tools for evaluating the performance of HMIs, however the set of tools for developing and designing HMIs is still in its infancy. Here we propose a rapid prototyping approach for integrating proposed HMIs into their native environments before a design is finalized. This approach allows researchers and developers to test design ideas and eliminate design flaws prior to fully developing the new system. We illustrate this approach with four prototype designs developed using Microsoft’s Windows Presentation Foundation (WPF). One example is integrated into a microworld environment to test the functionality of the design and identify the optimal level of automation for a new system in a nuclear power plant. The other three examples are integrated into a full-scale, glasstop digital simulator of a nuclear power plant. One example demonstrates the capabilities of next generation control concepts; another aims to expand the current state of the art; lastly, an HMI prototype was developed as a test platform for a new control system currently in development at U.S. nuclear power plants. WPF possesses several characteristics that make it well suited to HMI design. It provides a tremendous amount of flexibility, agility, robustness, and extensibility. Distributed control system (DCS) specific environments tend to focus on the safety and reliability requirements for real-world interfaces and consequently have less emphasis on providing functionality to support novel interaction paradigms. Because of WPF’s large user-base, Microsoft can provide an extremely mature tool. Within process control applications,WPF is

  7. Open Science Grid (OSG) Ticket Synchronization: Keeping Your Home Field Advantage In A Distributed Environment

    Science.gov (United States)

    Gross, Kyle; Hayashi, Soichi; Teige, Scott; Quick, Robert

    2012-12-01

    Large distributed computing collaborations, such as the Worldwide LHC Computing Grid (WLCG), face many issues when it comes to providing a working grid environment for their users. One of these is exchanging tickets between various ticketing systems in use by grid collaborations. Ticket systems such as Footprints, RT, Remedy, and ServiceNow all have different schema that must be addressed in order to provide a reliable exchange of information between support entities and users in different grid environments. To combat this problem, OSG Operations has created a ticket synchronization interface called GOC-TX that relies on web services instead of error-prone email parsing methods of the past. Synchronizing tickets between different ticketing systems allows any user or support entity to work on a ticket in their home environment, thus providing a familiar and comfortable place to provide updates without having to learn another ticketing system. The interface is built in a way that it is generic enough that it can be customized for nearly any ticketing system with a web-service interface with only minor changes. This allows us to be flexible and rapidly bring new ticket synchronization online. Synchronization can be triggered by different methods including mail, web services interface, and active messaging. GOC-TX currently interfaces with Global Grid User Support (GGUS) for WLCG, Remedy at Brookhaven National Lab (BNL), and Request Tracker (RT) at the Virtual Data Toolkit (VDT). Work is progressing on the Fermi National Accelerator Laboratory (FNAL) ServiceNow synchronization. This paper will explain the problems faced by OSG and how they led OSG to create and implement this ticket synchronization system along with the technical details that allow synchronization to be preformed at a production level.

  8. Open Science Grid (OSG) Ticket Synchronization: Keeping Your Home Field Advantage In A Distributed Environment

    International Nuclear Information System (INIS)

    Gross, Kyle; Hayashi, Soichi; Teige, Scott; Quick, Robert

    2012-01-01

    Large distributed computing collaborations, such as the Worldwide LHC Computing Grid (WLCG), face many issues when it comes to providing a working grid environment for their users. One of these is exchanging tickets between various ticketing systems in use by grid collaborations. Ticket systems such as Footprints, RT, Remedy, and ServiceNow all have different schema that must be addressed in order to provide a reliable exchange of information between support entities and users in different grid environments. To combat this problem, OSG Operations has created a ticket synchronization interface called GOC-TX that relies on web services instead of error-prone email parsing methods of the past. Synchronizing tickets between different ticketing systems allows any user or support entity to work on a ticket in their home environment, thus providing a familiar and comfortable place to provide updates without having to learn another ticketing system. The interface is built in a way that it is generic enough that it can be customized for nearly any ticketing system with a web-service interface with only minor changes. This allows us to be flexible and rapidly bring new ticket synchronization online. Synchronization can be triggered by different methods including mail, web services interface, and active messaging. GOC-TX currently interfaces with Global Grid User Support (GGUS) for WLCG, Remedy at Brookhaven National Lab (BNL), and Request Tracker (RT) at the Virtual Data Toolkit (VDT). Work is progressing on the Fermi National Accelerator Laboratory (FNAL) ServiceNow synchronization. This paper will explain the problems faced by OSG and how they led OSG to create and implement this ticket synchronization system along with the technical details that allow synchronization to be preformed at a production level.

  9. Analysis of Virtual Learning Environments from a Comprehensive Semiotic Perspective

    Directory of Open Access Journals (Sweden)

    Gloria María Álvarez Cadavid

    2012-11-01

    Full Text Available Although there is a wide variety of perspectives and models for the study of online education, most of these focus on the analysis of the verbal aspects of such learning, while very few consider the relationship between speech and elements of a different nature, such as images and hypermediality. In a previous article we presented a proposal for a comprehensive semiotic analysis of virtual learning environments that more recently has been developed and tested for the study of different online training courses without instructional intervention. In this paper we use this same proposal to analyze online learning environments in the framework of courses with instructional intervention. One of the main observations in relation to this type of analyses is that the organizational aspects of the courses are found to be related to the way in which the input elements for the teaching and learning process are constructed.

  10. Implementing GIS regression trees for generating the spatial distribution of copper in Mediterranean environments

    DEFF Research Database (Denmark)

    Bou Kheir, Rania; Greve, Mogens Humlekrog; Deroin, Jean-Paul

    2013-01-01

    coastal area situated in northern Lebanon using a geographic information system (GIS) and regression-tree analysis. The chosen area represents a typical case study of Mediterranean coastal landscape with deteriorating environment. Fifteen environmental parameters (parent material, soil type, p...... number of terminal nodes) combined soil pH and surroundings to waste areas, and explained 77% of the variability in Cu laboratory measurements. The overall accuracy of the predictive quantitative copper map produced using this model (at 1 : 50,000 cartographic scale) was estimated to be ca. 80%. Applying...

  11. A cloud based infrastructure to support business process analytics on highly distributed environments

    OpenAIRE

    Vera Baquero, Alejandro

    2015-01-01

    Mención Internacional en el título de doctor Business process improvement can drastically influence in the profit of corporations and helps them to remain viable in a slowdown economy such as the present one. For companies to remain viable in the face of intense global competition, they must be able to continuously improve their processes in order to respond rapidly to changing business environments. In this regard, the analysis of data plays an important role for improving and optimizing ...

  12. Virtual learning object and environment: a concept analysis

    OpenAIRE

    Salvador, Pétala Tuani Candido de Oliveira; Bezerril, Manacés dos Santos; Mariz, Camila Maria Santos; Fernandes, Maria Isabel Domingues; Martins, José Carlos Amado; Santos, Viviane Euzébia Pereira

    2017-01-01

    ABSTRACT Objective: To analyze the concept of virtual learning object and environment according to Rodgers' evolutionary perspective. Method: Descriptive study with a mixed approach, based on the stages proposed by Rodgers in his concept analysis method. Data collection occurred in August 2015 with the search of dissertations and theses in the Bank of Theses of the Coordination for the Improvement of Higher Education Personnel. Quantitative data were analyzed based on simple descriptive sta...

  13. Decision analysis for cleanup strategies in an urban environment

    International Nuclear Information System (INIS)

    Sinkko, K.; Ikaeheimonen, T.K.

    2000-01-01

    The values entering the decisions on protective actions, as concerning the society, are multidimensional. People have strong feelings and beliefs about these values, some of which are not numerically quantified and do not exist in monetary form. The decision analysis is applied in planning the recovery operations to clean up an urban environment in the event of a hypothetical nuclear power plant accident assisting in rendering explicit and apparent all factors involved and evaluating their relative importance. (author)

  14. Collaborative spatial analysis and modelling in a research environment

    CSIR Research Space (South Africa)

    Naudé, A

    2006-02-01

    Full Text Available of an open-source geoportal and geospatial content management framework (adapted for low-bandwidth environments), customisable spatial analysis workbenches (providing guidance and tools for geoprocesses such as spatial disaggregation) and the formulation... resources and processes. In these two sections, the concept of a knowledge geoportal is introduced. A knowledge geoportal includes the notion of customisable workbenches, aimed at addressing the other key problems seen in Figure 1. Further...

  15. Time-cost analysis of a quantum key distribution system clocked at 100 MHz.

    Science.gov (United States)

    Mo, X F; Lucio-Martinez, I; Chan, P; Healey, C; Hosier, S; Tittel, W

    2011-08-29

    We describe the realization of a quantum key distribution (QKD) system clocked at 100 MHz. The system includes classical postprocessing implemented via software, and is operated over a 12 km standard telecommunication dark fiber in a real-world environment. A time-cost analysis of the sifted, error-corrected, and secret key rates relative to the raw key rate is presented, and the scalability of our implementation with respect to higher secret key rates is discussed.

  16. Thermal-hydraulic analysis of Ignalina NPP compartments response to group distribution header rupture using RALOC4 code

    International Nuclear Information System (INIS)

    Urbonavicius, E.

    2000-01-01

    The Accident Localisation System (ALS) of Ignalina NPP is a containment of pressure suppression type designed to protect the environment from the dangerous impact of the radioactivity. The failure of ALS could lead to contamination of the environment and prescribed public radiation doses could be exceeded. The purpose of the presented analysis is to perform long term thermal-hydraulic analysis of compartments response to Group Distribution Header rupture and verify if design pressure values are not exceeded. (authors)

  17. EXPLORATORY ANALYSIS OF ORGANIZATIONAL CULTURE IN GALATI COUNTY BUSINESS ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Gabriela GHEORGHE

    2014-06-01

    Full Text Available Based on data collected as part of the COMOR Project, developed by The Scientific Society of Management from Romania, for the analysis of organizational culture in the Romanian business environment, we have initiated an exploration using Business Intelligence tools. The purpose of this analysis is to find relevant information about the organizational culture for Galati County, starting from the results obtained in this project, and, using data mining techniques, to investigate relationships and links between responses to different survey questions, in this "mine" data, gathered by the project team effort.

  18. Evaluation of Occupational Cold Environments: Field Measurements and Subjective Analysis

    Science.gov (United States)

    OLIVEIRA, A. Virgílio M.; GASPAR, Adélio R.; RAIMUNDO, António M.; QUINTELA, Divo A.

    2014-01-01

    The present work is dedicated to the study of occupational cold environments in food distribution industrial units. Field measurements and a subjective assessment based on an individual questionnaire were considered. The survey was carried out in 5 Portuguese companies. The field measurements include 26 workplaces, while a sample of 160 responses was considered for the subjective assessment. In order to characterize the level of cold exposure, the Required Clothing Insulation Index (IREQ) was adopted. The IREQ index highlights that in the majority of the workplaces the clothing ensembles worn are inadequate, namely in the freezing chambers where the protection provided by clothing is always insufficient. The questionnaires results show that the food distribution sector is characterized by a female population (70.6%), by a young work force (60.7% are less than 35 yr old) and by a population with a medium-length professional career (80.1% in this occupation for less than 10 yr). The incidence of health effects which is higher among women, the distribution of protective clothing (50.0% of the workers indicate one garment) and the significant percentage of workers (>75%) that has more difficulties in performing the activity during the winter represent other important results of the present study. PMID:24583510

  19. Respiratory quinones in Archaea: phylogenetic distribution and application as biomarkers in the marine environment.

    Science.gov (United States)

    Elling, Felix J; Becker, Kevin W; Könneke, Martin; Schröder, Jan M; Kellermann, Matthias Y; Thomm, Michael; Hinrichs, Kai-Uwe

    2016-02-01

    The distribution of respiratory quinone electron carriers among cultivated organisms provides clues on both the taxonomy of their producers and the redox processes these are mediating. Our study of the quinone inventories of 25 archaeal species belonging to the phyla Eury-, Cren- and Thaumarchaeota facilitates their use as chemotaxonomic markers for ecologically important archaeal clades. Saturated and monounsaturated menaquinones with six isoprenoid units forming the alkyl chain may serve as chemotaxonomic markers for Thaumarchaeota. Other diagnostic biomarkers are thiophene-bearing quinones for Sulfolobales and methanophenazines as functional quinone analogues of the Methanosarcinales. The ubiquity of saturated menaquinones in the Archaea in comparison to Bacteria suggests that these compounds may represent an ancestral and diagnostic feature of the Archaea. Overlap between quinone compositions of distinct thermophilic and halophilic archaea and bacteria may indicate lateral gene transfer. The biomarker potential of thaumarchaeal quinones was exemplarily demonstrated on a water column profile of the Black Sea. Both, thaumarchaeal quinones and membrane lipids showed similar distributions with maxima at the chemocline. Quinone distributions indicate that Thaumarchaeota dominate respiratory activity at a narrow interval in the chemocline, while they contribute only 9% to the microbial biomass at this depth, as determined by membrane lipid analysis. © 2015 Society for Applied Microbiology and John Wiley & Sons Ltd.

  20. Geospatial analysis and distribution patterns of home nursing care in a metropolitan area - a large-scale analysis.

    Science.gov (United States)

    Bauer, Jan; Reinhard, Julia; Boll, Michael; Groneberg, David

    2017-01-01

    This study focuses on home nursing care distribution in an urban setting in Germany. A shortage of nursing care workforce is present in Germany. A geospatial analysis was performed to examine distribution patterns at the district level in Frankfurt, Germany (n = 46 districts) and factors were analysed influencing the location choice of home nursing care providers (n = 151). Furthermore, within the analysis we focused on the population aged over 65 years to model the demand for nursing care. The analysis revealed a tendency of home nursing care providers to be located near the city centre (centripetal distribution pattern). However, the demand for care showed more inconsistent patterns. Still, a centripetal distribution pattern of demand could be stated. Compared with the control groups (e.g. acute hospitals and pharmacies) similar geographical distribution patterns were present. However, the location of home nursing care providers was less influenced by demand compared with the control groups. The supply of nursing care was unevenly distributed in this metropolitan setting, but still matched the demand for nursing care. Due to the rapidly changing health care environments policy, regulations must be (re-)evaluated critically to improve the management and delivery of nursing care provision. © 2016 John Wiley & Sons Ltd.

  1. Design and Implement a MapReduce Framework for Executing Standalone Software Packages in Hadoop-based Distributed Environments

    Directory of Open Access Journals (Sweden)

    Chao-Chun Chen

    2013-12-01

    Full Text Available The Hadoop MapReduce is the programming model of designing the auto scalable distributed computing applications. It provides developer an effective environment to attain automatic parallelization. However, most existing manufacturing systems are arduous and restrictive to migrate to MapReduce private cloud, due to the platform incompatible and tremendous complexity of system reconstruction. For increasing the efficiency of manufacturing systems with minimum modification of existing systems, we design a framework in this thesis, called MC-Framework: Multi-uses-based Cloudizing-Application Framework. It provides the simple interface to users for fairly executing requested tasks worked with traditional standalone software packages in MapReduce-based private cloud environments. Moreover, this thesis focuses on the multiuser workloads, but the default Hadoop scheduling scheme, i.e., FIFO, would increase delay under multiuser scenarios. Hence, we also propose a new scheduling mechanism, called Job-Sharing Scheduling, to explore and fairly share the jobs to machines in the MapReduce-based private cloud. Then, we prototype an experimental virtual-metrology module of a manufacturing system as a case study to verify and analysis the proposed MC-Framework. The results of our experiments indicate that our proposed framework enormously improved the time performance compared with the original package.

  2. Estimation Accuracy on Execution Time of Run-Time Tasks in a Heterogeneous Distributed Environment

    Directory of Open Access Journals (Sweden)

    Qi Liu

    2016-08-01

    Full Text Available Distributed Computing has achieved tremendous development since cloud computing was proposed in 2006, and played a vital role promoting rapid growth of data collecting and analysis models, e.g., Internet of things, Cyber-Physical Systems, Big Data Analytics, etc. Hadoop has become a data convergence platform for sensor networks. As one of the core components, MapReduce facilitates allocating, processing and mining of collected large-scale data, where speculative execution strategies help solve straggler problems. However, there is still no efficient solution for accurate estimation on execution time of run-time tasks, which can affect task allocation and distribution in MapReduce. In this paper, task execution data have been collected and employed for the estimation. A two-phase regression (TPR method is proposed to predict the finishing time of each task accurately. Detailed data of each task have drawn interests with detailed analysis report being made. According to the results, the prediction accuracy of concurrent tasks’ execution time can be improved, in particular for some regular jobs.

  3. Establishing Functional Relationships between Abiotic Environment, Macrophyte Coverage, Resource Gradients and the Distribution of Mytilus trossulus in a Brackish Non-Tidal Environment.

    Directory of Open Access Journals (Sweden)

    Jonne Kotta

    Full Text Available Benthic suspension feeding mussels are an important functional guild in coastal and estuarine ecosystems. To date we lack information on how various environmental gradients and biotic interactions separately and interactively shape the distribution patterns of mussels in non-tidal environments. Opposing to tidal environments, mussels inhabit solely subtidal zone in non-tidal waterbodies and, thereby, driving factors for mussel populations are expected to differ from the tidal areas. In the present study, we used the boosted regression tree modelling (BRT, an ensemble method for statistical techniques and machine learning, in order to explain the distribution and biomass of the suspension feeding mussel Mytilus trossulus in the non-tidal Baltic Sea. BRT models suggested that (1 distribution patterns of M. trossulus are largely driven by separate effects of direct environmental gradients and partly by interactive effects of resource gradients with direct environmental gradients. (2 Within its suitable habitat range, however, resource gradients had an important role in shaping the biomass distribution of M. trossulus. (3 Contrary to tidal areas, mussels were not competitively superior over macrophytes with patterns indicating either facilitative interactions between mussels and macrophytes or co-variance due to common stressor. To conclude, direct environmental gradients seem to define the distribution pattern of M. trossulus, and within the favourable distribution range, resource gradients in interaction with direct environmental gradients are expected to set the biomass level of mussels.

  4. Staghorn: An Automated Large-Scale Distributed System Analysis Platform

    Energy Technology Data Exchange (ETDEWEB)

    Gabert, Kasimir [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Burns, Ian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Elliott, Steven [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kallaher, Jenna [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vail, Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-09-01

    Conducting experiments on large-scale distributed computing systems is becoming significantly easier with the assistance of emulation. Researchers can now create a model of a distributed computing environment and then generate a virtual, laboratory copy of the entire system composed of potentially thousands of virtual machines, switches, and software. The use of real software, running at clock rate in full virtual machines, allows experiments to produce meaningful results without necessitating a full understanding of all model components. However, the ability to inspect and modify elements within these models is bound by the limitation that such modifications must compete with the model, either running in or alongside it. This inhibits entire classes of analyses from being conducted upon these models. We developed a mechanism to snapshot an entire emulation-based model as it is running. This allows us to \\freeze time" and subsequently fork execution, replay execution, modify arbitrary parts of the model, or deeply explore the model. This snapshot includes capturing packets in transit and other input/output state along with the running virtual machines. We were able to build this system in Linux using Open vSwitch and Kernel Virtual Machines on top of Sandia's emulation platform Firewheel. This primitive opens the door to numerous subsequent analyses on models, including state space exploration, debugging distributed systems, performance optimizations, improved training environments, and improved experiment repeatability.

  5. Energy loss analysis of an integrated space power distribution system

    Science.gov (United States)

    Kankam, M. David; Ribeiro, P. F.

    1992-01-01

    The results of studies related to conceptual topologies of an integrated utility-like space power system are described. The system topologies are comparatively analyzed by considering their transmission energy losses as functions of mainly distribution voltage level and load composition. The analysis is expedited by use of a Distribution System Analysis and Simulation (DSAS) software. This recently developed computer program by the Electric Power Research Institute (EPRI) uses improved load models to solve the power flow within the system. However, present shortcomings of the software with regard to space applications, and incompletely defined characteristics of a space power system make the results applicable to only the fundamental trends of energy losses of the topologies studied. Accountability, such as included, for the effects of the various parameters on the system performance can constitute part of a planning tool for a space power distribution system.

  6. Stability Analysis for Seed Yield over Environments in Coriander

    Directory of Open Access Journals (Sweden)

    Sangeeta Yadav

    2016-11-01

    Full Text Available Thirty five genotypes of coriander (Coriandrum sativum L. were tested in four artificially created environments to judge their stability in performance of seed yield. The differences among genotypes and environments were significant for seed yield. Stability parameters varied considerably among the tested genotypes in all the methods used. The variation in result in different methods was due to non-fulfillment of assumption of different models. However, AMMI analysis provides the information on main effects as well as interaction effects and depiction of PCA score gives better understanding of the pattern of genotype – environment interaction. The sum of squares due to PCAs was also used for the computation of AMMI stability values for better understanding of the adaptability behavior of genotypes hence, additive main effects and multiplicative interaction (AMMI model was most appropriate for the analysis of G x E interactions for seed yield in coriander. Genotypes RVC 15, RVC 19, RVC 22, RVC 25 and Panipat local showed wider adaptability while, Simpo S 33 exhibited specific adaptability to favourable conditions of high fertility. These genotypes could be utilized in breeding programmers to transfer the adaptability genes into high yielding genetic back ground of coriander.

  7. Gerald: a general environment for radiation analysis and design

    International Nuclear Information System (INIS)

    Boyle, Ch.; Oliveira, P.I.E. de; Oliveira, C.R.E. de; Adams, M.L.; Galan, J.M.

    2005-01-01

    Full text of publication follows: This paper describes the status of the GERALD interactive workbench for the analysis of radiation transport problems. GERALD basically guides the user through the various steps that are necessary to solve a radiation transport problem, and is aimed at education, research and industry. The advantages of such workbench are many: quality assurance of problem setup, interaction of the user with problem solution, preservation of theory and legacy research codes, and rapid proto-typing and testing of new methods. The environment is of general applicability catering for analytical, deterministic and stochastic analysis of the radiation problem and is not tied to one specific solution method or code. However, GERALD is being developed as a portable, modular, open source framework which renders itself quite naturally to the coupling of existing computational tools through specifically developed plug-ins. By offering a common route for setting up, solving and analyzing radiation transport problems GERALD offers the possibility of methods intercomparison and validation. Such flexible radiation transport environment will also facilitate the coupling of radiation physics methods to other physical phenomena and their application to other areas of application such as medical physics and the environment. (authors)

  8. Tense Usage Analysis in Verb Distribution in Brazilian Portuguese.

    Science.gov (United States)

    Hoge, Henry W., Comp.

    This section of a four-part research project investigating the syntax of Brazilian Portuguese presents data concerning tense usage in verb distribution. The data are derived from the analysis of selected literary samples from representative and contemporary writers. The selection of authors and tabulation of data are also described. Materials…

  9. Distribution and linkage disequilibrium analysis of polymorphisms of ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Volume 95; Issue 1. Distribution and linkage disequilibrium analysis of polymorphisms of GH1 gene in different populations of pigs associated with body size. Yunyun Cheng Songcai Liu Dan Su Chao Lu Xin Zhang Qingyan Wu Siming Li Haoyu Fu Hao Yu Linlin Hao. Research Article ...

  10. Three-Phase Harmonic Analysis Method for Unbalanced Distribution Systems

    Directory of Open Access Journals (Sweden)

    Jen-Hao Teng

    2014-01-01

    Full Text Available Due to the unbalanced features of distribution systems, a three-phase harmonic analysis method is essential to accurately analyze the harmonic impact on distribution systems. Moreover, harmonic analysis is the basic tool for harmonic filter design and harmonic resonance mitigation; therefore, the computational performance should also be efficient. An accurate and efficient three-phase harmonic analysis method for unbalanced distribution systems is proposed in this paper. The variations of bus voltages, bus current injections and branch currents affected by harmonic current injections can be analyzed by two relationship matrices developed from the topological characteristics of distribution systems. Some useful formulas are then derived to solve the three-phase harmonic propagation problem. After the harmonic propagation for each harmonic order is calculated, the total harmonic distortion (THD for bus voltages can be calculated accordingly. The proposed method has better computational performance, since the time-consuming full admittance matrix inverse employed by the commonly-used harmonic analysis methods is not necessary in the solution procedure. In addition, the proposed method can provide novel viewpoints in calculating the branch currents and bus voltages under harmonic pollution which are vital for harmonic filter design. Test results demonstrate the effectiveness and efficiency of the proposed method.

  11. Distribution and linkage disequilibrium analysis of polymorphisms of ...

    Indian Academy of Sciences (India)

    gene are associated with the body size of pigs providing genetic basis for pig breeding with the improved economic benefits. [Cheng Y., Liu S., Su D., Lu C., Zhang X., Wu Q., Li S., Fu H., Yu H. and Hao L. 2016 Distribution and linkage disequilibrium analysis of polymorphisms of GH1 gene in different populations of pigs ...

  12. Distribution and diversity of Verrucomicrobia methanotrophs in geothermal and acidic environments.

    Science.gov (United States)

    Sharp, Christine E; Smirnova, Angela V; Graham, Jaime M; Stott, Matthew B; Khadka, Roshan; Moore, Tim R; Grasby, Stephen E; Strack, Maria; Dunfield, Peter F

    2014-06-01

    Recently, methanotrophic members of the phylum Verrucomicrobia have been described, but little is known about their distribution in nature. We surveyed methanotrophic bacteria in geothermal springs and acidic wetlands via pyrosequencing of 16S rRNA gene amplicons. Putative methanotrophic Verrucomicrobia were found in samples covering a broad temperature range (22.5-81.6°C), but only in acidic conditions (pH 1.8-5.0) and only in geothermal environments, not in acidic bogs or fens. Phylogenetically, three 16S rRNA gene sequence clusters of putative methanotrophic Verrucomicrobia were observed. Those detected in high-temperature geothermal samples (44.1-81.6°C) grouped with known thermoacidiphilic 'Methylacidiphilum' isolates. A second group dominated in moderate-temperature geothermal samples (22.5-40.1°C) and a representative mesophilic methanotroph from this group was isolated (strain LP2A). Genome sequencing verified that strain LP2A possessed particulate methane monooxygenase, but its 16S rRNA gene sequence identity to 'Methylacidiphilum infernorum' strain V4 was only 90.6%. A third group clustered distantly with known methanotrophic Verrucomicrobia. Using pmoA-gene targeted quantitative polymerase chain reaction, two geothermal soil profiles showed a dominance of LP2A-like pmoA sequences in the cooler surface layers and 'Methylacidiphilum'-like pmoA sequences in deeper, hotter layers. Based on these results, there appears to be a thermophilic group and a mesophilic group of methanotrophic Verrucomicrobia. However, both were detected only in acidic geothermal environments. © 2014 Society for Applied Microbiology and John Wiley & Sons Ltd.

  13. The Future of the Global Environment: A Model-based Analysis Supporting UNEP's First Global Environment Outlook

    NARCIS (Netherlands)

    Bakkes JA; Woerden JW van; Alcamo J; Berk MM; Bol P; Born GJ van den; Brink BJE ten; Hettelingh JP; Langeweg F; Niessen LW; Swart RJ; United Nations Environment; MNV

    1997-01-01

    This report documents the scenario analysis in UNEP's first Global Environment Outlook, published at the same time as the scenario analysis. This Outlook provides a pilot assessment of developments in the environment, both global and regional, between now and 2015, with a further projection to

  14. A Systematic Approach for Engagement Analysis Under Multitasking Environments

    Science.gov (United States)

    Zhang, Guangfan; Leddo, John; Xu, Roger; Richey, Carl; Schnell, Tom; McKenzie, Frederick; Li, Jiang

    2011-01-01

    An overload condition can lead to high stress for an operator and further cause substantial drops in performance. On the other extreme, in automated systems, an operator may become underloaded; in which case, it is difficult for the operator to maintain sustained attention. When an unexpected event occurs, either internal or external to the automated system, a disengaged operation may neglect, misunderstand, or respond slowly/inappropriately to the situation. In this paper, we discuss a systematic approach monitor for extremes of cognitive workload and engagement in multitasking environments. Inferences of cognitive workload ar engagement are based on subjective evaluations, objective performance measures, physiological signals, and task analysis results. The systematic approach developed In this paper aggregates these types of information collected under the multitasking environment and can provide a real-time assessment or engagement.

  15. Analysis on Space Environment from the Anomalies of Geosynchronous Satellites

    Directory of Open Access Journals (Sweden)

    Jaejin Lee

    2009-12-01

    Full Text Available While it is well known that space environment can produce spacecraft anomaly, defining space environment effects for each anomalies is difficult. This is caused by the fact that spacecraft anomaly shows various symptoms and reproducing it is impossible. In this study, we try to find the conditions of when spacecraft failures happen more frequently and give satellite operators useful information. Especially, our study focuses on the geosynchronous satellites which cost is high and required high reliability. We used satellite anomaly data given by Satellite News Digest which is internet newspaper providing space industry news. In our analysis, 88 anomaly cases occurred from 1997 to 2008 shows bad corelation with Kp index. Satellite malfunctions were likely to happen in spring and fall and in local time from midnight to dawn. In addition, we found the probability of anomaly increase when high energy electron flux is high. This is more clearly appeared in solar minimum than maximum period.

  16. Weibull distribution in reliability data analysis in nuclear power plant

    International Nuclear Information System (INIS)

    Ma Yingfei; Zhang Zhijian; Zhang Min; Zheng Gangyang

    2015-01-01

    Reliability is an important issue affecting each stage of the life cycle ranging from birth to death of a product or a system. The reliability engineering includes the equipment failure data processing, quantitative assessment of system reliability and maintenance, etc. Reliability data refers to the variety of data that describe the reliability of system or component during its operation. These data may be in the form of numbers, graphics, symbols, texts and curves. Quantitative reliability assessment is the task of the reliability data analysis. It provides the information related to preventing, detect, and correct the defects of the reliability design. Reliability data analysis under proceed with the various stages of product life cycle and reliability activities. Reliability data of Systems Structures and Components (SSCs) in Nuclear Power Plants is the key factor of probabilistic safety assessment (PSA); reliability centered maintenance and life cycle management. The Weibull distribution is widely used in reliability engineering, failure analysis, industrial engineering to represent manufacturing and delivery times. It is commonly used to model time to fail, time to repair and material strength. In this paper, an improved Weibull distribution is introduced to analyze the reliability data of the SSCs in Nuclear Power Plants. An example is given in the paper to present the result of the new method. The Weibull distribution of mechanical equipment for reliability data fitting ability is very strong in nuclear power plant. It's a widely used mathematical model for reliability analysis. The current commonly used methods are two-parameter and three-parameter Weibull distribution. Through comparison and analysis, the three-parameter Weibull distribution fits the data better. It can reflect the reliability characteristics of the equipment and it is more realistic to the actual situation. (author)

  17. Improved gap filling method based on singular spectrum analysis and its application in space environment

    Science.gov (United States)

    Li, Xiangzhen; Liu, Shuai; Li, Zhi; Gong, Jiancun

    2017-11-01

    Data missing is a common phenomenon in the space environment measurements, which impacts or even blocks the following model-building procedures, predictions and posterior analysis. To fill these data gaps, an improved filling method based on iterative singular spectrum analysis is proposed. It first extracts a distribution array of the gaps and then fills the gaps with all known data. The distribution array is utilized to generate the test sets for cross validation. The embedding window length and principal components are determined by the discrete particle swarm optimization algorithm in a noncontinuous fashion. The effectiveness and adaptability of the filling method are proved by some tests done on solar wind data and geomagnetic indices from different solar activity years.

  18. Neuronal model with distributed delay: analysis and simulation study for gamma distribution memory kernel.

    Science.gov (United States)

    Karmeshu; Gupta, Varun; Kadambari, K V

    2011-06-01

    A single neuronal model incorporating distributed delay (memory)is proposed. The stochastic model has been formulated as a Stochastic Integro-Differential Equation (SIDE) which results in the underlying process being non-Markovian. A detailed analysis of the model when the distributed delay kernel has exponential form (weak delay) has been carried out. The selection of exponential kernel has enabled the transformation of the non-Markovian model to a Markovian model in an extended state space. For the study of First Passage Time (FPT) with exponential delay kernel, the model has been transformed to a system of coupled Stochastic Differential Equations (SDEs) in two-dimensional state space. Simulation studies of the SDEs provide insight into the effect of weak delay kernel on the Inter-Spike Interval(ISI) distribution. A measure based on Jensen-Shannon divergence is proposed which can be used to make a choice between two competing models viz. distributed delay model vis-á-vis LIF model. An interesting feature of the model is that the behavior of (CV(t))((ISI)) (Coefficient of Variation) of the ISI distribution with respect to memory kernel time constant parameter η reveals that neuron can switch from a bursting state to non-bursting state as the noise intensity parameter changes. The membrane potential exhibits decaying auto-correlation structure with or without damped oscillatory behavior depending on the choice of parameters. This behavior is in agreement with empirically observed pattern of spike count in a fixed time window. The power spectral density derived from the auto-correlation function is found to exhibit single and double peaks. The model is also examined for the case of strong delay with memory kernel having the form of Gamma distribution. In contrast to fast decay of damped oscillations of the ISI distribution for the model with weak delay kernel, the decay of damped oscillations is found to be slower for the model with strong delay kernel.

  19. Client - server programs analysis in the EPOCA environment

    Science.gov (United States)

    Donatelli, Susanna; Mazzocca, Nicola; Russo, Stefano

    1996-09-01

    Client - server processing is a popular paradigm for distributed computing. In the development of client - server programs, the designer has first to ensure that the implementation behaves correctly, in particular that it is deadlock free. Second, he has to guarantee that the program meets predefined performance requirements. This paper addresses the issues in the analysis of client - server programs in EPOCA. EPOCA is a computer-aided software engeneering (CASE) support system that allows the automated construction and analysis of generalized stochastic Petri net (GSPN) models of concurrent applications. The paper describes, on the basis of a realistic case study, how client - server systems are modelled in EPOCA, and the kind of qualitative and quantitative analysis supported by its tools.

  20. User-friendly Tool for Power Flow Analysis and Distributed Generation Optimisation in Radial Distribution Networks

    Directory of Open Access Journals (Sweden)

    M. F. Akorede

    2017-06-01

    Full Text Available The intent of power distribution companies (DISCOs is to deliver electric power to their customers in an efficient and reliable manner – with minimal energy loss cost. One major way to minimise power loss on a given power system is to install distributed generation (DG units on the distribution networks. However, to maximise benefits, it is highly crucial for a DISCO to ensure that these DG units are of optimal size and sited in the best locations on the network. This paper gives an overview of a software package developed in this study, called Power System Analysis and DG Optimisation Tool (PFADOT. The main purpose of the graphical user interface-based package is to guide a DISCO in finding the optimal size and location for DG placement in radial distribution networks. The package, which is also suitable for load flow analysis, employs the GUI feature of MATLAB. Three objective functions are formulated into a single optimisation problem and solved with fuzzy genetic algorithm to simultaneously obtain DG optimal size and location. The accuracy and reliability of the developed tool was validated using several radial test systems, and the results obtained are evaluated against the existing similar package cited in the literature, which are impressive and computationally efficient.

  1. Uranium distribution and sandstone depositional environments: oligocene and upper Cretaceous sediments, Cheyenne basin, Colorado

    International Nuclear Information System (INIS)

    Nibbelink, K.A.; Ethridge, F.G.

    1984-01-01

    Wyoming-type roll-front uranium deposits occur in the Upper Cretaceous Laramie and Fox Hills sandstones in the Cheyenne basin of northeastern Colorado. The location, geometry, and trend of specific depositional environments of the Oligocene White River and the Upper Cretaceous Laramie and Fox Hills formations are important factors that control the distribution of uranium in these sandstones. The Fox Hills Sandstone consists of up to 450 ft (140 m) of nearshore marine wave-dominated delta and barrier island-tidal channel sandstones which overlie offshore deposits of the Pierre Shale and which are overlain by delta-plain and fluvial deposits of the Laramie Formation. Uranium, which probably originated from volcanic ash in the White River Formation, was transported by groundwater through the fluvial-channel deposits of the White River into the sandstones of the Laramie and Fox Hills formations where it was precipitated. Two favorable depositional settings for uranium mineralization in the Fox Hills Sandstone are: (1) the landward side of barrier-island deposits where barrier sandstones thin and interfinger with back-barrier organic mudstones, and (2) the intersection of barrier-island and tidal channel sandstones. In both settings, sandstones were probably reduced during early burial by diagenesis of contained and adjacent organic matter. The change in permeability trends between the depositional strike-oriented barrier sandstones and the dip-oriented tidal-channel sandstones provided sites for dispersed groundwater flow and, as demonstrated in similar settings in other depositional systems, sites for uranium mineralization

  2. Providing QoS for Networked Peers in Distributed Haptic Virtual Environments

    Directory of Open Access Journals (Sweden)

    Alan Marshall

    2008-01-01

    Full Text Available Haptic information originates from a different human sense (touch, therefore the quality of service (QoS required to support haptic traffic is significantly different from that used to support conventional real-time traffic such as voice or video. Each type of network impairment has different (and severe impacts on the user's haptic experience. There has been no specific provision of QoS parameters for haptic interaction. Previous research into distributed haptic virtual environments (DHVEs have concentrated on synchronization of positions (haptic device or virtual objects, and are based on client-server architectures. We present a new peer-to-peer DHVE architecture that further extends this to enable force interactions between two users whereby force data are sent to the remote peer in addition to positional information. The work presented involves both simulation and practical experimentation where multimodal data is transmitted over a QoS-enabled IP network. Both forms of experiment produce consistent results which show that the use of specific QoS classes for haptic traffic will reduce network delay and jitter, leading to improvements in users' haptic experiences with these types of applications.

  3. Distribution of groundwater nitrate contamination in GIS environment: A case study, Sonqor plain

    Directory of Open Access Journals (Sweden)

    Parasto Setareh

    2014-06-01

    Full Text Available Background: Nitrate is a pollutant of groundwater resources which can results health risks such as methemoglobinemia and formation of nitrosamine compounds in higher concentration limits. The present study was aimed to determine the nitrite level, causes of pollution and zonation of nitrite concentration in drinking water resources in the villages of Sonqor. Methods: In this descriptive-analytrical study, 73 samples of all groundwater resources of Sonqor plain were taken in ,high water (March 2010 and low water (September 2011 periods. Water nitrate levels were then determined by spectrophotometry. Results were compared by national standards and analyzed by SPSS and Arcview GIS 9.3 software. Finally, the concentration distribution mapping was carried out in GIS environment and the factors affecting nitrite changes were analyzed. Results: nitrate concentration of water resources of Sonqor plain was fluctuating at 3.09-88.5 mg per liter.In one station, nitrite concentrations in the high (88.5 mg/liter and low (71.4 mg/liter water seasons were higher than the maximum limit. Based on the maps, a relatively high concentration of nitrite was observed in the Eastern and Southeastern regions. Conclusion: The findings indicated a reverse correlation between nitrite concentration changes and changes of static surface depth. Low thickness of alluvium, location of wells in the downstream farmlands, farming condition of the region, nitrate leaching from agricultural soils and wide application of nitrogen fertilizers in agriculture were considered as the causes of the pollution in one station.

  4. Concentration and distribution of 14C in aquatic environment around Qinshan nuclear power plant

    International Nuclear Information System (INIS)

    Wang Zhongtang; Guo Qiuju; Hu Dan; Xu Hong

    2015-01-01

    In order to study the concentration and distribution of 14 C in aquatic environment in the vicinity of Qinshan Nuclear Power Plant (NPP) after twenty years' operation, an apparatus extracting dissolved inorganic carbon from water was set up and applied to pretreat the water samples collected around Qinshan NPP. The 14 C concentration was measured by accelerator mass spectrometer (AMS). The results show that the 14 C specific activities in surface seawater samples range from 196.8 to 206.5 Bq/kg 203.4 ± 5.6) Bq/kg in average), which are close to the background. The 14 C concentrations in cooling water discharged from Qinshan NPP are close to the 14 C values in near shore seawater samples out of liquid radioactive effluent discharge period. It can be further concluded that the 14 C discharged previously is diluted and diffused well, and no 14 C enrichment in seawater is found. Also, no obvious increment in the 14 C specific activities of surface water and underground water samples are found between Qinshan NPP region and the reference region. (authors)

  5. The cost of electricity distribution in Italy: a quantitative analysis

    International Nuclear Information System (INIS)

    Scarpa, C.

    1998-01-01

    This paper presents a quantitative analysis of the cost of medium and low tension electricity distribution in Italy. An econometric analysis of the cost function is proposed, on the basis of data on 147 zones of the dominant firm, ENEL. Data are available only for 1996, which has forced to carry out only a cross-section OLS analysis. The econometric estimate shows the existence of significant scale economies, that the current organisational structure does not exploit. On this basis is also possible to control to what extent exogenous cost drivers affect costs. The role of numerous exogenous factors considered seems however quite limited. The area of the distribution zone and an indicator of quality are the only elements that appear significant from an economic viewpoint [it

  6. The distribution of soiling by coarse particulate matter in the museum environment.

    Science.gov (United States)

    Yoon, Y H; Brimblecombe, P

    2001-12-01

    Soiling measurements are needed to address strategies to control dust and determine its sources. There is no widely recognized method for dust monitoring in museums, but we used sticky samplers to collect deposited coarse particulate matter, and both manual microscopic observations and image analysis for determining soiling potential in the museum environment. We adopt fractional area covered by deposited particles as a surrogate for soiling and the covering rate (unit: s-1) as a measure of the rate of soiling. It was clear that visitor flow was a major contributor to soiling, such that soiling mechanisms in different museums could be compared after measurements were normalised on a per capita basis. The proximity of visitors to objects was another important factor with the soiling declining with distance from visitor pathways (a half-distance of about 0.5 m), which suggests soiling of objects on open display could be reduced by increasing the distance from visitors.

  7. Nuclear reactor decommissioning: an analysis of the regulatory environments

    Energy Technology Data Exchange (ETDEWEB)

    Cantor, R.

    1986-08-01

    In the next several decades, the electric utility industry will be faced withthe retirement of 50,000 megawatts (mW) of nuclear capacity. Responsibility for the financial and technical burdens this activity entails has been delegated to the utilities operating the reactors. However, the operators will have to perform the tasks of reactor decommissioning within the regulatory environment dictated by federal, state and local regulations. The purpose of this study was to highlight some of the current and likely trends in regulations and regulatory practices that will significantly affect the costs, technical alternatives and financing schemes encountered by the electric utilities and their customers. To identify significant trends and practices among regulatory bodies and utilities, a reviw of these factors was undertaken at various levels in the regulatory hierarchy. The technical policies were examined in reference to their treatment of allowed technical modes, restoration of the plant site including any specific recognition of the residual radioactivity levels, and planning requirements. The financial policies were examined for specification of acceptable financing arrangements, mechanisms which adjust for changes in the important parameters used to establish the fund, tax and rate-base treatments of the payments to and earnings on the fund, and whether or not escalation and/or discounting were considered in the estimates of decommissioning costs. The attitudes of regulators toward financial risk, the tax treatment of the decommissioning fund, and the time distribution of the technical mode were found to have the greatest effect on the discounted revenue requirements. Under plausible assumptions, the cost of a highly restricted environment is about seven times that of the minimum revenue requirement environment for the plants that must be decommissioned in the next three decades.

  8. Nuclear reactor decommissioning: an analysis of the regulatory environments

    International Nuclear Information System (INIS)

    Cantor, R.

    1986-08-01

    In the next several decades, the electric utility industry will be faced withthe retirement of 50,000 megawatts (mW) of nuclear capacity. Responsibility for the financial and technical burdens this activity entails has been delegated to the utilities operating the reactors. However, the operators will have to perform the tasks of reactor decommissioning within the regulatory environment dictated by federal, state and local regulations. The purpose of this study was to highlight some of the current and likely trends in regulations and regulatory practices that will significantly affect the costs, technical alternatives and financing schemes encountered by the electric utilities and their customers. To identify significant trends and practices among regulatory bodies and utilities, a reviw of these factors was undertaken at various levels in the regulatory hierarchy. The technical policies were examined in reference to their treatment of allowed technical modes, restoration of the plant site including any specific recognition of the residual radioactivity levels, and planning requirements. The financial policies were examined for specification of acceptable financing arrangements, mechanisms which adjust for changes in the important parameters used to establish the fund, tax and rate-base treatments of the payments to and earnings on the fund, and whether or not escalation and/or discounting were considered in the estimates of decommissioning costs. The attitudes of regulators toward financial risk, the tax treatment of the decommissioning fund, and the time distribution of the technical mode were found to have the greatest effect on the discounted revenue requirements. Under plausible assumptions, the cost of a highly restricted environment is about seven times that of the minimum revenue requirement environment for the plants that must be decommissioned in the next three decades

  9. RANGE AND DISTRIBUTION OF TECHNETIUM KD VALUES IN THE SRS SUBSURFACE ENVIRONMENT

    International Nuclear Information System (INIS)

    Kaplan, D.

    2008-01-01

    Performance assessments (PAs) are risk calculations used to estimate the amount of low-level radioactive waste that can be disposed at DOE sites. Distribution coefficients (K d values) are input parameters used in PA calculations to provide a measure of radionuclide sorption to sediment; the greater the K d value, the greater the sorption and the slower the estimated movement of the radionuclide through sediment. Understanding and quantifying K d value variability is important for estimating the uncertainty of PA calculations. Without this information, it is necessary to make overly conservative estimates about the possible limits of K d values, which in turn may increase disposal costs. Finally, technetium is commonly found to be amongst the radionuclides posing potential risk at waste disposal locations because it is believed to be highly mobile in its anionic form (pertechnetate, TcO 4 - ), it exists in relatively high concentrations in SRS waste, and it has a long half-life (213,000 years). The objectives of this laboratory study were to determine under SRS environmental conditions: (1) whether and to what extent TcO 4 - sorbs to sediments, (2) the range of Tc K d values, (3) the distribution (normal or log-normal) of Tc K d values, and (4) how strongly Tc sorbs to SRS sediments through desorption experiments. Objective 3, to identify the Tc K d distribution is important because it provides a statistical description that influences stochastic modeling of estimated risk. The approach taken was to collect 26 sediments from a non-radioactive containing sediment core collected from E-Area, measure Tc K d values and then perform statistical analysis to describe the measured Tc K d values. The mean K d value was 3.4 ± 0.5 mL/g and ranged from -2.9 to 11.2 mL/g. The data did not have a Normal distribution (as defined by the Shapiro-Wilk's Statistic) and had a 95-percentile range of 2.4 to 4.4 mL/g. The E-Area subsurface is subdivided into three hydrostratigraphic

  10. Statistical analysis of rockfall volume distributions: Implications for rockfall dynamics

    Science.gov (United States)

    Dussauge, Carine; Grasso, Jean-Robert; Helmstetter, AgnèS.

    2003-06-01

    We analyze the volume distribution of natural rockfalls on different geological settings (i.e., calcareous cliffs in the French Alps, Grenoble area, and granite Yosemite cliffs, California Sierra) and different volume ranges (i.e., regional and worldwide catalogs). Contrary to previous studies that included several types of landslides, we restrict our analysis to rockfall sources which originated on subvertical cliffs. For the three data sets, we find that the rockfall volumes follow a power law distribution with a similar exponent value, within error bars. This power law distribution was also proposed for rockfall volumes that occurred along road cuts. All these results argue for a recurrent power law distribution of rockfall volumes on subvertical cliffs, for a large range of rockfall sizes (102-1010 m3), regardless of the geological settings and of the preexisting geometry of fracture patterns that are drastically different on the three studied areas. The power law distribution for rockfall volumes could emerge from two types of processes. First, the observed power law distribution of rockfall volumes is similar to the one reported for both fragmentation experiments and fragmentation models. This argues for the geometry of rock mass fragment sizes to possibly control the rockfall volumes. This way neither cascade nor avalanche processes would influence the rockfall volume distribution. Second, without any requirement of scale-invariant quenched heterogeneity patterns, the rock mass dynamics can arise from avalanche processes driven by fluctuations of the rock mass properties, e.g., cohesion or friction angle. This model may also explain the power law distribution reported for landslides involving unconsolidated materials. We find that the exponent values of rockfall volume on subvertical cliffs, 0.5 ± 0.2, is significantly smaller than the 1.2 ± 0.3 value reported for mixed landslide types. This change of exponents can be driven by the material strength, which

  11. Harmonic Analysis of Radial Distribution Systems Embedded Shunt Capacitors

    Directory of Open Access Journals (Sweden)

    Abdallah Elsherif

    2017-03-01

    Full Text Available Harmonic analysis is an important application for analysis and design of distribution systems. It is used to quantify the distortion in voltage and current waveforms at various buses for a distribution system. However such analysis has become more and more important since the presence of harmonic-producing equipment is increasing. As harmonics propagate through a system, they result in increased power losses and possible equipment loss-of-life. Further equipments might be damaged by overloads resulting from resonant amplifications. There are a large number of harmonic analysis methods that are in widespread use. The most popular of these are frequency scans, harmonic penetration and harmonic power flow. Current source (or current injection methods are the most popular forms of such harmonic analyses. These methods make use of the admittance matrix inverse which computationally demand and may be a singular in some cases of radial distributors. Therefore, in this paper, a new fast harmonic load flow method is introduced. The introduced method is designed to save computational time required for the admittance matrix formation used in current injection methods. Also, the introduced method can overcome the singularity problems that appear in the conventional methods. Applying the introduced harmonic load flow method to harmonic polluted distribution systems embedded shunt capacitors which commonly used for losses minimization and voltage enhancement, it is found that the shunt capacitor can maximize or minimize system total harmonic distortion (THD according to its size and connection point. Therefore, in this paper, a new proposed multi-objective particle swarm optimization "MOPSO" for optimal capacitors placement on harmonic polluted distribution systems has been introduced. The obtained results verify the effectiveness of the introduced MOPSO algorithm for voltage THD minimization, power losses minimization and voltage enhancement of radial

  12. Plutonium distribution in the environs of the Nevada Test Site: status report

    International Nuclear Information System (INIS)

    Bliss, W.A.; Jakubowski, F.M.

    1977-01-01

    Over 6,000 soil samples from Areas 1, 4, and 5 of the Nevada Test Site were Ge(Li) scanned for Am-241 and other gamma-emitting radionuclides. The results indicate only about 5% contained Am-241 in quantities above the detection limit (1 pCi/g of Am-241). In addition, approximately 10% of the samples were subjected to wet chemistry analysis for Pu-239. The results indicate all contained Pu-239 in quantities above the detection limit (0.01 pCi/g of Pu-239). The largest concentrations of Am-241 and Pu-239 appear in the Hamilton and Small Boy sites of Area 5. The Hamilton soil data may be sufficient to estimate both an inventory and a distribution of plutonium over most of the site, but the data from the majority of the other sites are sufficient only to set upper limits of plutonium inventories

  13. How is an analysis of the enterprise economic environment done?

    Directory of Open Access Journals (Sweden)

    Carlos Parodi Trece

    2015-09-01

    Full Text Available The proper interpretation of the evolution of the economy is a useful tool to improve corporate decision-making. The aim of this paper is to explain, with examples applied to the current economic reality, how an analysis of the economic environment is made and how it serves for the corporate strategic planning. To do this, after the explanation of the overall conceptual framework; gross domestic product, inflation and external deficit as indicators, key in the "language" used by analysts are defined. These are economic indicators, related, that depend on domestic policy and exogenous shocks, defined as events that are out of the hands of economic policy designers, but that influence the three variables, such as the international financial crisis. Following it, the formalization through macroeconomic identities is made, in order to finally explain "how is the economy" through them; and the relationship between the internal to the external economic environment. Bringing the economy to business should be a priority in an increasingly integrated context, characterized by the fact that the positive and the negative of what happens in the world economy is transmitted through various channels, to companies located in different countries. Hence, companies should broaden their vision, as the economic environment does not only include what happens within the country, but the future of the world economy.

  14. Modular Environment for Graph Research and Analysis with a Persistent

    Energy Technology Data Exchange (ETDEWEB)

    2009-11-18

    The MEGRAPHS software package provides a front-end to graphs and vectors residing on special-purpose computing resources. It allows these data objects to be instantiated, destroyed, and manipulated. A variety of primitives needed for typical graph analyses are provided. An example program illustrating how MEGRAPHS can be used to implement a PageRank computation is included in the distribution.The MEGRAPHS software package is targeted towards developers of graph algorithms. Programmers using MEGRAPHS would write graph analysis programs in terms of high-level graph and vector operations. These computations are transparently executed on the Cray XMT compute nodes.

  15. Number size distribution measurements of biological aerosols under contrasting environments and seasons from southern tropical India

    Science.gov (United States)

    Valsan, Aswathy; Cv, Biju; Krishna, Ravi; Huffman, Alex; Poschl, Ulrich; Gunthe, Sachin

    2016-04-01

    the locations exhibited varied peaks in the size range of 1-3 μm indicating different sources of biological particles. In Chennai the number size distribution was observed to be bimodal with peaks at 2.5 μm and 1.1 μm. whereas in Munnar the number size distribution was monomodal during monsoon with a prominent peak at 3 μm and was bimodal during winter with peaks at 1.5 μm and 2.8 μm. The detailed analysis also revealed the distinct features and trends in the diunral patterns of FBAP at contrasting locations during distinct seasons. The further details about the measurements will be presented.

  16. Environment

    Science.gov (United States)

    2005-01-01

    biodiversity. Consequently, the major environmental challenges facing us in the 21st century include: global climate change , energy, population and food...technological prowess, and security interests. Challenges Global Climate Change – Evidence shows that our environment and the global climate ... urbanization will continue to pressure the regional environment . Although most countries have environmental protection ministries or agencies, a lack of

  17. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    Science.gov (United States)

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  18. An improved algorithm for connectivity analysis of distribution networks

    International Nuclear Information System (INIS)

    Kansal, M.L.; Devi, Sunita

    2007-01-01

    In the present paper, an efficient algorithm for connectivity analysis of moderately sized distribution networks has been suggested. Algorithm is based on generation of all possible minimal system cutsets. The algorithm is efficient as it identifies only the necessary and sufficient conditions of system failure conditions in n-out-of-n type of distribution networks. The proposed algorithm is demonstrated with the help of saturated and unsaturated distribution networks. The computational efficiency of the algorithm is justified by comparing the computational efforts with the previously suggested appended spanning tree (AST) algorithm. The proposed technique has the added advantage as it can be utilized for generation of system inequalities which is useful in reliability estimation of capacitated networks

  19. HammerCloud: A Stress Testing System for Distributed Analysis

    CERN Document Server

    van der Ster, Daniel C; Ubeda Garcia, Mario; Paladin, Massimo

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud (HC) is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HC was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HC has been ...

  20. HammerCloud: A Stress Testing System for Distributed Analysis

    International Nuclear Information System (INIS)

    Ster, Daniel C van der; García, Mario Úbeda; Paladin, Massimo; Elmsheuser, Johannes

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).

  1. Cost Analysis In A Multi-Mission Operations Environment

    Science.gov (United States)

    Newhouse, M.; Felton, L.; Bornas, N.; Botts, D.; Roth, K.; Ijames, G.; Montgomery, P.

    2014-01-01

    Spacecraft control centers have evolved from dedicated, single-mission or single missiontype support to multi-mission, service-oriented support for operating a variety of mission types. At the same time, available money for projects is shrinking and competition for new missions is increasing. These factors drive the need for an accurate and flexible model to support estimating service costs for new or extended missions; the cost model in turn drives the need for an accurate and efficient approach to service cost analysis. The National Aeronautics and Space Administration (NASA) Huntsville Operations Support Center (HOSC) at Marshall Space Flight Center (MSFC) provides operations services to a variety of customers around the world. HOSC customers range from launch vehicle test flights; to International Space Station (ISS) payloads; to small, short duration missions; and has included long duration flagship missions. The HOSC recently completed a detailed analysis of service costs as part of the development of a complete service cost model. The cost analysis process required the team to address a number of issues. One of the primary issues involves the difficulty of reverse engineering individual mission costs in a highly efficient multimission environment, along with a related issue of the value of detailed metrics or data to the cost model versus the cost of obtaining accurate data. Another concern is the difficulty of balancing costs between missions of different types and size and extrapolating costs to different mission types. The cost analysis also had to address issues relating to providing shared, cloud-like services in a government environment, and then assigning an uncertainty or risk factor to cost estimates that are based on current technology, but will be executed using future technology. Finally the cost analysis needed to consider how to validate the resulting cost models taking into account the non-homogeneous nature of the available cost data and the

  2. The Future of the Global Environment: A Model-based Analysis Supporting UNEP's First Global Environment Outlook

    OpenAIRE

    Bakkes JA; Woerden JW van; Alcamo J; Berk MM; Bol P; Born GJ van den; Brink BJE ten; Hettelingh JP; Langeweg F; Niessen LW; Swart RJ; United Nations Environment Programme (UNEP), Nairobi, Kenia; MNV

    1997-01-01

    This report documents the scenario analysis in UNEP's first Global Environment Outlook, published at the same time as the scenario analysis. This Outlook provides a pilot assessment of developments in the environment, both global and regional, between now and 2015, with a further projection to 2050. The study was carried out in support of the Agenda 21 interim evaluation, five years after 'Rio' and ten years after 'Brundtland'. The scenario analysis is based on only one scenario, Conventional...

  3. Distribution of genes related to antimicrobial resistance in different oral environments: a systematic review.

    Science.gov (United States)

    Moraes, Ludmila Coutinho; Só, Marcus Vinícius Reis; Dal Pizzol, Tatiane da Silva; Ferreira, Maria Beatriz Cardoso; Montagner, Francisco

    2015-04-01

    The oral cavity is the main source of microorganisms for odontogenic infections. It is important to perform an extensive analysis regarding the reports on the presence of bacteria that carry resistance genes to antimicrobial agents. The aim of the study was to verify the reports on the distribution of genes associated with resistance to antibiotics prescribed in dentistry in different human oral sites. A systematic review was conducted in electronic databases and gray literature to analyze clinical studies that detected genes of bacterial resistance to antibiotics in saliva, supragingival biofilm, and endodontic infections. Data regarding the research group, geographic location, sample source, number of subjects, methods for sample analysis, the targeted gene groups, and the detection rates were collected. Descriptive data analysis was performed. Preliminary analysis was performed in 152 titles; 50 abstracts were reviewed, and 29 full texts were obtained. Nine articles matched the inclusion criteria (saliva = 2, supragingival biofilm = 1, and endodontic infections = 6). The presence of 33 different targeted genes was evaluated. The most frequently investigated groups of genes were tetracycline and lactamics (tetM, tetQ, tetW, and cfxA). There was a wide range for the detection rates of each resistance gene among studies and for each specific gene group. This systematic review highlights the presence of resistance genes to antimicrobial agents in saliva, dental biofilm, and endodontic infections, especially for tetracycline and lactamics. There is a lack of reports on the presence of genes and resulting outcomes obtained through the therapeutic approaches for infection control. Copyright © 2015 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  4. Cross-platform validation and analysis environment for particle physics

    Energy Technology Data Exchange (ETDEWEB)

    Chekanov, S. V.; Pogrebnyak, I.; Wilbern, D.

    2017-11-01

    A multi-platform validation and analysis framework for public Monte Carlo simulation for high-energy particle collisions is discussed. The front-end of this framework uses the Python programming language, while the back-end is written in Java, which provides a multi-platform environment that can be run from a web browser and can easily be deployed at the grid sites. The analysis package includes all major software tools used in high-energy physics, such as Lorentz vectors, jet algorithms, histogram packages, graphic canvases, and tools for providing data access. This multi-platform software suite, designed to minimize OS-specific maintenance and deployment time, is used for online validation of Monte Carlo event samples through a web interface.

  5. R AS THE ENVIRONMENT FOR DATA ANALYSIS IN PSYCHOLOGICAL EVALUATION

    Directory of Open Access Journals (Sweden)

    Ana María Ruiz-Ruano

    2016-01-01

    Full Text Available R is a free computing environment for statistical data analysis and graph creation. It is becoming a key tool in a wide range of knowledge domains. The interaction with the software is mainly based on a command line interface but efforts are currently being made to develop friendlier graphical user interfaces that will help novice users to become familiar with this programming language. R is a flexible and powerful system thanks to the development community that is working together to improve its capabilities. As a result, it is the chosen statistical software in many applied and basic contexts. This paper highlights the potential usefulness of R for psychological assessment and related areas. Additionally, the relevance of statistical data analysis is emphasised as an instrument that will boost the progress of psychology under the umbrella of the scientific method.

  6. TREEFINDER: a powerful graphical analysis environment for molecular phylogenetics

    Directory of Open Access Journals (Sweden)

    von Haeseler Arndt

    2004-06-01

    Full Text Available Abstract Background Most analysis programs for inferring molecular phylogenies are difficult to use, in particular for researchers with little programming experience. Results TREEFINDER is an easy-to-use integrative platform-independent analysis environment for molecular phylogenetics. In this paper the main features of TREEFINDER (version of April 2004 are described. TREEFINDER is written in ANSI C and Java and implements powerful statistical approaches for inferring gene tree and related analyzes. In addition, it provides a user-friendly graphical interface and a phylogenetic programming language. Conclusions TREEFINDER is a versatile framework for analyzing phylogenetic data across different platforms that is suited both for exploratory as well as advanced studies.

  7. Virtual learning object and environment: a concept analysis.

    Science.gov (United States)

    Salvador, Pétala Tuani Candido de Oliveira; Bezerril, Manacés Dos Santos; Mariz, Camila Maria Santos; Fernandes, Maria Isabel Domingues; Martins, José Carlos Amado; Santos, Viviane Euzébia Pereira

    2017-01-01

    To analyze the concept of virtual learning object and environment according to Rodgers' evolutionary perspective. Descriptive study with a mixed approach, based on the stages proposed by Rodgers in his concept analysis method. Data collection occurred in August 2015 with the search of dissertations and theses in the Bank of Theses of the Coordination for the Improvement of Higher Education Personnel. Quantitative data were analyzed based on simple descriptive statistics and the concepts through lexicographic analysis with support of the IRAMUTEQ software. The sample was made up of 161 studies. The concept of "virtual learning environment" was presented in 99 (61.5%) studies, whereas the concept of "virtual learning object" was presented in only 15 (9.3%) studies. A virtual learning environment includes several and different types of virtual learning objects in a common pedagogical context. Analisar o conceito de objeto e de ambiente virtual de aprendizagem na perspectiva evolucionária de Rodgers. Estudo descritivo, de abordagem mista, realizado a partir das etapas propostas por Rodgers em seu modelo de análise conceitual. A coleta de dados ocorreu em agosto de 2015 com a busca de dissertações e teses no Banco de Teses e Dissertações da Coordenação de Aperfeiçoamento de Pessoal de Nível Superior. Os dados quantitativos foram analisados a partir de estatística descritiva simples e os conceitos pela análise lexicográfica com suporte do IRAMUTEQ. A amostra é constituída de 161 estudos. O conceito de "ambiente virtual de aprendizagem" foi apresentado em 99 (61,5%) estudos, enquanto o de "objeto virtual de aprendizagem" em apenas 15 (9,3%). Concluiu-se que um ambiente virtual de aprendizagem reúne vários e diferentes tipos de objetos virtuais de aprendizagem em um contexto pedagógico comum.

  8. Data analysis and mapping of the mountain permafrost distribution

    Science.gov (United States)

    Deluigi, Nicola; Lambiel, Christophe; Kanevski, Mikhail

    2017-04-01

    In Alpine environments mountain permafrost is defined as a thermal state of the ground and corresponds to any lithosphere material that is at or below 0°C for, at least, two years. Its degradation is potentially leading to an increasing rock fall activity, rock glacier accelerations and an increase in the sediment transfer rates. During the last 15 years, knowledge on this phenomenon has significantly increased thanks to many studies and monitoring projects. They revealed a spatial distribution extremely heterogeneous and complex. As a consequence, modelling the potential extent of the mountain permafrost recently became a very important task. Although existing statistical models generally offer a good overview at a regional scale, they are not always able to reproduce its strong spatial discontinuity at the micro scale. To overcome this lack, the objective of this study is to propose an alternative modelling approach using three classification algorithms belonging to statistics and machine learning: Logistic regression (LR), Support Vector Machines (SVM) and Random forests (RF). The former is a linear parametric classifier that commonly used as a benchmark classification algorithm to be employed before using more complex classifiers. Non-linear SVM is a non-parametric learning algorithm and it is a member of the so-called kernel methods. RF are an ensemble learning method based on bootstrap aggregating and offer an embedded measure of the variable importance. Permafrost evidences were selected in a 588 km2 area of the Western Swiss Alps and serve as training examples. They were mapped from field data (thermal and geoelectrical data) and ortho-image interpretation (rock glacier inventorying). The dataset was completed with environmental predictors such as altitude, mean annual air temperature, aspect, slope, potential incoming solar radiation, normalized difference vegetation index and planar, profile and combined terrain curvature indices. Aiming at predicting

  9. Globally Distributed R&D Work in a Marketing Management Support Systems (MMSS) Environment

    NARCIS (Netherlands)

    J. Eliashberg (Jehoshua); S. Swami (Sanjeev); C.B. Weinberg (Charles); B. Wierenga (Berend)

    2008-01-01

    textabstractGlobalisation, liberalization and rapid technological developments have been changing business environments drastically in the recent decades. These trends are increasingly exposing businesses to market competition and thus intensifying competition. In such an environment, the role of

  10. Analysis of rainfall distribution in Kelantan river basin, Malaysia

    Science.gov (United States)

    Che Ros, Faizah; Tosaka, Hiroyuki

    2018-03-01

    Using rainfall gauge on its own as input carries great uncertainties regarding runoff estimation, especially when the area is large and the rainfall is measured and recorded at irregular spaced gauging stations. Hence spatial interpolation is the key to obtain continuous and orderly rainfall distribution at unknown points to be the input to the rainfall runoff processes for distributed and semi-distributed numerical modelling. It is crucial to study and predict the behaviour of rainfall and river runoff to reduce flood damages of the affected area along the Kelantan river. Thus, a good knowledge on rainfall distribution is essential in early flood prediction studies. Forty six rainfall stations and their daily time-series were used to interpolate gridded rainfall surfaces using inverse-distance weighting (IDW), inverse-distance and elevation weighting (IDEW) methods and average rainfall distribution. Sensitivity analysis for distance and elevation parameters were conducted to see the variation produced. The accuracy of these interpolated datasets was examined using cross-validation assessment.

  11. Risk analysis for a local gas distribution network

    International Nuclear Information System (INIS)

    Peters, J.W.

    1991-01-01

    Cost control and service reliability are popular topics when discussing strategic issues facing local distribution companies (LDCs) in the 1990s. The ability to provide secure and uninterrupted gas service is crucial for growth and company image, both with the public and regulatory agencies. At the same time, the industry is facing unprecedented competition from alternate fuels, and cost control is essential for maintaining a competitive edge in the market. On the surface, it would appear that cost control and service reliability are contradictory terms. Improvement in service reliability should cost something, or does it? Risk analysis can provide the answer from a distribution design perspective. From a gas distribution engineer's perspective, projects such as loops, backfeeds and even valve placement are designed to reduce, minimize and/or eliminate potential customer outages. These projects improve service reliability by acting as backups should a failure occur on a component of the distribution network. These contingency projects are cost-effective but their longterm benefit or true value is under question. Their purpose is to maintain supply to an area in the distribution network in the event of a failure somewhere else. Two phrases, potential customer outages and in the event of failure, identify uncertainty

  12. Analysis of rainfall distribution in Kelantan river basin, Malaysia

    Directory of Open Access Journals (Sweden)

    Che Ros Faizah

    2018-01-01

    Full Text Available Using rainfall gauge on its own as input carries great uncertainties regarding runoff estimation, especially when the area is large and the rainfall is measured and recorded at irregular spaced gauging stations. Hence spatial interpolation is the key to obtain continuous and orderly rainfall distribution at unknown points to be the input to the rainfall runoff processes for distributed and semi-distributed numerical modelling. It is crucial to study and predict the behaviour of rainfall and river runoff to reduce flood damages of the affected area along the Kelantan river. Thus, a good knowledge on rainfall distribution is essential in early flood prediction studies. Forty six rainfall stations and their daily time-series were used to interpolate gridded rainfall surfaces using inverse-distance weighting (IDW, inverse-distance and elevation weighting (IDEW methods and average rainfall distribution. Sensitivity analysis for distance and elevation parameters were conducted to see the variation produced. The accuracy of these interpolated datasets was examined using cross-validation assessment.

  13. Size distribution measurements and chemical analysis of aerosol components

    Energy Technology Data Exchange (ETDEWEB)

    Pakkanen, T.A.

    1995-12-31

    The principal aims of this work were to improve the existing methods for size distribution measurements and to draw conclusions about atmospheric and in-stack aerosol chemistry and physics by utilizing size distributions of various aerosol components measured. A sample dissolution with dilute nitric acid in an ultrasonic bath and subsequent graphite furnace atomic absorption spectrometric analysis was found to result in low blank values and good recoveries for several elements in atmospheric fine particle size fractions below 2 {mu}m of equivalent aerodynamic particle diameter (EAD). Furthermore, it turned out that a substantial amount of analyses associated with insoluble material could be recovered since suspensions were formed. The size distribution measurements of in-stack combustion aerosols indicated two modal size distributions for most components measured. The existence of the fine particle mode suggests that a substantial fraction of such elements with two modal size distributions may vaporize and nucleate during the combustion process. In southern Norway, size distributions of atmospheric aerosol components usually exhibited one or two fine particle modes and one or two coarse particle modes. Atmospheric relative humidity values higher than 80% resulted in significant increase of the mass median diameters of the droplet mode. Important local and/or regional sources of As, Br, I, K, Mn, Pb, Sb, Si and Zn were found to exist in southern Norway. The existence of these sources was reflected in the corresponding size distributions determined, and was utilized in the development of a source identification method based on size distribution data. On the Finnish south coast, atmospheric coarse particle nitrate was found to be formed mostly through an atmospheric reaction of nitric acid with existing coarse particle sea salt but reactions and/or adsorption of nitric acid with soil derived particles also occurred. Chloride was depleted when acidic species reacted

  14. DIRAC - The Distributed MC Production and Analysis for LHCb

    CERN Document Server

    Tsaregorodtsev, A

    2004-01-01

    DIRAC is the LHCb distributed computing grid infrastructure for MC production and analysis. Its architecture is based on a set of distributed collaborating services. The service decomposition broadly follows the ARDA project proposal, allowing for the possibility of interchanging the EGEE/ARDA and DIRAC components in the future. Some components developed outside the DIRAC project are already in use as services, for example the File Catalog developed by the AliEn project. An overview of the DIRAC architecture will be given, in particular the recent developments to support user analysis. The main design choices will be presented. One of the main design goals of DIRAC is the simplicity of installation, configuring and operation of various services. This allows all the DIRAC resources to be easily managed by a single Production Manager. The modular design of the DIRAC components allows its functionality to be easily extended to include new computing and storage elements or to handle new tasks. The DIRAC system al...

  15. Comparing Distributions of Environmental Outcomes for Regulatory Environmental Justice Analysis

    Directory of Open Access Journals (Sweden)

    Glenn Sheriff

    2011-05-01

    Full Text Available Economists have long been interested in measuring distributional impacts of policy interventions. As environmental justice (EJ emerged as an ethical issue in the 1970s, the academic literature has provided statistical analyses of the incidence and causes of various environmental outcomes as they relate to race, income, and other demographic variables. In the context of regulatory impacts, however, there is a lack of consensus regarding what information is relevant for EJ analysis, and how best to present it. This paper helps frame the discussion by suggesting a set of questions fundamental to regulatory EJ analysis, reviewing past approaches to quantifying distributional equity, and discussing the potential for adapting existing tools to the regulatory context.

  16. Off-chip wire distribution and signal analysis

    OpenAIRE

    Shi, Rui

    2008-01-01

    With the steady progress of high performance electronic systems, the complexity of the electronic systems grows continuously and new features like high-speed, low power and low cost, become the key issues. The interconnection in the electronic systems distributes the power, clock and transfers the electrical signals among numerous components. As the feature size of microelectronic technology becomes smaller, the design and analysis of the interconnection are more important for the performance...

  17. Model for teaching distributed computing in a distance-based educational environment

    CSIR Research Space (South Africa)

    le Roux, P

    2010-10-01

    Full Text Available Due to the prolific growth in connectivity, the development and implementation of distributed systems receives a lot of attention. Several technologies and languages exist for the development and implementation of such distributed systems; however...

  18. Integrated Software Environment for Pressurized Thermal Shock Analysis

    Directory of Open Access Journals (Sweden)

    Dino Araneo

    2011-01-01

    Full Text Available The present paper describes the main features and an application to a real Nuclear Power Plant (NPP of an Integrated Software Environment (in the following referred to as “platform” developed at University of Pisa (UNIPI to perform Pressurized Thermal Shock (PTS analysis. The platform is written in Java for the portability and it implements all the steps foreseen in the methodology developed at UNIPI for the deterministic analysis of PTS scenarios. The methodology starts with the thermal hydraulic analysis of the NPP with a system code (such as Relap5-3D and Cathare2, during a selected transient scenario. The results so obtained are then processed to provide boundary conditions for the next step, that is, a CFD calculation. Once the system pressure and the RPV wall temperature are known, the stresses inside the RPV wall can be calculated by mean a Finite Element (FE code. The last step of the methodology is the Fracture Mechanics (FM analysis, using weight functions, aimed at evaluating the stress intensity factor (KI at crack tip to be compared with the critical stress intensity factor KIc. The platform automates all these steps foreseen in the methodology once the user specifies a number of boundary conditions at the beginning of the simulation.

  19. Knowledge exchange and learning from failures in distributed environments: The role of contractor relationship management and work characteristics

    International Nuclear Information System (INIS)

    Gressgård, Leif Jarle; Hansen, Kåre

    2015-01-01

    Learning from failures is vital for improvement of safety performance, reliability, and resilience in organizations. In order for such learning to take place in distributed environments, knowledge has to be shared among organizational members at different locations and units. This paper reports on a study conducted in the context of drilling and well operations on the Norwegian Continental Shelf, which represents a high-risk distributed organizational environment. The study investigates the relationships between organizations' abilities to learn from failures, knowledge exchange within and between organizational units, quality of contractor relationship management, and work characteristics. The results show that knowledge exchange between units is the most important predictor of perceived ability to learn from failures. Contractor relationship management, leadership involvement, role clarity, and empowerment are also important factors for failure-based learning, both directly and through increased knowledge exchange. The results of the study enhance our understanding of how abilities to learn from failures can be improved in distributed environments where similar work processes take place at different locations and involve employees from several companies. Theoretical contributions and practical implications are discussed. - Highlights: • We investigate factors affecting failure-based learning in distributed environments. • Knowledge exchange between units is the most important predictor. • Contractor relationship management is positively related to knowledge exchange. • Leadership involvement, role clarity, and empowerment are significant variables. • Respondents from an operator firm and eight contractors are included in the study

  20. Automatic analysis of attack data from distributed honeypot network

    Science.gov (United States)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  1. Performance analysis of distributed beamforming in a spectrum sharing system

    KAUST Repository

    Yang, Liang

    2012-09-01

    In this paper, we consider a distributed beamforming scheme (DBF) in a spectrum sharing system where multiple secondary users share the spectrum with the licensed primary users under an interference temperature constraint. We assume that DBF is applied at the secondary users. We first consider optimal beamforming and compare it with the user selection scheme in terms of the outage probability and bit-error rate performance. Since perfect feedback is difficult to obtain, we then investigate a limited feedback DBF scheme and develop an outage probability analysis for a random vector quantization (RVQ) design algorithm. Numerical results are provided to illustrate our mathematical formalism and verify our analysis. © 2012 IEEE.

  2. Robust Bayesian Analysis of Generalized Half Logistic Distribution

    Directory of Open Access Journals (Sweden)

    Ajit Chaturvedi

    2017-06-01

    Full Text Available In this paper, Robust Bayesian analysis of the generalized half logistic distribution (GHLD under an $\\epsilon$-contamination class of priors for the shape parameter $\\lambda$ is considered. ML-II Bayes estimators of the parameters, reliability function and hazard function are derived under the squared-error loss function (SELF and linear exponential (LINEX loss function by considering the Type~II censoring and the sampling scheme of Bartholomew (1963. Both the cases when scale parameter is known and unknown is considered under Type~II censoring and under the sampling scheme of Bartholomew. Simulation study and analysis of a real data set are presented.

  3. A Scheduling Algorithm for the Distributed Student Registration System in Transaction-Intensive Environment

    Science.gov (United States)

    Li, Wenhao

    2011-01-01

    Distributed workflow technology has been widely used in modern education and e-business systems. Distributed web applications have shown cross-domain and cooperative characteristics to meet the need of current distributed workflow applications. In this paper, the author proposes a dynamic and adaptive scheduling algorithm PCSA (Pre-Calculated…

  4. The long-run distribution of births across environments under environmental stochasticity and its use in the calculation of unconditional life-history parameters.

    Science.gov (United States)

    Hernandez-Suarez, Carlos; Rabinovich, Jorge; Hernandez, Karla

    2012-12-01

    Matrix population models assume individuals develop in time along different stages that may include age, size or degree of maturity to name a few. Once in a given stage, an individual's ability to survive, reproduce or move to another stage are fixed for that stage. Some demographic models consider that environmental conditions may change, and thus the chances of reproducing, dying or developing to another stage depend on the current stage and environmental conditions. That is, models have evolved from a single transition matrix to a set of several transition matrices, each accounting for the properties of a given environment. These models require information on the transition between environments, which is in general assumed to be Markovian. Although great progress has been made in the analysis of these models, they present new challenges and some new parameters need to be calculated, mainly the ones related to how births are distributed among environments. These parameters may help in population management and to calculate unconditional life history parameters. We derive for the first time an expression for the long-run distribution of births across environments, and show that it does not depend only on the long-range frequency of different environments, but also on the set of all transition and fertility matrices. We also derive the long-run distribution of deaths across environments. We provide an example using a real data set of the dynamics of Saiga antelope. Theoretical values closely match the observed values obtained in a large set of stochastic simulations. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Air Flow Distribution Analysis for APR1400 Integrated Head Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Seonggi; Yi, Kunwoo; Jung, Byung Ryul; Yune, Seok Jeong; Park, Seong Chan [KEPCO Engineering and Construction, Daejeon (Korea, Republic of)

    2016-10-15

    The Integrated Head Assembly (IHA) provides air flow path for cooling of control element drive mechanism (CEDM). CEDM cooling fans are located on the top of the IHA, which provide CEDM cooling air in the upper air plenum of IHA. The CEDM cooling system provides sufficient air flow to remove heat generated from the CEDM coils and the RV nozzles. The cooling system is provided to ensure that the CEDM cooling flows are within a required volumetric flow rate range. This paper presents a CFD analysis of pressure drop and temperature distribution across each IHA air flow path and volumetric flow rate in each CEDM cooling shroud. It is important to cool CEDMs properly for integrity of CEDM operations. The analysis is aimed at checking the IHA cooling design data for CEDM cooling design. A comparison was performed between the CFD analysis and the simplified equation for IHA cooling in terms of temperature, flow distribution and pressure drop. The CFD analysis results showed a little difference to those calculated from a simplified equation. It is because the CFD analysis results are calculated by considering temperatures of IHA flow path as shown results.

  6. An Analysis of University Students' Attitudes towards Personalized Learning Environments

    Science.gov (United States)

    Sahin, Muhittin; Kisla, Tarik

    2016-01-01

    The aim of this research is to analyze university students' attitudes towards personalized learning environments with respect to the independent variables of gender, age, university, year of study, knowledge about the environment, participation in the environment and being willing to participate in the environment. The correlative survey model is…

  7. Influence of a Municipal Waste Landfill on the Spatial Distribution of Mercury in the Environment.

    Science.gov (United States)

    Gworek, Barbara; Dmuchowski, Wojciech; Gozdowski, Dariusz; Koda, Eugeniusz; Osiecka, Renata; Borzyszkowski, Jan

    2015-01-01

    The study investigations were focused on assessing the influence of a 35-year-old municipal waste landfill on environmental mercury pollution. The total Hg content was determined in the soil profile, groundwater, and the plants (Solidago virgaurea and Poaceae sp.) in the landfill area. Environmental pollution near the landfill was relatively low. The topsoil layer, groundwater and the leaves of Solidago virgaurea and Poaceae sp. contained 19-271 μg kg-1, 0.36-3.01 μg l-1, 19-66 μg kg-1 and 8-29 μg kg-1 of Hg, respectively. The total Hg content in the soil decreased with the depth. The results are presented as pollution maps of the landfill area based on the total Hg content in the soil, groundwater and plants. Statistical analysis revealed the lack of correlation between the total Hg content in the soil and plants, but a relationship between the total concentration of Hg in groundwater and soil was shown. The landfill is not a direct source of pollution in the area. The type of land morphology did not influence the pollution level. Construction of bentonite cut-off wall bypassing MSW landfill reduces the risk of mercury release into ground-water environment.

  8. Sustainable Mobility: Longitudinal Analysis of Built Environment on Transit Ridership

    Directory of Open Access Journals (Sweden)

    Dohyung Kim

    2016-10-01

    Full Text Available Given the concerns about urban mobility, traffic congestion, and greenhouse gas (GHG emissions, extensive research has explored the relationship between the built environment and transit ridership. However, the nature of aggregation and the cross-sectional approach of the research rarely provide essential clues on the potential of a transit system as a sustainable mobility option. From the perspective of longitudinal sustainability, this paper develops regression models for rail transit stations in the Los Angeles Metro system. These models attempt to identify the socio-demographic characteristics and land use features influencing longitudinal transit ridership changes. Step-wise ordinary least square (OLS regression models are used to identify factors that contribute to transit ridership changes. Those factors include the number of dwelling units, employment-oriented land uses such as office and commercial land uses, and land use balance. The models suggest a negative relationship between job and population balance with transit ridership change. They also raise a question regarding the 0.4 km radius commonly used in transit analysis. The models indicate that the 0.4 km radius is too small to capture the significant influence of the built environment on transit ridership.

  9. A Web-Based Development Environment for Collaborative Data Analysis

    CERN Document Server

    Erdmann, M; Glaser, C; Klingebiel, D; Komm, M; Müller, G; Rieger, M; Steggemann, J; Urban, M; Winchen, T

    2014-01-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis ow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-congurable connections to remote machines supplying resources and local le access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and a...

  10. A Web-Based Development Environment for Collaborative Data Analysis

    Science.gov (United States)

    Erdmann, M.; Fischer, R.; Glaser, C.; Klingebiel, D.; Komm, M.; Müller, G.; Rieger, M.; Steggemann, J.; Urban, M.; Winchen, T.

    2014-06-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis flow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-configurable connections to remote machines supplying resources and local file access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and additional software either centralized or individually. We further report on the results of an application with more than 100 third-year students using VISPA for their regular particle physics exercises during the winter term 2012/13. Besides the ambition to support and simplify the development cycle of physics analyses, new use cases such as fast, location-independent status queries, the validation of results, and the ability to share analyses within worldwide collaborations with a single click become conceivable.

  11. A Web-Based Development Environment for Collaborative Data Analysis

    International Nuclear Information System (INIS)

    Erdmann, M; Fischer, R; Glaser, C; Klingebiel, D; Müller, G; Rieger, M; Urban, M; Winchen, T; Komm, M; Steggemann, J

    2014-01-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis flow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-configurable connections to remote machines supplying resources and local file access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and additional software either centralized or individually. We further report on the results of an application with more than 100 third-year students using VISPA for their regular particle physics exercises during the winter term 2012/13. Besides the ambition to support and simplify the development cycle of physics analyses, new use cases such as fast, location-independent status queries, the validation of results, and the ability to share analyses within worldwide collaborations with a single click become conceivable

  12. Distribution and importance of microplastics in the marine environment: A review of the sources, fate, effects, and potential solutions.

    Science.gov (United States)

    Auta, H S; Emenike, C U; Fauziah, S H

    2017-05-01

    The presence of microplastics in the marine environment poses a great threat to the entire ecosystem and has received much attention lately as the presence has greatly impacted oceans, lakes, seas, rivers, coastal areas and even the Polar Regions. Microplastics are found in most commonly utilized products (primary microplastics), or may originate from the fragmentation of larger plastic debris (secondary microplastics). The material enters the marine environment through terrestrial and land-based activities, especially via runoffs and is known to have great impact on marine organisms as studies have shown that large numbers of marine organisms have been affected by microplastics. Microplastic particles have been found distributed in large numbers in Africa, Asia, Southeast Asia, India, South Africa, North America, and in Europe. This review describes the sources and global distribution of microplastics in the environment, the fate and impact on marine biota, especially the food chain. Furthermore, the control measures discussed are those mapped out by both national and international environmental organizations for combating the impact from microplastics. Identifying the main sources of microplastic pollution in the environment and creating awareness through education at the public, private, and government sectors will go a long way in reducing the entry of microplastics into the environment. Also, knowing the associated behavioral mechanisms will enable better understanding of the impacts for the marine environment. However, a more promising and environmentally safe approach could be provided by exploiting the potentials of microorganisms, especially those of marine origin that can degrade microplastics. The concentration, distribution sources and fate of microplastics in the global marine environment were discussed, so also was the impact of microplastics on a wide range of marine biota. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Distribution of Aleutian mink disease virus contamination in the environment of infected mink farms.

    Science.gov (United States)

    Prieto, A; Fernández-Antonio, R; Díaz-Cao, J M; López, G; Díaz, P; Alonso, J M; Morrondo, P; Fernández, G

    2017-05-01

    Control and eradication of Aleutian Mink Disease Virus (AMDV) are a major concern for fur-bearing animal production. Despite notably reducing disease prevalence, current control programs are unable to prevent the reinfection of farms, and environmental AMDV persistence seems to play a major role regarding this issue. In this study 114 samples from different areas and elements of seven infected mink farms were analyzed by qPCR in order to evaluate the environmental distribution of AMDV load. Samples were classified into nine categories, depending on the type of sample and degree of proximity to the animals, the main source of infection. Two different commercial DNA extraction kits were employed in parallel for all samples. qPCR analysis showed 69.3% positive samples with one kit and 81.6% with the other, and significant differences between the two DNA extraction methods were found regarding AMDV DNA recovery. Regarding sample categorization, all categories showed a high percentage of AMDV positive samples (31%-100%). Quantification of positive samples showed a decrease in AMDV load from animal barns to the periphery of the farm. In addition, those elements in direct contact with animals, the street clothes and vehicles of farm workers and personal protective equipment used for sampling showed a high viral load, and statistical analysis revealed significant differences in AMDV load between the first and last categories. These results indicate high environmental contamination of positive farms, which is helpful for future considerations about cleaning and disinfection procedures and biosecurity protocols. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Performance Analysis of Information Services in a Grid Environment

    Directory of Open Access Journals (Sweden)

    Giovanni Aloisio

    2004-10-01

    Full Text Available The Information Service is a fundamental component in a grid environment. It has to meet a lot of requirements such as access to static and dynamic information related to grid resources, efficient and secure access to dynamic data, decentralized maintenance, fault tolerance etc., in order to achieve better performance, scalability, security and extensibility. Currently there are two different major approaches. One is based on a directory infrastructure and another one on a novel approach that exploits a relational DBMS. In this paper we present a performance comparison analysis between Grid Resource Information Service (GRIS and Local Dynamic Grid Catalog relational information service (LDGC, providing also information about two projects (iGrid and Grid Relational Catalog in the grid data management area.

  15. Privacy-Preserving k-Means Clustering under Multiowner Setting in Distributed Cloud Environments

    Directory of Open Access Journals (Sweden)

    Hong Rong

    2017-01-01

    Full Text Available With the advent of big data era, clients who lack computational and storage resources tend to outsource data mining tasks to cloud service providers in order to improve efficiency and reduce costs. It is also increasingly common for clients to perform collaborative mining to maximize profits. However, due to the rise of privacy leakage issues, the data contributed by clients should be encrypted using their own keys. This paper focuses on privacy-preserving k-means clustering over the joint datasets encrypted under multiple keys. Unfortunately, existing outsourcing k-means protocols are impractical because not only are they restricted to a single key setting, but also they are inefficient and nonscalable for distributed cloud computing. To address these issues, we propose a set of privacy-preserving building blocks and outsourced k-means clustering protocol under Spark framework. Theoretical analysis shows that our scheme protects the confidentiality of the joint database and mining results, as well as access patterns under the standard semihonest model with relatively small computational overhead. Experimental evaluations on real datasets also demonstrate its efficiency improvements compared with existing approaches.

  16. An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment.

    Science.gov (United States)

    Muthurajan, Vinothkumar; Narayanasamy, Balaji

    2016-01-01

    Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric) mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function) provide minimum protection level compared to asymmetric key (RSA, AES, and ECC) schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA) to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT) regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation.

  17. An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment

    Directory of Open Access Journals (Sweden)

    Vinothkumar Muthurajan

    2016-01-01

    Full Text Available Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function provide minimum protection level compared to asymmetric key (RSA, AES, and ECC schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation.

  18. Natural Environment Suitability of China and Its Relationship with Population Distributions

    OpenAIRE

    Yang, Xiaohuan; Ma, Hanqing

    2009-01-01

    The natural environment factor is one of the main indexes for evaluating human habitats, sustained economic growth and ecological health status. Based on Geographic Information System (GIS) technology and an analytic hierarchy process method, this article presents the construction of the Natural Environment Suitability Index (NESI) model of China by using natural environment data including climate, hydrology, surface configuration and ecological conditions. The NESI value is calculated in gri...

  19. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  20. Distribution and air-sea exchange of mercury (Hg) in polluted marine environments

    Science.gov (United States)

    Bagnato, E.; Sprovieri, M.; Bitetto, M.; Bonsignore, M.; Calabrese, S.; Di Stefano, V.; Oliveri, E.; Parello, F.; Mazzola, S.

    2012-04-01

    Mercury (Hg) is emitted in the atmosphere by anthropogenic and natural sources, these last accounting for one third of the total emissions. Since the pre-industrial age, the atmospheric deposition of mercury have increased notably, while ocean emissions have doubled owing to the re-emission of anthropogenic mercury. Exchange between the atmosphere and ocean plays an important role in cycling and transport of mercury. We present the preliminary results from a study on the distribution and evasion flux of mercury at the atmosphere/sea interface in the Augusta basin (SE Sicily, southern Italy), a semi-enclosed marine area affected by a high degree of contamination (heavy metals and PHA) due to the oil refineries placed inside its commercial harbor. It seems that the intense industrial activity of the past have lead to an high Hg pollution in the bottom sediments of the basin, whose concentrations are far from the background mercury value found in most of the Sicily Strait sediments. The release of mercury into the harbor seawater and its dispersion by diffusion from sediments to the surface, make the Augusta basin a potential supplier of mercury both to the Mediterranean Sea and the atmosphere. Based on these considerations, mercury concentration and flux at the air-sea interface of the Bay have been estimated using a real-time atomic adsorption spectrometer (LUMEX - RA915+) and an home-made accumulation chamber, respectively. Estimated Total Atmospheric Mercury (TGM) concentrations during the cruise on the bay were in the range of 1-3 ng · m-3, with a mean value of about 1.4 ng · m-3. These data well fit with the background Hgatm concentration values detected on the land (1-2 ng · m-3, this work), and, more in general, with the background atmospheric TGM levels found in the North Hemisphere (1.5-1.7 ng · m-3)a. Besides, our measurements are in the range of those reported for other important polluted marine areas. The mercury evasion flux at the air-sea interface

  1. The Distribution of Antarctic Subglacial Lake Environments With Implications for Their Origin and Evolution

    Science.gov (United States)

    Blankenship, D. D.; Young, D. A.; Carter, S. P.

    2006-12-01

    Ice-penetrating radar records across the Antarctic Ice Sheet show regions with strong flat mirror-like reflections from the subglacial interface that are interpreted to be from subglacial lakes. The majority of subglacial lakes are found in East Antarctica, primarily in topographically low areas of basins beneath the thick ice divides. Occasionally lakes are observed "perched" at higher elevations within local depressions of rough morphological regions. In addition, a correlation between the "onset" of enhanced glacial flow and subglacial lakes was identified. The greatest concentration of known lakes was found in the vicinity of Dome C. A second grouping of lakes lying near Ridge B includes Lake Vostok and several smaller lakes. Subglacial lakes were also discovered near the South Pole, within eastern Wilkes Land, west of the Transantarctic Mountains, and within West Antarctica's Whitmore Mountains. Aside from Lake Vostok, typical lengths of subglacial lakes were found to range from a few to about 20 kilometers. A recent inventory includes 145 subglacial lakes. Approximately 81% of detected lakes lie at elevations less than a few hundred meters above sea level while the majority of the remaining lakes are "perched" at higher elevations. We present the locations from the subglacial lake inventory on local "ice divides" calculated from the satellite derived surface elevations with and find the distance of each lake from these divides. Most significantly, we found that 66% of the lakes identified lie within 50 km of a local ice divide and 88% lie within 100 km of a local divide. In particular, note that lakes located far from the Dome C/Ridge B cluster and even those associated with very narrow catchments lie either on or within a few tens of kilometers of the local divide marked by the catchment boundary. The distance correlation of subglacial lakes with local ice divides leads to a fundamental question for the evolution of subglacial lake environments: Does the

  2. An environment for operation planning of electrical energy distribution systems; Um ambiente para planejamento da operacao de sistemas de distribuicao de energia eletrica

    Energy Technology Data Exchange (ETDEWEB)

    Hashimoto, Kleber

    1997-07-01

    The aim of operation planning of electrical energy distribution systems is to define a group of interventions in the distribution network in a way to reach a global optimization. Both normal operation and emergency conditions are considered including scheduled interventions in the network. Under normal conditions, some aspects as voltage drop and loss minimization, for example, are considered; under emergency conditions, the minimization of non-distributed energy and interruption time are taken into account. This proposes the development of a computer environment for operation planning studies. The design and implementation of a specific operation planning database, based on a relational database model, was carried out so that new information modules could be easily included. A statistical approach has been applied to the load model including consumption habits and seasonal information, for each consumer category. A group of operation planning functions is proposed, which includes simulation modules responsible for system analysis. These functions and the database should be designed together. The developed computer environment allows the inclusion of new functions, due to the standardized query language used to access the database. Two functions have been implemented for illustration. The development of the proposed environment constitutes a step towards an open operation planning system, an integration with other information systems and an easy-to-use system based on graphical interface. (author)

  3. Resonance analysis in parallel voltage-controlled Distributed Generation inverters

    DEFF Research Database (Denmark)

    Wang, Xiongfei; Blaabjerg, Frede; Chen, Zhe

    2013-01-01

    Thanks to the fast responses of the inner voltage and current control loops, the dynamic behaviors of parallel voltage-controlled Distributed Generation (DG) inverters not only relies on the stability of load sharing among them, but subjects to the interactions between the voltage control loops...... of the inverters and the remaining system dynamics. This paper addresses the later interactions and the consequent resonances through the frequency-domain analysis of the inverters output impedances and the remaining equivalent network impedance. Furthermore, impacts of the virtual output impedance loop...... and the voltage feedforward loop in the current controller are evaluated based on such an impedance interactions analysis. Simulation results are presented to confirm the validity of the theoretical analysis....

  4. Water hammer analysis in a water distribution system

    Directory of Open Access Journals (Sweden)

    John Twyman

    2017-04-01

    Full Text Available The solution to water hammer in a water distribution system (WDS is shown by applying three hybrid methods (HM based on the Box’s scheme, McCormack's method and Diffusive Scheme. Each HM formulation in conjunction with their relative advantages and disadvantages are reviewed. The analyzed WDS has pipes with different lengths, diameters and wave speeds, being the Courant number different in each pipe according to the adopted discretization. The HM results are compared with the results obtained by the Method of Characteristics (MOC. In reviewing the numerical attenuation, second order schemes based on Box and McCormack are more conservative from a numerical point of view, being recommendable their application in the analysis of water hammer in water distribution systems.

  5. Fault Diagnosis for Electrical Distribution Systems using Structural Analysis

    DEFF Research Database (Denmark)

    Knüppel, Thyge; Blanke, Mogens; Østergaard, Jacob

    2014-01-01

    Fault-tolerance in electrical distribution relies on the ability to diagnose possible faults and determine which components or units cause a problem or are close to doing so. Faults include defects in instrumentation, power generation, transformation and transmission. The focus of this paper...... is the design of efficient diagnostic algorithms, which is a prerequisite for fault-tolerant control of power distribution. Diagnosis in a grid depend on available analytic redundancies, and hence on network topology. When topology changes, due to earlier fault(s) or caused by maintenance, analytic redundancy...... analysis of power systems, it demonstrates detection and isolation of failures in a network, and shows how typical faults are diagnosed. Nonlinear fault simulations illustrate the results....

  6. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    This paper investigates the characteristics of typical optimisation models within Distribution Network Design. During the paper fourteen models known from the literature will be thoroughly analysed. Through this analysis a schematic approach to categorisation of distribution network design models...... for educational purposes. Furthermore, the paper can be seen as a practical introduction to network design modelling as well as a being an art manual or recipe when constructing such a model....... are covered in the categorisation include fixed vs. general networks, specialised vs. general nodes, linear vs. nonlinear costs, single vs. multi commodity, uncapacitated vs. capacitated activities, single vs. multi modal and static vs. dynamic. The models examined address both strategic and tactical planning...

  7. Analysis of Statistical Distributions Used for Modeling Reliability and Failure Rate of Temperature Alarm Circuit

    International Nuclear Information System (INIS)

    EI-Shanshoury, G.I.

    2011-01-01

    Several statistical distributions are used to model various reliability and maintainability parameters. The applied distribution depends on the' nature of the data being analyzed. The presented paper deals with analysis of some statistical distributions used in reliability to reach the best fit of distribution analysis. The calculations rely on circuit quantity parameters obtained by using Relex 2009 computer program. The statistical analysis of ten different distributions indicated that Weibull distribution gives the best fit distribution for modeling the reliability of the data set of Temperature Alarm Circuit (TAC). However, the Exponential distribution is found to be the best fit distribution for modeling the failure rate

  8. Distribution pattern of radon and daughters in an urban environment and determination of organ-dose frequency distributions

    International Nuclear Information System (INIS)

    Steinhaeusler, F.; Hofmann, W.; Pohl, E.; Pohl-Rueling, J.

    1980-01-01

    In a normal urban environment, inhaled radon and its decay products cause an important part of the human radiation burden not only for the respiratory tract but also for several other organs. A study was made with 729 test persons in Salzburg, Austria. Each person was questioned regarding his living activity, i.e., the mean times he spent in individual sleeping, living, and working places and the corresponding physical activities that strongly influence the respiratory minute volume and consequently the inhaled radioactivity. At all these places, the annual means of the external gamma radiation as well as of the air content of 222 Rn and its short-lived decay products were determined. From all these measurements (more than 8000) and the information obtained, mean annual organ doses were calculated for each test person. The doses due to cosmic radiation, 40 K, and other radionuclides incorporated were totaled. The range of the mean annual doses in millirems is only from 73 to 126 for gonads and 70 to 336 for the kidneys and finally reaches from 117 to 10,700 for the basal cells of the bronchial epithelium

  9. Time series power flow analysis for distribution connected PV generation.

    Energy Technology Data Exchange (ETDEWEB)

    Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J.; Smith, Jeff; Dugan, Roger

    2013-01-01

    Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating

  10. Distribution and transfer of radiocesium and radioiodine in the environment following the Fukushima nuclear accident - Distribution and transfer of radiocesium and radioiodine in the environment of Fukushima Prefecture following the nuclear accident

    Energy Technology Data Exchange (ETDEWEB)

    Muramatsu, Yasuyuki; Ohno, Takeshi; Sugiyama, Midori [Gakushuin University, Toshima-ku, Tokyo, 171-8588 (Japan); Sato, Mamoru [Fukushima Agricultural Technology Centre, Koriyama, Fukushima 963-0531 (Japan); Matsuzaki, Hiroyuki [The University of Tokyo, Bunkyo-ku, Tokyo 113-8656 (Japan)

    2014-07-01

    Large quantities of radioiodine and radiocesium were released from the accident at the Fukushima Daiichi Nuclear Power Plant (FDNPP) in March 2011. We have carried out intensive studies on the distribution and behaviour of these nuclides in the environment following the accident. Two topics obtained from our studies are presented. (1) Retrospective estimation of I-131 deposition through the analysis of I-129 in soil: It is necessary to obtain deposition data of radioiodine in Fukushima Prefecture for the assessment of thyroid doses due to the accident. However, short half-life of I-131 (8 days) made it impossible to obtain adequate sample coverage that would permit direct determination of the regional deposition patterns of I-131 within the prefecture and surrounding areas. On the other hand, I-129 released simultaneously during the accident still remains in soil, due to its long half-life of 1.57x10{sup 7} years. In order to reconstruct the I-131 deposition, we have determined I-129 concentrations by accelerator mass spectrometry (AMS). A good correlation was found between the measured concentrations of I-131 and I-129 in soils collected in the vicinity of FDNPP. We have analyzed I-129 in more than 500 soil samples collected systematically from Fukushima Prefecture. Using the obtained results, the I-131 deposition was calculated in different areas and the deposition map for I-131 was constructed. We also studied the vertical distribution of I-129 in soil. (2) Peculiar accumulation of radiocesium to some plants and mushrooms The radioactivity levels in agricultural crops decreased markedly in some months following the accident and their concentrations became lower than the Japanese guideline for foodstuffs (500 Bq/kg in 2011, and 100 Bq/kg after 2012). However, some agricultural products such as tea leaves and citrus fruits showed relatively higher values. Our analytical results obtained for the distribution of radiocesium in tea trees show that the root uptake

  11. Job optimization in ATLAS TAG-based distributed analysis

    Science.gov (United States)

    Mambelli, M.; Cranshaw, J.; Gardner, R.; Maeno, T.; Malon, D.; Novak, M.

    2010-04-01

    The ATLAS experiment is projected to collect over one billion events/year during the first few years of operation. The efficient selection of events for various physics analyses across all appropriate samples presents a significant technical challenge. ATLAS computing infrastructure leverages the Grid to tackle the analysis across large samples by organizing data into a hierarchical structure and exploiting distributed computing to churn through the computations. This includes events at different stages of processing: RAW, ESD (Event Summary Data), AOD (Analysis Object Data), DPD (Derived Physics Data). Event Level Metadata Tags (TAGs) contain information about each event stored using multiple technologies accessible by POOL and various web services. This allows users to apply selection cuts on quantities of interest across the entire sample to compile a subset of events that are appropriate for their analysis. This paper describes new methods for organizing jobs using the TAGs criteria to analyze ATLAS data. It further compares different access patterns to the event data and explores ways to partition the workload for event selection and analysis. Here analysis is defined as a broader set of event processing tasks including event selection and reduction operations ("skimming", "slimming" and "thinning") as well as DPD making. Specifically it compares analysis with direct access to the events (AOD and ESD data) to access mediated by different TAG-based event selections. We then compare different ways of splitting the processing to maximize performance.

  12. Risk analysis for autonomous underwater vehicle operations in extreme environments.

    Science.gov (United States)

    Brito, Mario Paulo; Griffiths, Gwyn; Challenor, Peter

    2010-12-01

    Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan-Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January-March 2009. © 2010 Society for Risk Analysis.

  13. Integrated analysis environment for the Movement Assessment Battery for Children

    Directory of Open Access Journals (Sweden)

    Carlos Norberto Fischer

    2013-12-01

    Full Text Available Developmental Coordination Disorder (DCD, a chronic and usually permanent condition found in children, is characterized by motor impairment that interferes with a child's activities of daily living and with academic achievement. One of the most popular tests for the quantitative diagnosis of DCD is the Movement Assessment Battery for Children (MABC. Based on the Battery's standardized scores, it is possible to identify children with typical development, children at risk of developing DCD, and children with DCD. This article describes a computational system we developed to assist with the analysis of results obtained in the MABC test. The tool was developed for the web environment and its database provides integration of MABC data. Thus, researchers around the world can share data and develop collaborative work in the DCD field. In order to help analysis processes, our system provides services for filtering data to show more specific sets of information and present the results in textual, table, and graphic formats, allowing easier and more comprehensive evaluation of the results.

  14. A distributed data base management facility for the CAD/CAM environment

    Science.gov (United States)

    Balza, R. M.; Beaudet, R. W.; Johnson, H. R.

    1984-01-01

    Current/PAD research in the area of distributed data base management considers facilities for supporting CAD/CAM data management in a heterogeneous network of computers encompassing multiple data base managers supporting a variety of data models. These facilities include coordinated execution of multiple DBMSs to provide for administration of and access to data distributed across them.

  15. Distribution and diversity of members of the bacterial phylum Fibrobacteres in environments where cellulose degradation occurs.

    Science.gov (United States)

    Ransom-Jones, Emma; Jones, David L; Edwards, Arwyn; McDonald, James E

    2014-10-01

    The Fibrobacteres phylum contains two described species, Fibrobacter succinogenes and Fibrobacter intestinalis, both of which are prolific degraders of cellulosic plant biomass in the herbivore gut. However, recent 16S rRNA gene sequencing studies have identified novel Fibrobacteres in landfill sites, freshwater lakes and the termite hindgut, suggesting that members of the Fibrobacteres occupy a broader ecological range than previously appreciated. In this study, the ecology and diversity of Fibrobacteres was evaluated in 64 samples from contrasting environments where cellulose degradation occurred. Fibrobacters were detected in 23 of the 64 samples using Fibrobacter genus-specific 16S rRNA gene PCR, which provided their first targeted detection in marine and estuarine sediments, cryoconite from Arctic glaciers, as well as a broader range of environmental samples. To determine the phylogenetic diversity of the Fibrobacteres phylum, Fibrobacter-specific 16S rRNA gene clone libraries derived from 17 samples were sequenced (384 clones) and compared with all available Fibrobacteres sequences in the Ribosomal Database Project repository. Phylogenetic analysis revealed 63 lineages of Fibrobacteres (95% OTUs), with many representing as yet unclassified species. Of these, 24 OTUs were exclusively comprised of fibrobacters derived from environmental (non-gut) samples, 17 were exclusive to the mammalian gut, 15 to the termite hindgut, and 7 comprised both environmental and mammalian strains, thus establishing Fibrobacter spp. as indigenous members of microbial communities beyond the gut ecosystem. The data highlighted significant taxonomic and ecological diversity within the Fibrobacteres, a phylum circumscribed by potent cellulolytic activity, suggesting considerable functional importance in the conversion of lignocellulosic biomass in the biosphere. Copyright © 2014 Elsevier GmbH. All rights reserved.

  16. Sampling surface particle size distributions and stability analysis of deep channel in the Pearl River Estuary

    Science.gov (United States)

    Feng, Hao-chuan; Zhang, Wei; Zhu, Yu-liang; Lei, Zhi-yi; Ji, Xiao-mei

    2017-06-01

    Particle size distributions (PSDs) of bottom sediments in a coastal zone are generally multimodal due to the complexity of the dynamic environment. In this paper, bottom sediments along the deep channel of the Pearl River Estuary (PRE) are used to understand the multimodal PSDs' characteristics and the corresponding depositional environment. The results of curve-fitting analysis indicate that the near-bottom sediments in the deep channel generally have a bimodal distribution with a fine component and a relatively coarse component. The particle size distribution of bimodal sediment samples can be expressed as the sum of two lognormal functions and the parameters for each component can be determined. At each station of the PRE, the fine component makes up less volume of the sediments and is relatively poorly sorted. The relatively coarse component, which is the major component of the sediments, is even more poorly sorted. The interrelations between the dynamics and particle size of the bottom sediment in the deep channel of the PRE have also been investigated by the field measurement and simulated data. The critical shear velocity and the shear velocity are calculated to study the stability of the deep channel. The results indicate that the critical shear velocity has a similar distribution over large part of the deep channel due to the similar particle size distribution of sediments. Based on a comparison between the critical shear velocities derived from sedimentary parameters and the shear velocities obtained by tidal currents, it is likely that the depositional area is mainly distributed in the northern part of the channel, while the southern part of the deep channel has to face higher erosion risk.

  17. REACTOR ANALYSIS AND VIRTUAL CONTROL ENVIRONMENT (RAVEN) FY12 REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Cristian Rabiti; Andrea Alfonsi; Joshua Cogliati; Diego Mandelli; Robert Kinoshita

    2012-09-01

    RAVEN is a complex software tool that will have tasks spanning from being the RELAP-7 user interface, to using RELAP-7 to perform Risk Informed Safety Characterization (RISMC), and to controlling RELAP-7 calculation execution. The goal of this document is to: 1. Highlight the functional requirements of the different tasks of RAVEN 2. Identify shared functions that could be aggregate in modules so to obtain a minimal software redundancy and maximize software utilization. RAVEN is in fact a software framework that will allow exploiting the following functionalities: • Derive and actuate the control logic required to: o Simulate the plant control system o Simulate the operator (procedure guided) actions o Perform Monte Carlo sampling of random distributed events o Perform event three based analysis • Provide a GUI to: o Input a plant description to RELAP-7 (component, control variable, control parameters) o Concurrent monitoring of Control Parameters o Concurrent alteration of control parameters • Provide Post Processing data mining capability based on o Dimensionality reduction o Cardinality reduction In this document it will be shown how an appropriate mathematical formulation of the control logic and probabilistic analysis leads to have most of the software infrastructure leveraged between the two main tasks. Further, this document will go through the development accomplished this year, including simulation results, and priorities for the next years development

  18. Estimating Species Distributions Across Space Through Time and with Features of the Environment

    Energy Technology Data Exchange (ETDEWEB)

    Kelling, S. [Cornell Lab of Ornithology; Fink, D. [Cornell Lab of Ornithology; Hochachka, W. [Cornell Lab of Ornithology; Rosenberg, K. [Cornell Lab of Ornithology; Cook, R. [Oak Ridge National Laboratory (ORNL); Damoulas, C. [Department of Computer Science, Cornell University; Silva, C. [Department of Computer Science, Polytechnic Institute of New York; Michener, W. [DataONE, University of New Mexico

    2013-01-01

    Complete guidance for mastering the tools and techniques of the digital revolution With the digital revolution opening up tremendous opportunities in many fields, there is a growing need for skilled professionals who can develop data-intensive systems and extract information and knowledge from them. This book frames for the first time a new systematic approach for tackling the challenges of data-intensive computing, providing decision makers and technical experts alike with practical tools for dealing with our exploding data collections. Emphasizing data-intensive thinking and interdisciplinary collaboration, The Data Bonanza: Improving Knowledge Discovery in Science, Engineering, and Business examines the essential components of knowledge discovery, surveys many of the current research efforts worldwide, and points to new areas for innovation. Complete with a wealth of examples and DISPEL-based methods demonstrating how to gain more from data in real-world systems, the book: Outlines the concepts and rationale for implementing data-intensive computing in organizations Covers from the ground up problem-solving strategies for data analysis in a data-rich world Introduces techniques for data-intensive engineering using the Data-Intensive Systems Process Engineering Language DISPEL Features in-depth case studies in customer relations, environmental hazards, seismology, and more Showcases successful applications in areas ranging from astronomy and the humanities to transport engineering Includes sample program snippets throughout the text as well as additional materials on a companion website The Data Bonanza is a must-have guide for information strategists, data analysts, and engineers in business, research, and government, and for anyone wishing to be on the cutting edge of data mining, machine learning, databases, distributed systems, or large-scale computing.

  19. GISMOWA: Geospatial Risk-Based Analysis Identifying Water Quality Monitoring Sites in Distribution Systems

    DEFF Research Database (Denmark)

    Larsen, Sille Lyster; Christensen, Sarah Christine Boesgaard; Albrechtsen, Hans-Jørgen

    2017-01-01

    threats, e.g., contaminated sites; and (3) sensitive consumers, e.g., hospitals, in a GIS environment. The tool used a multicriteria decision analysis to evaluate multiple monitoring site parameters and map zones particularly suitable for water quality monitoring. GISMOWA was applied to Danish water......Monitoring water quality in drinking water distribution systems is the basis for proactive approaches to prevent or manage emerging water quality issues, and such a monitoring requires a strategic selection of relevant and representative monitoring sites. GISMOWA is a new GIS and risk...

  20. Numerical analysis of decoy state quantum key distribution protocols

    Energy Technology Data Exchange (ETDEWEB)

    Harrington, Jim W [Los Alamos National Laboratory; Rice, Patrick R [Los Alamos National Laboratory

    2008-01-01

    Decoy state protocols are a useful tool for many quantum key distribution systems implemented with weak coherent pulses, allowing significantly better secret bit rates and longer maximum distances. In this paper we present a method to numerically find optimal three-level protocols, and we examine how the secret bit rate and the optimized parameters are dependent on various system properties, such as session length, transmission loss, and visibility. Additionally, we show how to modify the decoy state analysis to handle partially distinguishable decoy states as well as uncertainty in the prepared intensities.

  1. Hygrothermal Analysis of Indoor Environment of Residential Prefabricated Buildings

    Science.gov (United States)

    Kraus, Michal

    2017-10-01

    Recent studies show that the relative humidity and the indoor air temperature constitute an important determinant of the quality of indoor air. Hygrothermal microclimate has a significant impact on occupant’s health and their comfort. The study presents the results of experimental measurement of indoor air temperature and relative humidity in selected apartment in prefabricated panel house situated in Ostrava, Czechia. The contribution describes and analysis the relation between indoor air temperature [°C] and relative humidity [%] in this apartment. The experimental object is selected with respect to the housing stock in the Czech Republic. A third of the housing stock in the Czech Republic is composed of prefabricated panel houses. Regeneration and revitalization of these buildings were in the focus of interest during recent years. Building modifications, such as thermal insulation of building envelope or window replacement, lead to a significantly higher level of airtightness of these objects. Humidity and indoor air temperature are measured in 10-minute cycles for two periods. The values of temperature and humidity are measured for the non-heating and the heating season. The length of each experimental period is 30 days. The mean value of indoor air temperature is 22.21 °C and average relative humidity is 45.87% in the non-heating period. The values of 22.62 °C and 35.20% represent average values for the heating period. A slight increase of the average temperature of the indoor environment (+1.85%) is observed. The decrease of the relative humidity is evident at first glance. The relative humidity of the internal environment is approximately 10% lower in the heating period. Long-term decline of relative humidity below 30% brings many problems. It is necessary to take measures to increase of relative humidity in residential prefabricated building. The aquarium appears to be ineffective. The solution may be forced artificial ventilation or humidifiers.

  2. High excitation rovibrational molecular analysis in warm environments

    Science.gov (United States)

    Zhang, Ziwei; Stancil, Phillip C.; Cumbee, Renata; Ferland, Gary J.

    2017-06-01

    Inspired by advances in infrared observation (e.g., Spitzer, Herschel and ALMA), we investigate rovibrational emission CO and SiO in warm astrophysical environments. With recent innovation in collisional rate coefficients and rescaling methods, we are able to construct more comprehensive collisional data with high rovibrational states (vibration up to v=5 and rotation up to J=40) and multiple colliders (H2, H and He). These comprehensive data sets are used in spectral simulations with the radiative transfer codes RADEX and Cloudy. We obtained line ratio diagnostic plots and line spectra for both near- and far-infrared emission lines over a broad range of density and temperature for the case of a uniform medium. Considering the importance of both molecules in probing conditions and activities of UV-irradiated interstellar gas, we model rovibrational emission in photodissociation region (PDR) and AGB star envelopes (such as VY Canis Majoris, IK Tau and IRC +10216) with Cloudy. Rotational diagrams, energy distribution diagrams, and spectra are produced to examine relative state abundances, line emission intensity, and other properties. With these diverse models, we expect to have a better understanding of PDRs and expand our scope in the chemical architecture and evolution of AGB stars and other UV-irradiated regions. The soon to be launched James Webb Space Telescope (JWST) will provide high resolution observations at near- to mid-infrared wavelengths, which opens a new window to study molecular vibrational emission calling for more detailed chemical modeling and comprehensive laboratory astrophysics data on more molecules. This work was partially supported by NASA grants NNX12AF42G and NNX15AI61G. We thank Benhui Yang, Kyle Walker, Robert Forrey, and N. Balakrishnan for collaborating on the collisional data adopted in the current work.

  3. Characterizing single-molecule FRET dynamics with probability distribution analysis.

    Science.gov (United States)

    Santoso, Yusdi; Torella, Joseph P; Kapanidis, Achillefs N

    2010-07-12

    Probability distribution analysis (PDA) is a recently developed statistical tool for predicting the shapes of single-molecule fluorescence resonance energy transfer (smFRET) histograms, which allows the identification of single or multiple static molecular species within a single histogram. We used a generalized PDA method to predict the shapes of FRET histograms for molecules interconverting dynamically between multiple states. This method is tested on a series of model systems, including both static DNA fragments and dynamic DNA hairpins. By fitting the shape of this expected distribution to experimental data, the timescale of hairpin conformational fluctuations can be recovered, in good agreement with earlier published results obtained using different techniques. This method is also applied to studying the conformational fluctuations in the unliganded Klenow fragment (KF) of Escherichia coli DNA polymerase I, which allows both confirmation of the consistency of a simple, two-state kinetic model with the observed smFRET distribution of unliganded KF and extraction of a millisecond fluctuation timescale, in good agreement with rates reported elsewhere. We expect this method to be useful in extracting rates from processes exhibiting dynamic FRET, and in hypothesis-testing models of conformational dynamics against experimental data.

  4. Languages, compilers and run-time environments for distributed memory machines

    CERN Document Server

    Saltz, J

    1992-01-01

    Papers presented within this volume cover a wide range of topics related to programming distributed memory machines. Distributed memory architectures, although having the potential to supply the very high levels of performance required to support future computing needs, present awkward programming problems. The major issue is to design methods which enable compilers to generate efficient distributed memory programs from relatively machine independent program specifications. This book is the compilation of papers describing a wide range of research efforts aimed at easing the task of programmin

  5. Wheel inspection system environment qualification and validation : final report for public distribution.

    Science.gov (United States)

    2009-03-20

    International Electronic Machines Corporation (IEM) has developed and is now marketing a state-of-the-art Wheel Inspection System Environment (WISE). WISE provides wheel profile and dimensional measurements, i.e. rim thickness, flange height, flange ...

  6. Grain Size Analysis And Depositional Environment For Beach Sediments Along Abu Dhabi Coast United Arab Emirates

    Directory of Open Access Journals (Sweden)

    Saeed Al Rashedi

    2015-08-01

    Full Text Available The paper analysed the grain-size spectrum and textural parameters namely mean sorting skewness and kurtosis of Abu Dhabi coast in United Arab Emirates applying the ASTM sieves. For this purpose fifty seven samples were analysed. The results of the sieving analysis divulged that samples of the study area range between -2.63 pebble size to 2-39 Fine sands. The statistical analysis reveals that the sand is characteristically fine grained moderately well sorted to extremely poorly sorted. The sand distribution is strongly coarse and leptokurtic in nature. Abundance of the medium sand to fine sand shows the prevalence of comparatively moderate- to low-energy condition in the study area. Linear discriminate function of the samples indicates an Aeolian shallow marine deposition environment and less influence of fluvial 7 process.

  7. Analysis of black soil environment based on Arduino

    Science.gov (United States)

    Li, Y.; Zhang, Y. F.; Wu, C. H.; Wang, J. F.

    2017-05-01

    As everyone knows, the black soil of Heilongjiang bred rice is famous in the world. How to use networking technology to detection the growth environment of Heilongjiang rice, and expands it to the local planting environment to our country is the most important topic. However, the growth environment of rice is complex. In current research, some importnat factors such as carbon dioxide, oxygen, temperature and humidity, pH value and microbial content in black soil that affect the growth of plants are selected, and a kind of black land based on data acquisition and transmission system based on the Arduino development environment and the mechanism construction of Kingview has been realized. The collected data was employed to establish the simulation environment for the growth of rice in Heilongjiang. It can be applied to stimulate the rice growing environment of Heilongjiang province, and gives a improvement of rice quality in other areas. Keywords: Arduino; Kingview; living environment

  8. Alternative Equation on Magnetic Pair Distribution Function for Quantitative Analysis

    Science.gov (United States)

    Kodama, Katsuaki; Ikeda, Kazutaka; Shamoto, Shin-ichi; Otomo, Toshiya

    2017-12-01

    We derive an alternative equation of magnetic pair distribution function (mPDF) related to the mPDF equation given in a preceding study [B. A. Frandsen, X. Yang, and S. J. L. Billinge, https://doi.org/10.1107/S2053273313033081" xlink:type="simple">Acta Crystallogr., Sect. A 70, 3 (2014)] for quantitative analysis of realistic experimental data. The additional term related to spontaneous magnetization included in the equation is particularly important for the mPDF analysis of ferromagnetic materials. Quantitative estimation of mPDF from neutron diffraction data is also shown. The experimental mPDFs estimated from the neutron diffraction data of the ferromagnet MnSb and the antiferromagnet MnF2 are quantitatively consistent with the mPDFs calculated using the presented equation.

  9. Modal distribution analysis of vibrato in musical signals

    Science.gov (United States)

    Mellody, Maureen; Wakefield, Gregory H.

    1998-10-01

    Due to the nonstationary nature of vibrator notes, standard Fourier analysis techniques may not sufficiently characterize the partials of notes undergoing vibrato. Our study employs the modal distribution, a bilinear time-frequency representation, to analyze vibrato signals. Instantaneous frequency and amplitude values for each partial are extracted using Hilbert techniques applied to local neighborhoods of the time-frequency surface. We consider vibrato in violin and vocal performance. Our study confirms the presence of both amplitude modulation and frequency modulation in the partials of notes generated by each of these instruments, and provides a fine-grained analysis of these variations. In addition, we show that these instantaneous amplitude and frequency estimates can be incorporated into methods for synthesizing signals that perceptually resemble the original sampled sounds.

  10. Pair distribution function analysis applied to decahedral gold nanoparticles

    International Nuclear Information System (INIS)

    Nakotte, H; Silkwood, C; Kiefer, B; Karpov, D; Fohtung, E; Page, K; Wang, H-W; Olds, D; Manna, S; Fullerton, E E

    2017-01-01

    The five-fold symmetry of face-centered cubic (fcc) derived nanoparticles is inconsistent with the translational symmetry of a Bravais lattice and generally explained by multiple twinning of a tetrahedral subunit about a (joint) symmetry axis, with or without structural modification to the fcc motif. Unlike in bulk materials, five-fold twinning in cubic nanoparticles is common and strongly affects their structural, chemical, and electronic properties. To test and verify theoretical approaches, it is therefore pertinent that the local structural features of such materials can be fully characterized. The small size of nanoparticles severely limits the application of traditional analysis techniques, such as Bragg diffraction. A complete description of the atomic arrangement in nanoparticles therefore requires a departure from the concept of translational symmetry, and prevents fully evaluating all the structural features experimentally. We describe how recent advances in instrumentation, together with the increasing power of computing, are shaping the development of alternative analysis methods of scattering data for nanostructures. We present the application of Debye scattering and pair distribution function (PDF) analysis towards modeling of the total scattering data for the example of decahedral gold nanoparticles. PDF measurements provide a statistical description of the pair correlations of atoms within a material, allowing one to evaluate the probability of finding two atoms within a given distance. We explored the sensitivity of existing synchrotron x-ray PDF instruments for distinguishing four different simple models for our gold nanoparticles: a multiply twinned fcc decahedron with either a single gap or multiple distributed gaps, a relaxed body-centered orthorhombic (bco) decahedron, and a hybrid decahedron. The data simulations of the models were then compared with experimental data from synchrotron x-ray total scattering. We present our experimentally

  11. Pair distribution function analysis applied to decahedral gold nanoparticles

    Science.gov (United States)

    Nakotte, H.; Silkwood, C.; Page, K.; Wang, H.-W.; Olds, D.; Kiefer, B.; Manna, S.; Karpov, D.; Fohtung, E.; Fullerton, E. E.

    2017-11-01

    The five-fold symmetry of face-centered cubic (fcc) derived nanoparticles is inconsistent with the translational symmetry of a Bravais lattice and generally explained by multiple twinning of a tetrahedral subunit about a (joint) symmetry axis, with or without structural modification to the fcc motif. Unlike in bulk materials, five-fold twinning in cubic nanoparticles is common and strongly affects their structural, chemical, and electronic properties. To test and verify theoretical approaches, it is therefore pertinent that the local structural features of such materials can be fully characterized. The small size of nanoparticles severely limits the application of traditional analysis techniques, such as Bragg diffraction. A complete description of the atomic arrangement in nanoparticles therefore requires a departure from the concept of translational symmetry, and prevents fully evaluating all the structural features experimentally. We describe how recent advances in instrumentation, together with the increasing power of computing, are shaping the development of alternative analysis methods of scattering data for nanostructures. We present the application of Debye scattering and pair distribution function (PDF) analysis towards modeling of the total scattering data for the example of decahedral gold nanoparticles. PDF measurements provide a statistical description of the pair correlations of atoms within a material, allowing one to evaluate the probability of finding two atoms within a given distance. We explored the sensitivity of existing synchrotron x-ray PDF instruments for distinguishing four different simple models for our gold nanoparticles: a multiply twinned fcc decahedron with either a single gap or multiple distributed gaps, a relaxed body-centered orthorhombic (bco) decahedron, and a hybrid decahedron. The data simulations of the models were then compared with experimental data from synchrotron x-ray total scattering. We present our experimentally

  12. On the relevance of efficient, integrated computer and network monitoring in HEP distributed online environment

    CERN Document Server

    Carvalho, D F; Delgado, V; Albert, J N; Bellas, N; Javello, J; Miere, Y; Ruffinoni, D; Smith, G

    1996-01-01

    Large Scientific Equipments are controlled by Computer System whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, thhe sophistication of its trearment and, on the over hand by the fast evolution of the computer and network market. Some people call them generically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this frame- work the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is to integrate the various functions of DCCS monitoring into one general purpose Multi-layer ...

  13. A Real-Time Fault Management Software System for Distributed Environments, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — DyMA-FM (Dynamic Multivariate Assessment for Fault Management) is a software architecture for real-time fault management. Designed to run in a distributed...

  14. A Real-Time Fault Management Software System for Distributed Environments Project

    Data.gov (United States)

    National Aeronautics and Space Administration — DyMA-FM (Dynamic Multivariate Assessment for Fault Management) is a software architecture for real-time fault management. Designed to run in a distributed...

  15. Distributed Data Analysis in the ATLAS Experiment: Challenges and Solutions

    CERN Document Server

    Elmsheuser, J; The ATLAS collaboration

    2012-01-01

    The ATLAS experiment at the LHC at CERN is recording and simulating several 10's of PetaBytes of data per year. To analyse these data the ATLAS experiment has developed and operates a mature and stable distributed analysis (DA) service on the Worldwide LHC Computing Grid. The service is actively used: more than 1400 users have submitted jobs in the year 2011 and a total of more 1 million jobs run every week. Users are provided with a suite of tools to submit Athena, ROOT or generic jobs to the grid, and the PanDA workload management system is responsible for their execution. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. This paper will review the state of the DA tools and services, summarize the past year of distributed analysis activity, and present the directions for future improvements to the system.

  16. Mercury speciation and distribution in a glacierized mountain environment and their relevance to environmental risks in the inland Tibetan Plateau.

    Science.gov (United States)

    Sun, Xuejun; Zhang, Qianggong; Kang, Shichang; Guo, Junming; Li, Xiaofei; Yu, Zhengliang; Zhang, Guoshuai; Qu, Dongmei; Huang, Jie; Cong, Zhiyuan; Wu, Guangjian

    2018-08-01

    Glacierized mountain environments can preserve and release mercury (Hg) and play an important role in regional Hg biogeochemical cycling. However, the behavior of Hg in glacierized mountain environments and its environmental risks remain poorly constrained. In this research, glacier meltwater, runoff and wetland water were sampled in Zhadang-Qugaqie basin (ZQB), a typical glacierized mountain environment in the inland Tibetan Plateau, to investigate Hg distribution and its relevance to environmental risks. The total mercury (THg) concentrations ranged from 0.82 to 6.98ng·L -1 , and non-parametric pairwise multiple comparisons of the THg concentrations among the three different water samples showed that the THg concentrations were comparable. The total methylmercury (TMeHg) concentrations ranged from 0.041 to 0.115ng·L -1 , and non-parametric pairwise multiple comparisons of the TMeHg concentrations showed a significant difference. Both the THg and MeHg concentrations of water samples from the ZQB were comparable to those of other remote areas, indicating that Hg concentrations in the ZQB watershed are equivalent to the global background level. Particulate Hg was the predominant form of Hg in all runoff samples, and was significantly correlated with the total suspended particle (TSP) and not correlated with the dissolved organic carbon (DOC) concentration. The distribution of mercury in the wetland water differed from that of the other water samples. THg exhibited a significant correlation with DOC as well as TMeHg, whereas neither THg nor TMeHg was associated with TSP. Based on the above findings and the results from previous work, we propose a conceptual model illustrating the four Hg distribution zones in glacierized environments. We highlight that wetlands may enhance the potential hazards of Hg released from melting glaciers, making them a vital zone for investigating the environmental effects of Hg in glacierized environments and beyond. Copyright © 2018

  17. A review on distribution and monitoring of hormones in the environment and their removal in wastewater treatment systems

    Directory of Open Access Journals (Sweden)

    Rahele Kafaei

    2014-11-01

    Full Text Available Steroid hormones of the Endocrine disrupting compounds (EDC are steroid hormones, which cause negative effects on human health, animals and ecosystems balance, have become a major concern in modern societies. In recent years numerous studies have performed on hormone distribution in the environment, especially in aquatic environments and the ways that they have been removed. Hormones entrance into the environment primarily is through wastewater, municipal wastewater treatment sludge, hospital wastewater and livestock activity. Measured values in the wastewater treatment influent, livestock lagoons, surface water and groundwater, showed different concentrations of hormones in the range of ng/L. But it is important to know even in trace concentration of ng/L, hormones can have adverse effects on environment. By biodegradation, biosorption and biotransformation, hormones will be degraded and their activities will be decreased. Wastewater treatment processes includes preliminary, primary, secondary and advanced treatment, that are the most important ways to prevent the entrance of hormonal compounds to the environment. Sludge should be cleaned by available technology before entering the environment. Wastewater processes in both liquid and sludge phase, under various operating conditions, show different range of hormones removal. In this paper authors try to discuss about the problem and different environmental aspects of hormones.

  18. Control of dispatch dynamics for lowering the cost of distributed generation in the built environment

    Science.gov (United States)

    Flores, Robert Joseph

    Distributed generation can provide many benefits over traditional central generation such as increased reliability and efficiency while reducing emissions. Despite these potential benefits, distributed generation is generally not purchased unless it reduces energy costs. Economic dispatch strategies can be designed such that distributed generation technologies reduce overall facility energy costs. In this thesis, a microturbine generator is dispatched using different economic control strategies, reducing the cost of energy to the facility. Several industrial and commercial facilities are simulated using acquired electrical, heating, and cooling load data. Industrial and commercial utility rate structures are modeled after Southern California Edison and Southern California Gas Company tariffs and used to find energy costs for the simulated buildings and corresponding microturbine dispatch. Using these control strategies, building models, and utility rate models, a parametric study examining various generator characteristics is performed. An economic assessment of the distributed generation is then performed for both the microturbine generator and parametric study. Without the ability to export electricity to the grid, the economic value of distributed generation is limited to reducing the individual costs that make up the cost of energy for a building. Any economic dispatch strategy must be built to reduce these individual costs. While the ability of distributed generation to reduce cost depends of factors such as electrical efficiency and operations and maintenance cost, the building energy demand being serviced has a strong effect on cost reduction. Buildings with low load factors can accept distributed generation with higher operating costs (low electrical efficiency and/or high operations and maintenance cost) due to the value of demand reduction. As load factor increases, lower operating cost generators are desired due to a larger portion of the building load

  19. Chemometric analysis of ecological toxicants in petrochemical and industrial environments.

    Science.gov (United States)

    Olawoyin, Richard; Heidrich, Brenden; Oyewole, Samuel; Okareh, Oladapo T; McGlothlin, Charles W

    2014-10-01

    The application of chemometrics in the assessment of toxicants, such as heavy metals (HMs) and polycyclic aromatic hydrocarbons (PAHs) potentially derived from petrochemical activities in the microenvironment, is vital in providing safeguards for human health of children and adults residing around petrochemical industrial regions. Several multivariate statistical methods are used in geosciences and environmental protection studies to classify, identify and group prevalent pollutants with regard to exhibited trends. Chemometrics can be applied for toxicant source identification, estimation of contaminants contributions to the toxicity of sites of interest, the assessment of the integral risk index of an area and provision of mitigating measures that limit or eliminate the contaminants identified. In this study, the principal component analysis (PCA) was used for dimensionality reduction of both organic and inorganic substances data in the environment, which are potentially hazardous. The high molecular weight (HMW) PAHs correlated positively with stronger impact on the model than the lower molecular weight (LMW) PAHs, the total petroleum hydrocarbons (TPHs), PAHs and BTEX correlate positively in the F1 vs F2 plot indicating similar source contributions of these pollutants in the environmental material. Cu, Cr, Cd, Fe, Zn and Pb all show positive correlation in the same space indicating similar source of contamination. Analytical processes involving environmental assessment data obtained in the Niger Delta area of Nigeria, confirmed the usefulness of chemometrics for comprehensive ecological evaluation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Integrated Coastal Zone Planning Based on Environment Carrying Capacity Analysis

    Science.gov (United States)

    Miharja, M.; Arsallia, S.

    2017-07-01

    Coastal zone is a crucial area in terms of planning development. It holds high economic value, which affect to increasing number of inhabitants living in the area. As a result, this condition influences environmental degradation. Thus, in every attempt towards coastal zone development, it is crucial to always refer to environment carrying capacity. Carrying capacity is the limit of a certain coastal zone capability to support all human created activities, in which all ecological performances are maintained at sustainable level. The failure to establish strong and clear method and regulation on carrying capacity analysis will lead to a very risky coastal zone development, which in turn would threat the area’s sustainability. This paper discusses method for analysing carrying capacity of coastal zone as important input for the area development plan. Two key parameters, i.e. land and clean water carrying capacities are discussed to form carrying capacity analytical method. Furthermore, an empirical data of Ambon Bay, Moluccas Province, is used to illustrate the operationalization of the method.

  1. Evaluation of Distribution Analysis Software for DER Applications

    Energy Technology Data Exchange (ETDEWEB)

    Staunton, RH

    2003-01-23

    unstoppable. In response, energy providers will be forced to both fully acknowledge the trend and plan for accommodating DER [3]. With bureaucratic barriers [4], lack of time/resources, tariffs, etc. still seen in certain regions of the country, changes still need to be made. Given continued technical advances in DER, the time is fast approaching when the industry, nation-wide, must not only accept DER freely but also provide or review in-depth technical assessments of how DER should be integrated into and managed throughout the distribution system. Characterization studies are needed to fully understand how both the utility system and DER devices themselves will respond to all reasonable events (e.g., grid disturbances, faults, rapid growth, diverse and multiple DER systems, large reactive loads). Some of this work has already begun as it relates to operation and control of DER [5] and microturbine performance characterization [6,7]. One of the most urgently needed tools that can provide these types of analyses is a distribution network analysis program in combination with models for various DER. Together, they can be used for (1) analyzing DER placement in distribution networks and (2) helping to ensure that adequate transmission reliability is maintained. Surveys of the market show products that represent a partial match to these needs; specifically, software that has been developed to plan electrical distribution systems and analyze reliability (in a near total absence of DER). The first part of this study (Sections 2 and 3 of the report) looks at a number of these software programs and provides both summary descriptions and comparisons. The second part of this study (Section 4 of the report) considers the suitability of these analysis tools for DER studies. It considers steady state modeling and assessment work performed by ORNL using one commercially available tool on feeder data provided by a southern utility. Appendix A provides a technical report on the results of

  2. Determination of the elemental distribution in cigarette components and smoke by instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Wu, D.; Landsberger, S.; Larson, S.M.

    1997-01-01

    Cigarette smoking is a major source of particle released in indoor environments. A comprehensive study of the elemental distribution in cigarettes and cigarette smoke has been completed. Specifically, concentrations of thirty elements have been determined for the components of 15 types of cigarettes. Components include tobacco, ash, butts, filters, and cigarette paper. In addition, particulate matter from mainstream smoke (MS) and sidesstream smoke (SS) were analyzed. The technique of elemental determination used in the study is instrumental neutron activation analysis. The results show that certain heavy metals, such as As, Cd, K, Sb and Zn, are released into the MS and SS. These metals may then be part of the health risk of exposure to smoke. Other elements are retained, for the most part, in cigarette ash and butts. The elemental distribution among the cigarette components and smoke changes for different smoking conditions. (author)

  3. Spatial and temporal distribution of free-living protozoa in aquatic environments of a Brazilian semi-arid region

    Directory of Open Access Journals (Sweden)

    Maria Luisa Quinino de Medeiros

    2013-08-01

    Full Text Available Free-living protozoa organisms are distributed in aquatic environments and vary widely in both qualitative and quantitative terms. The unique ecological functions they exhibit in their habitats help to maintain the dynamic balance of these environments. Despite their wide range and abundance, studies on geographical distribution and ecology, when compared to other groups, are still scarce. This study aimed to identify and record the occurrence of free-living protozoa at three points in Piancó-Piranhas-Açu basin, in a semi-arid area of Rio Grande do Norte (RN state, and to relate the occurrence of taxa with variations in chlorophyll a, pH and temperature in the environments. Samples were collected in the Armando Ribeiro Gonçalves Dam, from two lentic environments upstream and a lotic ecosystem downstream. Sixty-five taxa of free-living protozoa were found. The Student's t-test showed significant inter-variable differences (p <0.05. Similar protozoan species were recorded under different degrees of trophic status according to chlorophyll a concentrations, suggesting the organisms identified are not ideal for indicating trophic level. We hypothesize that food availability, the influence of lentic and lotic systems and the presence of aquatic macrophytes influenced protozoan dynamics during the study period.

  4. Development of a web service for analysis in a distributed network.

    Science.gov (United States)

    Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila

    2014-01-01

    We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among researchers at geographically dispersed institutes.

  5. Scalable Scientific Data Mining in Distributed, Peer-to-Peer Environments

    Science.gov (United States)

    Borne, K. D.; Kargupta, H.; Das, K.; Griffin, W.; Giannella, C.

    2008-12-01

    Data-intensive science and knowledge discovery involving very large sky surveys are playing increasingly important roles in today's astronomy research. This Discovery Informatics scientific approach is evolving as a core research paradigm in all science disciplines. In particular, Astroinformatics is developing as the formalization of data-intensive astronomy for research and education. Nearly completed projects (such as the Sloan Digital Sky Survey SDSS, the 2-Micron All-Sky Survey 2MASS, and the GALEX All-Sky Survey) and future projects (such as the WISE All-Sky Survey, Pan-STARRS, and the Large Synoptic Survey Telescope LSST) are destined to produce enormous catalogs of astronomical sources. These collections are naturally distributed and heterogeneous, in addition to being terascale, petascale, and beyond. It is this virtual collection of terabyte and (eventually) petabyte catalogs that will enable remarkable new scientific discoveries through the integration and cross-correlation of data across multiple survey dimensions (time, wavelength, and sky coverage). However, this will be difficult to achieve without a computational backbone that includes support for queries and data mining across distributed virtual tables of de-centralized, joined, and integrated sky survey catalogs. Moreover, use of local data management systems such as MyDB, MySpace in AstroGrid, and Grid Bricks for storing and managing user's local data is becoming increasingly popular. This is opening up the possibility of constructing Peer-to-Peer (P2P) networks for data sharing and mining. We will report on our research in these areas. We are exploring the possibility of using distributed and P2P data mining technology for exploratory astronomy from data integrated and cross-correlated across these multiple sky surveys. We will report on new scientific results, including new explorations of the classical fundamental plane problem, in which multiple dimensions of galaxy parameter space can be

  6. The distribution of micro zooplankton in the lagoon environments; La distribuzione del microzooplancton negli ambienti lagunari

    Energy Technology Data Exchange (ETDEWEB)

    Grenni, P.; Creo, C. [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dipt. Ambiente

    1998-07-01

    The aim of this work is to verify the possible use of micro zooplankton as a biological indicator in aquatic environments. In particular, studies carried out in lagoon environments are reported, relatively to the Venice lagoon and the Pontine coastal lakes (Italy). New methodologies to assess the micro plankton component are developed and tested, particularly the concentration and count steps. The use of the same methodologies to assess nano plankton component, as biological indicator. are reported. [Italian] Nel presente lavoro viene analizzata la possibilita' di utilizzare il microzooplancton quale indicatore biologico negli ambienti acquatici (mmarini, acquadulcicoli, salmastri). In particolare, vengono riportati gli studi effettuati dall'ENEA (National Agency for New Technology, Energy and the Environment) su tale componente in ambienti lagunari, con riferimento alla laguna di Venezia e alle lagune pontine.

  7. The Effect of Audio and Visual Aids on Task Performance in Distributed Collaborative Virtual Environments

    Science.gov (United States)

    Ullah, Sehat; Richard, Paul; Otman, Samir; Mallem, Malik

    2009-03-01

    Collaborative virtual environments (CVE) has recently gained the attention of many researchers due to its numerous potential application domains. Cooperative virtual environments, where users simultaneously manipulate objects, is one of the subfields of CVEs. In this paper we present a framework that enables two users to cooperatively manipulate objects in virtual environment, while setting on two separate machines connected through local network. In addition the article presents the use of sensory feedback (audio and visual) and investigates their effects on the cooperation and user's performance. Six volunteers subject had to cooperatively perform a peg-in-hole task. Results revealed that visual and auditory aid increase users' performance. However majority of the users preferred visual feedback to audio. We hope this framework will greatly help in the development of CAD systems that allow the designers to collaboratively design while being distant. Similarly other application domains may be cooperative assembly, surgical training and rehabilitation systems.

  8. Distribution of quinolones, sulfonamides, tetracyclines in aquatic environment and antibiotic resistance in Indochina

    Directory of Open Access Journals (Sweden)

    Satoru eSuzuki

    2012-02-01

    Full Text Available Southeast Asia has become the center of rapid industrial development and economic growth. However, this growth has far outpaced investment in public infrastructure, leading to the unregulated release of many pollutants, including wastewater-related contaminants such as antibiotics. Antibiotics are of major concern because they can easily be released into the environment from numerous sources, and can subsequently induce development of antibiotic-resistant bacteria. Recent studies have shown that for some categories of drugs this source-to-environment antibiotic resistance relationship is more complex. This review summarizes current understanding regarding the presence of quinolones, sulfonamides, and tetracyclines in aquatic environments of Indochina and the prevalence of bacteria resistant to them. Several noteworthy findings are discussed: 1 quinolone contamination and the occurrence of quinolone resistance are not correlated; 2 occurrence of the sul sulfonamide resistance gene varies geographically; and 3 microbial diversity might be related to the rate of oxytetracycline resistance.

  9. Pure random search for ambient sensor distribution optimisation in a smart home environment.

    Science.gov (United States)

    Poland, Michael P; Nugent, Chris D; Wang, Hui; Chen, Liming

    2011-01-01

    Smart homes are living spaces facilitated with technology to allow individuals to remain in their own homes for longer, rather than be institutionalised. Sensors are the fundamental physical layer with any smart home, as the data they generate is used to inform decision support systems, facilitating appropriate actuator actions. Positioning of sensors is therefore a fundamental characteristic of a smart home. Contemporary smart home sensor distribution is aligned to either a) a total coverage approach; b) a human assessment approach. These methods for sensor arrangement are not data driven strategies, are unempirical and frequently irrational. This Study hypothesised that sensor deployment directed by an optimisation method that utilises inhabitants' spatial frequency data as the search space, would produce more optimal sensor distributions vs. the current method of sensor deployment by engineers. Seven human engineers were tasked to create sensor distributions based on perceived utility for 9 deployment scenarios. A Pure Random Search (PRS) algorithm was then tasked to create matched sensor distributions. The PRS method produced superior distributions in 98.4% of test cases (n=64) against human engineer instructed deployments when the engineers had no access to the spatial frequency data, and in 92.0% of test cases (n=64) when engineers had full access to these data. These results thus confirmed the hypothesis.

  10. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan

    2014-09-01

    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  11. Distribution and Potential Mobility of Selected Heavy Metals in a Fluvial Environment Under the Influence of Tanneries

    Directory of Open Access Journals (Sweden)

    Rodrigues M. L. K.

    2013-04-01

    Full Text Available In this study we evaluated the occurrence of heavy metals in a fluvial environment under the influence of tanneries – the Cadeia and Feitoria rivers basin (RS, south Brazil, highlighting the distribution and potential mobility of the selected elements. Every three months, over one year-period, selected heavy metals and ancillary parameters were analyzed in water and sediment samples taken at ten sites along the rivers. Water analyses followed APHA recommendations, and sediment analyses were based on methods from USEPA (SW846 and European Community (BCR sequential extraction. The determinations were performed by ICP/OES, except for Hg (CV/ETA. Statistical factor analysis was applied to water and sediment data sets, in order to obtain a synthesis of the environmental diagnosis. The results revealed that water quality decreased along the rivers, and mainly on the dry period (January, showing the influence of tannery plants vicinity and flow variations. Except for Fe, Al, and eventually Mn, heavy metal contents in water were in agreement with Brazilian standards. Concerning sediments, Al, Cu, Fe, Ni, Mn, Ti, and Zn concentrations appeared to reflect the base levels, while Cr and Hg were enriched in the deposits from the lower part of the basin. The partition of heavy metals among the sediment geochemical phases showed higher mobility of Mn along the sampling sites, followed by Cr in the lower reach of the basin, most affected by tanneries. Since Cr was predominantly associated to the oxidizable fraction, its potential mobilization from contaminated sediments would be associated to redox conditions. The detection of Hg in the tissue of a bottom-fish species indicated that the environmental conditions are apparently favoring the remobilization of this metal from contaminated sediments.

  12. A Network-Aware Distributed Storage Cache for Data Intensive Environments

    Energy Technology Data Exchange (ETDEWEB)

    Tierney, B.L.; Lee, J.R.; Johnston, W.E.; Crowley, B.; Holding, M.

    1999-12-23

    Modern scientific computing involves organizing, moving, visualizing, and analyzing massive amounts of data at multiple sites around the world. The technologies, the middleware services, and the architectures that are used to build useful high-speed, wide area distributed systems, constitute the field of data intensive computing. In this paper the authors describe an architecture for data intensive applications where they use a high-speed distributed data cache as a common element for all of the sources and sinks of data. This cache-based approach provides standard interfaces to a large, application-oriented, distributed, on-line, transient storage system. They describe their implementation of this cache, how they have made it network aware, and how they do dynamic load balancing based on the current network conditions. They also show large increases in application throughput by access to knowledge of the network conditions.

  13. Experimental Testing for Stability Analysis of Distributed Energy Resources Components with Storage Devices and Loads

    DEFF Research Database (Denmark)

    Mihet-Popa, Lucian; Groza, Voicu; Isleifsson, Fridrik Rafn

    2012-01-01

    Experimental Testing for Stability Analysis of Distributed Energy Resources Components with Storage Devices and Loads......Experimental Testing for Stability Analysis of Distributed Energy Resources Components with Storage Devices and Loads...

  14. CROSS DRIVE: A Collaborative and Distributed Virtual Environment for Exploitation of Atmospherical and Geological Datasets of Mars

    Science.gov (United States)

    Cencetti, Michele

    2016-07-01

    European space exploration missions have produced huge data sets of potentially immense value for research as well as for planning and operating future missions. For instance, Mars Exploration programs comprise a series of missions with launches ranging from the past to beyond present, which are anticipated to produce exceptional volumes of data which provide prospects for research breakthroughs and advancing further activities in space. These collected data include a variety of information, such as imagery, topography, atmospheric, geochemical datasets and more, which has resulted in and still demands, databases, versatile visualisation tools and data reduction methods. Such rate of valuable data acquisition requires the scientists, researchers and computer scientists to coordinate their storage, processing and relevant tools to enable efficient data analysis. However, the current position is that expert teams from various disciplines, the databases and tools are fragmented, leaving little scope for unlocking its value through collaborative activities. The benefits of collaborative virtual environments have been implemented in various industrial fields allowing real-time multi-user collaborative work among people from different disciplines. Exploiting the benefits of advanced immersive virtual environments (IVE) has been recognized as an important interaction paradigm to facilitate future space exploration. The current work is mainly aimed towards the presentation of the preliminary results coming from the CROSS DRIVE project. This research received funding from the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement n° 607177 and is mainly aimed towards the implementation of a distributed virtual workspace for collaborative scientific discovery, mission planning and operations. The purpose of the CROSS DRIVE project is to lay foundations of collaborative European workspaces for space science. It will demonstrate the feasibility and

  15. Paleomagnetism.org : An online multi-platform open source environment for paleomagnetic data analysis

    NARCIS (Netherlands)

    Koymans, Mathijs R.; Langereis, C.G.|info:eu-repo/dai/nl/073584223; Pastor-Galán, D.; van Hinsbergen, D.J.J.|info:eu-repo/dai/nl/269263624

    2016-01-01

    This contribution provides an overview of Paleomagnetism.org, an open-source, multi-platform online environment for paleomagnetic data analysis. Paleomagnetism.org provides an interactive environment where paleomagnetic data can be interpreted, evaluated, visualized, and exported. The

  16. Optimal sampling theory and population modelling - Application to determination of the influence of the microgravity environment on drug distribution and elimination

    Science.gov (United States)

    Drusano, George L.

    1991-01-01

    The optimal sampling theory is evaluated in applications to studies related to the distribution and elimination of several drugs (including ceftazidime, piperacillin, and ciprofloxacin), using the SAMPLE module of the ADAPT II package of programs developed by D'Argenio and Schumitzky (1979, 1988) and comparing the pharmacokinetic parameter values with results obtained by traditional ten-sample design. The impact of the use of optimal sampling was demonstrated in conjunction with NONMEM (Sheiner et al., 1977) approach, in which the population is taken as the unit of analysis, allowing even fragmentary patient data sets to contribute to population parameter estimates. It is shown that this technique is applicable in both the single-dose and the multiple-dose environments. The ability to study real patients made it possible to show that there was a bimodal distribution in ciprofloxacin nonrenal clearance.

  17. Evaluating a Data Distribution Service System for Dynamic Manufacturing Environments: A Case Study

    NARCIS (Netherlands)

    Essers, M.S.; Vaneker, Thomas H.J.

    2014-01-01

    Small and Medium sized Enterprises (SMEs) in Europe struggle to incorporate industrial robots in their production environments, while large enterprises use these robots for large batch production only. The paradigm shift from mass production to mass personalization decreases batch sizes and changes

  18. Creating an Organic Knowledge-Building Environment within an Asynchronous Distributed Learning Context.

    Science.gov (United States)

    Moller, Leslie; Prestera, Gustavo E.; Harvey, Douglas; Downs-Keller, Margaret; McCausland, Jo-Ann

    2002-01-01

    Discusses organic architecture and suggests that learning environments should be designed and constructed using an organic approach, so that learning is not viewed as a distinct human activity but incorporated into everyday performance. Highlights include an organic knowledge-building model; information objects; scaffolding; discourse action…

  19. Distribution of Invasive Plants in Urban Environment Is Strongly Spatially Structured

    Czech Academy of Sciences Publication Activity Database

    Štajerová, Kateřina; Šmilauer, P.; Brůna, Josef; Pyšek, Petr

    2017-01-01

    Roč. 32, č. 3 (2017), s. 681-692 ISSN 0921-2973 R&D Projects: GA ČR GB14-36079G Institutional support: RVO:67985939 Keywords : invasive plants * urban environment * species richness Subject RIV: EH - Ecology, Behaviour OBOR OECD: Ecology Impact factor: 3.615, year: 2016

  20. Analysis of the work environment of extension personnel in Benue ...

    African Journals Online (AJOL)

    This investigation was undertaken to ascertain the environment in which extension workers transact their legitimate professional duties in Benue and Plateau States. The analyses of the results obtained indicate that extension workers operate in a less satisfying environment to the extent that about 67.3%, 36.9%, 88.2%, ...

  1. Analysis of acidic properties of distribution transformer oil insulation ...

    African Journals Online (AJOL)

    This paper examined the acidic properties of distribution transformer oil insulation in service at Jericho distribution network Ibadan, Nigeria. Five oil samples each from six distribution transformers (DT1, DT2, DT3, DT4 and DT5) making a total of thirty samples were taken from different installed distribution transformers all ...

  2. The East Africa Oligocene intertrappean beds: Regional distribution, depositional environments and Afro/Arabian mammal dispersals

    Science.gov (United States)

    Abbate, Ernesto; Bruni, Piero; Ferretti, Marco Peter; Delmer, Cyrille; Laurenzi, Marinella Ada; Hagos, Miruts; Bedri, Omar; Rook, Lorenzo; Sagri, Mario; Libsekal, Yosief

    2014-11-01

    inspection of the Agere Selam (Mekele) intertrappean beds revealed the occurrence of lacustrine limestones and diatomites, which were contrastingly quite subordinate with respect to the fine clastic sediments found in the nearby Amba Alaji area. Further south, the intertrappean section in the Jema valley (100 km north of Addis Ababa and close to the Blue Nile gorge) is 120 m thick with predominant clastic sediments and a few diatomites at the top. Literature information from 35 additional sites, including northern Kenya, Yemen, Sudan and Saudi Arabia sections, confirms the fluvial and lacustrine depositional environment of the intertrappean beds, underlines the interest in their mammal fauna (Chilga, Losodok), and reports exploitable coal seams for some of them. As for the vegetal landscape in which the intertrappean beds were deposited, pollen and plant analysis results indicative of a tropical wet forest, similar to that of present-day western Africa. Another common feature of the intertrappean beds is their relatively limited thickness, averaging a few tens of meters, but reaching a few hundred meters in graben-related basins, such as Delbi Moye in southern Ethiopia. In most cases only thin, lens-shaped successions were deposited above the hummocky topography of their volcanic substratum, commonly unaffected by significant faulting. An average duration of the intertrappean beds is from one to three million years. This time interval is commonly matched by a few tens (or more rarely, hundreds) of meters of sediments left over after erosive episodes or depositional starvation. As to the lateral continuity of the intertrappean beds, the present-day outcrops show large differences: from some tens of kilometers in the Mendefera area, to a few tens of kilometres in the Jema valley, and to a few hundreds meters in the Agere Selam (Mekele) area. Even if it is difficult to quantify the original size of the sedimentation areas, it nevertheless proves that the intertrappean basins

  3. Harmonic Analysis of Electric Vehicle Loadings on Distribution System

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Yijun A [University of Southern California, Department of Electrical Engineering; Xu, Yunshan [University of Southern California, Department of Electrical Engineering; Chen, Zimin [University of Southern California, Department of Electrical Engineering; Peng, Fei [University of Southern California, Department of Electrical Engineering; Beshir, Mohammed [University of Southern California, Department of Electrical Engineering

    2014-12-01

    With the increasing number of Electric Vehicles (EV) in this age, the power system is facing huge challenges of the high penetration rates of EVs charging stations. Therefore, a technical study of the impact of EVs charging on the distribution system is required. This paper is applied with PSCAD software and aimed to analyzing the Total Harmonic Distortion (THD) brought by Electric Vehicles charging stations in power systems. The paper starts with choosing IEEE34 node test feeder as the distribution system, building electric vehicle level two charging battery model and other four different testing scenarios: overhead transmission line and underground cable, industrial area, transformer and photovoltaic (PV) system. Then the statistic method is used to analyze different characteristics of THD in the plug-in transient, plug-out transient and steady-state charging conditions associated with these four scenarios are taken into the analysis. Finally, the factors influencing the THD in different scenarios are found. The analyzing results lead the conclusion of this paper to have constructive suggestions for both Electric Vehicle charging station construction and customers' charging habits.

  4. A Distributed Flocking Approach for Information Stream Clustering Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL

    2006-01-01

    Intelligence analysts are currently overwhelmed with the amount of information streams generated everyday. There is a lack of comprehensive tool that can real-time analyze the information streams. Document clustering analysis plays an important role in improving the accuracy of information retrieval. However, most clustering technologies can only be applied for analyzing the static document collection because they normally require a large amount of computation resource and long time to get accurate result. It is very difficult to cluster a dynamic changed text information streams on an individual computer. Our early research has resulted in a dynamic reactive flock clustering algorithm which can continually refine the clustering result and quickly react to the change of document contents. This character makes the algorithm suitable for cluster analyzing dynamic changed document information, such as text information stream. Because of the decentralized character of this algorithm, a distributed approach is a very natural way to increase the clustering speed of the algorithm. In this paper, we present a distributed multi-agent flocking approach for the text information stream clustering and discuss the decentralized architectures and communication schemes for load balance and status information synchronization in this approach.

  5. Comparison of Environment Impact between Conventional and Cold Chain Management System in Paprika Distribution Process

    Directory of Open Access Journals (Sweden)

    Eidelweijs A Putri

    2012-09-01

    Full Text Available Pasir Langu village in Cisarua, West Java, is the largest central production area of paprika in Indonesia. On average, for every 200 kilograms of paprika produced, there is rejection amounting to 3 kilograms. This resulted in money loss for wholesalers and wastes. In one year, this amount can be approximately 11.7 million Indonesian rupiahs. Recently, paprika wholesalers in Pasir Langu village recently are developing cold chain management system to maintain quality of paprika so that number of rejection can be reduced. The objective of this study is to compare environmental impacts between conventional and cold chain management system in paprika distribution process using Life Cycle Assessment (LCA methodology and propose Photovoltaic (PV system in paprika distribution process. The result implies that the cold chain system produces more CO2 emission compared to conventional system. However, due to the promotion of PV system, the emission would be reduced. For future research, it is necessary to reduce CO2 emission from transportation process since this process is biggest contributor of CO2 emission at whole distribution process. Keywords: LCA, environmentally friendly distribution, paprika,cold chain, PV system

  6. A Market-Oriented Programming Environment and Its Application to Distributed Multicommodity Flow Problems

    Science.gov (United States)

    1992-10-01

    studied framework for coordinating decentralized decision pro- cesses with minimal communication. WALRAS is a general "market- oriented programming...Pr9sc.ibed byi AN IP i ifl1 8 Table of Contents 1. The WALRAS Project .............................................................. 1 2. Distributed Resource...4 4. WALRAS ............................................................................. 7 4.1 Market Configuration and Equilibrium

  7. Quantifying the influence of geography and environment on the northeast Atlantic mackerel spawning distribution

    DEFF Research Database (Denmark)

    Brunel, Thomas; van Damme, Cindy J. G.; Samson, Melvin

    2018-01-01

    Mackerel (Scomber scombrus) in the northeast Atlantic have shown changes in distribution at certain times of the year, which have impacted their exploitation and management. In this study, mackerel spawning habitat over 21 recent years was characterised using generalised additive modelling, based...

  8. Distribution and migration of plutonium in soils of an accidentally contaminated environment

    International Nuclear Information System (INIS)

    Iranzo, E.; Rivas, P.; Mingarro, E.; Marin, C.; Espinoas, A.; Iranzo, C.E.

    1991-01-01

    In an area contaminated accidentally by plutonium, studies are made to determine the geochemical state of plutonium in the soil of farmed and uncultivated areas. Plutonium concentrations have been measured in relation to depth, particle size distribution and mineralogical composition of the soil. Sequential leaching experiments have been made to estimate the proportion of plutonium bound to the main compounds. (orig.)

  9. On the relevancy of efficient, integrated computer and network monitoring in HEP distributed online environment

    International Nuclear Information System (INIS)

    Carvalho, D.; Gavillet, Ph.; Delgado, V.; Javello, J.; Miere, Y.; Ruffinoni, D.; Albert, J.N.; Bellas, N.; Smith, G.

    1996-01-01

    Large Scientific Equipment are controlled by Computer Systems whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, the sophistication of its treatment and, on the other hand by the fast evolution of the computer and network market. Some people call them generically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this framework the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is proposed to integrate the various functions of DCCS monitoring into one general purpose Multi-layer System. (author)

  10. Regional power distribution in the competitive market environment; Die regionale Stromverteilung im liberalisierten Markt

    Energy Technology Data Exchange (ETDEWEB)

    Marquis, G.; Birkner, P. [Lech-Elektrizitaetswerke AG, Augsburg (Germany). Abt. Netzbetrieb

    2000-05-01

    Regional or local electricity suppliers as the owners of the local or regional distribution networks are responsible for reliable power supply to the end-use customers in their area, and have to guarantee the required voltage quality. Thus regional suppliers' major business is power procurement and distribution, as well as load and voltage management in the distribution systems. The article discusses the new challenges and obligations of distribution providers arising from the conditions of power trading in deregulated markets, referring to major aspects such as efficiency enhancements in network design and operation, and customized supply quality. (orig./CB) [German] Der Kunde erhaelt die elektrische Energie in der von ihm gewuenschten Spannungsstufe und an dem von ihm gewuenschten Ort. Regionalversorger sind also schwerpunktmaessig in Energieeinkauf und -vertrieb sowie in der Veredelung elektrischer Energie in Form von Transformation und Verteilung taetig. Im Folgenden werden die veraenderten Anforderungen an ein elektrisches Verteilungsnetz beschrieben, die sich aus der Liberalisierung des Strommarktes ergeben. Neben noetigen Effizienzsteigerungen in Konstruktion und Betrieb elektrischer Netze ist als wesentliche Konsequenz des Wettbewerbs die kundenspezifische Versorgungsqualitaet zu nennen. (orig.)

  11. Energy system analysis of fuel cells and distributed generation

    DEFF Research Database (Denmark)

    Mathiesen, Brian Vad; Lund, Henrik

    2007-01-01

    must be thorough and careful when identifying any imbalances between electricity demand and production from CHP plants (Combined Heat and Power) and fluctuating renewable energy sources. This chapter introduces the energy system analysis model EnergyPLAN, which is one example of a freeware tool, which...... on the energy system in which they are used. Consequently, coherent energy systems analyses of specific and complete energy systems must be conducted in order to evaluate the benefits of FC technologies and in order to be able to compare alternative solutions. In relation to distributed generation, FC...... can be used for such analyses. Moreover, the chapter presents the results of evaluating the overall system fuel savings achieved by introducing different FC applications into different energy systems. Natural gas-based and hydrogen-based micro FC-CHP, natural gas local FC-CHP plants for district...

  12. The Grid-distributed data analysis in CMS

    International Nuclear Information System (INIS)

    Fanzago, F.; Farina, F.; Cinquilli, M.; Codispoti, G.; Fanfani, F.; Lacaprara, S.; Miccio, E.; Spiga, S.; Vaandering, E.

    2009-01-01

    The CMS experiment will soon produce a huge amount of data (a few P Bytes per year) that will be distributed and stored in many computing centres spread across the countries participating in the collaboration. Data will be available to the whole CMS physicists: this will be possible thanks to the services provided by supported Grids. Crab is the CMS collaboration tool developed to allow physicists to access and analyze data stored over world-wide sites. It aims to simplify the data discovery process and the jobs creation, execution and monitoring tasks hiding the details related both to Grid infrastructures and CMS analysis framework. We will discuss the recent evolution of this tool from its standalone version up to the client-server architecture adopted for particularly challenging workload volumes and we will report the usage statistics collected from the Crab community, involving so far almost 600 distinct users.

  13. Environment

    International Nuclear Information System (INIS)

    McIntyre, A.D.; Turnbull, R.G.H.

    1992-01-01

    The development of the hydrocarbon resources of the North Sea has resulted in both offshore and onshore environmental repercussions, involving the existing physical attributes of the sea and seabed, the coastline and adjoining land. The social and economic repercussions of the industry were equally widespread. The dramatic and speedy impact of the exploration and exploitation of the northern North Sea resources in the early 1970s, on the physical resources of Scotland was quickly realised together with the concern that any environmental and social damage to the physical and social fabric should be kept to a minimum. To this end, a wide range of research and other activities by central and local government, and other interested agencies was undertaken to extend existing knowledge on the marine and terrestrial environments that might be affected by the oil and gas industry. The outcome of these activities is summarized in this paper. The topics covered include a survey of the marine ecosystems of the North Sea, the fishing industry, the impact of oil pollution on seabirds and fish stocks, the ecology of the Scottish coastline and the impact of the petroleum industry on a selection of particular sites. (author)

  14. Statistical Distribution Analysis of Lineated Bands on Europa

    Science.gov (United States)

    Chen, T.; Phillips, C. B.; Pappalardo, R. T.

    2016-12-01

    Tina Chen, Cynthia B. Phillips, Robert T. Pappalardo Europa's surface is covered with intriguing linear and disrupted features, including lineated bands that range in scale and size. Previous studies have shown the possibility of an icy shell at the surface that may be concealing a liquid ocean with the potential to harboring life (Pappalardo et al., 1999). Utilizing the high-resolution imaging data from the Galileo spacecraft, we examined bands through a morphometric and morphologic approach. Greeley et al. (2000) and Procktor et al. (2002) have defined bands as wide, hummocky to lineated features that have distinctive surface texture and albedo compared to its surrounding terrain. We took morphometric measurements of lineated bands to find correlations in properties such as size, location, and orientation, and to shed light on formation models. We will present our measurements of over 100 bands on Europa that was mapped on the USGS Europa Global Mosaic Base Map (2002). We also conducted a statistical analysis to understand the distribution of lineated bands globally, and whether the widths of the bands differ by location. Our preliminary analysis from our statistical distribution evaluation, combined with the morphometric measurements, supports a uniform ice shell thickness for Europa rather than one that varies geographically. References: Greeley, Ronald, et al. "Geologic mapping of Europa." Journal of Geophysical Research: Planets 105.E9 (2000): 22559-22578.; Pappalardo, R. T., et al. "Does Europa have a subsurface ocean? Evaluation of the geological evidence." Journal of Geophysical Research: Planets 104.E10 (1999): 24015-24055.; Prockter, Louise M., et al. "Morphology of Europan bands at high resolution: A mid-ocean ridge-type rift mechanism." Journal of Geophysical Research: Planets 107.E5 (2002).; U.S. Geological Survey, 2002, Controlled photomosaic map of Europa, Je 15M CMN: U.S. Geological Survey Geologic Investigations Series I-2757, available at http

  15. The institutional structure and political economy of food distribution systems: A comparative analysis of six Eastern European countries

    DEFF Research Database (Denmark)

    Esbjerg, Lars; Skytte, Hans

    This paper discusses the food distribution systems of six Eastern European countries. It considers the macro and task environments of distribution systems, discussing the constraints and opportunities the environments present to companies. The institutional structure of retailing and wholesaling...

  16. Distributional patterns of cecropia (Cecropiaceae: a panbiogeographic analysis

    Directory of Open Access Journals (Sweden)

    Franco Rosselli Pilar

    1997-06-01

    Full Text Available A panbiogeographic analysis of the distributional patterns of 60 species of Cecropia was carried out. Based on the distributional ranges of 36 species, we found eight generalized tracks for Cecropia species. whereas distributional patterns of 24 species were uninformative for the analysis. The major concentration of species of Cecropia is in the Neotropical Andean region. where there are three generalized tracks and two nodes. The northern Andes in Colombia and Ecuador are richer than the Central Andes in Perú. they contain two generalized tracks; one to the west and another to the east, formed by individual tracks of eight species each. There are four generalized tracks outside the Andean region: two in the Amazonian region in Guayana-Pará and in Manaus. one in Roraima. one in Serra do Mar in the Atlantic forest of Brazil and one in Central America. Speciation in Cecropia may be related to the Andean first uplift.Con base en la distribución de 60 especies del género Cecropia, se hizo un análisis panbiogeográfico. Se construyeron 8 trazos generalizados con base en el patrón de distribución de 36 especies; la distribución de las demás especies no aportaba información para la definición de los trazos. La región andina tiene la mayor concentración de especies de Cecropia representada por la presencia de tres trazos generalizados y dos nodos; los dos trazos con mayor número de especies se localizan en su parte norte, en Colombia y Ecuador y el otro en los Andes centrales en Perú. Se encontraron además, cuatro trazos extrandinos: dos en la región amazónica, en Pará-Guayana y en Manaus, uno en Roraima, uno en Serra do Mar en la Selva Atlánfíca del Brasil y uno en Centro América. La especiación en Cecropia parece estar relacionada con el primer levantamiento de los Andes.

  17. Transformation and distribution processes governing the fate and behaviour of nanomaterials in the environment: an overview

    DEFF Research Database (Denmark)

    Hansen, Steffen Foss; Hartmann, Nanna B.; Baun, Anders

    2015-01-01

    -based approaches, relates to the interplay between the nanomaterial physico-chemical properties, surrounding media/matrix composition and the underlying processes that determine particle behaviour. Here we identify and summarize key processes governing the fate and behaviour of nanomaterials in the environment....... This is done through a critical review of the present state-of-knowledge. We describe the (photo)chemical, physical or biologically mediated transformation of manufactured nanomaterials due to degradation, aggregation, agglomeration, or through association with dissolved, colloidal or particulate matter...... present in the environment. Specific nanomaterials are used as case studies to illustrate these processes. Key environmental processes are identified and ranked and key knowledge gaps are identified, feeding into the longer-term goal of improving the existing models for predicted environmental...

  18. The distribution, fate, and effects of propylene glycol substances in the environment.

    Science.gov (United States)

    West, Robert; Banton, Marcy; Hu, Jing; Klapacz, Joanna

    2014-01-01

    The propylene glycol substances comprise a homologous family of synthetic organic molecules that have widespread use and very high production volumes across the globe. The information presented and summarized here is intended to provide an overview of the most current and reliable information available for assessing the potential environmental exposures and impacts of these substances across the manufacture, use, and disposal phases of their product life cycles.The PG substances are characterized as being miscible in water, having very low octanol-water partition coefficients (log Pow) and exhibiting low potential to volatilize from water or soil in both pure and dissolved forms. The combination of these properties dictates that, almost regardless of the mode of their initial emission, they will ultimately associate with surface water, soil, and the related groundwater compartments in the environment. These substances have low affinity for soil and sediment particles, and thus will remain mobile and bio-available within these media.In the atmosphere, the PG substances are demonstrated to have short lifetimes(I. 7-11 h), due to rapid reaction with photochemically-generated hydroxyl radicals.This reactivity, combined with efficient wet deposition of their vapor and aerosol forms, lends to their very low potential for long-range transport via the atmosphere.In the aquatic and terrestrial compartments of the environment, the PG substances are rapidly and ultimately biodegraded under both aerobic and anaerobic conditions by a wide variety of microorganisms, regardless of prior adaptation to the substances.Except for the TePG substance, the propylene glycol substances meet the OECD definition of "readily biodegradable", and according to this definition are not expected to persist in either aquatic or terrestrial environments. The TePG exhibits inherent biodegradability, is not regarded to be persistent, and is expected to ultimately biodegrade in the environment, albeit

  19. Teaching Millennials to Engage THE Environment Instead of Their Environment: A Pedagogical Analysis

    Science.gov (United States)

    Stevens, J. Richard; Crow, Deserai Anderson

    2016-01-01

    This article examines the difficulty in teaching contemporary students of journalism (those in the much-discussed Millennial Generation) to report on complex topics like science and the environment. After examining contemporary literature, the authors subjected 120 undergraduate students to a strategy that combined visual representations of…

  20. Presence of thallium in the environment: sources of contaminations, distribution and monitoring methods

    OpenAIRE

    Karbowska, Bozena

    2016-01-01

    Thallium is released into the biosphere from both natural and anthropogenic sources. It is generally present in the environment at low levels; however, human activity has greatly increased its content. Atmospheric emission and deposition from industrial sources have resulted in increased concentrations of thallium in the vicinity of mineral smelters and coal-burning facilities. Increased levels of thallium are found in vegetables, fruit and farm animals. Thallium is toxic even at very low con...

  1. DLTAP: A Network-efficient Scheduling Method for Distributed Deep Learning Workload in Containerized Cluster Environment

    OpenAIRE

    Qiao Wei; Li Ying; Wu Zhong-Hai

    2017-01-01

    Deep neural networks (DNNs) have recently yielded strong results on a range of applications. Training these DNNs using a cluster of commodity machines is a promising approach since training is time consuming and compute-intensive. Furthermore, putting DNN tasks into containers of clusters would enable broader and easier deployment of DNN-based algorithms. Toward this end, this paper addresses the problem of scheduling DNN tasks in the containerized cluster environment. Efficiently scheduling ...

  2. Presence of thallium in the environment: sources of contaminations, distribution and monitoring methods.

    Science.gov (United States)

    Karbowska, Bozena

    2016-11-01

    Thallium is released into the biosphere from both natural and anthropogenic sources. It is generally present in the environment at low levels; however, human activity has greatly increased its content. Atmospheric emission and deposition from industrial sources have resulted in increased concentrations of thallium in the vicinity of mineral smelters and coal-burning facilities. Increased levels of thallium are found in vegetables, fruit and farm animals. Thallium is toxic even at very low concentrations and tends to accumulate in the environment once it enters the food chain. Thallium and thallium-based compounds exhibit higher water solubility compared to other heavy metals. They are therefore also more mobile (e.g. in soil), generally more bioavailable and tend to bioaccumulate in living organisms. The main aim of this review was to summarize the recent data regarding the actual level of thallium content in environmental niches and to elucidate the most significant sources of thallium in the environment. The review also includes an overview of analytical methods, which are commonly applied for determination of thallium in fly ash originating from industrial combustion of coal, in surface and underground waters, in soils and sediments (including soil derived from different parent materials), in plant and animal tissues as well as in human organisms.

  3. Phylogenetic diversity and environment-specific distributions of glycosyl hydrolase family 10 xylanases in geographically distant soils.

    Directory of Open Access Journals (Sweden)

    Guozeng Wang

    Full Text Available BACKGROUND: Xylan is one of the most abundant biopolymers on Earth. Its degradation is mediated primarily by microbial xylanase in nature. To explore the diversity and distribution patterns of xylanase genes in soils, samples of five soil types with different physicochemical characters were analyzed. METHODOLOGY/PRINCIPAL FINDINGS: Partial xylanase genes of glycoside hydrolase (GH family 10 were recovered following direct DNA extraction from soil, PCR amplification and cloning. Combined with our previous study, a total of 1084 gene fragments were obtained, representing 366 OTUs. More than half of the OTUs were novel (identities of <65% with known xylanases and had no close relatives based on phylogenetic analyses. Xylanase genes from all the soil environments were mainly distributed in Bacteroidetes, Proteobacteria, Acidobacteria, Firmicutes, Actinobacteria, Dictyoglomi and some fungi. Although identical sequences were found in several sites, habitat-specific patterns appeared to be important, and geochemical factors such as pH and oxygen content significantly influenced the compositions of xylan-degrading microbial communities. CONCLUSION/SIGNIFICANCE: These results provide insight into the GH 10 xylanases in various soil environments and reveal that xylan-degrading microbial communities are environment specific with diverse and abundant populations.

  4. Short paper: Distributed vehicular traffic congestion detection algorithm for urban environments

    OpenAIRE

    Milojevic, M.; Rakocevic, V.

    2013-01-01

    Vehicular traffic congestion is a well-known economic and social problem generating significant costs and safety challenges, and increasing pollution in the cities. Current intelligent transport systems and vehicular networking technologies rely heavily on the supporting network infrastructure which is still not widely available. This paper contributes towards the development of distributed and cooperative vehicular traffic congestion detection by proposing a new vehicle-to-vehicle (V2V) cong...

  5. Diversity of flux distribution in central carbon metabolism of S. cerevisiae strains from diverse environments.

    Science.gov (United States)

    Nidelet, Thibault; Brial, Pascale; Camarasa, Carole; Dequin, Sylvie

    2016-04-05

    S. cerevisiae has attracted considerable interest in recent years as a model for ecology and evolutionary biology, revealing a substantial genetic and phenotypic diversity. However, there is a lack of knowledge on the diversity of metabolic networks within this species. To identify the metabolic and evolutionary constraints that shape metabolic fluxes in S. cerevisiae, we used a dedicated constraint-based model to predict the central carbon metabolism flux distribution of 43 strains from different ecological origins, grown in wine fermentation conditions. In analyzing these distributions, we observed a highly contrasted situation in flux variability, with quasi-constancy of the glycolysis and ethanol synthesis yield yet high flexibility of other fluxes, such as the pentose phosphate pathway and acetaldehyde production. Furthermore, these fluxes with large variability showed multimodal distributions that could be linked to strain origin, indicating a convergence between genetic origin and flux phenotype. Flux variability is pathway-dependent and, for some flux, a strain origin effect can be found. These data highlight the constraints shaping the yeast operative central carbon network and provide clues for the design of strategies for strain improvement.

  6. [Environment spatial distribution of mercury pollution in Songhua River upstream gold mining areas].

    Science.gov (United States)

    Zou, Ting-Ting; Wang, Ning; Zhang, Gang; Zhao, Dan-Dan

    2010-09-01

    Using Zeeman mercury spectrometer RA915+ monitoring the total gaseous mercury concentration were collected from gold mining area in Huadian, in the upper reaches of the Songhua River, during summer and autumn of 2008, where we simultaneously collected samples of air, water, sediment and soil. The research is focused on analyzing of the spatial and temporal distribution characteristics of atmospheric mercury pollution and the correlation with other environmental factors. The results show that: the concentration of atmospheric mercury in summer is higher than that in autumn and in the evening is higher than at noon, and it present a gradual decay with the distance to the gold mining area as the center point increasing. The distribution rule of mercury pollution of environmental factors in the gold mining area is: in sediment > in soil > in plant > in water, the characteristics of mercury pollution distribution in plant is: root > stem and leaf, and the content of mercury in plant in autumn is commonly higher than that in summer. This is thought due to the accumulation of pollutant element from soil during the growth of plant. The atmospheric mercury has a significant correlation with the root of plant, respectively 0.83 in summer and 0.97 in autumn.

  7. Biological soil crusts from arctic environments: characterization of the prokaryotic community and exopolysaccharidic matrix analysis.

    Science.gov (United States)

    Mugnai, Gianmarco; Ventura, Stefano; Mascalchi, Cristina; Rossi, Federico; Adessi, Alessandra; De Philippis, Roberto

    2015-04-01

    Biological soil crusts (BSCs) are highly specialized topsoil microbial communities widespread in many ecosystems, from deserts to polar regions. BSCs play an active role in promoting soil fertility and plant growth. In Arctic environments BSCs are involved in promoting primary succession after deglaciation, increasing moisture availability and nutrient immission at the topsoil. The organisms residing on BSCs produce extracellular polymeric substances (EPS) in response to the environmental characteristics, thus contributing to the increase of constraint tolerance. The aim of this study was to investigate the taxonomic diversity of microbial communities, together with the analysis of the chemical features of EPS, from BSC samples collected in several sites near Ny-Ǻlesund, Norway. The phylogenetic composition of the prokaryotic community was assessed through a metagenomic approach. Exopolysaccharidic fractions were quantified using ion-exchange chromatography to determine the monosaccharidic composition. Size exclusion chromatography was used to determine the distribution of the EPS fractions. Abundance of phototrophic microorganisms, which are known to contribute to EPS excretion, was also evaluated. Results underlined the complexity of the microbial communities, showing a high level of diversity within the BSC sampled analyzed. The analysis of the polysaccharide composition displayed a high number of constituent sugars; the matrix was found to be constituted by two main fractions, a higher molecular weight (2 10 exp(6) Da) and a lower molecular weight fraction (< 100 10 exp(3) Da). This study presents novel data concerning EPS of BSCs matrix in relationship with the microbial communities in cold environments.

  8. The Safety Analysis of Shipborne Ammunition in Fire Environment

    Science.gov (United States)

    Ren, Junpeng; Wang, Xudong; Yue, Pengfei

    2017-12-01

    The safety of Ammunition has always been the focus of national military science and technology issues. And fire is one of the major safety threats to the ship’s ammunition storage environment, In this paper, Mk-82 shipborne aviation bomb has been taken as the study object, simulated the whole process of fire by using the FDS (Fire Detection System) software. According to the simulation results of FDS, ANSYS software was used to simulate the temperature field of Mk-82 carrier-based aviation bomb under fire environment, and the safety of aviation bomb in fire environment was analyzed. The result shows that the aviation bombs under the fire environment can occur the combustion or explosion after 70s constant cook-off, and it was a huge threat to the ship security.

  9. The concept 'environment' in exergy analysis Some special cases

    International Nuclear Information System (INIS)

    Serova, E.N.; Brodianski, V.M.

    2004-01-01

    The concept 'environment' is of considerable importance in present-day engineering thermodynamics. Introduction of this concept in operation brings not only simplification of the methods of solving classical thermodynamic problems, but also gives the exergy method which forms the major new part of thermodynamics, including some parts of biology, economics and other fields of science. But practice shows that it is necessary to define the concept 'environment' more precisely in some cases

  10. Analysis of anaerobic product properties in fluid and aggressive environments

    OpenAIRE

    Goncharov, A.; Tulinov, A.

    2008-01-01

    The article presents the results of experiments involved in investigation of properties of some domestic and foreign-made anaerobic materials in components and units operating in fluid and aggressive environments. These experiments determined the strength and swell values of anaerobic products in the sea water, fuel and oil, and confirmed their anticorrosion properties. The experiments demonstrated high resistance of anaerobic products to various fluids and aggressive environments, which make...

  11. Investment Analysis Of Environment Pollution In Educational Institutions

    OpenAIRE

    Mahbub Ullah Miyan; Abdus Salam; Md. Nuruzzaman; Sanjida Naznin

    2015-01-01

    Environment pollution has become one of the biggest concerns for the educational institutions in Bangladesh. Thinking not yet starts that environmental pollution has a connection with educational institutions which requires investment. Educational institutions are paying huge amount of money in order to clean the academic atmosphere. Due to unawareness and unconsciousness the environment of the institutions campus continuously polluting in many ways. This paper provides an outline of how diff...

  12. Location and Size Planning of Distributed Photovoltaic Generation in Distribution network System Based on K-means Clustering Analysis

    Science.gov (United States)

    Lu, Siqi; Wang, Xiaorong; Wu, Junyong

    2018-01-01

    The paper presents a method to generate the planning scenarios, which is based on K-means clustering analysis algorithm driven by data, for the location and size planning of distributed photovoltaic (PV) units in the network. Taken the power losses of the network, the installation and maintenance costs of distributed PV, the profit of distributed PV and the voltage offset as objectives and the locations and sizes of distributed PV as decision variables, Pareto optimal front is obtained through the self-adaptive genetic algorithm (GA) and solutions are ranked by a method called technique for order preference by similarity to an ideal solution (TOPSIS). Finally, select the planning schemes at the top of the ranking list based on different planning emphasis after the analysis in detail. The proposed method is applied to a 10-kV distribution network in Gansu Province, China and the results are discussed.

  13. Performance analysis of distributed beamforming in a spectrum sharing system

    KAUST Repository

    Yang, Liang

    2013-05-01

    In this paper, we consider a distributed beamforming scheme (DBF) in a spectrum sharing system where multiple secondary users share the spectrum with some licensed primary users under an interference temperature constraint. We assume that the DBF is applied at the secondary users. We first consider optimal beamforming and compare it with the user selection scheme in terms of the outage probability and bit error rate performance metrics. Since perfect feedback is difficult to obtain, we then investigate a limited feedback DBF scheme and develop an analysis for a random vector quantization design algorithm. Specifically, the approximate statistics functions of the squared inner product between the optimal and quantized vectors are derived. With these statistics, we analyze the outage performance. Furthermore, the effects of channel estimation error and number of primary users on the system performance are investigated. Finally, optimal power adaptation and cochannel interference are considered and analyzed. Numerical and simulation results are provided to illustrate our mathematical formalism and verify our analysis. © 2012 IEEE.

  14. Synchrotron Radiation Pair Distribution Function Analysis of Gels in Cements

    Directory of Open Access Journals (Sweden)

    Ana Cuesta

    2017-10-01

    Full Text Available The analysis of atomic ordering in a nanocrystalline phase with small particle sizes, below 5 nm, is intrinsically complicated because of the lack of long-range order. Furthermore, the presence of additional crystalline phase(s may exacerbate the problem, as is the case in cement pastes. Here, we use the synchrotron pair distribution function (PDF chiefly to characterize the local atomic order of the nanocrystalline phases, gels, in cement pastes. We have used a multi r-range analysis approach, where the ~4–7 nm r-range allows determining the crystalline phase contents; the ~1–2.5 nm r-range is used to characterize the atomic ordering in the nanocrystalline component; and the ~0.2–1.0 nm r-range gives insights about additional amorphous components. Specifically, we have prepared four alite pastes with variable water contents, and the analyses showed that a defective tobermorite, Ca11Si9O28(OH2.8.5H2O, gave the best fit. Furthermore, the PDF analyses suggest that the calcium silicate hydrate gel is composed of this tobermorite and amorphous calcium hydroxide. Finally, this approach has been used to study alternative cements. The hydration of monocalcium aluminate and ye’elimite pastes yield aluminum hydroxide gels. PDF analyses show that these gels are constituted of nanocrystalline gibbsite, and the particle size can be as small as 2.5 nm.

  15. Dictionaries and distributions: Combining expert knowledge and large scale textual data content analysis : Distributed dictionary representation.

    Science.gov (United States)

    Garten, Justin; Hoover, Joe; Johnson, Kate M; Boghrati, Reihane; Iskiwitch, Carol; Dehghani, Morteza

    2018-02-01

    Theory-driven text analysis has made extensive use of psychological concept dictionaries, leading to a wide range of important results. These dictionaries have generally been applied through word count methods which have proven to be both simple and effective. In this paper, we introduce Distributed Dictionary Representations (DDR), a method that applies psychological dictionaries using semantic similarity rather than word counts. This allows for the measurement of the similarity between dictionaries and spans of text ranging from complete documents to individual words. We show how DDR enables dictionary authors to place greater emphasis on construct validity without sacrificing linguistic coverage. We further demonstrate the benefits of DDR on two real-world tasks and finally conduct an extensive study of the interaction between dictionary size and task performance. These studies allow us to examine how DDR and word count methods complement one another as tools for applying concept dictionaries and where each is best applied. Finally, we provide references to tools and resources to make this method both available and accessible to a broad psychological audience.

  16. Distribution of coagulase-negative Staphylococcus species from milk and environment of dairy cows differs between herds.

    Science.gov (United States)

    Piessens, V; Van Coillie, E; Verbist, B; Supré, K; Braem, G; Van Nuffel, A; De Vuyst, L; Heyndrickx, M; De Vliegher, S

    2011-06-01

    In many parts of the world, coagulase-negative staphylococci (CNS) are the predominant pathogens causing intramammary infections (IMI) in dairy cows. The cows' environment is thought to be a possible source for CNS mastitis and this was investigated in the present paper. A longitudinal field study was carried out in 6 well-managed dairy herds to determine the distribution and epidemiology of various CNS species isolated from milk, causing IMI and living freely in the cows' environment, respectively. In each herd, quarter milk samples from a cohort of 10 lactating cows and environmental samples from stall air, slatted floor, sawdust from cubicles, and sawdust stock were collected monthly (n=13). Isolates from quarter milk samples (n=134) and the environment (n=637) were identified to species level using amplified fragment length polymorphism (AFLP) genotyping. Staphylococcus chromogenes, S. haemolyticus, S. epidermidis, and S. simulans accounted for 81.3% of all CNS milk isolates. Quarters were considered infected with CNS (positive IMI status) only when 2 out of 3 consecutive milk samples yielded the same CNS AFLP type. The species causing IMI were S. chromogenes (n=35 samples with positive IMI status), S. haemolyticus (n=29), S. simulans (n=14), and S. epidermidis (n=6). The observed persistent IMI cases (n=17) had a mean duration of 149.4 d (range 63.0 to 329.8 d). The CNS species predominating in the environment were S. equorum, S. sciuri, S. haemolyticus, and S. fleurettii. Herd-to-herd differences in distribution of CNS species were observed in both milk and the environment, suggesting that herd-level factors are involved in the establishment of particular species in a dairy herd. Primary reservoirs of the species causing IMI varied. Staphylococcus chromogenes and S. epidermidis were rarely found in the environment, indicating that other reservoirs were more important in their epidemiology. For S. haemolyticus and S. simulans, the environment was found as a

  17. Distribution and speciation of radionuclides in the environment: their implication in radioecology

    International Nuclear Information System (INIS)

    Cigna, A.A.

    2000-01-01

    Following the discovery of X-ray and radioactivity, radioecological researches were initiated all over the world. But only after the 2nd World War the knowledge of the effects of ionizing radiations on the organisms and the processes of the diffusion of radionuclides in the environment achieved an outstanding level. On account of the great sensitivity of the radioactivity measurements, negligible amounts of radionuclides could be easily identified and measured in different environmental compartments without any slight interference with the metabolisms of living organisms. Many processes and phenomena could then be detected and studied. Ecology took advantage from such studies and its growth in a few years was probably greater than in the whole of the previous century. As a result a great interest in the determination of concentration factors in any organism spread widely in many laboratories, a large number of values were available in a few years time. Further it appeared that the transfer of the radionuclides from the environment to man could be better evaluated and monitored through the definition of some 'critical' quantity: a critical group, a critical radionuclide, a critical pathway, etc. The fallout dispersed by the experimental detonation of nuclear weapons and, more recently, the contamination due to the Chernobyl accident, were the most important sources of radionuclides in most of the environmental compartments. Undoubtedly in the post Chernobyl situation radioecology is in a better position because the description of the environment is presently much closer to reality and its conclusions much more reliable. But, as it is usual in science development, new problems appeared and new questions were asked. Speciation of radionuclides and other pollutants is considered and some of the effects on the diffusion and consequences are discussed. Finally, the application of the great amount of knowledge obtained by the radioecological research to a better

  18. Analysis of distribution systems with a high penetration of distributed generation

    DEFF Research Database (Denmark)

    Lund, Torsten

    Since the mid eighties, a large number of wind turbines and distributed combined heat and power plants (CHPs) have been connected to the Danish power system. Especially in the Western part, comprising Jutland and Funen, the penetration is high compared to the load demand. In some periods the wind...... power alone can cover the entire load demand. The objective of the work is to investigate the influence of wind power and distributed combined heat and power production on the operation of the distribution systems. Where other projects have focused on the modeling and control of the generators and prime...... movers, the focus of this project is on the operation of an entire distribution system with several wind farms and CHPs. Firstly, the subject of allocation of power system losses in a distribution system with distributed generation is treated. A new approach to loss allocation based on current injections...

  19. Procurement-distribution model for perishable items with quantity discounts incorporating freight policies under fuzzy environment

    Directory of Open Access Journals (Sweden)

    Makkar Sandhya

    2013-01-01

    Full Text Available A significant issue of the supply chain problem is how to integrate different entities. Managing supply chain is a difficult task because of complex integrations, especially when the products are perishable in nature. Little attention has been paid on ordering specific perishable products jointly in uncertain environment with multiple sources and multiple destinations. In this article, we propose a supply chain coordination model through quantity and freight discount policy for perishable products under uncertain cost and demand information. A case is provided to validate the procedure.

  20. Performance Evaluation of Multicast Video Distribution using LTE-A in Vehicular Environments

    OpenAIRE

    Thota, Jayashree; Bulut, Berna; Doufexi, Angela; Armour, Simon; Nix, Andrew

    2017-01-01

    Application Layer Forward Error Correction (AL-FEC) based on Raptor codes has been employed in Multimedia Broadcast/Multicast Services (MBMS) to improve reliability. This paper considers a cross-layer system based on the latest Raptor Q codes for transmitting high data rate video. Multiple Input Multiple Output (MIMO) channels in a realistic outdoor environment for a user moving at 50kmph in an LTE-A system is considered. A link adaptation model with optimized cross-layer parameters is propos...