WorldWideScience

Sample records for distributed analysis environment

  1. Distributed analysis environment for HEP and interdisciplinary applications

    CERN Document Server

    Moscicki, J T

    2003-01-01

    Huge data volumes of Large Hadron Collider experiments require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R&D project (http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modul...

  2. Power distribution, the environment, and public health. A state-level analysis

    International Nuclear Information System (INIS)

    Boyce, James K.; Klemer, Andrew R.; Templet, Paul H.; Willis, Cleve E.

    1999-01-01

    This paper examines relationships among power distribution, the environment, and public health by means of a cross-sectional analysis of the 50 US states. A measure of inter-state variations in power distribution is derived from data on voter participation, tax fairness, Medicaid access, and educational attainment. We develop and estimate a recursive model linking the distribution of power to environmental policy, environmental stress, and public health. The results support the hypothesis that greater power inequality leads to weaker environmental policies, which in turn lead to greater environmental degradation and to adverse public health outcomes

  3. Power distribution, the environment, and public health. A state-level analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boyce, James K. [Department of Economics, University of Massachusetts, Amherst, MA 01003 (United States); Klemer, Andrew R. [Department of Biology, University of Minnesota, Duluth, MN (United States); Templet, Paul H. [Institute of Environmental Studies, Louisiana State University, Baton Rouge, LA (United States); Willis, Cleve E. [Department of Resource Economics, University of Massachusetts, Amherst, MA 01003 (United States)

    1999-04-15

    This paper examines relationships among power distribution, the environment, and public health by means of a cross-sectional analysis of the 50 US states. A measure of inter-state variations in power distribution is derived from data on voter participation, tax fairness, Medicaid access, and educational attainment. We develop and estimate a recursive model linking the distribution of power to environmental policy, environmental stress, and public health. The results support the hypothesis that greater power inequality leads to weaker environmental policies, which in turn lead to greater environmental degradation and to adverse public health outcomes.

  4. Power distribution, the environment, and public health. A state-level analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boyce, James K. [Department of Economics, University of Massachusetts, Amherst, MA 01003 (United States); Klemer, Andrew R. [Department of Biology, University of Minnesota, Duluth, MN (United States); Templet, Paul H. [Institute of Environmental Studies, Louisiana State University, Baton Rouge, LA (United States); Willis, Cleve E. [Department of Resource Economics, University of Massachusetts, Amherst, MA 01003 (United States)

    1999-04-15

    This paper examines relationships among power distribution, the environment, and public health by means of a cross-sectional analysis of the 50 US states. A measure of inter-state variations in power distribution is derived from data on voter participation, tax fairness, Medicaid access, and educational attainment. We develop and estimate a recursive model linking the distribution of power to environmental policy, environmental stress, and public health. The results support the hypothesis that greater power inequality leads to weaker environmental policies, which in turn lead to greater environmental degradation and to adverse public health outcomes

  5. Distributed analysis environment for HEP and interdisciplinary applications

    International Nuclear Information System (INIS)

    Moscicki, J.T.

    2003-01-01

    Huge data volumes of Larger Hadron Collider experiment require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R and D project (http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modules implemented with newer technology as necessary. The paper gives an overview of DIANE architecture and explains the main design choices. Selected examples of diverse applications from a variety of domains applicable to DIANE are presented. As well as preliminary benchmarking results

  6. Distributed computing and nuclear reactor analysis

    International Nuclear Information System (INIS)

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-01-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations

  7. A spatial pattern analysis of the halophytic species distribution in an arid coastal environment.

    Science.gov (United States)

    Badreldin, Nasem; Uria-Diez, J; Mateu, J; Youssef, Ali; Stal, Cornelis; El-Bana, Magdy; Magdy, Ahmed; Goossens, Rudi

    2015-05-01

    Obtaining information about the spatial distribution of desert plants is considered as a serious challenge for ecologists and environmental modeling due to the required intensive field work and infrastructures in harsh and remote arid environments. A new method was applied for assessing the spatial distribution of the halophytic species (HS) in an arid coastal environment. This method was based on the object-based image analysis for a high-resolution Google Earth satellite image. The integration of the image processing techniques and field work provided accurate information about the spatial distribution of HS. The extracted objects were based on assumptions that explained the plant-pixel relationship. Three different types of digital image processing techniques were implemented and validated to obtain an accurate HS spatial distribution. A total of 2703 individuals of the HS community were found in the case study, and approximately 82% were located above an elevation of 2 m. The micro-topography exhibited a significant negative relationship with pH and EC (r = -0.79 and -0.81, respectively, p < 0.001). The spatial structure was modeled using stochastic point processes, in particular a hybrid family of Gibbs processes. A new model is proposed that uses a hard-core structure at very short distances, together with a cluster structure in short-to-medium distances and a Poisson structure for larger distances. This model was found to fit the data perfectly well.

  8. Distributed Scheduling in Time Dependent Environments: Algorithms and Analysis

    OpenAIRE

    Shmuel, Ori; Cohen, Asaf; Gurewitz, Omer

    2017-01-01

    Consider the problem of a multiple access channel in a time dependent environment with a large number of users. In such a system, mostly due to practical constraints (e.g., decoding complexity), not all users can be scheduled together, and usually only one user may transmit at any given time. Assuming a distributed, opportunistic scheduling algorithm, we analyse the system's properties, such as delay, QoS and capacity scaling laws. Specifically, we start with analyzing the performance while \\...

  9. Wigner Ville Distribution in Signal Processing, using Scilab Environment

    Directory of Open Access Journals (Sweden)

    Petru Chioncel

    2011-01-01

    Full Text Available The Wigner Ville distribution offers a visual display of quantitative information about the way a signal’s energy is distributed in both, time and frequency. Through that, this distribution embodies the fundamentally concepts of the Fourier and time-domain analysis. The energy of the signal is distributed so that specific frequencies are localized in time by the group delay time and at specifics instants in time the frequency is given by the instantaneous frequency. The net positive volum of the Wigner distribution is numerically equal to the signal’s total energy. The paper shows the application of the Wigner Ville distribution, in the field of signal processing, using Scilab environment.

  10. Sensors in Distributed Mixed Reality Environments

    Directory of Open Access Journals (Sweden)

    Felix Hamza-Lup

    2005-04-01

    Full Text Available A distributed mixed-reality (MR or virtual reality (VR environment implies the cooperative engagement of a set of software and hardware resources. With the advances in sensors and computer networks we have seen an increase in the number of potential MR/VR applications that require large amounts of information from the real world collected through sensors (e.g. position and orientation tracking sensors. These sensors collect data from the real environment in real-time at different locations and a distributed environment connecting them must assure data distribution among collaborative sites at interactive speeds. With the advances in sensor technology, we envision that in future systems a significant amount of data will be collected from sensors and devices attached to the participating nodes This paper proposes a new architecture for sensor based interactive distributed MR/VR environments that falls in-between the atomistic peer-to-peer model and the traditional client-server model. Each node is autonomous and fully manages its resources and connectivity. The dynamic behavior of the nodes is dictated by the human participants that manipulate the sensors attached to these nodes.

  11. Learning Networks Distributed Environment

    NARCIS (Netherlands)

    Martens, Harrie; Vogten, Hubert; Koper, Rob; Tattersall, Colin; Van Rosmalen, Peter; Sloep, Peter; Van Bruggen, Jan; Spoelstra, Howard

    2005-01-01

    Learning Networks Distributed Environment is a prototype of an architecture that allows the sharing and modification of learning materials through a number of transport protocols. The prototype implements a p2p protcol using JXTA.

  12. Distributed Research Center for Analysis of Regional Climatic Changes and Their Impacts on Environment

    Science.gov (United States)

    Shiklomanov, A. I.; Okladnikov, I.; Gordov, E. P.; Proussevitch, A. A.; Titov, A. G.

    2016-12-01

    Presented is a collaborative project carrying out by joint team of researchers from the Institute of Monitoring of Climatic and Ecological Systems, Russia and Earth Systems Research Center, University of New Hampshire, USA. Its main objective is development of a hardware and software prototype of Distributed Research Center (DRC) for monitoring and projecting of regional climatic and and their impacts on the environment over the Northern extratropical areas. In the framework of the project new approaches to "cloud" processing and analysis of large geospatial datasets (big geospatial data) are being developed. It will be deployed on technical platforms of both institutions and applied in research of climate change and its consequences. Datasets available at NCEI and IMCES include multidimensional arrays of climatic, environmental, demographic, and socio-economic characteristics. The project is aimed at solving several major research and engineering tasks: 1) structure analysis of huge heterogeneous climate and environmental geospatial datasets used in the project, their preprocessing and unification; 2) development of a new distributed storage and processing model based on a "shared nothing" paradigm; 3) development of a dedicated database of metadata describing geospatial datasets used in the project; 4) development of a dedicated geoportal and a high-end graphical frontend providing intuitive user interface, internet-accessible online tools for analysis of geospatial data and web services for interoperability with other geoprocessing software packages. DRC will operate as a single access point to distributed archives of spatial data and online tools for their processing. Flexible modular computational engine running verified data processing routines will provide solid results of geospatial data analysis. "Cloud" data analysis and visualization approach will guarantee access to the DRC online tools and data from all over the world. Additionally, exporting of data

  13. Xcache in the ATLAS Distributed Computing Environment

    CERN Document Server

    Hanushevsky, Andrew; The ATLAS collaboration

    2018-01-01

    Built upon the Xrootd Proxy Cache (Xcache), we developed additional features to adapt the ATLAS distributed computing and data environment, especially its data management system RUCIO, to help improve the cache hit rate, as well as features that make the Xcache easy to use, similar to the way the Squid cache is used by the HTTP protocol. We are optimizing Xcache for the HPC environments, and adapting the HL-LHC Data Lakes design as its component for data delivery. We packaged the software in CVMFS, in Docker and Singularity containers in order to standardize the deployment and reduce the cost to resolve issues at remote sites. We are also integrating it into RUCIO as a volatile storage systems, and into various ATLAS workflow such as user analysis,

  14. Authorization Administration in a Distributed Multi-application Environment

    Institute of Scientific and Technical Information of China (English)

    DUAN Sujuan; HONG Fan; LI Xinhua

    2004-01-01

    To meet the authorization administration requirements in a distributed computer network environment, this paper extends the role-based access control model with multiple application dimensions and establishes a new access control model ED-RBAC(Extended Role Based Access Control Model) for the distributed environment. We propose an extendable hierarchical authorization assignment framework and design effective role-registering, role-applying and role-assigning protocol with symmetric and asymmetric cryptographic systems. The model can be used to simplify authorization administration in a distributed environment with multiple applications.

  15. Distributed analysis with PROOF in ATLAS collaboration

    International Nuclear Information System (INIS)

    Panitkin, S Y; Ernst, M; Ito, H; Maeno, T; Majewski, S; Rind, O; Tarrade, F; Wenaus, T; Ye, S; Benjamin, D; Montoya, G Carillo; Guan, W; Mellado, B; Xu, N; Cranmer, K; Shibata, A

    2010-01-01

    The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF can be configured to work with centralized storage systems, but it is especially effective together with distributed local storage systems - like Xrootd, when data are distributed over computing nodes. It works efficiently on different types of hardware and scales well from a multi-core laptop to large computing farms. From that point of view it is well suited for both large central analysis facilities and Tier 3 type analysis farms. PROOF can be used in interactive or batch like regimes. The interactive regime allows the user to work with typically distributed data from the ROOT command prompt and get a real time feedback on analysis progress and intermediate results. We will discuss our experience with PROOF in the context of ATLAS Collaboration distributed analysis. In particular we will discuss PROOF performance in various analysis scenarios and in multi-user, multi-session environments. We will also describe PROOF integration with the ATLAS distributed data management system and prospects of running PROOF on geographically distributed analysis farms.

  16. Distributed analysis with PROOF in ATLAS collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Panitkin, S Y; Ernst, M; Ito, H; Maeno, T; Majewski, S; Rind, O; Tarrade, F; Wenaus, T; Ye, S [Brookhaven National Laboratory, Upton, NY 11973 (United States); Benjamin, D [Duke University, Durham, NC 27708 (United States); Montoya, G Carillo; Guan, W; Mellado, B; Xu, N [University of Wisconsin-Madison, Madison, WI 53706 (United States); Cranmer, K; Shibata, A [New York University, New York, NY 10003 (United States)

    2010-04-01

    The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF can be configured to work with centralized storage systems, but it is especially effective together with distributed local storage systems - like Xrootd, when data are distributed over computing nodes. It works efficiently on different types of hardware and scales well from a multi-core laptop to large computing farms. From that point of view it is well suited for both large central analysis facilities and Tier 3 type analysis farms. PROOF can be used in interactive or batch like regimes. The interactive regime allows the user to work with typically distributed data from the ROOT command prompt and get a real time feedback on analysis progress and intermediate results. We will discuss our experience with PROOF in the context of ATLAS Collaboration distributed analysis. In particular we will discuss PROOF performance in various analysis scenarios and in multi-user, multi-session environments. We will also describe PROOF integration with the ATLAS distributed data management system and prospects of running PROOF on geographically distributed analysis farms.

  17. Security Analysis of Measurement-Device-Independent Quantum Key Distribution in Collective-Rotation Noisy Environment

    Science.gov (United States)

    Li, Na; Zhang, Yu; Wen, Shuang; Li, Lei-lei; Li, Jian

    2018-01-01

    Noise is a problem that communication channels cannot avoid. It is, thus, beneficial to analyze the security of MDI-QKD in noisy environment. An analysis model for collective-rotation noise is introduced, and the information theory methods are used to analyze the security of the protocol. The maximum amount of information that Eve can eavesdrop is 50%, and the eavesdropping can always be detected if the noise level ɛ ≤ 0.68. Therefore, MDI-QKD protocol is secure as quantum key distribution protocol. The maximum probability that the relay outputs successful results is 16% when existing eavesdropping. Moreover, the probability that the relay outputs successful results when existing eavesdropping is higher than the situation without eavesdropping. The paper validates that MDI-QKD protocol has better robustness.

  18. The ATLAS distributed analysis system

    International Nuclear Information System (INIS)

    Legger, F

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  19. The ATLAS distributed analysis system

    Science.gov (United States)

    Legger, F.; Atlas Collaboration

    2014-06-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  20. Dynamic shared state maintenance in distributed virtual environments

    Science.gov (United States)

    Hamza-Lup, Felix George

    Advances in computer networks and rendering systems facilitate the creation of distributed collaborative environments in which the distribution of information at remote locations allows efficient communication. Particularly challenging are distributed interactive Virtual Environments (VE) that allow knowledge sharing through 3D information. The purpose of this work is to address the problem of latency in distributed interactive VE and to develop a conceptual model for consistency maintenance in these environments based on the participant interaction model. An area that needs to be explored is the relationship between the dynamic shared state and the interaction with the virtual entities present in the shared scene. Mixed Reality (MR) and VR environments must bring the human participant interaction into the loop through a wide range of electronic motion sensors, and haptic devices. Part of the work presented here defines a novel criterion for categorization of distributed interactive VE and introduces, as well as analyzes, an adaptive synchronization algorithm for consistency maintenance in such environments. As part of the work, a distributed interactive Augmented Reality (AR) testbed and the algorithm implementation details are presented. Currently the testbed is part of several research efforts at the Optical Diagnostics and Applications Laboratory including 3D visualization applications using custom built head-mounted displays (HMDs) with optical motion tracking and a medical training prototype for endotracheal intubation and medical prognostics. An objective method using quaternion calculus is applied for the algorithm assessment. In spite of significant network latency, results show that the dynamic shared state can be maintained consistent at multiple remotely located sites. In further consideration of the latency problems and in the light of the current trends in interactive distributed VE applications, we propose a hybrid distributed system architecture for

  1. The comprehensive intermodal planning system in distributed environment

    Directory of Open Access Journals (Sweden)

    Olgierd Dziamski

    2013-06-01

    Full Text Available Background: Goods distribution by containers has opened new opportunities for them in the global supply chain network. Standardization of transportation units allow to simplify transport logistics operations and enable the integration of different types of transport. Currently, container transport is the most popular means of transport by sea, but recent studies show an increase in its popularity in land transport. In the paper presented the concept of a comprehensive intermodal planning system in a distributed environment which enables more efficient use containers in the global transport. The aim of this paper was to formulate and develop new concept of Internet Intermodal Planning System in distribution environment, supporting the common interests of the consignors and the logistic service providers by integrating the transportation modes, routing, logistics operations and scheduling in order to create intermodal transport plans. Methods: In the paper presented and discussed the new approach to the management of logistics resources used in international intermodal transport. Detailed analysis of proposed classification of variety transportation means has been carried out along with their specific properties, which may affect the time and cost of transportation. Have been presented the modular web-based distribution planning system supporting container transportation planning. Conducted the analysis of the main planning process and curried out the discussion on the effectiveness of the new web-based container transport planning system. Result and conclusions: This paper presents a new concept of the distributed container transport planning system integrating the available on the market logistics service providers with freight exchange. Specific to container planning new transport categories has been identified which simplify and facilitate the management of intermodal planning process. The paper presents a modular structure of Integrated

  2. Response Time Analysis of Distributed Web Systems Using QPNs

    Directory of Open Access Journals (Sweden)

    Tomasz Rak

    2015-01-01

    Full Text Available A performance model is used for studying distributed Web systems. Performance evaluation is done by obtaining load test measurements. Queueing Petri Nets formalism supports modeling and performance analysis of distributed World Wide Web environments. The proposed distributed Web systems modeling and design methodology have been applied in the evaluation of several system architectures under different external loads. Furthermore, performance analysis is done to determine the system response time.

  3. Community problem-solving framed as a distributed information use environment: bridging research and practice

    Directory of Open Access Journals (Sweden)

    Joan C. Durrance

    2006-01-01

    Full Text Available Introduction. This article results from a qualitative study of 1 information behavior in community problem-solving framed as a distributed information use environment and 2 approaches used by a best-practice library to anticipate information needs associated with community problem solving. Method. Several approaches to data collection were used - focus groups, interviews, observation of community and library meetings, and analysis of supporting documents. We focused first on the information behaviour of community groups. Finding that the library supported these activities we sought to understand its approach. Analysis. Data were coded thematically for both information behaviour concepts and themes germane to problem-solving activity. A grounded theory approach was taken to capture aspects of the library staff's practice. Themes evolved from the data; supporting documentation - reports, articles and library communication - was also coded. Results. The study showed 1 how information use environment components (people, setting, problems, problem resolutions combine in this distributed information use environment to determine specific information needs and uses; and 2 how the library contributed to the viability of this distributed information use environment. Conclusion. Community problem solving, here explicated as a distributed IUE, is likely to be seen in multiple communities. The library model presented demonstrates that by reshaping its information practice within the framework of an information use environment, a library can anticipate community information needs as they are generated and where they are most relevant.

  4. PIXE analysis of work environment aerosols

    International Nuclear Information System (INIS)

    Akabayashi, Hideo; Fujimoto, Fuminori; Komaki, Kenichiro; Ootuka, Akio; Kobayashi, Koichi; Yamashita, Hiroshi

    1988-01-01

    In labor environment, the quantity of chemical substances in the air is more, and their kinds are more diversified than in general home environment. It has been well known that some substances contained in the aerosol in labor environment (floating dust in the atmosphere) such as asbestos and hexavalent chromium have the possibility of causing serious injuries such as cancer of respiratory organ. In order to identify the harmful substances to which laborers are exposed and to take the measures for removing them, it is necessary to investigate in detail into many factors related to the effect of aerosol on human bodies, such as the composition of elements, chemical condition, concentration, the particle size of dust and temporal and spatial distributions. For the purpose, sampling and analysis must be carried out so that information can be extracted as much as possible from a minute amount of sample. The particle induced x-ray emission (PIXE) analysis is very effective for this application. In this paper, the development of a PIXE analysis system and the knowledge obtained by the sampling and measurement of aerosol in indoor labor environment are reported. The labor environment selected is that of the workshop of Department of Liberal Arts, University of Tokyo. Sampling, the experimental apparatus, the method of data analysis and the results of analysis are described. (Kako, I.)

  5. Problem solving environment for distributed interactive applications

    NARCIS (Netherlands)

    Rycerz, K.; Bubak, M.; Sloot, P.; Getov, V.; Gorlatch, S.; Bubak, M.; Priol, T.

    2008-01-01

    Interactive Problem Solving Environments (PSEs) offer an integrated approach for constructing and running complex systems, such as distributed simulation systems. To achieve efficient execution of High Level Architecture (HLA)-based distributed interactive simulations on the Grid, we introduce a PSE

  6. Plutonium in the ocean environment. Its distributions and behavior

    International Nuclear Information System (INIS)

    Hirose, Katsumi

    2009-01-01

    Marine environments have been extensively contaminated by plutonium as a result of global fallout due to atmospheric nuclear-weapons testing. Knowledge of the levels and behavior of plutonium in marine environments is necessary to assess the radiological and ecological effects of plutonium. Such analytical techniques as radiochemical analysis, α-spectrometry, and mass spectrometry have been developed to analyze the plutonium in seawater over the past five decades. Because of complex chemical properties (e.g. high reactivity to particles), plutonium in the ocean exhibits more complicated behavior than other long-lived anthropogenic radionuclides, such as 137 Cs. In the present study. In reviewed the research history of plutonium in the ocean, including spatial and temporal changes of plutonium levels and distributions, and its oceanographic behavior. (author)

  7. Depositional environments and porosity distribution in regressive limestone reservoirs of the Mishrif Formation, Southern Iraq

    International Nuclear Information System (INIS)

    AlDabbas, Moutaz; AlJassim Jassim; AlJumaily Saad

    2010-01-01

    Eight subsurface sections and a large number of thin sections of the Mishrif Limestone were studied to unravel the depositional facies and environments. The allochems in the Mishrif Formation are dominated by bioclasts, whereas peloids, ooids, and intraclasts are less abundant. The sedimentary microfacies of the Mishrif Formation includes mudstone, wackestone, packstone, grainstone, floatstone, and rudstone, which have been deposited in basinal, outer shelf, slop followed by shoal reef and lagoonal environments. The formation displays various extents of dolomitization and is cemented by calcite and dolomite. The formation has gradational contact with the underlying Rumaila Formation but is unconformably overlain by the Khasib Formation. The unconformity is recognized because the skeletal grains are dominated by Chaophyta (algae), which denotes the change of environment from fully marine to lacustrine environment. Thus, the vertical bioclast analysis indicates that the Mishrif Formation is characterized by two regressive cycles, which control the distribution of reservoir quality as well as the patterns of calcite and dolomite cement distribution. Mishrif Formation gradationally overlies Rumaila Formation. This was indicated by the presence of the green parts of Chaophyta (algae) as main skeletal grains at the uppermost part of well Zb-47, which refer to lacustrine or fresh water environment. Petrographical study shows that the fossils, peloids, oolitis, and intraclasts represent the main allochem. Calcite and dolomite (as diagenetic products) are the predominant mineral components of Mishrif Formation. Fossils were studied as an environmental age and facial boundaries indicators, which are located in a chart using personal computer programs depending on their distributions on the first appearance of species. Fifteen principal sedimentary microfacies have been identified in the Mishrif Formation, which includes lime mudstone, mudstone-wackestone, wackestone

  8. Distributed computing testbed for a remote experimental environment

    International Nuclear Information System (INIS)

    Butner, D.N.; Casper, T.A.; Howard, B.C.; Henline, P.A.; Davis, S.L.; Barnes, D.

    1995-01-01

    Collaboration is increasing as physics research becomes concentrated on a few large, expensive facilities, particularly in magnetic fusion energy research, with national and international participation. These facilities are designed for steady state operation and interactive, real-time experimentation. We are developing tools to provide for the establishment of geographically distant centers for interactive operations; such centers would allow scientists to participate in experiments from their home institutions. A testbed is being developed for a Remote Experimental Environment (REE), a ''Collaboratory.'' The testbed will be used to evaluate the ability of a remotely located group of scientists to conduct research on the DIII-D Tokamak at General Atomics. The REE will serve as a testing environment for advanced control and collaboration concepts applicable to future experiments. Process-to-process communications over high speed wide area networks provide real-time synchronization and exchange of data among multiple computer networks, while the ability to conduct research is enhanced by adding audio/video communication capabilities. The Open Software Foundation's Distributed Computing Environment is being used to test concepts in distributed control, security, naming, remote procedure calls and distributed file access using the Distributed File Services. We are exploring the technology and sociology of remotely participating in the operation of a large scale experimental facility

  9. Distribution and characteristics of gamma and cosmic ray dose rate in living environment

    International Nuclear Information System (INIS)

    Nagaoka, Toshi; Moriuchi, Shigeru

    1991-01-01

    A series of environmental radiation surveys was carried out from the viewpoint of characterizing the natural radiation dose rate distribution in the living environment, including natural and artificial ones. Through the analysis of the data obtained at numbers of places, several aspects of the radiation field in living environments were clarified. That is the gamma ray dose rate varies due to the following three dominant causes: 1) the radionuclide concentration of surrounding materials acting as gamma ray sources, 2) the spatial distribution of surrounding materials, and 3) the geometrical and shielding conditions between the natural gamma ray sources and the measured point; whereas, the cosmic ray dose rate varies due to the thickness of upper shielding materials. It was also suggested that the gamma ray dose rate generally shows an upward tendency, and the cosmic ray dose rate a downward one in artificial environment. This kind of knowledge is expected to serve as fundamental information for accurate and realistic evaluation of the collective dose in the living environment. (author)

  10. A cooperative model for IS security risk management in distributed environment.

    Science.gov (United States)

    Feng, Nan; Zheng, Chundong

    2014-01-01

    Given the increasing cooperation between organizations, the flexible exchange of security information across the allied organizations is critical to effectively manage information systems (IS) security in a distributed environment. In this paper, we develop a cooperative model for IS security risk management in a distributed environment. In the proposed model, the exchange of security information among the interconnected IS under distributed environment is supported by Bayesian networks (BNs). In addition, for an organization's IS, a BN is utilized to represent its security environment and dynamically predict its security risk level, by which the security manager can select an optimal action to safeguard the firm's information resources. The actual case studied illustrates the cooperative model presented in this paper and how it can be exploited to manage the distributed IS security risk effectively.

  11. CyberWalk : a web-based distributed virtual walkthrough environment.

    OpenAIRE

    Chim, J.; Lau, R. W. H.; Leong, H. V.; Si, A.

    2003-01-01

    A distributed virtual walkthrough environment allows users connected to the geometry server to walk through a specific place of interest, without having to travel physically. This place of interest may be a virtual museum, virtual library or virtual university. There are two basic approaches to distribute the virtual environment from the geometry server to the clients, complete replication and on-demand transmission. Although the on-demand transmission approach saves waiting time and optimize...

  12. Texture mapping in a distributed environment

    NARCIS (Netherlands)

    Nicolae, Goga; Racovita, Zoea; Telea, Alexandru

    2003-01-01

    This paper presents a tool for texture mapping in a distributed environment. A parallelization method based on the master-slave model is described. The purpose of this work is to lower the image generation time in the complex 3D scenes synthesis process. The experimental results concerning the

  13. Distributed optimization-based control of multi-agent networks in complex environments

    CERN Document Server

    Zhu, Minghui

    2015-01-01

    This book offers a concise and in-depth exposition of specific algorithmic solutions for distributed optimization based control of multi-agent networks and their performance analysis. It synthesizes and analyzes distributed strategies for three collaborative tasks: distributed cooperative optimization, mobile sensor deployment and multi-vehicle formation control. The book integrates miscellaneous ideas and tools from dynamic systems, control theory, graph theory, optimization, game theory and Markov chains to address the particular challenges introduced by such complexities in the environment as topological dynamics, environmental uncertainties, and potential cyber-attack by human adversaries. The book is written for first- or second-year graduate students in a variety of engineering disciplines, including control, robotics, decision-making, optimization and algorithms and with backgrounds in aerospace engineering, computer science, electrical engineering, mechanical engineering and operations research. Resea...

  14. Internet-centric collaborative design in a distributed environment

    International Nuclear Information System (INIS)

    Kim, Hyun; Kim, Hyoung Sun; Do, Nam Chul; Lee, Jae Yeol; Lee, Joo Haeng; Myong, Jae Hyong

    2001-01-01

    Recently, advanced information technologies including internet-related technology and distributed object technology have opened new possibilities for collaborative designs. In this paper, we discuss computer supports for collaborative design in a distributed environment. The proposed system is the internet-centric system composed of an engineering framework, collaborative virtual workspace and engineering service. It allows the distributed designers to more efficiently and collaboratively work their engineering tasks throughout the design process

  15. Exploring distributed user interfaces in ambient intelligent environments

    NARCIS (Netherlands)

    Dadlani Mahtani, P.M.; Peregrin Emparanza, J.; Markopoulos, P.; Gallud, J.A.; Tesoriero, R.; Penichet, V.M.R.

    2011-01-01

    In this paper we explore the use of Distributed User Interfaces (DUIs) in the field of Ambient Intelligence (AmI). We first introduce the emerging area of AmI, followed by describing three case studies where user interfaces or ambient displays are distributed and blending in the user’s environments.

  16. Natural Environment Suitability of China and Its Relationship with Population Distributions

    Directory of Open Access Journals (Sweden)

    Xiaohuan Yang

    2009-12-01

    Full Text Available The natural environment factor is one of the main indexes for evaluating human habitats, sustained economic growth and ecological health status. Based on Geographic Information System (GIS technology and an analytic hierarchy process method, this article presents the construction of the Natural Environment Suitability Index (NESI model of China by using natural environment data including climate, hydrology, surface configuration and ecological conditions. The NESI value is calculated in grids of 1 km by 1 km through ArcGIS. The spatial regularity of NESI is analyzed according to its spatial distribution and proportional structure. The relationship of NESI with population distribution and economic growth is also discussed by analyzing NESI results with population distribution data and GDP data in 1 km by 1 km grids. The study shows that: (1 the value of NESI is higher in the East and lower in the West in China; The best natural environment area is the Yangtze River Delta region and the worst are the northwest of Tibet and southwest of Xinjiang. (2 There is a close correlation among natural environment, population distribution and economic growth; the best natural environment area, the Yangtze River Delta region, is also the region with higher population density and richer economy. The worst natural environment areas, Northwest and Tibetan Plateau, are also regions with lower population density and poorer economies.

  17. Natural Environment Suitability of China and Its Relationship with Population Distributions

    Science.gov (United States)

    Yang, Xiaohuan; Ma, Hanqing

    2009-01-01

    The natural environment factor is one of the main indexes for evaluating human habitats, sustained economic growth and ecological health status. Based on Geographic Information System (GIS) technology and an analytic hierarchy process method, this article presents the construction of the Natural Environment Suitability Index (NESI) model of China by using natural environment data including climate, hydrology, surface configuration and ecological conditions. The NESI value is calculated in grids of 1 km by 1 km through ArcGIS. The spatial regularity of NESI is analyzed according to its spatial distribution and proportional structure. The relationship of NESI with population distribution and economic growth is also discussed by analyzing NESI results with population distribution data and GDP data in 1 km by 1 km grids. The study shows that: (1) the value of NESI is higher in the East and lower in the West in China; The best natural environment area is the Yangtze River Delta region and the worst are the northwest of Tibet and southwest of Xinjiang. (2) There is a close correlation among natural environment, population distribution and economic growth; the best natural environment area, the Yangtze River Delta region, is also the region with higher population density and richer economy. The worst natural environment areas, Northwest and Tibetan Plateau, are also regions with lower population density and poorer economies. PMID:20049243

  18. Distributed Computations Environment Protection Using Artificial Immune Systems

    Directory of Open Access Journals (Sweden)

    A. V. Moiseev

    2011-12-01

    Full Text Available In this article the authors describe possibility of artificial immune systems applying for distributed computations environment protection from definite types of malicious impacts.

  19. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  20. Prototype for a generic thin-client remote analysis environment for CMS

    International Nuclear Information System (INIS)

    Steenberg, C.D.; Bunn, J.J.; Hickey, T.M.; Holtman, K.; Legrand, I.; Litvin, V.; Newman, H.B.; Samar, A.; Singh, S.; Wilkinson, R.

    2001-01-01

    The multi-tiered architecture of the highly-distributed CMS computing systems necessitates a flexible data distribution and analysis environment. The authors describe a prototype analysis environment which functions efficiently over wide area networks using a server installed at the Caltech/UCSD Tier 2 prototype to analyze CMS data stored at various locations using a thin client. The analysis environment is based on existing HEP (Anaphe) and CMS (CARF, ORCA, IGUANA) software technology on the server accessed from a variety of clients. A Java Analysis Studio (JAS, from SLAC) plug-in is being developed as a reference client. The server is operated as a 'black box' on the proto-Tier2 system. ORCA objectivity databases (e.g. an existing large CMS Muon sample) are hosted on the master and slave nodes, and remote clients can request processing of queries across the server nodes, and get the histogram results returned and rendered in the client. The server is implemented using pure C++, and use XML-RPC as a language-neutral transport. This has several benefits, including much better scalability, better integration with CARF-ORCA, and importantly, makes the work directly useful to other non-Java general-purpose analysis and presentation tools such as Hippodraw, Lizard, or ROOT

  1. Metagenomes Reveal Global Distribution of Bacterial Steroid Catabolism in Natural, Engineered, and Host Environments

    Directory of Open Access Journals (Sweden)

    Johannes Holert

    2018-01-01

    Full Text Available Steroids are abundant growth substrates for bacteria in natural, engineered, and host-associated environments. This study analyzed the distribution of the aerobic 9,10-seco steroid degradation pathway in 346 publically available metagenomes from diverse environments. Our results show that steroid-degrading bacteria are globally distributed and prevalent in particular environments, such as wastewater treatment plants, soil, plant rhizospheres, and the marine environment, including marine sponges. Genomic signature-based sequence binning recovered 45 metagenome-assembled genomes containing a majority of 9,10-seco pathway genes. Only Actinobacteria and Proteobacteria were identified as steroid degraders, but we identified several alpha- and gammaproteobacterial lineages not previously known to degrade steroids. Actino- and proteobacterial steroid degraders coexisted in wastewater, while soil and rhizosphere samples contained mostly actinobacterial ones. Actinobacterial steroid degraders were found in deep ocean samples, while mostly alpha- and gammaproteobacterial ones were found in other marine samples, including sponges. Isolation of steroid-degrading bacteria from sponges confirmed their presence. Phylogenetic analysis of key steroid degradation proteins suggested their biochemical novelty in genomes from sponges and other environments. This study shows that the ecological significance as well as taxonomic and biochemical diversity of bacterial steroid degradation has so far been largely underestimated, especially in the marine environment.

  2. An Intelligent System for Document Retrieval in Distributed Office Environments.

    Science.gov (United States)

    Mukhopadhyay, Uttam; And Others

    1986-01-01

    MINDS (Multiple Intelligent Node Document Servers) is a distributed system of knowledge-based query engines for efficiently retrieving multimedia documents in an office environment of distributed workstations. By learning document distribution patterns and user interests and preferences during system usage, it customizes document retrievals for…

  3. Task distribution mechanism for effective collaboration in virtual environments

    International Nuclear Information System (INIS)

    Khalid, S.; Ullah, S.; Alam, A.

    2016-01-01

    Collaborative Virtual Environments (CVEs) are computer generated worlds where two or more users can simultaneously interact with synthetic objects to perform a task. User performance is one of the main issues caused by either loose coordination, less awareness or communication among collaborating users. In this paper, a new model for task distribution is proposed, in which task distribution strategy among multiple users in CVEs is defined. The model assigns the task to collaborating users in CVEs either on static or dynamic basis. In static distribution there exists loose dependency and requires less communication during task realization whereas in dynamic distribution users are more dependent on each other and thus require more communication. In order to study the effect of static and dynamic task distribution strategies on user's performance in CVEs, a collaborative virtual environment is developed where twenty four (24) teams (each consists of two users) perform a task in collaboration under both strategies (static and dynamic). Results reveal that static distribution is more effective and increases users performance in CVEs. The outcome of this work will help the development of effective CVEs in the field of virtual assembly, repair, education and entertainment. (author)

  4. Students’ perception of the learning environment in a distributed medical programme

    Directory of Open Access Journals (Sweden)

    Kiran Veerapen

    2010-09-01

    Full Text Available Background : The learning environment of a medical school has a significant impact on students’ achievements and learning outcomes. The importance of equitable learning environments across programme sites is implicit in distributed undergraduate medical programmes being developed and implemented. Purpose : To study the learning environment and its equity across two classes and three geographically separate sites of a distributed medical programme at the University of British Columbia Medical School that commenced in 2004. Method : The validated Dundee Ready Educational Environment Survey was sent to all students in their 2nd and 3rd year (classes graduating in 2009 and 2008 of the programme. The domains of the learning environment surveyed were: students’ perceptions of learning, students’ perceptions of teachers, students’ academic self-perceptions, students’ perceptions of the atmosphere, and students’ social self-perceptions. Mean scores, frequency distribution of responses, and inter- and intrasite differences were calculated. Results : The perception of the global learning environment at all sites was more positive than negative. It was characterised by a strongly positive perception of teachers. The work load and emphasis on factual learning were perceived negatively. Intersite differences within domains of the learning environment were more evident in the pioneer class (2008 of the programme. Intersite differences consistent across classes were largely related to on-site support for students. Conclusions : Shared strengths and weaknesses in the learning environment at UBC sites were evident in areas that were managed by the parent institution, such as the attributes of shared faculty and curriculum. A greater divergence in the perception of the learning environment was found in domains dependent on local arrangements and social factors that are less amenable to central regulation. This study underlines the need for ongoing

  5. Students' perception of the learning environment in a distributed medical programme.

    Science.gov (United States)

    Veerapen, Kiran; McAleer, Sean

    2010-09-24

    The learning environment of a medical school has a significant impact on students' achievements and learning outcomes. The importance of equitable learning environments across programme sites is implicit in distributed undergraduate medical programmes being developed and implemented. To study the learning environment and its equity across two classes and three geographically separate sites of a distributed medical programme at the University of British Columbia Medical School that commenced in 2004. The validated Dundee Ready Educational Environment Survey was sent to all students in their 2nd and 3rd year (classes graduating in 2009 and 2008) of the programme. The domains of the learning environment surveyed were: students' perceptions of learning, students' perceptions of teachers, students' academic self-perceptions, students' perceptions of the atmosphere, and students' social self-perceptions. Mean scores, frequency distribution of responses, and inter- and intrasite differences were calculated. The perception of the global learning environment at all sites was more positive than negative. It was characterised by a strongly positive perception of teachers. The work load and emphasis on factual learning were perceived negatively. Intersite differences within domains of the learning environment were more evident in the pioneer class (2008) of the programme. Intersite differences consistent across classes were largely related to on-site support for students. Shared strengths and weaknesses in the learning environment at UBC sites were evident in areas that were managed by the parent institution, such as the attributes of shared faculty and curriculum. A greater divergence in the perception of the learning environment was found in domains dependent on local arrangements and social factors that are less amenable to central regulation. This study underlines the need for ongoing comparative evaluation of the learning environment at the distributed sites and

  6. Cooperative Control of Distributed Autonomous Vehicles in Adversarial Environments

    Science.gov (United States)

    2006-08-14

    COOPERATIVE CONTROL OF DISTRIBUTED AUTONOMOUS VEHICLES IN ADVERSARIAL ENVIRONMENTS Grant #F49620–01–1–0361 Final Report Jeff Shamma Department of...CONTRACT NUMBER F49620-01-1-0361 5b. GRANT NUMBER 4. TITLE AND SUBTITLE COOPERATIVE CONTROL OF DISTRIBUTED AUTONOMOUS VEHICLES IN...single dominant language or a distribution of languages. A relation to multivehicle systems is understanding how highly autonomous vehicles on extended

  7. Data analysis in an Object Request Broker environment

    International Nuclear Information System (INIS)

    Malon, D.M.; May, E.N.; Grossman, R.L.; Day, C.T.; Quarrie, D.R.

    1995-01-01

    Computing for the Next Millenium will require software interoperability in heterogeneous, increasingly object-oriented environments. The Common Object Request Broker Architecture (CORBA) is a software industry effort, under the aegis of the Object Management Group (OMG), to standardize mechanisms for software interaction among disparate applications written in a variety of languages and running on a variety of distributed platforms. In this paper, we describe some of the design and performance implications for software that must function in such a brokered environment in a standards-compliant way. We illustrate these implications with a physics data analysis example as a case study

  8. Data analysis in an object request broker environment

    International Nuclear Information System (INIS)

    Malon, David M.; May, Edward N.; Grossman, Robert L.; Day, Christopher T.; Quarrie, David R.

    1996-01-01

    Computing for the Next Millennium will require software interoperability in heterogeneous, increasingly object-oriented environments. The Common Request Broker Architecture (CORBA) is a software industry effort, under the aegis of the Object Management Group (OMG), to standardize mechanism for software interaction among disparate applications written in a variety of languages and running on a variety of distributed platforms. In this paper, we describe some of the design and performance implications for software that must function is such a brokered environment in a standards-compliant way. We illustrate these implications with a physics data analysis example as a case study. (author)

  9. GIS-based poverty and population distribution analysis in China

    Science.gov (United States)

    Cui, Jing; Wang, Yingjie; Yan, Hong

    2009-07-01

    Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

  10. A Study of Land Surface Temperature Retrieval and Thermal Environment Distribution Based on Landsat-8 in Jinan City

    Science.gov (United States)

    Dong, Fang; Chen, Jian; Yang, Fan

    2018-01-01

    Based on the medium resolution Landsat 8 OLI/TIRS, the temperature distribution in four seasons of urban area in Jinan City was obtained by using atmospheric correction method for the retrieval of land surface temperature. Quantitative analysis of the spatio-temporal distribution characteristics, development trend of urban thermal environment, the seasonal variation and the relationship between surface temperature and normalized difference vegetation index (NDVI) was studied. The results show that the distribution of high temperature areas is concentrated in Jinan, and there is a tendency to expand from east to west, revealing a negative correlation between land surface temperature distribution and NDVI. So as to provide theoretical references and scientific basis of improving the ecological environment of Jinan City, strengthening scientific planning and making overall plan addressing climate change.

  11. Making distributed ALICE analysis simple using the GRID plug-in

    International Nuclear Information System (INIS)

    Gheata, A; Gheata, M

    2012-01-01

    We have developed an interface within the ALICE analysis framework that allows transparent usage of the experiment's distributed resources. This analysis plug-in makes it possible to configure back-end specific parameters from a single interface and to run with no change the same custom user analysis in many computing environments, from local workstations to PROOF clusters or GRID resources. The tool is used now extensively in the ALICE collaboration for both end-user analysis and large scale productions.

  12. A Virtual Hosting Environment for Distributed Online Gaming

    Science.gov (United States)

    Brossard, David; Prieto Martinez, Juan Luis

    With enterprise boundaries becoming fuzzier, it’s become clear that businesses need to share resources, expose services, and interact in many different ways. In order to achieve such a distribution in a dynamic, flexible, and secure way, we have designed and implemented a virtual hosting environment (VHE) which aims at integrating business services across enterprise boundaries and virtualising the ICT environment within which these services operate in order to exploit economies of scale for the businesses as well as achieve shorter concept-to-market time scales. To illustrate the relevance of the VHE, we have applied it to the online gaming world. Online gaming is an early adopter of distributed computing and more than 30% of gaming developer companies, being aware of the shift, are focusing on developing high performance platforms for the new online trend.

  13. Long-term preservation of analysis software environment

    International Nuclear Information System (INIS)

    Toppe Larsen, Dag; Blomer, Jakob; Buncic, Predrag; Charalampidis, Ioannis; Haratyunyan, Artem

    2012-01-01

    Long-term preservation of scientific data represents a challenge to experiments, especially regarding the analysis software. Preserving data is not enough; the full software and hardware environment is needed. Virtual machines (VMs) make it possible to preserve hardware “in software”. A complete infrastructure package has been developed for easy deployment and management of VMs, based on CERN virtual machine (CernVM). Further, a HTTP-based file system, CernVM file system (CVMFS), is used for the distribution of the software. It is possible to process data with any given software version, and a matching, regenerated VM version. A point-and-click web user interface is being developed for setting up the complete processing chain, including VM and software versions, number and type of processing nodes, and the particular type of analysis and data. This paradigm also allows for distributed cloud-computing on private and public clouds, for both legacy and contemporary experiments.

  14. Analysis of steady state temperature distribution in rod arrays exposed to stagnant gaseous environment

    International Nuclear Information System (INIS)

    Pal, G.; MaarKandeya, S.G.; Raj, V.V.

    1991-01-01

    This paper deals with the calculation of radiative heat exchange in a rod array exposed to stagnant gaseous environment. a computer code has been developed for this purpose and has been used for predicting the steady state temperature distribution in a nuclear fuel sub-assembly. Nuclear fuels continue to generate heat even after their removal from the core. During the transfer of the nuclear fuel sub-assemblies from the core to the storage bay, they pass through stagnant gaseous environment and may remain there for extended periods under abnormal conditions. Radiative heat exchange will be the dominant mode within the sub-assembly involved, since axial heat conduction through the fuel pins needs to be accounted for. a computer code RHEINA-3D (Radiative Heat Exchange In Nuclear Assemblies -3D) has been developed based on a simplified numerical model which considers both the above-mentioned phenomena. The analytical model and the results obtained are briefly discussed in this paper

  15. DEVELOPMENT OF A HETEROGENIC DISTRIBUTED ENVIRONMENT FOR SPATIAL DATA PROCESSING USING CLOUD TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    A. S. Garov

    2016-06-01

    Full Text Available We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (http://cartsrv.mexlab.ru/geoportal will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.

  16. Development of a Heterogenic Distributed Environment for Spatial Data Processing Using Cloud Technologies

    Science.gov (United States)

    Garov, A. S.; Karachevtseva, I. P.; Matveev, E. V.; Zubarev, A. E.; Florinsky, I. V.

    2016-06-01

    We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (geoportal"target="_blank">http://cartsrv.mexlab.ru/geoportal) will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.

  17. SERVICES OF FULL-TEXT SEARCHING IN A DISTRIBUTED INFORMATION ENVIRONMENT (PROJECT HUMANITARIANA

    Directory of Open Access Journals (Sweden)

    S. K. Lyapin

    2015-01-01

    Full Text Available Problem statement. We justify the possibility of full-text search services application in both universal and specialized (in terms of resource base digital libraries for the extraction and analysis of the context knowledge in the humanities. The architecture and services of virtual information and resource center for extracting knowledge from the humanitarian texts generated by «Humanitariana» project are described. The functional integration of the resources and services for a full-text search in a distributed decentralized environment, organized in the Internet / Intranet architecture under the control of the client (user browser accessing a variety of independent servers. An algorithm for a distributed full-text query implementation is described. Methods. Method of combining requency-ranked and paragraph-oriented full-text queries is used: the first are used for the preliminary analysis of the subject area or a combination product (explication of "vertical" context, or macro context, the second - for the explication of "horizontal" context, or micro context within copyright paragraph. The results of the frequency-ranked queries are used to compile paragraph-oriented queries. Results. The results of textual research are shown on the topics "The question of fact in Russian philosophy", "The question of loneliness in Russian philosophy and culture". About 50 pieces of context knowledge on the total resource base of about 2,500 full-text resources have been explicated and briefly described to their further expert investigating. Practical significance. The proposed technology (advanced full-text searching services in a distributed information environment can be used for the information support of humanitarian studies and education in the humanities, for functional integration of resources and services of various organizations, for carrying out interdisciplinary research.

  18. Distributed Analysis in CMS

    CERN Document Server

    Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank

    2009-01-01

    The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.

  19. Distributed temperature and distributed acoustic sensing for remote and harsh environments

    Science.gov (United States)

    Mondanos, Michael; Parker, Tom; Milne, Craig H.; Yeo, Jackson; Coleman, Thomas; Farhadiroushan, Mahmoud

    2015-05-01

    Advances in opto-electronics and associated signal processing have enabled the development of Distributed Acoustic and Temperature Sensors. Unlike systems relying on discrete optical sensors a distributed system does not rely upon manufactured sensors but utilises passive custom optical fibre cables resistant to harsh environments, including high temperature applications (600°C). The principle of distributed sensing is well known from the distributed temperature sensor (DTS) which uses the interaction of the source light with thermal vibrations (Raman scattering) to determine the temperature at all points along the fibre. Distributed Acoustic Sensing (DAS) uses a novel digital optical detection technique to precisely capture the true full acoustic field (amplitude, frequency and phase) over a wide dynamic range at every point simultaneously. A number of signal processing techniques have been developed to process a large array of acoustic signals to quantify the coherent temporal and spatial characteristics of the acoustic waves. Predominantly these systems have been developed for the oil and gas industry to assist reservoir engineers in optimising the well lifetime. Nowadays these systems find a wide variety of applications as integrity monitoring tools in process vessels, storage tanks and piping systems offering the operator tools to schedule maintenance programs and maximize service life.

  20. SoS Notebook: An Interactive Multi-Language Data Analysis Environment.

    Science.gov (United States)

    Peng, Bo; Wang, Gao; Ma, Jun; Leong, Man Chong; Wakefield, Chris; Melott, James; Chiu, Yulun; Du, Di; Weinstein, John N

    2018-05-22

    Complex bioinformatic data analysis workflows involving multiple scripts in different languages can be difficult to consolidate, share, and reproduce. An environment that streamlines the entire processes of data collection, analysis, visualization and reporting of such multi-language analyses is currently lacking. We developed Script of Scripts (SoS) Notebook, a web-based notebook environment that allows the use of multiple scripting language in a single notebook, with data flowing freely within and across languages. SoS Notebook enables researchers to perform sophisticated bioinformatic analysis using the most suitable tools for different parts of the workflow, without the limitations of a particular language or complications of cross-language communications. SoS Notebook is hosted at http://vatlab.github.io/SoS/ and is distributed under a BSD license. bpeng@mdanderson.org.

  1. A planning and analysis framework for evaluating distributed generation and utility strategies

    International Nuclear Information System (INIS)

    Ault, Graham W.

    2000-01-01

    The numbers of smaller scale distributed power generation units connected to the distribution networks of electricity utilities in the UK and elsewhere have grown significantly in recent years. Numerous economic and political drivers have stimulated this growth and continue to provide the environment for future growth in distributed generation. The simple fact that distributed generation is independent from the distribution utility complicates planning and operational tasks for the distribution network. The uncertainty relating to the number, location and type of distributed generating units to connect to the distribution network in the future makes distribution planning a particularly difficult activity. This thesis concerns the problem of distribution network and business planning in the era of distributed generation. A distributed generation strategic analysis framework is proposed to provide the required analytical capability and planning and decision making framework to enable distribution utilities to deal effectively with the challenges and opportunities presented to them by distributed generation. The distributed generation strategic analysis framework is based on the best features of modern planning and decision making methodologies and facilitates scenario based analysis across many utility strategic options and uncertainties. Case studies are presented and assessed to clearly illustrate the potential benefits of such an approach to distributed generation planning in the UK electricity supply industry. (author)

  2. Implementing Run-Time Evaluation of Distributed Timing Constraints in a Real-Time Environment

    DEFF Research Database (Denmark)

    Kristensen, C. H.; Drejer, N.

    1994-01-01

    In this paper we describe a solution to the problem of implementing run-time evaluation of timing constraints in distributed real-time environments......In this paper we describe a solution to the problem of implementing run-time evaluation of timing constraints in distributed real-time environments...

  3. Analyzing coastal environments by means of functional data analysis

    Science.gov (United States)

    Sierra, Carlos; Flor-Blanco, Germán; Ordoñez, Celestino; Flor, Germán; Gallego, José R.

    2017-07-01

    Here we used Functional Data Analysis (FDA) to examine particle-size distributions (PSDs) in a beach/shallow marine sedimentary environment in Gijón Bay (NW Spain). The work involved both Functional Principal Components Analysis (FPCA) and Functional Cluster Analysis (FCA). The grainsize of the sand samples was characterized by means of laser dispersion spectroscopy. Within this framework, FPCA was used as a dimension reduction technique to explore and uncover patterns in grain-size frequency curves. This procedure proved useful to describe variability in the structure of the data set. Moreover, an alternative approach, FCA, was applied to identify clusters and to interpret their spatial distribution. Results obtained with this latter technique were compared with those obtained by means of two vector approaches that combine PCA with CA (Cluster Analysis). The first method, the point density function (PDF), was employed after adapting a log-normal distribution to each PSD and resuming each of the density functions by its mean, sorting, skewness and kurtosis. The second applied a centered-log-ratio (clr) to the original data. PCA was then applied to the transformed data, and finally CA to the retained principal component scores. The study revealed functional data analysis, specifically FPCA and FCA, as a suitable alternative with considerable advantages over traditional vector analysis techniques in sedimentary geology studies.

  4. Evaluation of trace elements distribution in water, sediment, soil and cassava plant in Muria peninsula environment by NAA method

    International Nuclear Information System (INIS)

    Muryono, H.; Sumining; Agus Taftazani; Kris Tri Basuki; Sukarman, A.

    1999-01-01

    The evaluation of trace elements distribution in water, sediment, soil and cassava plant in Muria peninsula by NAA method were done. The nuclear power plant (NPP) and the coal power plant (CPP) will be built in Muria peninsula, so, the Muria peninsula is an important site for samples collection and monitoring of environment. River-water, sediment, dryland-soil and cassava plant were choosen as specimens samples from Muria peninsula environment. The analysis result of trace elements were used as a contributed data for environment monitoring before and after NPP was built. The trace elements in specimens of river-water, sediment, dryland-soil and cassava plant samples were analyzed by INAA method. It was found that the trace elements distribution were not evenly distributed. Percentage of trace elements distribution in river-water, sediment, dryland-soil and cassava leaves were 0.00026-0.037% in water samples, 0.49-62.7% in sediment samples, 36.29-99.35% in soil samples and 0.21-99.35% in cassava leaves. (author)

  5. Evaluation of trace elements distribution in water, sediment, soil and cassava plant in Muria peninsula environment by NAA method

    Energy Technology Data Exchange (ETDEWEB)

    Muryono, H.; Sumining; Agus Taftazani; Kris Tri Basuki; Sukarman, A. [Yogyakarta Nuclear Research Center, Yogyakarta (Indonesia)

    1999-10-01

    The evaluation of trace elements distribution in water, sediment, soil and cassava plant in Muria peninsula by NAA method were done. The nuclear power plant (NPP) and the coal power plant (CPP) will be built in Muria peninsula, so, the Muria peninsula is an important site for samples collection and monitoring of environment. River-water, sediment, dryland-soil and cassava plant were choosen as specimens samples from Muria peninsula environment. The analysis result of trace elements were used as a contributed data for environment monitoring before and after NPP was built. The trace elements in specimens of river-water, sediment, dryland-soil and cassava plant samples were analyzed by INAA method. It was found that the trace elements distribution were not evenly distributed. Percentage of trace elements distribution in river-water, sediment, dryland-soil and cassava leaves were 0.00026-0.037% in water samples, 0.49-62.7% in sediment samples, 36.29-99.35% in soil samples and 0.21-99.35% in cassava leaves. (author)

  6. High-Surety Telemedicine in a Distributed, 'Plug-and-Play' Environment

    International Nuclear Information System (INIS)

    Craft, Richard L.; Funkhouser, Donald R.; Gallagher, Linda K.; Garcia, Rudy J.; Parks, Raymond C.; Warren, Steve

    1999-01-01

    Commercial telemedicine systems are increasingly functional, incorporating video-conferencing capabilities, diagnostic peripherals, medication reminders, and patient education services. However, these systems (1) rarely utilize information architectures which allow them to be easily integrated with existing health information networks and (2) do not always protect patient confidentiality with adequate security mechanisms. Using object-oriented methods and software wrappers, we illustrate the transformation of an existing stand-alone telemedicine system into 'plug-and-play' components that function in a distributed medical information environment. We show, through the use of open standards and published component interfaces, that commercial telemedicine offerings which were once incompatible with electronic patient record systems can now share relevant data with clinical information repositories while at the same time hiding the proprietary implementations of the respective systems. Additionally, we illustrate how leading-edge technology can secure this distributed telemedicine environment, maintaining patient confidentiality and the integrity of the associated electronic medical data. Information surety technology also encourages the development of telemedicine systems that have both read and write access to electronic medical records containing patient-identifiable information. The win-win approach to telemedicine information system development preserves investments in legacy software and hardware while promoting security and interoperability in a distributed environment

  7. Protect Heterogeneous Environment Distributed Computing from Malicious Code Assignment

    Directory of Open Access Journals (Sweden)

    V. S. Gorbatov

    2011-09-01

    Full Text Available The paper describes the practical implementation of the protection system of heterogeneous environment distributed computing from malicious code for the assignment. A choice of technologies, development of data structures, performance evaluation of the implemented system security are conducted.

  8. Pseudodifferential Analysis, Automorphic Distributions in the Plane and Modular Forms

    CERN Document Server

    Unterberger, Andre

    2011-01-01

    Pseudodifferential analysis, introduced in this book in a way adapted to the needs of number theorists, relates automorphic function theory in the hyperbolic half-plane I to automorphic distribution theory in the plane. Spectral-theoretic questions are discussed in one or the other environment: in the latter one, the problem of decomposing automorphic functions in I according to the spectral decomposition of the modular Laplacian gives way to the simpler one of decomposing automorphic distributions in R2 into homogeneous components. The Poincare summation process, which consists in building au

  9. Study of Solid State Drives performance in PROOF distributed analysis system

    Science.gov (United States)

    Panitkin, S. Y.; Ernst, M.; Petkus, R.; Rind, O.; Wenaus, T.

    2010-04-01

    Solid State Drives (SSD) is a promising storage technology for High Energy Physics parallel analysis farms. Its combination of low random access time and relatively high read speed is very well suited for situations where multiple jobs concurrently access data located on the same drive. It also has lower energy consumption and higher vibration tolerance than Hard Disk Drive (HDD) which makes it an attractive choice in many applications raging from personal laptops to large analysis farms. The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF is especially efficient together with distributed local storage systems like Xrootd, when data are distributed over computing nodes. In such an architecture the local disk subsystem I/O performance becomes a critical factor, especially when computing nodes use multi-core CPUs. We will discuss our experience with SSDs in PROOF environment. We will compare performance of HDD with SSD in I/O intensive analysis scenarios. In particular we will discuss PROOF system performance scaling with a number of simultaneously running analysis jobs.

  10. Distributed analysis at LHCb

    International Nuclear Information System (INIS)

    Williams, Mike; Egede, Ulrik; Paterson, Stuart

    2011-01-01

    The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.

  11. Distribution analysis of segmented wave sea clutter in littoral environments

    CSIR Research Space (South Africa)

    Strempel, MD

    2015-10-01

    Full Text Available are then fitted against the K-distribution. It is shown that the approach can accurately describe specific sections of the wave with a reduced error between actual and estimated distributions. The improved probability density function (PDF) representation...

  12. Distribution tactics for success in turbulent versus stable environments: A complexity theory approach

    Directory of Open Access Journals (Sweden)

    Roger Bruce Mason

    2013-11-01

    Full Text Available This article proposes that the external environment influences the choice of distribution tactics. Since businesses and markets are complex adaptive systems, using complexity theory to understand such environments is necessary, but it has not been widely researched. A qualitative case method using in-depth interviews investigated four successful, versus less successful, companies in turbulent versus stable environments. The results tentatively confirmed that the more successful company, in a turbulent market, sees distribution activities as less important than other aspects of the marketing mix, but uses them to stabilise customer relationships and to maintain distribution processes. These findings can benefit marketers by emphasising a new way to consider place activities. How marketers can be assisted, and suggestions for further research, are provided.

  13. Distribution and communication in software engineering environments. Application to the HELIOS Software Bus.

    OpenAIRE

    Jean, F. C.; Jaulent, M. C.; Coignard, J.; Degoulet, P.

    1991-01-01

    Modularity, distribution and integration are current trends in Software Engineering. To reach these goals HELIOS, a distributive Software Engineering Environment dedicated to the medical field, has been conceived and a prototype implemented. This environment is made by the collaboration of several, well encapsulated Software Components. This paper presents the architecture retained to allow communication between the different components and focus on the implementation details of the Software ...

  14. SPATIAL ANALYSIS TO SUPPORT GEOGRAPHIC TARGETING OF GENOTYPES TO ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    Glenn eHyman

    2013-03-01

    Full Text Available Crop improvement efforts have benefited greatly from advances in available data, computing technology and methods for targeting genotypes to environments. These advances support the analysis of genotype by environment interactions to understand how well a genotype adapts to environmental conditions. This paper reviews the use of spatial analysis to support crop improvement research aimed at matching genotypes to their most appropriate environmental niches. Better data sets are now available on soils, weather and climate, elevation, vegetation, crop distribution and local conditions where genotypes are tested in experimental trial sites. The improved data are now combined with spatial analysis methods to compare environmental conditions across sites, create agro-ecological region maps and assess environment change. Climate, elevation and vegetation data sets are now widely available, supporting analyses that were much more difficult even five or ten years ago. While detailed soil data for many parts of the world remains difficult to acquire for crop improvement studies, new advances in digital soil mapping are likely to improve our capacity. Site analysis and matching and regional targeting methods have advanced in parallel to data and technology improvements. All these developments have increased our capacity to link genotype to phenotype and point to a vast potential to improve crop adaptation efforts.

  15. Computer network environment planning and analysis

    Science.gov (United States)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  16. Getting reforms done in inhospitable institutional environments: untying a Gordian Knot in India's power distribution sector

    International Nuclear Information System (INIS)

    Tankha, Sunil; Misal, Annasahed B.; Fuller, Boyd W.

    2010-01-01

    When grand institutional reforms based on idealized models are stalled by the poor institutional environments and difficult politics which often surround large infrastructure systems in developing countries, partial reforms whose design and implementation take into account the different interests of the key stakeholders can provide valuable and immediate benefits while moving these systems from low- towards higher-level equilibria. Strategically negotiated, experimentally partial and purposefully hybrid, these reforms are based on careful stakeholder analysis and strategic coalition building that avoid rigid positions based on idealized models. Our findings are based on a study of power sector reforms in India, where we performed a micro-level and in-depth analysis of a partial and innovative experiment which has allowed private sector participation in electricity distribution within a hostile institutional environment.

  17. Interaction Control Protocols for Distributed Multi-user Multi-camera Environments

    Directory of Open Access Journals (Sweden)

    Gareth W Daniel

    2003-10-01

    Full Text Available Video-centred communication (e.g., video conferencing, multimedia online learning, traffic monitoring, and surveillance is becoming a customary activity in our lives. The management of interactions in such an environment is a complicated HCI issue. In this paper, we present our study on a collection of interaction control protocols for distributed multiuser multi-camera environments. These protocols facilitate different approaches to managing a user's entitlement for controlling a particular camera. We describe a web-based system that allows multiple users to manipulate multiple cameras in varying remote locations. The system was developed using the Java framework, and all protocols discussed have been incorporated into the system. Experiments were designed and conducted to evaluate the effectiveness of these protocols, and to enable the identification of various human factors in a distributed multi-user and multi-camera environment. This work provides an insight into the complexity associated with the interaction management in video-centred communication. It can also serve as a conceptual and experimental framework for further research in this area.

  18. Location Analysis of Freight Distribution Terminal of Jakarta City, Indonesia

    Directory of Open Access Journals (Sweden)

    Nahry Nahry

    2016-03-01

    Full Text Available Currently Jakarta has two freight terminals, namely Pulo Gebang and Tanah Merdeka. But, both terminals are just functioned for parking and have not been utilized properly yet, e.g. for consolidation. Goods consolidation, which is usually performed in distribution terminal, may reduce number of freight flow within the city. This paper is aimed to determine the best location of distribution terminal in Jakarta among those two terminals and two additional alternative sites, namely Lodan and Rawa Buaya. It is initialized by the identification of important factors that affect the location selection. It is carried out by Likert analysis through the questionnaires distributed to logistics firms. The best location is determined by applying Overlay Analysis using ArcGIS 9.2. Four grid maps are produced to represent the accessibility, cost, time, and environment factors as the important factors of location. The result shows that the ranking from the best is; Lodan, Tanah Merdeka, Pulo Gebang, and Rawa Buaya.

  19. A distributed execution environment for large data visualization

    International Nuclear Information System (INIS)

    Huang Jian; Liu Huadong; Beck, Micah; Gao Jinzhu; Moore, Terry

    2006-01-01

    Over the years, homogeneous computer cluster have been the most popular, and, in some sense, the only viable, platform for use in parallel visualization. In this work, we designed an execution environment for data-intensive visualization that is suitable to handle SciDAC scale datasets. This environment is solely based on computers distributed across the Internet that are owned and operated by independent institutions, while being openly shared for free. Those Internet computers are inherently of heterogeneous hardware configuration and running a variety of operating systems. Using 100 processors of such kind, we have been able to obtain the same level of performance offered by a 64-node cluster of 2.2 GHz P4 processors, while processing a 75GBs subset of TSI simulation data. Due to its inherently shared nature, this execution environment for data-intensive visualization could provide a viable means of collaboration among geographically separated SciDAC scientists

  20. Distributed multiscale computing with MUSCLE 2, the Multiscale Coupling Library and Environment

    NARCIS (Netherlands)

    Borgdorff, J.; Mamonski, M.; Bosak, B.; Kurowski, K.; Ben Belgacem, M.; Chopard, B.; Groen, D.; Coveney, P.V.; Hoekstra, A.G.

    2014-01-01

    We present the Multiscale Coupling Library and Environment: MUSCLE 2. This multiscale component-based execution environment has a simple to use Java, C++, C, Python and Fortran API, compatible with MPI, OpenMP and threading codes. We demonstrate its local and distributed computing capabilities and

  1. Distributed Analysis Experience using Ganga on an ATLAS Tier2 infrastructure

    International Nuclear Information System (INIS)

    Fassi, F.; Cabrera, S.; Vives, R.; Fernandez, A.; Gonzalez de la Hoz, S.; Sanchez, J.; March, L.; Salt, J.; Kaci, M.; Lamas, A.; Amoros, G.

    2007-01-01

    The ATLAS detector will explore the high-energy frontier of Particle Physics collecting the proton-proton collisions delivered by the LHC (Large Hadron Collider). Starting in spring 2008, the LHC will produce more than 10 Peta bytes of data per year. The adapted tiered hierarchy for computing model at the LHC is: Tier-0 (CERN), Tiers-1 and Tiers-2 centres distributed around the word. The ATLAS Distributed Analysis (DA) system has the goal of enabling physicists to perform Grid-based analysis on distributed data using distributed computing resources. IFIC Tier-2 facility is participating in several aspects of DA. In support of the ATLAS DA activities a prototype is being tested, deployed and integrated. The analysis data processing applications are based on the Athena framework. GANGA, developed by LHCb and ATLAS experiments, allows simple switching between testing on a local batch system and large-scale processing on the Grid, hiding Grid complexities. GANGA deals with providing physicists an integrated environment for job preparation, bookkeeping and archiving, job splitting and merging. The experience with the deployment, configuration and operation of the DA prototype will be presented. Experiences gained of using DA system and GANGA in the Top physics analysis will be described. (Author)

  2. Preview of the Mission Assurance Analysis Protocol (MAAP): Assessing Risk and Opportunity in Complex Environments

    National Research Council Canada - National Science Library

    Alberts, Christopher; Dorofee, Audrey; Marino, Lisa

    2008-01-01

    .... A MAAP assessment provides a systematic, in-depth analysis of the potential for success in distributed, complex, and uncertain environments and can be applied across the life cycle and throughout the supply chain...

  3. Instructional Design Issues in a Distributed Collaborative Engineering Design (CED) Instructional Environment

    Science.gov (United States)

    Koszalka, Tiffany A.; Wu, Yiyan

    2010-01-01

    Changes in engineering practices have spawned changes in engineering education and prompted the use of distributed learning environments. A distributed collaborative engineering design (CED) course was designed to engage engineering students in learning about and solving engineering design problems. The CED incorporated an advanced interactive…

  4. Optimising metadata workflows in a distributed information environment

    OpenAIRE

    Robertson, R. John; Barton, Jane

    2005-01-01

    The different purposes present within a distributed information environment create the potential for repositories to enhance their metadata by capitalising on the diversity of metadata available for any given object. This paper presents three conceptual reference models required to achieve this optimisation of metadata workflow: the ecology of repositories, the object lifecycle model, and the metadata lifecycle model. It suggests a methodology for developing the metadata lifecycle model, and ...

  5. Antibiotics in the coastal environment of the Hailing Bay region, South China Sea: Spatial distribution, source analysis and ecological risks

    International Nuclear Information System (INIS)

    Chen, Hui; Liu, Shan; Xu, Xiang-Rong; Zhou, Guang-Jie; Liu, Shuang-Shuang; Yue, Wei-Zhong; Sun, Kai-Feng; Ying, Guang-Guo

    2015-01-01

    Highlights: • Thirty-eight antibiotics were systematically investigated in marine environment. • The distribution of antibiotics was significantly correlated with COD and NO 3 –N. • Untreated domestic sewage was the primary source of antibiotics. • Fluoroquinolones showed a strong sorption capacity onto sediments. • Oxytetracycline, norfloxacin and erythromycin–H 2 O indicated high risks. - Abstract: In this study, the occurrence and spatial distribution of 38 antibiotics in surface water and sediment samples of the Hailing Bay region, South China Sea, were investigated. Twenty-one, 16 and 15 of 38 antibiotics were detected with the concentrations ranging from <0.08 (clarithromycin) to 15,163 ng/L (oxytetracycline), 2.12 (methacycline) to 1318 ng/L (erythromycin–H 2 O), <1.95 (ciprofloxacin) to 184 ng/g (chlortetracycline) in the seawater, discharged effluent and sediment samples, respectively. The concentrations of antibiotics in the water phase were correlated positively with chemical oxygen demand and nitrate. The source analysis indicated that untreated domestic sewage was the primary source of antibiotics in the study region. Fluoroquinolones showed strong sorption capacity onto sediments due to their high pseudo-partitioning coefficients. Risk assessment indicated that oxytetracycline, norfloxacin and erythromycin–H 2 O posed high risks to aquatic organisms

  6. Energy and environment efficiency analysis based on an improved environment DEA cross-model: Case study of complex chemical processes

    International Nuclear Information System (INIS)

    Geng, ZhiQiang; Dong, JunGen; Han, YongMing; Zhu, QunXiong

    2017-01-01

    Highlights: •An improved environment DEA cross-model method is proposed. •Energy and environment efficiency analysis framework of complex chemical processes is obtained. •This proposed method is efficient in energy-saving and emission reduction of complex chemical processes. -- Abstract: The complex chemical process is a high pollution and high energy consumption industrial process. Therefore, it is very important to analyze and evaluate the energy and environment efficiency of the complex chemical process. Data Envelopment Analysis (DEA) is used to evaluate the relative effectiveness of decision-making units (DMUs). However, the traditional DEA method usually cannot genuinely distinguish the effective and inefficient DMU due to its extreme or unreasonable weight distribution of input and output variables. Therefore, this paper proposes an energy and environment efficiency analysis method based on an improved environment DEA cross-model (DEACM) method. The inputs of the complex chemical process are divided into energy and non-energy inputs. Meanwhile, the outputs are divided into desirable and undesirable outputs. And then the energy and environment performance index (EEPI) based on the cross evaluation is used to represent the overall performance of each DMU. Moreover, the improvement direction of energy-saving and carbon emission reduction of each inefficiency DMU is quantitatively obtained based on the self-evaluation model of the improved environment DEACM. The results show that the improved environment DEACM method has a better effective discrimination than the original DEA method by analyzing the energy and environment efficiency of the ethylene production process in complex chemical processes, and it can obtain the potential of energy-saving and carbon emission reduction of ethylene plants, especially the improvement direction of inefficient DMUs to improve energy efficiency and reduce carbon emission.

  7. Data management in an object-oriented distributed aircraft conceptual design environment

    Science.gov (United States)

    Lu, Zhijie

    In the competitive global market place, aerospace companies are forced to deliver the right products to the right market, with the right cost, and at the right time. However, the rapid development of technologies and new business opportunities, such as mergers, acquisitions, supply chain management, etc., have dramatically increased the complexity of designing an aircraft. Therefore, the pressure to reduce design cycle time and cost is enormous. One way to solve such a dilemma is to develop and apply advanced engineering environments (AEEs), which are distributed collaborative virtual design environments linking researchers, technologists, designers, etc., together by incorporating application tools and advanced computational, communications, and networking facilities. Aircraft conceptual design, as the first design stage, provides major opportunity to compress design cycle time and is the cheapest place for making design changes. However, traditional aircraft conceptual design programs, which are monolithic programs, cannot provide satisfactory functionality to meet new design requirements due to the lack of domain flexibility and analysis scalability. Therefore, we are in need of the next generation aircraft conceptual design environment (NextADE). To build the NextADE, the framework and the data management problem are two major problems that need to be addressed at the forefront. Solving these two problems, particularly the data management problem, is the focus of this research. In this dissertation, in light of AEEs, a distributed object-oriented framework is firstly formulated and tested for the NextADE. In order to improve interoperability and simplify the integration of heterogeneous application tools, data management is one of the major problems that need to be tackled. To solve this problem, taking into account the characteristics of aircraft conceptual design data, a robust, extensible object-oriented data model is then proposed according to the

  8. Multi-VO support in IHEP's distributed computing environment

    International Nuclear Information System (INIS)

    Yan, T; Suo, B; Zhao, X H; Zhang, X M; Ma, Z T; Yan, X F; Lin, T; Deng, Z Y; Li, W D; Belov, S; Pelevanyuk, I; Zhemchugov, A; Cai, H

    2015-01-01

    Inspired by the success of BESDIRAC, the distributed computing environment based on DIRAC for BESIII experiment, several other experiments operated by Institute of High Energy Physics (IHEP), such as Circular Electron Positron Collider (CEPC), Jiangmen Underground Neutrino Observatory (JUNO), Large High Altitude Air Shower Observatory (LHAASO) and Hard X-ray Modulation Telescope (HXMT) etc, are willing to use DIRAC to integrate the geographically distributed computing resources available by their collaborations. In order to minimize manpower and hardware cost, we extended the BESDIRAC platform to support multi-VO scenario, instead of setting up a self-contained distributed computing environment for each VO. This makes DIRAC as a service for the community of those experiments. To support multi-VO, the system architecture of BESDIRAC is adjusted for scalability. The VOMS and DIRAC servers are reconfigured to manage users and groups belong to several VOs. A lightweight storage resource manager StoRM is employed as the central SE to integrate local and grid data. A frontend system is designed for user's massive job splitting, submission and management, with plugins to support new VOs. A monitoring and accounting system is also considered to easy the system administration and VO related resources usage accounting. (paper)

  9. Spatial structures of the environment and of dispersal impact species distribution in competitive metacommunities.

    Science.gov (United States)

    Ai, Dexiecuo; Gravel, Dominique; Chu, Chengjin; Wang, Gang

    2013-01-01

    The correspondence between species distribution and the environment depends on species' ability to track favorable environmental conditions (via dispersal) and to maintain competitive hierarchy against the constant influx of migrants (mass effect) and demographic stochasticity (ecological drift). Here we report a simulation study of the influence of landscape structure on species distribution. We consider lottery competition for space in a spatially heterogeneous environment, where the landscape is represented as a network of localities connected by dispersal. We quantified the contribution of neutrality and species sorting to their spatial distribution. We found that neutrality increases and the strength of species-sorting decreases with the centrality of a community in the landscape when the average dispersal among communities is low, whereas the opposite was found at elevated dispersal. We also found that the strength of species-sorting increases with environmental heterogeneity. Our results illustrate that spatial structure of the environment and of dispersal must be taken into account for understanding species distribution. We stress the importance of spatial geographic structure on the relative importance of niche vs. neutral processes in controlling community dynamics.

  10. High-Surety Telemedicine in a Distributed, 'Plug-andPlan' Environment

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard L.; Funkhouser, Donald R.; Gallagher, Linda K.; Garcia, Rudy J.; Parks, Raymond C.; Warren, Steve

    1999-05-17

    Commercial telemedicine systems are increasingly functional, incorporating video-conferencing capabilities, diagnostic peripherals, medication reminders, and patient education services. However, these systems (1) rarely utilize information architectures which allow them to be easily integrated with existing health information networks and (2) do not always protect patient confidentiality with adequate security mechanisms. Using object-oriented methods and software wrappers, we illustrate the transformation of an existing stand-alone telemedicine system into `plug-and-play' components that function in a distributed medical information environment. We show, through the use of open standards and published component interfaces, that commercial telemedicine offerings which were once incompatible with electronic patient record systems can now share relevant data with clinical information repositories while at the same time hiding the proprietary implementations of the respective systems. Additionally, we illustrate how leading-edge technology can secure this distributed telemedicine environment, maintaining patient confidentiality and the integrity of the associated electronic medical data. Information surety technology also encourages the development of telemedicine systems that have both read and write access to electronic medical records containing patient-identifiable information. The win-win approach to telemedicine information system development preserves investments in legacy software and hardware while promoting security and interoperability in a distributed environment.

  11. Some Technical Implications of Distributed Cognition on the Design on Interactive Learning Environments.

    Science.gov (United States)

    Dillenbourg, Pierre

    1996-01-01

    Maintains that diagnosis, explanation, and tutoring, the functions of an interactive learning environment, are collaborative processes. Examines how human-computer interaction can be improved using a distributed cognition framework. Discusses situational and distributed knowledge theories and provides a model on how they can be used to redesign…

  12. Applied research and development of neutron activation analysis - The study on human health and environment by neutron activation analysis of biological samples

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Seung Yeon; Yoo, Jong Ik; Lee, Jae Kwang; Lee, Sung Jun; Lee, Sang Sun; Jeon, Ki Hong; Na, Kyung Won; Kang, Sang Hun [Yonsei University, Seoul (Korea)

    2000-04-01

    With the development of the precise quantitative analytical method for the analysis of trace elements in the various biological samples such as hair and food, evaluation in view of health and environment to the trace elements in various sources which can be introduced inside human body was done. The trace elemental distribution in Korean total diet and representative food stuff was identified first. With the project the elemental distributions in supplemental healthy food and Korean and Chinese origin oriental medicine were identified. The amount of trace elements ingested with the hair analysis of oriental medicine takers were also estimated. The amounts of trace elements inhaled with the analysis of foundry air, blood and hair of foundry workers were also estimated. The basic estimation method in view of health and environment with the neutron activation analysis of biological samples such as foods and hair was established with the result. Nationwide usage system of the NAA facility in Hanaro in many different and important areas of biological area can be initiated with the results. The output of the project can support public heath, environment, and medical research area. The results can be applied for the process of micronutrients enhanced health food production and for the health safety and health status enhancement with the additional necessary data expansion and the development of various evaluation technique. 19 refs., 7 figs., 23 tabs. (Author)

  13. Data intensive high energy physics analysis in a distributed cloud

    International Nuclear Information System (INIS)

    Charbonneau, A; Impey, R; Podaima, W; Agarwal, A; Anderson, M; Armstrong, P; Fransham, K; Gable, I; Harris, D; Leavett-Brown, C; Paterson, M; Sobie, R J; Vliet, M

    2012-01-01

    We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.

  14. Data intensive high energy physics analysis in a distributed cloud

    Science.gov (United States)

    Charbonneau, A.; Agarwal, A.; Anderson, M.; Armstrong, P.; Fransham, K.; Gable, I.; Harris, D.; Impey, R.; Leavett-Brown, C.; Paterson, M.; Podaima, W.; Sobie, R. J.; Vliet, M.

    2012-02-01

    We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.

  15. Thermal-hydraulic analysis of Ignalina NPP compartments response to group distribution header rupture using RALOC4 code

    International Nuclear Information System (INIS)

    Urbonavicius, E.

    2000-01-01

    The Accident Localisation System (ALS) of Ignalina NPP is a containment of pressure suppression type designed to protect the environment from the dangerous impact of the radioactivity. The failure of ALS could lead to contamination of the environment and prescribed public radiation doses could be exceeded. The purpose of the presented analysis is to perform long term thermal-hydraulic analysis of compartments response to Group Distribution Header rupture and verify if design pressure values are not exceeded. (authors)

  16. Spatial structures of the environment and of dispersal impact species distribution in competitive metacommunities.

    Directory of Open Access Journals (Sweden)

    Dexiecuo Ai

    Full Text Available The correspondence between species distribution and the environment depends on species' ability to track favorable environmental conditions (via dispersal and to maintain competitive hierarchy against the constant influx of migrants (mass effect and demographic stochasticity (ecological drift. Here we report a simulation study of the influence of landscape structure on species distribution. We consider lottery competition for space in a spatially heterogeneous environment, where the landscape is represented as a network of localities connected by dispersal. We quantified the contribution of neutrality and species sorting to their spatial distribution. We found that neutrality increases and the strength of species-sorting decreases with the centrality of a community in the landscape when the average dispersal among communities is low, whereas the opposite was found at elevated dispersal. We also found that the strength of species-sorting increases with environmental heterogeneity. Our results illustrate that spatial structure of the environment and of dispersal must be taken into account for understanding species distribution. We stress the importance of spatial geographic structure on the relative importance of niche vs. neutral processes in controlling community dynamics.

  17. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration; Pacheco Pages, A; Stradling, A

    2013-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  18. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  19. Development of System Architecture to Investigate the Impact of Integrated Air and Missile Defense in a Distributed Lethality Environment

    Science.gov (United States)

    2017-12-01

    SYSTEM ARCHITECTURE TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT by Justin K. Davis...TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT 5. FUNDING NUMBERS 6. AUTHOR(S) Justin K...ARCHITECTURE TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT Justin K. Davis Lieutenant

  20. Population fluctuation and vertical distribution of meiofauna in the Red Sea interstitial environment.

    Science.gov (United States)

    El-Serehy, Hamed A; Al-Misned, Fahad A; Al-Rasheid, Khaled A

    2015-07-01

    The composition and distribution of the benthic meiofauna assemblages of the Egyptian coasts along the Red Sea are described in relation to abiotic variables. Sediment samples were collected seasonally from three stations chosen along the Red Sea to observe the meiofaunal community structure, its temporal distribution and vertical fluctuation in relation to environmental conditions of the Red Sea marine ecosystem. The temperature, salinity, pH, dissolved oxygen, and redox potential were measured at the time of collection. The water content of the sediments, total organic matters and chlorophyll a values were determined, and sediment samples were subjected to granulometric analysis. A total of 10 meiofauna taxa were identified, with the meiofauna being primarily represented by nematodes (on annual average from 42% to 84%), harpacticoids, polycheates and ostracodes; and the meiofauna abundances ranging from 41 to 167 ind./10 cm(2). The meiofaunal population density fluctuated seasonally with a peak of 192.52 ind./10 cm(2) during summer at station II. The vertical zonation in the distribution of meiofaunal community was significantly correlated with interstitial water, chlorophyll a and total organic matter values. The present study indicates the existence of the well diversified meiofaunal group which can serve as food for higher trophic levels in the Red Sea interstitial environment.

  1. Distributed Large Data-Object Environments: End-to-End Performance Analysis of High Speed Distributed Storage Systems in Wide Area ATM Networks

    Science.gov (United States)

    Johnston, William; Tierney, Brian; Lee, Jason; Hoo, Gary; Thompson, Mary

    1996-01-01

    We have developed and deployed a distributed-parallel storage system (DPSS) in several high speed asynchronous transfer mode (ATM) wide area networks (WAN) testbeds to support several different types of data-intensive applications. Architecturally, the DPSS is a network striped disk array, but is fairly unique in that its implementation allows applications complete freedom to determine optimal data layout, replication and/or coding redundancy strategy, security policy, and dynamic reconfiguration. In conjunction with the DPSS, we have developed a 'top-to-bottom, end-to-end' performance monitoring and analysis methodology that has allowed us to characterize all aspects of the DPSS operating in high speed ATM networks. In particular, we have run a variety of performance monitoring experiments involving the DPSS in the MAGIC testbed, which is a large scale, high speed, ATM network and we describe our experience using the monitoring methodology to identify and correct problems that limit the performance of high speed distributed applications. Finally, the DPSS is part of an overall architecture for using high speed, WAN's for enabling the routine, location independent use of large data-objects. Since this is part of the motivation for a distributed storage system, we describe this architecture.

  2. Functional Analysis in Virtual Environments

    Science.gov (United States)

    Vasquez, Eleazar, III; Marino, Matthew T.; Donehower, Claire; Koch, Aaron

    2017-01-01

    Functional analysis (FA) is an assessment procedure involving the systematic manipulation of an individual's environment to determine why a target behavior is occurring. An analog FA provides practitioners the opportunity to manipulate variables in a controlled environment and formulate a hypothesis for the function of a behavior. In previous…

  3. Research computing in a distributed cloud environment

    International Nuclear Information System (INIS)

    Fransham, K; Agarwal, A; Armstrong, P; Bishop, A; Charbonneau, A; Desmarais, R; Hill, N; Gable, I; Gaudet, S; Goliath, S; Impey, R; Leavett-Brown, C; Ouellete, J; Paterson, M; Pritchet, C; Penfold-Brown, D; Podaima, W; Schade, D; Sobie, R J

    2010-01-01

    The recent increase in availability of Infrastructure-as-a-Service (IaaS) computing clouds provides a new way for researchers to run complex scientific applications. However, using cloud resources for a large number of research jobs requires significant effort and expertise. Furthermore, running jobs on many different clouds presents even more difficulty. In order to make it easy for researchers to deploy scientific applications across many cloud resources, we have developed a virtual machine resource manager (Cloud Scheduler) for distributed compute clouds. In response to a user's job submission to a batch system, the Cloud Scheduler manages the distribution and deployment of user-customized virtual machines across multiple clouds. We describe the motivation for and implementation of a distributed cloud using the Cloud Scheduler that is spread across both commercial and dedicated private sites, and present some early results of scientific data analysis using the system.

  4. Extending the LWS Data Environment: Distributed Data Processing and Analysis

    Science.gov (United States)

    Narock, Thomas

    2005-01-01

    The final stages of this work saw changes to the original framework, as well as the completion and integration of several data processing services. Initially, it was thought that a peer-to-peer architecture was necessary to make this work possible. The peer-to-peer architecture provided many benefits including the dynamic discovery of new services that would be continually added. A prototype example was built and while it showed promise, a major disadvantage was seen in that it was not easily integrated into the existing data environment. While the peer-to-peer system worked well for finding and accessing distributed data processing services, it was found that its use was limited by the difficulty in calling it from existing tools and services. After collaborations with members of the data community, it was determined that our data processing system was of high value and that a new interface should be pursued in order for the community to take full advantage of it. As such; the framework was modified from a peer-to-peer architecture to a more traditional web service approach. Following this change multiple data processing services were added. These services include such things as coordinate transformations and sub setting of data. Observatory (VHO), assisted with integrating the new architecture into the VHO. This allows anyone using the VHO to search for data, to then pass that data through our processing services prior to downloading it. As a second attempt at demonstrating the new system, a collaboration was established with the Collaborative Sun Earth Connector (CoSEC) group at Lockheed Martin. This group is working on a graphical user interface to the Virtual Observatories and data processing software. The intent is to provide a high-level easy-to-use graphical interface that will allow access to the existing Virtual Observatories and data processing services from one convenient application. Working with the CoSEC group we provided access to our data

  5. Distributed service-based approach for sensor data fusion in IoT environments.

    Science.gov (United States)

    Rodríguez-Valenzuela, Sandra; Holgado-Terriza, Juan A; Gutiérrez-Guerrero, José M; Muros-Cobos, Jesús L

    2014-10-15

    The Internet of Things (IoT) enables the communication among smart objects promoting the pervasive presence around us of a variety of things or objects that are able to interact and cooperate jointly to reach common goals. IoT objects can obtain data from their context, such as the home, office, industry or body. These data can be combined to obtain new and more complex information applying data fusion processes. However, to apply data fusion algorithms in IoT environments, the full system must deal with distributed nodes, decentralized communication and support scalability and nodes dynamicity, among others restrictions. In this paper, a novel method to manage data acquisition and fusion based on a distributed service composition model is presented, improving the data treatment in IoT pervasive environments.

  6. Agile distributed software development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan

    2012-01-01

    While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...

  7. CLINICAL SURFACES - Activity-Based Computing for Distributed Multi-Display Environments in Hospitals

    Science.gov (United States)

    Bardram, Jakob E.; Bunde-Pedersen, Jonathan; Doryab, Afsaneh; Sørensen, Steffen

    A multi-display environment (MDE) is made up of co-located and networked personal and public devices that form an integrated workspace enabling co-located group work. Traditionally, MDEs have, however, mainly been designed to support a single “smart room”, and have had little sense of the tasks and activities that the MDE is being used for. This paper presents a novel approach to support activity-based computing in distributed MDEs, where displays are physically distributed across a large building. CLINICAL SURFACES was designed for clinical work in hospitals, and enables context-sensitive retrieval and browsing of patient data on public displays. We present the design and implementation of CLINICAL SURFACES, and report from an evaluation of the system at a large hospital. The evaluation shows that using distributed public displays to support activity-based computing inside a hospital is very useful for clinical work, and that the apparent contradiction between maintaining privacy of medical data in a public display environment can be mitigated by the use of CLINICAL SURFACES.

  8. Source Coding for Wireless Distributed Microphones in Reverberant Environments

    DEFF Research Database (Denmark)

    Zahedi, Adel

    2016-01-01

    . However, it comes with the price of several challenges, including the limited power and bandwidth resources for wireless transmission of audio recordings. In such a setup, we study the problem of source coding for the compression of the audio recordings before the transmission in order to reduce the power...... consumption and/or transmission bandwidth by reduction in the transmission rates. Source coding for wireless microphones in reverberant environments has several special characteristics which make it more challenging in comparison with regular audio coding. The signals which are acquired by the microphones......Modern multimedia systems are more and more shifting toward distributed and networked structures. This includes audio systems, where networks of wireless distributed microphones are replacing the traditional microphone arrays. This allows for flexibility of placement and high spatial diversity...

  9. Environment effects on the flattening distribution of galaxies

    International Nuclear Information System (INIS)

    Freitas Pacheco, J.A. de; Souza, R.E. de; Arakaki, L.

    1983-01-01

    The results of a study of environment effects on the flattening distribution of galaxies are presented. Samples of different morphological types in regions of high and low projected galaxian density were prepared using the Uppsala General Catalogue of Galaxies. No significant differences were detected for the E and Spiral galaxies. However this study suggest that there is an excess of flat S phi systems in regions of high galaxian density with respect the field systems. These 'flat' S phi's are interpreted as being gas stripped Sa galaxies and the consequences of such a hypothesis are also discussed. (Author) [pt

  10. Problem-Based Learning and Problem-Solving Tools: Synthesis and Direction for Distributed Education Environments.

    Science.gov (United States)

    Friedman, Robert S.; Deek, Fadi P.

    2002-01-01

    Discusses how the design and implementation of problem-solving tools used in programming instruction are complementary with both the theories of problem-based learning (PBL), including constructivism, and the practices of distributed education environments. Examines how combining PBL, Web-based distributed education, and a problem-solving…

  11. Performance analysis and optimization of AMGA for the WISDOM environment

    CERN Document Server

    Ahn, Sunil; Lee, Seehoon; Hwang, Soonwook; Breton, Vincent; Koblitz, Birger

    2008-01-01

    AMGA is a gLite-metadata catalogue service designed to offer access to metadata for files stored on the Grid. We evaluated AMGA to analyze whether it is suitable for the WISDOM environment, where thousands of jobs access it simultaneously to get metadata describing docking results and the status of jobs. In this work, we address performance issues on AMGA and propose new techniques to improve AMGA performance in the WISDOM environment. In the WISDOM environment, thousands of job agents distributed on the Grid may have access to an AMGA server simultaneously (1) to take docking tasks out of the AMGA server to execute on the machine that they are sitting, (2) to get the related ligand and target information, and (3) to store the docking results. The docking tasks take about 10 to 30 minutes to finish depending on the machine that they run and the docking configuration. We have carried out some performance analysis on the current AMGA implementation. Due to the overhead required to handle GSI/SSL connection on t...

  12. Analysis of Ecological Distribution and Genomic Content from a Clade of Bacteroidetes Endemic to Sulfidic Environments

    Science.gov (United States)

    Zhou, K.; Sylvan, J. B.; Hallam, S. J.

    2017-12-01

    The Bacteroidetes are a ubiquitous phylum of bacteria found in a wide variety of habitats. Marine Bacteroidetes are known to utilize complex carbohydrates and have a potentially important role in the global carbon cycle through processing these compounds, which are not digestible by many other microbes. Some members of the phylum are known to perform denitrification and are facultative anaerobes, but Bacteroidetes are not known to participate in sulfur redox cycling. Recently, it was shown that a clade of uncultured Bacteroidetes, including the VC2.1_Bac22 group, appears to be endemic to sulfidic environments, including hydrothermal vent sulfide chimneys, sediments and marine water column oxygen minimum zones (OMZs). This clade, dubbed the Sulfiphilic Bacteroidetes, is not detected in 16S rRNA amplicon studies from non-sulfidic environments. To test the hypothesis that the Sulphiphilic Bacteroidetes are involved in sulfur redox chemistry, we updated our meta-analysis of the clade using 16s rRNA sequences from public databases and employed single-cell genomics to survey their genomic potential using 19 single amplified genomes (SAGs) isolated from the seasonally anoxic Saanich Inlet, a seasonally hypoxic basin in British Columbia. Initial analysis of these SAGs indicates the Sulphiphilic Bacteroidetes may perform sulfur redox reactions using a three gene psrABC operon encoding the polysulfide reductase enzyme complex with a thiosulfate sulfurtransferase (rhodanese), which putatively uses cyanide to convert thiosulfate to sulfite, just upstream. Interestingly, this is the same configuration as discovered recently in some Marine Group A bacteria. Further aspects of the Sulphiphilic Bacteroidetes' genomic potential will be presented in light of their presence in sulfidic environments.

  13. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  14. Distributed analysis challenges in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Duckeck, Guenter; Legger, Federica; Mitterer, Christoph Anton; Walker, Rodney [Ludwig-Maximilians-Universitaet Muenchen (Germany)

    2016-07-01

    The ATLAS computing model has undergone massive changes to meet the high luminosity challenge of the second run of the Large Hadron Collider (LHC) at CERN. The production system and distributed data management have been redesigned, a new data format and event model for analysis have been introduced, and common reduction and derivation frameworks have been developed. We report on the impact these changes have on the distributed analysis system, study the various patterns of grid usage for user analysis, focusing on the differences between the first and th e second LHC runs, and measure performances of user jobs.

  15. Comparison of indoor air distribution and thermal environment for different combinations of radiant heating systems with mechanical ventilation systems

    DEFF Research Database (Denmark)

    Wu, Xiaozhou; Fang, Lei; Olesen, Bjarne W.

    2018-01-01

    A hybrid system with a radiant heating system and a mechanical ventilation system, which is regarded as an advanced heating, ventilation and air-conditioning (HVAC) system, has been applied in many modern buildings worldwide. To date, almost no studies focused on comparative analysis of the indoor...... air distribution and the thermal environment for all combinations of radiant heating systems with mechanical ventilation systems. Therefore, in this article, the indoor air distribution and the thermal environment were comparatively analyzed in a room with floor heating (FH) or ceiling heating (CH......) and mixing ventilation (MV) or displacement ventilation (DV) when the supply air temperature ranged from 15.0°C to 19.0°C. The results showed that the temperature effectiveness values were 1.05–1.16 and 0.95–1.02 for MV+ FH and MV+ CH, respectively, and they were 0.78–0.91 and 0.51–0.67 for DV + FH and DV...

  16. Distributed open environment for data retrieval based on pattern recognition techniques

    International Nuclear Information System (INIS)

    Pereira, A.; Vega, J.; Castro, R.; Portas, A.

    2010-01-01

    Pattern recognition methods for data retrieval have been applied to fusion databases for the localization and extraction of similar waveforms within temporal evolution signals. In order to standardize the use of these methods, a distributed open environment has been designed. It is based on a client/server architecture that supports distribution, interoperability and portability between heterogeneous platforms. The server part is a single desktop application based on J2EE (Java 2 Enterprise Edition), which provides a mature standard framework and a modular architecture. It can handle transactions and concurrency of components that are deployed on JETTY, an embedded web container within the Java server application for providing HTTP services. The data management is based on Apache DERBY, a relational database engine also embedded on the same Java based solution. This encapsulation allows hiding of unnecessary details about the installation, distribution, and configuration of all these components but with the flexibility to create and allocate many databases on different servers. The DERBY network module increases the scope of the installed database engine by providing traditional Java database network connections (JDBC-TCP/IP). This avoids scattering several database engines (a unique embedded engine defines the rules for accessing the distributed data). Java thin clients (Java 5 or above is the unique requirement) can be executed in the same computer than the server program (for example a desktop computer) but also server and client software can be distributed in a remote participation environment (wide area networks). The thin client provides graphic user interface to look for patterns (entire waveforms or specific structural forms) and display the most similar ones. This is obtained with HTTP requests and by generating dynamic content (servlets) in response to these client requests.

  17. Distributed Open Environment for Data Retrieval based on Pattern Recognition Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, A.; Vega, J.; Castro, R.; Portas, A. [Association EuratomCIEMAT para Fusion, Madrid (Spain)

    2009-07-01

    Full text of publication follows: Pattern recognition methods for data retrieval have been applied to fusion databases for the localization and extraction of similar waveforms within temporal evolution signals. In order to standardize the use of these methods, a distributed open environment has been designed. It is based on a client/server architecture that supports distribution, inter-operability and portability between heterogeneous platforms. The server part is a single desktop application based on J2EE, which provides a mature standard framework and a modular architecture. It can handle transactions and competition of components that are deployed on JETTY, an embedded web container within the Java server application for providing HTTP services. The data management is based on Apache DERBY, a relational database engine also embedded on the same Java based solution. This encapsulation allows concealment of unnecessary details about the installation, distribution, and configuration of all these components but with the flexibility to create and allocate many databases on different servers. The DERBY network module increases the scope of the installed database engine by providing traditional Java database network connections (JDBC-TCP/IP). This avoids scattering several database engines (a unique embedded engine defines the rules for accessing the distributed data). Java thin clients (Java 5 or above is the unique requirement) can be executed in the same computer than the server program (for example a desktop computer) but also server and client software can be distributed in a remote participation environment (wide area networks). The thin client provides graphic user interface to look for patterns (entire waveforms or specific structural forms) and display the most similar ones. This is obtained with HTTP requests and by generating dynamic content (servlets) in response to these client requests. (authors)

  18. Distributed open environment for data retrieval based on pattern recognition techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, A., E-mail: augusto.pereira@ciemat.e [Asociacion EURATOM/CIEMAT para Fusion, CIEMAT, Edificio 66, Avda. Complutense, 22, 28040 Madrid (Spain); Vega, J.; Castro, R.; Portas, A. [Asociacion EURATOM/CIEMAT para Fusion, CIEMAT, Edificio 66, Avda. Complutense, 22, 28040 Madrid (Spain)

    2010-07-15

    Pattern recognition methods for data retrieval have been applied to fusion databases for the localization and extraction of similar waveforms within temporal evolution signals. In order to standardize the use of these methods, a distributed open environment has been designed. It is based on a client/server architecture that supports distribution, interoperability and portability between heterogeneous platforms. The server part is a single desktop application based on J2EE (Java 2 Enterprise Edition), which provides a mature standard framework and a modular architecture. It can handle transactions and concurrency of components that are deployed on JETTY, an embedded web container within the Java server application for providing HTTP services. The data management is based on Apache DERBY, a relational database engine also embedded on the same Java based solution. This encapsulation allows hiding of unnecessary details about the installation, distribution, and configuration of all these components but with the flexibility to create and allocate many databases on different servers. The DERBY network module increases the scope of the installed database engine by providing traditional Java database network connections (JDBC-TCP/IP). This avoids scattering several database engines (a unique embedded engine defines the rules for accessing the distributed data). Java thin clients (Java 5 or above is the unique requirement) can be executed in the same computer than the server program (for example a desktop computer) but also server and client software can be distributed in a remote participation environment (wide area networks). The thin client provides graphic user interface to look for patterns (entire waveforms or specific structural forms) and display the most similar ones. This is obtained with HTTP requests and by generating dynamic content (servlets) in response to these client requests.

  19. A Distributed Feature-based Environment for Collaborative Design

    Directory of Open Access Journals (Sweden)

    Wei-Dong Li

    2003-02-01

    Full Text Available This paper presents a client/server design environment based on 3D feature-based modelling and Java technologies to enable design information to be shared efficiently among members within a design team. In this environment, design tasks and clients are organised through working sessions generated and maintained by a collaborative server. The information from an individual design client during a design process is updated and broadcast to other clients in the same session through an event-driven and call-back mechanism. The downstream manufacturing analysis modules can be wrapped as agents and plugged into the open environment to support the design activities. At the server side, a feature-feature relationship is established and maintained to filter the varied information of a working part, so as to facilitate efficient information update during the design process.

  20. Probabilistic analysis in normal operation of distribution system with distributed generation

    DEFF Research Database (Denmark)

    Villafafila-Robles, R.; Sumper, A.; Bak-Jensen, B.

    2011-01-01

    Nowadays, the incorporation of high levels of small-scale non-dispatchable distributed generation is leading to the transition from the traditional 'vertical' power system structure to a 'horizontally-operated' power system, where the distribution networks contain both stochastic generation...... and load. This fact increases the number of stochastic inputs and dependence structures between them need to be considered. The deterministic analysis is not enough to cope with these issues and a new approach is needed. Probabilistic analysis provides a better approach. Moreover, as distribution systems...

  1. Distributed autonomous mapping of indoor environments

    Science.gov (United States)

    Rogers, J.; Paluri, M.; Cunningham, A.; Christensen, H. I.; Michael, N.; Kumar, V.; Ma, J.; Matthies, L.

    2011-06-01

    This paper describes the results of a Joint Experiment performed on behalf of the MAST CTA. The system developed for the Joint Experiment makes use of three robots which work together to explore and map an unknown environment. Each of the robots used in this experiment is equipped with a laser scanner for measuring walls and a camera for locating doorways. Information from both of these types of structures is concurrently incorporated into each robot's local map using a graph based SLAM technique. A Distributed-Data-Fusion algorithm is used to efficiently combine local maps from each robot into a shared global map. Each robot computes a compressed local feature map and transmits it to neighboring robots, which allows each robot to merge its map with the maps of its neighbors. Each robot caches the compressed maps from its neighbors, allowing it to maintain a coherent map with a common frame of reference. The robots utilize an exploration strategy to efficiently cover the unknown environment which allows collaboration on an unreliable communications channel. As each new branching point is discovered by a robot, it broadcasts the information about where this point is along with the robot's path from a known landmark to the other robots. When the next robot reaches a dead-end, new branching points are allocated by auction. In the event of communication interruption, the robot which observed the branching point will eventually explore it; therefore, the exploration is complete in the face of communication failure.

  2. KeyWare: an open wireless distributed computing environment

    Science.gov (United States)

    Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir

    1995-12-01

    Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.

  3. Environmental factors determining ammonia-oxidizing organism distribution and diversity in marine environments.

    Science.gov (United States)

    Bouskill, Nicholas J; Eveillard, Damien; Chien, Diana; Jayakumar, Amal; Ward, Bess B

    2012-03-01

    Ammonia-oxidizing bacteria (AOB) and archaea (AOA) play a vital role in bridging the input of fixed nitrogen, through N-fixation and remineralization, to its loss by denitrification and anammox. Yet the major environmental factors determining AOB and AOA population dynamics are little understood, despite both groups having a wide environmental distribution. This study examined the relative abundance of both groups of ammonia-oxidizing organisms (AOO) and the diversity of AOA across large-scale gradients in temperature, salinity and substrate concentration and dissolved oxygen. The relative abundance of AOB and AOA varied across environments, with AOB dominating in the freshwater region of the Chesapeake Bay and AOA more abundant in the water column of the coastal and open ocean. The highest abundance of the AOA amoA gene was recorded in the oxygen minimum zones (OMZs) of the Eastern Tropical South Pacific (ETSP) and the Arabian Sea (AS). The ratio of AOA : AOB varied from 0.7 in the Chesapeake Bay to 1600 in the Sargasso Sea. Relative abundance of both groups strongly correlated with ammonium concentrations. AOA diversity, as determined by phylogenetic analysis of clone library sequences and archetype analysis from a functional gene DNA microarray, detected broad phylogenetic differences across the study sites. However, phylogenetic diversity within physicochemically congruent stations was more similar than would be expected by chance. This suggests that the prevailing geochemistry, rather than localized dispersal, is the major driving factor determining OTU distribution. © 2011 Society for Applied Microbiology and Blackwell Publishing Ltd.

  4. A Java based environment to control and monitor distributed processing systems

    International Nuclear Information System (INIS)

    Legrand, I.C.

    1997-01-01

    Distributed processing systems are considered to solve the challenging requirements of triggering and data acquisition systems for future HEP experiments. The aim of this work is to present a software environment to control and monitor large scale parallel processing systems based on a distributed client-server approach developed in Java. One server task may control several processing nodes, switching elements or controllers for different sub-systems. Servers are designed as multi-thread applications for efficient communications with other objects. Servers communicate between themselves by using Remote Method Invocation (RMI) in a peer-to-peer mechanism. This distributed server layer has to provide a dynamic and transparent access from any client to all the resources in the system. The graphical user interface programs, which are platform independent, may be transferred to any client via the http protocol. In this scheme the control and monitor tasks are distributed among servers and network controls the flow of information among servers and clients providing a flexible mechanism for monitoring and controlling large heterogenous distributed systems. (author)

  5. Distribution of {sup 129}I in terrestrial surface water environments

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xuegao [State Key Laboratory of Hydrology-Water Resources and Hydraulic Engineering, Hohai University, Nanjing 210098 (China); College of Hydrology and Water Resources, Hohai University, Nanjing (China); Gong, Meng [College of Hydrology and Water Resources, Hohai University, Nanjing (China); Yi, Peng, E-mail: pengyi1915@163.com [State Key Laboratory of Hydrology-Water Resources and Hydraulic Engineering, Hohai University, Nanjing 210098 (China); College of Hydrology and Water Resources, Hohai University, Nanjing (China); Aldahan, Ala [Department of Earth Sciences, Uppsala University, Uppsala (Sweden); Department of Geology, United Arab Emirates University, Al Ain (United Arab Emirates); Yu, Zhongbo [State Key Laboratory of Hydrology-Water Resources and Hydraulic Engineering, Hohai University, Nanjing 210098 (China); College of Hydrology and Water Resources, Hohai University, Nanjing (China); Possnert, Göran [Tandem Laboratory, Uppsala University, Uppsala (Sweden); Chen, Li [State Key Laboratory of Hydrology-Water Resources and Hydraulic Engineering, Hohai University, Nanjing 210098 (China); College of Hydrology and Water Resources, Hohai University, Nanjing (China)

    2015-10-15

    The global distribution of the radioactive isotope iodine-129 in surface waters (lakes and rivers) is presented here and compared with the atmospheric deposition and distribution in surface marine waters. The results indicate relatively high concentrations in surface water systems in close vicinity of the anthropogenic release sources as well as in parts of Western Europe, North America and Central Asia. {sup 129}I level is generally higher in the terrestrial surface water of the Northern hemisphere compared to the southern hemisphere. The highest values of {sup 129}I appear around 50°N and 40°S in the northern and southern hemisphere, separately. Direct gaseous and marine atmospheric emissions are the most likely avenues for the transport of {sup 129}I from the sources to the terrestrial surface waters. To apply iodine-129 as process tracer in terrestrial surface water environment, more data are needed on {sup 129}I distribution patterns both locally and globally.

  6. Comparative phyloinformatics of virus genes at micro and macro levels in a distributed computing environment.

    Science.gov (United States)

    Singh, Dadabhai T; Trehan, Rahul; Schmidt, Bertil; Bretschneider, Timo

    2008-01-01

    Preparedness for a possible global pandemic caused by viruses such as the highly pathogenic influenza A subtype H5N1 has become a global priority. In particular, it is critical to monitor the appearance of any new emerging subtypes. Comparative phyloinformatics can be used to monitor, analyze, and possibly predict the evolution of viruses. However, in order to utilize the full functionality of available analysis packages for large-scale phyloinformatics studies, a team of computer scientists, biostatisticians and virologists is needed--a requirement which cannot be fulfilled in many cases. Furthermore, the time complexities of many algorithms involved leads to prohibitive runtimes on sequential computer platforms. This has so far hindered the use of comparative phyloinformatics as a commonly applied tool in this area. In this paper the graphical-oriented workflow design system called Quascade and its efficient usage for comparative phyloinformatics are presented. In particular, we focus on how this task can be effectively performed in a distributed computing environment. As a proof of concept, the designed workflows are used for the phylogenetic analysis of neuraminidase of H5N1 isolates (micro level) and influenza viruses (macro level). The results of this paper are hence twofold. Firstly, this paper demonstrates the usefulness of a graphical user interface system to design and execute complex distributed workflows for large-scale phyloinformatics studies of virus genes. Secondly, the analysis of neuraminidase on different levels of complexity provides valuable insights of this virus's tendency for geographical based clustering in the phylogenetic tree and also shows the importance of glycan sites in its molecular evolution. The current study demonstrates the efficiency and utility of workflow systems providing a biologist friendly approach to complex biological dataset analysis using high performance computing. In particular, the utility of the platform Quascade

  7. Cluster analysis for determining distribution center location

    Science.gov (United States)

    Lestari Widaningrum, Dyah; Andika, Aditya; Murphiyanto, Richard Dimas Julian

    2017-12-01

    Determination of distribution facilities is highly important to survive in the high level of competition in today’s business world. Companies can operate multiple distribution centers to mitigate supply chain risk. Thus, new problems arise, namely how many and where the facilities should be provided. This study examines a fast-food restaurant brand, which located in the Greater Jakarta. This brand is included in the category of top 5 fast food restaurant chain based on retail sales. There were three stages in this study, compiling spatial data, cluster analysis, and network analysis. Cluster analysis results are used to consider the location of the additional distribution center. Network analysis results show a more efficient process referring to a shorter distance to the distribution process.

  8. Non-uniform distribution pattern for differentially expressed genes of transgenic rice Huahui 1 at different developmental stages and environments.

    Directory of Open Access Journals (Sweden)

    Zhi Liu

    Full Text Available DNA microarray analysis is an effective method to detect unintended effects by detecting differentially expressed genes (DEG in safety assessment of genetically modified (GM crops. With the aim to reveal the distribution of DEG of GM crops under different conditions, we performed DNA microarray analysis using transgenic rice Huahui 1 (HH1 and its non-transgenic parent Minghui 63 (MH63 at different developmental stages and environmental conditions. Considerable DEG were selected in each group of HH1 under different conditions. For each group of HH1, the number of DEG was different; however, considerable common DEG were shared between different groups of HH1. These findings suggested that both DEG and common DEG were adequate for investigation of unintended effects. Furthermore, a number of significantly changed pathways were found in all groups of HH1, indicating genetic modification caused everlasting changes to plants. To our knowledge, our study for the first time provided the non-uniformly distributed pattern for DEG of GM crops at different developmental stages and environments. Our result also suggested that DEG selected in GM plants at specific developmental stage and environment could act as useful clues for further evaluation of unintended effects of GM plants.

  9. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    International Nuclear Information System (INIS)

    Scheidt, Rafael de Faria; Vilain, Patrícia; Dantas, M A R

    2014-01-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers

  10. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    Science.gov (United States)

    de Faria Scheidt, Rafael; Vilain, Patrícia; Dantas, M. A. R.

    2014-10-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers.

  11. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  12. Geospatial analysis of food environment demonstrates associations with gestational diabetes.

    Science.gov (United States)

    Kahr, Maike K; Suter, Melissa A; Ballas, Jerasimos; Ramin, Susan M; Monga, Manju; Lee, Wesley; Hu, Min; Shope, Cindy D; Chesnokova, Arina; Krannich, Laura; Griffin, Emily N; Mastrobattista, Joan; Dildy, Gary A; Strehlow, Stacy L; Ramphul, Ryan; Hamilton, Winifred J; Aagaard, Kjersti M

    2016-01-01

    Gestational diabetes mellitus (GDM) is one of most common complications of pregnancy, with incidence rates varying by maternal age, race/ethnicity, obesity, parity, and family history. Given its increasing prevalence in recent decades, covariant environmental and sociodemographic factors may be additional determinants of GDM occurrence. We hypothesized that environmental risk factors, in particular measures of the food environment, may be a diabetes contributor. We employed geospatial modeling in a populous US county to characterize the association of the relative availability of fast food restaurants and supermarkets to GDM. Utilizing a perinatal database with >4900 encoded antenatal and outcome variables inclusive of ZIP code data, 8912 consecutive pregnancies were analyzed for correlations between GDM and food environment based on countywide food permit registration data. Linkage between pregnancies and food environment was achieved on the basis of validated 5-digit ZIP code data. The prevalence of supermarkets and fast food restaurants per 100,000 inhabitants for each ZIP code were gathered from publicly available food permit sources. To independently authenticate our findings with objective data, we measured hemoglobin A1c levels as a function of geospatial distribution of food environment in a matched subset (n = 80). Residence in neighborhoods with a high prevalence of fast food restaurants (fourth quartile) was significantly associated with an increased risk of developing GDM (relative to first quartile: adjusted odds ratio, 1.63; 95% confidence interval, 1.21-2.19). In multivariate analysis, this association held true after controlling for potential confounders (P = .002). Measurement of hemoglobin A1c levels in a matched subset were significantly increased in association with residence in a ZIP code with a higher fast food/supermarket ratio (n = 80, r = 0.251 P analysis, a relationship of food environment and risk for gestational diabetes was

  13. An analysis of the orbital distribution of solid rocket motor slag

    Science.gov (United States)

    Horstman, Matthew F.; Mulrooney, Mark

    2009-01-01

    The contribution by solid rocket motors (SRMs) to the orbital debris environment is potentially significant and insufficiently studied. Design and combustion processes can lead to the emission of enough by-products to warrant assessment of their contribution to orbital debris. These particles are formed during SRM tail-off, or burn termination, by the rapid solidification of molten Al2O3 slag accumulated during the burn. The propensity of SRMs to generate particles larger than 100μm raises concerns regarding the debris environment. Sizes as large as 1 cm have been witnessed in ground tests, and comparable sizes have been estimated via observations of sub-orbital tail-off events. Utilizing previous research we have developed more sophisticated size distributions and modeled the time evolution of resultant orbital populations using a historical database of SRM launches, propellant, and likely location and time of tail-off. This analysis indicates that SRM ejecta is a significant component of the debris environment.

  14. Merging assistance function with task distribution model to enhance user performance in collaborative virtual environment

    International Nuclear Information System (INIS)

    Khalid, S.; Alam, A.

    2016-01-01

    Collaborative Virtual Environments (CVEs) falls under Virtual Reality (VR) where two or more users manipulate objects collaboratively. In this paper we have made some experiments to make assembly from constituents parts scattered in Virtual Environment (VE) based on task distribution model using assistance functions for checking and enhancing user performance. The CVEs subjects setting on distinct connected machines via local area network. In this perspective, we consider the effects of assistance function with oral communication on collaboration, co-presence and users performance. Twenty subjects performed collaboratively an assembly task on static and dynamic based task distribution. We examine the degree of influence of assistance function with oral communications on user's performance based on task distribution model. The results show that assistance functions with oral communication based on task distribution model not only increase user performance but also enhance the sense of copresence and awareness. (author)

  15. Efficiency and productivity (TFP) of the Turkish electricity distribution companies: An application of two-stage (DEA and Tobit) analysis

    International Nuclear Information System (INIS)

    Çelen, Aydın

    2013-01-01

    In this study, we analyze the efficiency performances of 21 Turkish electricity distribution companies during the period of 2002–2009. For this aim, we employ a two-stage analysis in order to take into account the business environment variables which are beyond the control of distribution companies. We determine the efficiency performances of the electricity distribution companies by help of DEA in the first stage. Then, in the second stage, using these calculated efficiency scores as dependent variable, we utilize Tobit model to determine the business environment variables which may explain the efficiency scores. According to the results, customer density of the region and the private ownership affect the efficiencies positively. Thus, the best strategy to improve efficiency in the market is privatizing the public distribution companies. - Highlights: • We analyze the efficiencies of 21 Turkish electricity distribution companies. • A two-stage analysis is employed to take into account environmental variables. • We firstly calculate efficiencies of companies with DEA, then Tobit model is used to determine the effects of the variables. • Customer density and ownership type affect the efficiencies positively. • Privatization is a good strategy to improve efficiencies

  16. Spatial distribution and ecological environment analysis of great gerbil in Xinjiang Plague epidemic foci based on remote sensing

    International Nuclear Information System (INIS)

    Gao, Mengxu; Wang, Juanle; Li, Qun; Cao, Chunxiang

    2014-01-01

    Yersinia pestis (Plague bacterium) from great gerbil was isolated in 2005 in Xinjiang Dzungarian Basin, which confirmed the presence of the plague epidemic foci. This study analysed the spatial distribution and suitable habitat of great gerbil based on the monitoring data of great gerbil from Chinese Center for Disease Control and Prevention, as well as the ecological environment elements obtained from remote sensing products. The results showed that: (1) 88.5% (277/313) of great gerbil distributed in the area of elevation between 200 and 600 meters. (2) All the positive points located in the area with a slope of 0–3 degree, and the sunny tendency on aspect was not obvious. (3) All 313 positive points of great gerbil distributed in the area with an average annual temperature from 5 to 11 °C, and 165 points with an average annual temperature from 7 to 9 °C. (4) 72.8% (228/313) of great gerbil survived in the area with an annual precipitation of 120–200mm. (5) The positive points of great gerbil increased correspondingly with the increasing of NDVI value, but there is no positive point when NDVI is higher than 0.521, indicating the suitability of vegetation for great gerbil. This study explored a broad and important application for the monitoring and prevention of plague using remote sensing and geographic information system

  17. A Distributed Snapshot Protocol for Efficient Artificial Intelligence Computation in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    JongBeom Lim

    2018-01-01

    Full Text Available Many artificial intelligence applications often require a huge amount of computing resources. As a result, cloud computing adoption rates are increasing in the artificial intelligence field. To support the demand for artificial intelligence applications and guarantee the service level agreement, cloud computing should provide not only computing resources but also fundamental mechanisms for efficient computing. In this regard, a snapshot protocol has been used to create a consistent snapshot of the global state in cloud computing environments. However, the existing snapshot protocols are not optimized in the context of artificial intelligence applications, where large-scale iterative computation is the norm. In this paper, we present a distributed snapshot protocol for efficient artificial intelligence computation in cloud computing environments. The proposed snapshot protocol is based on a distributed algorithm to run interconnected multiple nodes in a scalable fashion. Our snapshot protocol is able to deal with artificial intelligence applications, in which a large number of computing nodes are running. We reveal that our distributed snapshot protocol guarantees the correctness, safety, and liveness conditions.

  18. Designing Sustainable Systems for Urban Freight Distribution through techniques of Multicriteria Decision Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Muerza, V.; Larrode, E.; Moreno- Jimenez, J.M.

    2016-07-01

    This paper focuses on the analysis and selection of the parameters that have a major influence on the optimization of the urban freight distribution system by using sustainable means of transport, such as electric vehicles. In addition, a procedure has been be studied to identify the alternatives that may exist to establish the best system for urban freight distribution, which suits the stage that is considered using the most appropriate means of transportation available. To do this, it has been used the Analytic Hierarchy Process, one of the tools of multicriteria decision analysis. In order to establish an adequate planning of an urban freight distribution system using electric vehicles three hypotheses are necessary: (i) it is necessary to establish the strategic planning of the distribution process by defining the relative importance of the strategic objectives of the process of distribution of goods in the urban environment, both economically and technically and in social and environmental terms; (ii) it must be established the operational planning that allows the achievement of the strategic objectives with the most optimized allocation of available resources; and (iii) to determine the optimal architecture of the vehicle that best suits the operating conditions in which it will work and ensures optimum energy efficiency in operation. (Author)

  19. Measurement of distribution coefficients of U series radionuclides on soils under shallow land environment (2). pH dependence of distribution coefficients

    International Nuclear Information System (INIS)

    Sakamoto, Yoshiaki; Takebe, Shinichi; Ogawa, Hiromichi; Inagawa, Satoshi; Sasaki, Tomozou

    2001-01-01

    In order to study sorption behavior of U series radionuclides (Pb, Ra, Th, Ac, Pa and U) under aerated zone environment (loam-rain water system) and aquifer environment (sand-groundwater system) for safety assessment of U bearing waste, pH dependence of distribution coefficients of each element has been obtained. The pH dependence of distribution coefficients of Pb, Ra, Th, Ac and U was analyzed by model calculation based on aqueous speciation of each element and soil surface charge characteristics, which is composed of a cation exchange capacity and surface hydroxyl groups. From the model calculation, the sorption behavior of Pb, Ra, Th, Ac and U could be described by a combination of cation exchange reaction and surface-complexation model. (author)

  20. Distribution coefficients for plutonium and americium on particulates in aquatic environments

    International Nuclear Information System (INIS)

    Sanchez, A.L.; Schell, W.R.; Sibley, T.H.

    1982-01-01

    The distribution coefficients of two transuranic elements, plutonium and americium, were measured experimentally in laboratory systems of selected freshwater, estuarine, and marine environments. Gamma-ray emitting isotopes of these radionuclides, 237 Pu and 241 Am, were significantly greater than the sorption Ksub(d) values, suggesting some irreversibility in the sorption of these radionuclides onto sediments. The effects of pH and of sediment concentration on the distribution coefficients were also investigated. There were significant changes in the Ksub(d) values as these parameters were varied. Experiments using sterilized and nonsterilized samples for some of the sediment/water systems indicate possible bacterial effects on Ksub(d) values. (author)

  1. DESIGN COORDINATION IN DISTRIBUTED ENVIRONMENTS USING VIRTUAL REALITY SYSTEMS

    Directory of Open Access Journals (Sweden)

    Rusdi HA

    2003-01-01

    Full Text Available This paper presents a research project, which investigates the use of virtual reality and computer communication technology to facilitate building design coordination in distributed environments. The emphasis of the system, called VR-based DEsign COordination (VRDECO is providing a communication tool that can be used by remote designers for settling ideas before they fully engage in concurrent engineering environments. VRDECO provides the necessary design tools, library of building elements and communication procedures, for designers from remote places to perform and coordinate their initial tasks. It has been implemented using available commercial software packages, and is used in designing a simple house. VRDECO facilitates the creation a preliminary design and simple communication with the client. There are, however, some difficulties in the development of the full version of VRDECO, i.e.: creating an adequate number of building elements, building specification database with a sufficient number of choices, and establishing a systematic rule to determine the parts of a building that are updateable.

  2. Supportability Analysis in LCI Environment

    OpenAIRE

    Dragan Vasiljevic; Ana Horvat

    2013-01-01

    Starting from the basic pillars of the supportability analysis this paper queries its characteristics in LCI (Life Cycle Integration) environment. The research methodology contents a review of modern logistics engineering literature with the objective to collect and synthesize the knowledge relating to standards of supportability design in e-logistics environment. The results show that LCI framework has properties which are in fully compatibility with the requirement of s...

  3. 99Tc in the environment. Sources, distribution and methods

    International Nuclear Information System (INIS)

    Garcia-Leon, Manuel

    2005-01-01

    99 Tc is a β-emitter, E max =294 keV, with a very long half-life (T 1/2 =2.11 x 10 5 y). It is mainly produced in the fission of 235 U and 239 Pu at a rate of about 6%. This rate together with its long half-life makes it a significant nuclide in the whole nuclear fuel cycle, from which it can be introduced into the environment at different rates depending on the cycle step. A gross estimation shows that adding all the possible sources, at least 2000 TBq had been released into the environment up to 2000 and that up to the middle of the nineties of the last century some 64000 TBq had been produced worldwide. Nuclear explosions have liberated some 160 TBq into the environment. In this work, environmental distribution of 99 Tc as well as the methods for its determination will be discussed. Emphasis is put on the environmental relevance of 99 Tc, mainly with regard to the future committed radiation dose received by the population and to the problem of nuclear waste management. Its determination at environmental levels is a challenging task. For that, special mention is made about the mass spectrometric methods for its measurement. (author)

  4. On delay adjustment for dynamic load balancing in distributed virtual environments.

    Science.gov (United States)

    Deng, Yunhua; Lau, Rynson W H

    2012-04-01

    Distributed virtual environments (DVEs) are becoming very popular in recent years, due to the rapid growing of applications, such as massive multiplayer online games (MMOGs). As the number of concurrent users increases, scalability becomes one of the major challenges in designing an interactive DVE system. One solution to address this scalability problem is to adopt a multi-server architecture. While some methods focus on the quality of partitioning the load among the servers, others focus on the efficiency of the partitioning process itself. However, all these methods neglect the effect of network delay among the servers on the accuracy of the load balancing solutions. As we show in this paper, the change in the load of the servers due to network delay would affect the performance of the load balancing algorithm. In this work, we conduct a formal analysis of this problem and discuss two efficient delay adjustment schemes to address the problem. Our experimental results show that our proposed schemes can significantly improve the performance of the load balancing algorithm with neglectable computation overhead.

  5. Communication Analysis of Environment.

    Science.gov (United States)

    Malik, M. F.; Thwaites, H. M.

    This textbook was developed for use in a Concordia University (Quebec) course entitled "Communication Analysis of Environment." Designed as a practical application of information theory and cybernetics in the field of communication studies, the course is intended to be a self-instructional process, whereby each student chooses one…

  6. Massive calculations of electrostatic potentials and structure maps of biopolymers in a distributed computing environment

    International Nuclear Information System (INIS)

    Akishina, T.P.; Ivanov, V.V.; Stepanenko, V.A.

    2013-01-01

    Among the key factors determining the processes of transcription and translation are the distributions of the electrostatic potentials of DNA, RNA and proteins. Calculations of electrostatic distributions and structure maps of biopolymers on computers are time consuming and require large computational resources. We developed the procedures for organization of massive calculations of electrostatic potentials and structure maps for biopolymers in a distributed computing environment (several thousands of cores).

  7. Instructional Designers' Media Selection Practices for Distributed Problem-Based Learning Environments

    Science.gov (United States)

    Fells, Stephanie

    2012-01-01

    The design of online or distributed problem-based learning (dPBL) is a nascent, complex design problem. Instructional designers are challenged to effectively unite the constructivist principles of problem-based learning (PBL) with appropriate media in order to create quality dPBL environments. While computer-mediated communication (CMC) tools and…

  8. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    Science.gov (United States)

    Simonyan, Vahan; Mazumder, Raja

    2014-09-30

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  9. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan

    2014-09-01

    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  10. ESC Track Fusion Demonstration Tool for Distributed Environments

    Science.gov (United States)

    Cox, C.; Degraaf, E.; Perry, R.; Diaz, R.

    A key requirement of future net-centric Space Situational Awareness systems development and operations will be decentralized operations, including multi-level distributed data fusion. Raytheon has developed a demonstration for ESC 850 ELSG/NS that fuses sensor-supplied tracks in a dense resident space object (RSO) environment. The demonstration use the state vector and covariance input data from single pass orbit solutions and applies track-to-track correlation algorithms to fuse the individual tracks into composite orbits. Platform independent Java technology and an agent-based software design using asynchronous inter-process communications was used in the demonstration tool development. The tool has been tested against a simulated scenario corresponding to the future 100,000+ object catalog environment. Ten days of simulated data from Fylingdales, Shemya, Eglin, and a future Space Fence sensor were generated for a co-orbiting family of 122 sun-synchronous objects between 700 and 800 km altitude from the NASA simulated small debris for 2015. The selected set exceeds the average object densities for the 100,000+ RSO environment, and provides a scenario similar to an evolved breakup where the debris has had time to disperse. The demo produced very good results using fast and simple astrodynamic models. A total of 16678 input tracks were fused, with less than 1.6% being misassociated. Pure tracks were generated for 65% of the 122 truth objects, and 97% of the objects had a misassociation rate tool that can be used to assess current breakups such as the Chinese ASAT event.

  11. Full immersion simulation: validation of a distributed simulation environment for technical and non-technical skills training in Urology.

    Science.gov (United States)

    Brewin, James; Tang, Jessica; Dasgupta, Prokar; Khan, Muhammad S; Ahmed, Kamran; Bello, Fernando; Kneebone, Roger; Jaye, Peter

    2015-07-01

    To evaluate the face, content and construct validity of the distributed simulation (DS) environment for technical and non-technical skills training in endourology. To evaluate the educational impact of DS for urology training. DS offers a portable, low-cost simulated operating room environment that can be set up in any open space. A prospective mixed methods design using established validation methodology was conducted in this simulated environment with 10 experienced and 10 trainee urologists. All participants performed a simulated prostate resection in the DS environment. Outcome measures included surveys to evaluate the DS, as well as comparative analyses of experienced and trainee urologist's performance using real-time and 'blinded' video analysis and validated performance metrics. Non-parametric statistical methods were used to compare differences between groups. The DS environment demonstrated face, content and construct validity for both non-technical and technical skills. Kirkpatrick level 1 evidence for the educational impact of the DS environment was shown. Further studies are needed to evaluate the effect of simulated operating room training on real operating room performance. This study has shown the validity of the DS environment for non-technical, as well as technical skills training. DS-based simulation appears to be a valuable addition to traditional classroom-based simulation training. © 2014 The Authors BJU International © 2014 BJU International Published by John Wiley & Sons Ltd.

  12. Marine ecology conditions at Weda Bay, North Maluku based on statistical analysis on distribution of recent foraminifera

    Directory of Open Access Journals (Sweden)

    Kurniasih Anis

    2017-01-01

    Full Text Available Analysis of foraminifera in geology,usually being used to find the age of rocks/ sediments and depositional environment. In this study, recent foraminifera was used not only to determinethe sedimentary environment,but also to estimate the ecological condition of the water through a statistical approach.Analysis was performed quantitatively in 10 surface seabed sediment samples in Weda Bay North Maluku. The analysis includes dominance (Sympson Index, diversity and evenness (Shannon Index, and the ratio of planktonic -benthic. The results were shown in the plotting diagram of M-R-T (Miliolid-Rotalid-Textularid to determine the depositional environment. Quantitative analysis was performed using Past software (paleontological version Statistic 1:29.The analysis result showed there was no domination of certain taxon with a moderate degree of evenness and stable communities and considerably a moderate diversity. The results of this analysis indicated that research area had a stable water conditions with the optimum level of carbonate content, oxygen supply, salinity, and temperature. The ratio of planktonic and benthic indicate the relative depth, which was deeper the water increased the percentage of planktonic foraminifera. Based on M-R-T diagram showed the distribution of sediment deposited on exposed carbonate (carbonate platform environment with normal saline.

  13. Establishing Functional Relationships between Abiotic Environment, Macrophyte Coverage, Resource Gradients and the Distribution of Mytilus trossulus in a Brackish Non-Tidal Environment.

    Directory of Open Access Journals (Sweden)

    Jonne Kotta

    Full Text Available Benthic suspension feeding mussels are an important functional guild in coastal and estuarine ecosystems. To date we lack information on how various environmental gradients and biotic interactions separately and interactively shape the distribution patterns of mussels in non-tidal environments. Opposing to tidal environments, mussels inhabit solely subtidal zone in non-tidal waterbodies and, thereby, driving factors for mussel populations are expected to differ from the tidal areas. In the present study, we used the boosted regression tree modelling (BRT, an ensemble method for statistical techniques and machine learning, in order to explain the distribution and biomass of the suspension feeding mussel Mytilus trossulus in the non-tidal Baltic Sea. BRT models suggested that (1 distribution patterns of M. trossulus are largely driven by separate effects of direct environmental gradients and partly by interactive effects of resource gradients with direct environmental gradients. (2 Within its suitable habitat range, however, resource gradients had an important role in shaping the biomass distribution of M. trossulus. (3 Contrary to tidal areas, mussels were not competitively superior over macrophytes with patterns indicating either facilitative interactions between mussels and macrophytes or co-variance due to common stressor. To conclude, direct environmental gradients seem to define the distribution pattern of M. trossulus, and within the favourable distribution range, resource gradients in interaction with direct environmental gradients are expected to set the biomass level of mussels.

  14. BioconductorBuntu: a Linux distribution that implements a web-based DNA microarray analysis server.

    Science.gov (United States)

    Geeleher, Paul; Morris, Dermot; Hinde, John P; Golden, Aaron

    2009-06-01

    BioconductorBuntu is a custom distribution of Ubuntu Linux that automatically installs a server-side microarray processing environment, providing a user-friendly web-based GUI to many of the tools developed by the Bioconductor Project, accessible locally or across a network. System installation is via booting off a CD image or by using a Debian package provided to upgrade an existing Ubuntu installation. In its current version, several microarray analysis pipelines are supported including oligonucleotide, dual-or single-dye experiments, including post-processing with Gene Set Enrichment Analysis. BioconductorBuntu is designed to be extensible, by server-side integration of further relevant Bioconductor modules as required, facilitated by its straightforward underlying Python-based infrastructure. BioconductorBuntu offers an ideal environment for the development of processing procedures to facilitate the analysis of next-generation sequencing datasets. BioconductorBuntu is available for download under a creative commons license along with additional documentation and a tutorial from (http://bioinf.nuigalway.ie).

  15. Determination of hydraulic conductivity from grain-size distribution for different depositional environments

    KAUST Repository

    Rosas, Jorge

    2013-06-06

    Over 400 unlithified sediment samples were collected from four different depositional environments in global locations and the grain-size distribution, porosity, and hydraulic conductivity were measured using standard methods. The measured hydraulic conductivity values were then compared to values calculated using 20 different empirical equations (e.g., Hazen, Carman-Kozeny) commonly used to estimate hydraulic conductivity from grain-size distribution. It was found that most of the hydraulic conductivity values estimated from the empirical equations correlated very poorly to the measured hydraulic conductivity values with errors ranging to over 500%. To improve the empirical estimation methodology, the samples were grouped by depositional environment and subdivided into subgroups based on lithology and mud percentage. The empirical methods were then analyzed to assess which methods best estimated the measured values. Modifications of the empirical equations, including changes to special coefficients and addition of offsets, were made to produce modified equations that considerably improve the hydraulic conductivity estimates from grain size data for beach, dune, offshore marine, and river sediments. Estimated hydraulic conductivity errors were reduced to 6 to 7.1m/day for the beach subgroups, 3.4 to 7.1m/day for dune subgroups, and 2.2 to 11m/day for offshore sediments subgroups. Improvements were made for river environments, but still produced high errors between 13 and 23m/day. © 2013, National Ground Water Association.

  16. Determination of hydraulic conductivity from grain-size distribution for different depositional environments

    KAUST Repository

    Rosas, Jorge; Lopez Valencia, Oliver Miguel; Missimer, Thomas M.; Coulibaly, Kapo M.; Dehwah, Abdullah; Sesler, Kathryn; Rodri­ guez, Luis R. Lujan; Mantilla, David

    2013-01-01

    Over 400 unlithified sediment samples were collected from four different depositional environments in global locations and the grain-size distribution, porosity, and hydraulic conductivity were measured using standard methods. The measured hydraulic conductivity values were then compared to values calculated using 20 different empirical equations (e.g., Hazen, Carman-Kozeny) commonly used to estimate hydraulic conductivity from grain-size distribution. It was found that most of the hydraulic conductivity values estimated from the empirical equations correlated very poorly to the measured hydraulic conductivity values with errors ranging to over 500%. To improve the empirical estimation methodology, the samples were grouped by depositional environment and subdivided into subgroups based on lithology and mud percentage. The empirical methods were then analyzed to assess which methods best estimated the measured values. Modifications of the empirical equations, including changes to special coefficients and addition of offsets, were made to produce modified equations that considerably improve the hydraulic conductivity estimates from grain size data for beach, dune, offshore marine, and river sediments. Estimated hydraulic conductivity errors were reduced to 6 to 7.1m/day for the beach subgroups, 3.4 to 7.1m/day for dune subgroups, and 2.2 to 11m/day for offshore sediments subgroups. Improvements were made for river environments, but still produced high errors between 13 and 23m/day. © 2013, National Ground Water Association.

  17. Generalized Load Sharing for Homogeneous Networks of Distributed Environment

    Directory of Open Access Journals (Sweden)

    A. Satheesh

    2008-01-01

    Full Text Available We propose a method for job migration policies by considering effective usage of global memory in addition to CPU load sharing in distributed systems. When a node is identified for lacking sufficient memory space to serve jobs, one or more jobs of the node will be migrated to remote nodes with low memory allocations. If the memory space is sufficiently large, the jobs will be scheduled by a CPU-based load sharing policy. Following the principle of sharing both CPU and memory resources, we present several load sharing alternatives. Our objective is to reduce the number of page faults caused by unbalanced memory allocations for jobs among distributed nodes, so that overall performance of a distributed system can be significantly improved. We have conducted trace-driven simulations to compare CPU-based load sharing policies with our policies. We show that our load sharing policies not only improve performance of memory bound jobs, but also maintain the same load sharing quality as the CPU-based policies for CPU-bound jobs. Regarding remote execution and preemptive migration strategies, our experiments indicate that a strategy selection in load sharing is dependent on the amount of memory demand of jobs, remote execution is more effective for memory-bound jobs, and preemptive migration is more effective for CPU-bound jobs. Our CPU-memory-based policy using either high performance or high throughput approach and using the remote execution strategy performs the best for both CPU-bound and memory-bound job in homogeneous networks of distributed environment.

  18. A distributed reasoning engine ecosystem for semantic context-management in smart environments.

    Science.gov (United States)

    Almeida, Aitor; López-de-Ipiña, Diego

    2012-01-01

    To be able to react adequately a smart environment must be aware of the context and its changes. Modeling the context allows applications to better understand it and to adapt to its changes. In order to do this an appropriate formal representation method is needed. Ontologies have proven themselves to be one of the best tools to do it. Semantic inference provides a powerful framework to reason over the context data. But there are some problems with this approach. The inference over semantic context information can be cumbersome when working with a large amount of data. This situation has become more common in modern smart environments where there are a lot sensors and devices available. In order to tackle this problem we have developed a mechanism to distribute the context reasoning problem into smaller parts in order to reduce the inference time. In this paper we describe a distributed peer-to-peer agent architecture of context consumers and context providers. We explain how this inference sharing process works, partitioning the context information according to the interests of the agents, location and a certainty factor. We also discuss the system architecture, analyzing the negotiation process between the agents. Finally we compare the distributed reasoning with the centralized one, analyzing in which situations is more suitable each approach.

  19. Distribution coefficients for radionuclides in aquatic environments. Volume 2. Dialysis experiments in marine environments

    International Nuclear Information System (INIS)

    Sibley, T.H.; Nevissi, A.E.; Schell, W.R.

    1981-05-01

    The overall objective of this research program was to obtain new information that can be used to predict the fate of radionuclides that may enter the aquatic environment from nuclear power plants, waste storage facilities or fuel reprocessing plants. Important parameters for determining fate are the distribution of radionuclides between the soluble and particulate phases and the partitioning of radionuclides among various suspended particulates. This report presents the results of dialysis experiments that were used to study the distribution of radionuclides among suspended sediments, phytoplankton, organic detritus, and filtered sea water. Three experiments were conducted to investigate the adsorption kinetics and equilibrium distribution of (59)Fe, (60)Co, (65)Zn, (106)Ru, (137)Cs, (207)Bi, (238)Pu, and (241)Am in marine system. Diffusion across the dialysis membranes depends upon the physico-chemical form of the radionuclides, proceeding quite rapidly for ionic species of (137)Cs and (60)Co but much more slowly for radionuclides which occur primarily as colloids and solid precipitates such as (59)Fe, (207)Bi, and (241)Am. All the radionuclides adsorb to suspended particulates although the amount of adsorption depends upon the specific types and concentration of particulates in the system and the selected radionuclide. High affinity of some radionuclides - e.g., (106)Ru and (241)Am - for detritus and phytoplankton suggests that suspended organics may significantly affect the eventual fate of those radionuclides in marine ecosystems

  20. Development of a web service for analysis in a distributed network.

    Science.gov (United States)

    Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila

    2014-01-01

    We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among researchers at geographically dispersed institutes.

  1. Development of a Web Service for Analysis in a Distributed Network

    Science.gov (United States)

    Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila

    2014-01-01

    Objective: We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. Background: We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. Methods: We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. Discussion: During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Conclusion: Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among

  2. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2001-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures. Distribution System Modeling and Analysis helps prevent those errors. It gives readers a basic understanding of the modeling and operating characteristics of the major components of a distribution system. One by one, the author develops and analyzes each component as a stand-alone element, then puts them all together to analyze a distribution system comprising the various shunt and series devices for power-flow and short-circuit studies. He includes the derivation of all models and includes many num...

  3. Empirical analysis for Distributed Energy Resources' impact on future distribution network

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2012-01-01

    There has been a large body of statements claiming that the large scale deployment of Distributed Energy Resources (DERs) will eventually reshape the future distribution grid operation in various ways. Thus, it is interesting to introduce a platform to interpret to what extent the power system...... operation will be alternated. In this paper, quantitative results in terms of how the future distribution grid will be changed by the deployment of distributed generation, active demand and electric vehicles, are presented. The analysis is based on the conditions for both a radial and a meshed distribution...... network. The input parameters are based on the current and envisioned DER deployment scenarios proposed for Sweden....

  4. A Knowledge-based Environment for Software Process Performance Analysis

    Directory of Open Access Journals (Sweden)

    Natália Chaves Lessa Schots

    2015-08-01

    Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.

  5. The role of distributed generation (DG) in a restructured utility environment

    International Nuclear Information System (INIS)

    Feibus, H.

    1999-01-01

    A major consequence of the restructuring of the electric utility industry is disintegration, by which the traditional integrated utility is spinning off its generation business and becoming a power distribution company, or distco. This company will be the remaining entity of the traditional electric utility that continues to be regulated. The world in which the distco functions is becoming a very different place. The distco will be called upon to deliver not only power, but a range of ancillary services, defined by the Federal Energy Regulatory Commission, including spinning reserves, voltage regulation, reactive power, energy imbalance and network stability, some of which may be obtained from the independent system operator, and some of which may be provided by the distco. In this environment the distco must maintain system reliability and provide service to the customer at the least cost. Meanwhile, restructuring is spawning a new generation of unregulated energy service companies that threaten to win the most attractive customers from the distco. Fortunately there is a new emerging generation of technologies, distributed resources, that provide options to the distco to help retain prime customers, by improving reliability and lowering costs. Specifically, distributed generation and storage systems if dispersed into the distribution system can provide these benefits, if generators with the right characteristics are selected, and the integration into the distribution system is done skillfully. The Electric Power Research Institute has estimated that new distributed generation may account for 30% of new generation. This presentation will include the characteristics of several distributed resources and identify potential benefits that can be obtained through the proper integration of distributed generation and storage systems

  6. Integrated Design Engineering Analysis (IDEA) Environment Automated Generation of Structured CFD Grids using Topology Methods

    Science.gov (United States)

    Kamhawi, Hilmi N.

    2012-01-01

    This report documents the work performed from March 2010 to March 2012. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML) as a framework and supporting the configuration design and parametric CFD grid generation. This report will focus on describing the work in the area of parametric CFD grid generation using novel concepts for defining the interaction between the mesh topology and the geometry in such a way as to separate the mesh topology from the geometric topology while maintaining the link between the mesh topology and the actual geometry.

  7. The distribution of helium isotopes of natural gas and tectonic environment

    International Nuclear Information System (INIS)

    Sun Mingliang; Tao Mingxin

    1993-01-01

    Based on the 3 He/ 4 He data of the main oil-gas bearing basins in continental China, a systematic study has been made for the first time on the relations between the space distribution of the helium isotopes of natural gas and the tectonic environment. The average value R-bar of 3 He/ 4 He in eastern China bordering on the Pacific Ocean is 2.08 x 10 -6 >Ra, and that is dualistic mixed helium containing mantle source helium. The R-bar of central and western China is 4.96 x 10 -8 , and that is mainly crust source radioactive helium. The R-bar of Huabei and Zhongyuan oil-gas fields is 8.00 x 10 -7 , and that is a kind of transitional helium intercepted between the eastern region and the central western region of China. On the whole, the 3 He/ 4 He values decrease gradually with the distance from the subduction zone of the Western Pacific Ocean. The results show that the space distributions of the helium isotopes is controlled by the tectonic environment, that is the environment of tensile rift, especially in the neighborhood of deep megafractures advantageous to the rise of mantle source helium, so then and there the 3 He/ 4 He value is the highest; In the most stable craton basins, the value is the lowest and the helium is a typical crust source radioactive one. Between the active area (rift) and stable area, there is the transitional helium and its value is 10 -7 , as is the case in Huabei-Zhongyuan oil-gas field

  8. The structure and pyrolysis product distribution of lignite from different sedimentary environment

    International Nuclear Information System (INIS)

    Liu, Peng; Zhang, Dexiang; Wang, Lanlan; Zhou, Yang; Pan, Tieying; Lu, Xilan

    2016-01-01

    Highlights: • Carbon structure of three lignites was measured by solid "1"3C NMR. • Effect of carbon structure on pyrolysis product distribution was studied. • Tar yield is influenced by aliphatic carbon and oxygen functional group. • C1–C4 content of pyrolysis gas is related to CH_2/CH_3 ratio. - Abstract: Low-temperature pyrolysis is an economically efficient method for lignite to obtain coal tar and improve its combustion calorific value. The research on the distribution of pyrolysis product (especially coal tar yield) plays an important role in energy application and economic development in the now and future. Pyrolysis test was carried out in a tube reactor at 873 K for 15 min. The structure of the lignite was measured by solid "1"3C nuclear magnetic resonance (NMR) and Fourier transform infrared spectroscopy (FTIR). The thermal analysis was analyzed by thermo-gravimetric (TG) analyzer. The results show that the pyrolysis product distribution is related to the breakage of branch structures of aromatic ring in lignites from different sedimentary environment. The gas yield and composition are related to the decomposition of carbonyl group and the breakage of aliphatic carbon. The tar yield derived from lignite pyrolysis follows the order: Xianfeng lignite (XF, 13.67 wt.%) > Xiaolongtan lignite (XLT, 7.97 wt.%) > Inner Mongolia lignite (IM, 6.30 wt.%), which is mainly influenced by the aliphatic carbon contents, the CH_2/CH_3 ratio and the oxygen functional groups in lignite. The pyrolysis water yield depends on the decomposition of oxygen functional groups. IM has the highest content of oxygen-linked carbon so that the pyrolysis water yield derived from IM is the highest (9.20 wt.%), and is far more than that from the other two lignites.

  9. The distributed failure probability approach to dependent failure analysis, and its application

    International Nuclear Information System (INIS)

    Hughes, R.P.

    1989-01-01

    The Distributed Failure Probability (DFP) approach to the problem of dependent failures in systems is presented. The basis of the approach is that the failure probability of a component is a variable. The source of this variability is the change in the 'environment' of the component, where the term 'environment' is used to mean not only obvious environmental factors such as temperature etc., but also such factors as the quality of maintenance and manufacture. The failure probability is distributed among these various 'environments' giving rise to the Distributed Failure Probability method. Within the framework which this method represents, modelling assumptions can be made, based both on engineering judgment and on the data directly. As such, this DFP approach provides a soundly based and scrutable technique by which dependent failures can be quantitatively assessed. (orig.)

  10. Multi-objective optimization with estimation of distribution algorithm in a noisy environment.

    Science.gov (United States)

    Shim, Vui Ann; Tan, Kay Chen; Chia, Jun Yong; Al Mamun, Abdullah

    2013-01-01

    Many real-world optimization problems are subjected to uncertainties that may be characterized by the presence of noise in the objective functions. The estimation of distribution algorithm (EDA), which models the global distribution of the population for searching tasks, is one of the evolutionary computation techniques that deals with noisy information. This paper studies the potential of EDAs; particularly an EDA based on restricted Boltzmann machines that handles multi-objective optimization problems in a noisy environment. Noise is introduced to the objective functions in the form of a Gaussian distribution. In order to reduce the detrimental effect of noise, a likelihood correction feature is proposed to tune the marginal probability distribution of each decision variable. The EDA is subsequently hybridized with a particle swarm optimization algorithm in a discrete domain to improve its search ability. The effectiveness of the proposed algorithm is examined via eight benchmark instances with different characteristics and shapes of the Pareto optimal front. The scalability, hybridization, and computational time are rigorously studied. Comparative studies show that the proposed approach outperforms other state of the art algorithms.

  11. Business models for distributed generation in a liberalized market environment

    International Nuclear Information System (INIS)

    Gordijn, Jaap; Akkermans, Hans

    2007-01-01

    The analysis of the potential of emerging innovative technologies calls for a systems-theoretic approach that takes into account technical as well as socio-economic factors. This paper reports the main findings of several business case studies of different future applications in various countries of distributed power generation technologies, all based on a common methodology for networked business modeling and analysis. (author)

  12. Acquisitions Everywhere: Modeling an Acquisitions Data Standard to Connect a Distributed Environment

    OpenAIRE

    Hanson, Eric M.; Lightcap, Paul W.; Miguez, Matthew R.

    2016-01-01

    Acquisitions functions remain operationally crucial in providing access to paid information resources, but data formats and workflows utilized within library acquisitions remain primarily within the traditional integrated library system (ILS). As libraries have evolved to use distributed systems to manage information resources, so too must acquisitions functions adapt to an environment that may include the ILS, e‐resource management systems (ERMS), institutional repositories (IR), and other d...

  13. An environment for operation planning of electrical energy distribution systems; Um ambiente para planejamento da operacao de sistemas de distribuicao de energia eletrica

    Energy Technology Data Exchange (ETDEWEB)

    Hashimoto, Kleber

    1997-07-01

    The aim of operation planning of electrical energy distribution systems is to define a group of interventions in the distribution network in a way to reach a global optimization. Both normal operation and emergency conditions are considered including scheduled interventions in the network. Under normal conditions, some aspects as voltage drop and loss minimization, for example, are considered; under emergency conditions, the minimization of non-distributed energy and interruption time are taken into account. This proposes the development of a computer environment for operation planning studies. The design and implementation of a specific operation planning database, based on a relational database model, was carried out so that new information modules could be easily included. A statistical approach has been applied to the load model including consumption habits and seasonal information, for each consumer category. A group of operation planning functions is proposed, which includes simulation modules responsible for system analysis. These functions and the database should be designed together. The developed computer environment allows the inclusion of new functions, due to the standardized query language used to access the database. Two functions have been implemented for illustration. The development of the proposed environment constitutes a step towards an open operation planning system, an integration with other information systems and an easy-to-use system based on graphical interface. (author)

  14. The ATLAS distributed analysis system

    OpenAIRE

    Legger, F.

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During...

  15. Analysis Of Aspects Of Messages Hiding In Text Environments

    Directory of Open Access Journals (Sweden)

    Afanasyeva Olesya

    2015-09-01

    Full Text Available In the work are researched problems, which arise during hiding of messages in text environments, being transmitted by electronic communication channels and the Internet. The analysis of selection of places in text environment (TE, which can be replaced by word from the message is performed. Selection and replacement of words in the text environment is implemented basing on semantic analysis of text fragment, consisting of the inserted word, and its environment in TE. For implementation of such analysis is used concept of semantic parameters of words coordination and semantic value of separate word. Are used well-known methods of determination of values of these parameters. This allows moving from quality level to quantitative level analysis of text fragments semantics during their modification by word substitution. Invisibility of embedded messages is ensured by providing preset values of the semantic cooperation parameter deviations.

  16. Occurrence and distribution of steroids, hormones and selected pharmaceuticals in South Florida coastal environments.

    Science.gov (United States)

    Singh, Simrat P; Azua, Arlette; Chaudhary, Amit; Khan, Shabana; Willett, Kristine L; Gardinali, Piero R

    2010-02-01

    The common occurrence of human derived contaminants like pharmaceuticals, steroids and hormones in surface waters has raised the awareness of the role played by the release of treated or untreated sewage in the water quality along sensitive coastal ecosystems. South Florida is home of many important protected environments ranging from wetlands to coral reefs which are in close proximity to large metropolitan cities. Because, large portions of South Florida and most of the Florida Keys population are not served by modern sewage treatment plants and rely heavily on the use of septic systems, a comprehensive survey of selected human waste contamination markers was conducted in three areas to assess water quality with respect to non-traditional micro-constituents. This study documents the occurrence and distribution of fifteen hormones and steroids and five commonly detected pharmaceuticals in surface water samples collected from different near shore environments along South Florida between 2004 and 2006. The compounds most frequently detected were: cholesterol, caffeine, estrone, DEET, coprostanol, biphenol-A, beta-estradiol, and triclosan. The concentration detected for estrone and beta-estradiol were up to 5.2 and 1.8 ng/L, respectively. Concentrations of caffeine (5.5-68 ng/L) and DEET (4.8-49 ng/L) were generally higher and more prevalent than were the steroids. Distribution of microconstituents was site specific likely reflecting a diversity of sources. In addition to chemical analysis, the yeast estrogen screen assay was used to screen the samples for estrogen equivalency. Overall, the results show that water collected from inland canals and restricted circulation water bodies adjacent to heavily populated areas had high concentrations of multiple steroids, pharmaceuticals, and personal care products while open bay waters were largely devoid of the target analytes.

  17. The Future of the Global Environment: A Model-based Analysis Supporting UNEP's First Global Environment Outlook

    NARCIS (Netherlands)

    Bakkes JA; Woerden JW van; Alcamo J; Berk MM; Bol P; Born GJ van den; Brink BJE ten; Hettelingh JP; Langeweg F; Niessen LW; Swart RJ; United Nations Environment; MNV

    1997-01-01

    This report documents the scenario analysis in UNEP's first Global Environment Outlook, published at the same time as the scenario analysis. This Outlook provides a pilot assessment of developments in the environment, both global and regional, between now and 2015, with a further projection to

  18. Probabilistic analysis of flaw distribution on structure under cyclic load

    International Nuclear Information System (INIS)

    Kwak, Sang Log; Choi, Young Hwan; Kim, Hho Jung

    2003-01-01

    Flaw geometries, applied stress, and material properties are major input variables for the fracture mechanics analysis. Probabilistic approach can be applied for the consideration of uncertainties within these input variables. But probabilistic analysis requires many assumptions due to the lack of initial flaw distributions data. In this study correlations are examined between initial flaw distributions and in-service flaw distributions on structures under cyclic load. For the analysis, LEFM theories and Monte Carlo simulation are applied. Result shows that in-service flaw distributions are determined by initial flaw distributions rather than fatigue crack growth rate. So initial flaw distribution can be derived from in-service flaw distributions

  19. An Artificial Intelligence-Based Environment Quality Analysis System

    OpenAIRE

    Oprea , Mihaela; Iliadis , Lazaros

    2011-01-01

    Part 20: Informatics and Intelligent Systems Applications for Quality of Life information Services (ISQLIS) Workshop; International audience; The paper describes an environment quality analysis system based on a combination of some artificial intelligence techniques, artificial neural networks and rule-based expert systems. Two case studies of the system use are discussed: air pollution analysis and flood forecasting with their impact on the environment and on the population health. The syste...

  20. Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis

    Science.gov (United States)

    Chen, Lu; Singh, Vijay P.

    2018-02-01

    Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.

  1. Pigeon interaction mode switch-based UAV distributed flocking control under obstacle environments.

    Science.gov (United States)

    Qiu, Huaxin; Duan, Haibin

    2017-11-01

    Unmanned aerial vehicle (UAV) flocking control is a serious and challenging problem due to local interactions and changing environments. In this paper, a pigeon flocking model and a pigeon coordinated obstacle-avoiding model are proposed based on a behavior that pigeon flocks will switch between hierarchical and egalitarian interaction mode at different flight phases. Owning to the similarity between bird flocks and UAV swarms in essence, a distributed flocking control algorithm based on the proposed pigeon flocking and coordinated obstacle-avoiding models is designed to coordinate a heterogeneous UAV swarm to fly though obstacle environments with few informed individuals. The comparative simulation results are elaborated to show the feasibility, validity and superiority of our proposed algorithm. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework

    Science.gov (United States)

    Gannon, C.

    2017-12-01

    As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.

  3. A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods

    Science.gov (United States)

    Ritter, Nicola L.

    2012-01-01

    Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…

  4. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2016-04-01

    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  5. Detecting Distributed SQL Injection Attacks in a Eucalyptus Cloud Environment

    Science.gov (United States)

    Kebert, Alan; Barnejee, Bikramjit; Solano, Juan; Solano, Wanda

    2013-01-01

    The cloud computing environment offers malicious users the ability to spawn multiple instances of cloud nodes that are similar to virtual machines, except that they can have separate external IP addresses. In this paper we demonstrate how this ability can be exploited by an attacker to distribute his/her attack, in particular SQL injection attacks, in such a way that an intrusion detection system (IDS) could fail to identify this attack. To demonstrate this, we set up a small private cloud, established a vulnerable website in one instance, and placed an IDS within the cloud to monitor the network traffic. We found that an attacker could quite easily defeat the IDS by periodically altering its IP address. To detect such an attacker, we propose to use multi-agent plan recognition, where the multiple source IPs are considered as different agents who are mounting a collaborative attack. We show that such a formulation of this problem yields a more sophisticated approach to detecting SQL injection attacks within a cloud computing environment.

  6. On the Moments and the Distribution of Aggregate Discounted Claims in a Markovian Environment

    Directory of Open Access Journals (Sweden)

    Shuanming Li

    2018-05-01

    Full Text Available This paper studies the moments and the distribution of the aggregate discounted claims (ADCs in a Markovian environment, where the claim arrivals, claim amounts, and forces of interest (for discounting are influenced by an underlying Markov process. Specifically, we assume that claims occur according to a Markovian arrival process (MAP. The paper shows that the vector of joint Laplace transforms of the ADC occurring in each state of the environment process by any specific time satisfies a matrix-form first-order partial differential equation, through which a recursive formula is derived for the moments of the ADC occurring in certain states (a subset. We also study two types of covariances of the ADC occurring in any two subsets of the state space and with two different time lengths. The distribution of the ADC occurring in certain states by any specific time is also investigated. Numerical results are also presented for a two-state Markov-modulated model case.

  7. Knowledge exchange and learning from failures in distributed environments: The role of contractor relationship management and work characteristics

    International Nuclear Information System (INIS)

    Gressgård, Leif Jarle; Hansen, Kåre

    2015-01-01

    Learning from failures is vital for improvement of safety performance, reliability, and resilience in organizations. In order for such learning to take place in distributed environments, knowledge has to be shared among organizational members at different locations and units. This paper reports on a study conducted in the context of drilling and well operations on the Norwegian Continental Shelf, which represents a high-risk distributed organizational environment. The study investigates the relationships between organizations' abilities to learn from failures, knowledge exchange within and between organizational units, quality of contractor relationship management, and work characteristics. The results show that knowledge exchange between units is the most important predictor of perceived ability to learn from failures. Contractor relationship management, leadership involvement, role clarity, and empowerment are also important factors for failure-based learning, both directly and through increased knowledge exchange. The results of the study enhance our understanding of how abilities to learn from failures can be improved in distributed environments where similar work processes take place at different locations and involve employees from several companies. Theoretical contributions and practical implications are discussed. - Highlights: • We investigate factors affecting failure-based learning in distributed environments. • Knowledge exchange between units is the most important predictor. • Contractor relationship management is positively related to knowledge exchange. • Leadership involvement, role clarity, and empowerment are significant variables. • Respondents from an operator firm and eight contractors are included in the study

  8. Prediction of oil droplet size distribution in agitated aquatic environments

    International Nuclear Information System (INIS)

    Khelifa, A.; Lee, K.; Hill, P.S.

    2004-01-01

    Oil spilled at sea undergoes many transformations based on physical, biological and chemical processes. Vertical dispersion is the hydrodynamic mechanism controlled by turbulent mixing due to breaking waves, vertical velocity, density gradients and other environmental factors. Spilled oil is dispersed in the water column as small oil droplets. In order to estimate the mass of an oil slick in the water column, it is necessary to know how the droplets formed. Also, the vertical dispersion and fate of oil spilled in aquatic environments can be modelled if the droplet-size distribution of the oil droplets is known. An oil spill remediation strategy can then be implemented. This paper presented a newly developed Monte Carlo model to predict droplet-size distribution due to Brownian motion, turbulence and a differential settling at equilibrium. A kinematic model was integrated into the proposed model to simulate droplet breakage. The key physical input of the model is the maximum droplet size permissible in the simulation. Laboratory studies were found to be in good agreement with field studies. 26 refs., 1 tab., 5 figs

  9. Heavy-tailed distribution of the SSH Brute-force attack duration in a multi-user environment

    Science.gov (United States)

    Lee, Jae-Kook; Kim, Sung-Jun; Park, Chan Yeol; Hong, Taeyoung; Chae, Huiseung

    2016-07-01

    Quite a number of cyber-attacks to be place against supercomputers that provide highperformance computing (HPC) services to public researcher. Particularly, although the secure shell protocol (SSH) brute-force attack is one of the traditional attack methods, it is still being used. Because stealth attacks that feign regular access may occur, they are even harder to detect. In this paper, we introduce methods to detect SSH brute-force attacks by analyzing the server's unsuccessful access logs and the firewall's drop events in a multi-user environment. Then, we analyze the durations of the SSH brute-force attacks that are detected by applying these methods. The results of an analysis of about 10 thousands attack source IP addresses show that the behaviors of abnormal users using SSH brute-force attacks are based on human dynamic characteristics of a typical heavy-tailed distribution.

  10. mGrid: A load-balanced distributed computing environment for the remote execution of the user-defined Matlab code

    Directory of Open Access Journals (Sweden)

    Almeida Jonas S

    2006-03-01

    Full Text Available Abstract Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else. Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web

  11. mGrid: a load-balanced distributed computing environment for the remote execution of the user-defined Matlab code.

    Science.gov (United States)

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-03-15

    Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over

  12. Distribution of triclosan-resistant genes in major pathogenic microorganisms revealed by metagenome and genome-wide analysis.

    Directory of Open Access Journals (Sweden)

    Raees Khan

    Full Text Available The substantial use of triclosan (TCS has been aimed to kill pathogenic bacteria, but TCS resistance seems to be prevalent in microbial species and limited knowledge exists about TCS resistance determinants in a majority of pathogenic bacteria. We aimed to evaluate the distribution of TCS resistance determinants in major pathogenic bacteria (N = 231 and to assess the enrichment of potentially pathogenic genera in TCS contaminated environments. A TCS-resistant gene (TRG database was constructed and experimentally validated to predict TCS resistance in major pathogenic bacteria. Genome-wide in silico analysis was performed to define the distribution of TCS-resistant determinants in major pathogens. Microbiome analysis of TCS contaminated soil samples was also performed to investigate the abundance of TCS-resistant pathogens. We experimentally confirmed that TCS resistance could be accurately predicted using genome-wide in silico analysis against TRG database. Predicted TCS resistant phenotypes were observed in all of the tested bacterial strains (N = 17, and heterologous expression of selected TCS resistant genes from those strains conferred expected levels of TCS resistance in an alternative host Escherichia coli. Moreover, genome-wide analysis revealed that potential TCS resistance determinants were abundant among the majority of human-associated pathogens (79% and soil-borne plant pathogenic bacteria (98%. These included a variety of enoyl-acyl carrier protein reductase (ENRs homologues, AcrB efflux pumps, and ENR substitutions. FabI ENR, which is the only known effective target for TCS, was either co-localized with other TCS resistance determinants or had TCS resistance-associated substitutions. Furthermore, microbiome analysis revealed that pathogenic genera with intrinsic TCS-resistant determinants exist in TCS contaminated environments. We conclude that TCS may not be as effective against the majority of bacterial pathogens as previously

  13. Distribution of triclosan-resistant genes in major pathogenic microorganisms revealed by metagenome and genome-wide analysis

    Science.gov (United States)

    Khan, Raees; Roy, Nazish; Choi, Kihyuck

    2018-01-01

    The substantial use of triclosan (TCS) has been aimed to kill pathogenic bacteria, but TCS resistance seems to be prevalent in microbial species and limited knowledge exists about TCS resistance determinants in a majority of pathogenic bacteria. We aimed to evaluate the distribution of TCS resistance determinants in major pathogenic bacteria (N = 231) and to assess the enrichment of potentially pathogenic genera in TCS contaminated environments. A TCS-resistant gene (TRG) database was constructed and experimentally validated to predict TCS resistance in major pathogenic bacteria. Genome-wide in silico analysis was performed to define the distribution of TCS-resistant determinants in major pathogens. Microbiome analysis of TCS contaminated soil samples was also performed to investigate the abundance of TCS-resistant pathogens. We experimentally confirmed that TCS resistance could be accurately predicted using genome-wide in silico analysis against TRG database. Predicted TCS resistant phenotypes were observed in all of the tested bacterial strains (N = 17), and heterologous expression of selected TCS resistant genes from those strains conferred expected levels of TCS resistance in an alternative host Escherichia coli. Moreover, genome-wide analysis revealed that potential TCS resistance determinants were abundant among the majority of human-associated pathogens (79%) and soil-borne plant pathogenic bacteria (98%). These included a variety of enoyl-acyl carrier protein reductase (ENRs) homologues, AcrB efflux pumps, and ENR substitutions. FabI ENR, which is the only known effective target for TCS, was either co-localized with other TCS resistance determinants or had TCS resistance-associated substitutions. Furthermore, microbiome analysis revealed that pathogenic genera with intrinsic TCS-resistant determinants exist in TCS contaminated environments. We conclude that TCS may not be as effective against the majority of bacterial pathogens as previously presumed

  14. The Future of the Global Environment: A Model-based Analysis Supporting UNEP's First Global Environment Outlook

    OpenAIRE

    Bakkes JA; Woerden JW van; Alcamo J; Berk MM; Bol P; Born GJ van den; Brink BJE ten; Hettelingh JP; Langeweg F; Niessen LW; Swart RJ; United Nations Environment Programme (UNEP), Nairobi, Kenia; MNV

    1997-01-01

    This report documents the scenario analysis in UNEP's first Global Environment Outlook, published at the same time as the scenario analysis. This Outlook provides a pilot assessment of developments in the environment, both global and regional, between now and 2015, with a further projection to 2050. The study was carried out in support of the Agenda 21 interim evaluation, five years after 'Rio' and ten years after 'Brundtland'. The scenario analysis is based on only one scenario, Conventional...

  15. Distributed Algorithms for Time Optimal Reachability Analysis

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    . We propose distributed computing to accelerate time optimal reachability analysis. We develop five distributed state exploration algorithms, implement them in \\uppaal enabling it to exploit the compute resources of a dedicated model-checking cluster. We experimentally evaluate the implemented...... algorithms with four models in terms of their ability to compute near- or proven-optimal solutions, their scalability, time and memory consumption and communication overhead. Our results show that distributed algorithms work much faster than sequential algorithms and have good speedup in general.......Time optimal reachability analysis is a novel model based technique for solving scheduling and planning problems. After modeling them as reachability problems using timed automata, a real-time model checker can compute the fastest trace to the goal states which constitutes a time optimal schedule...

  16. Assessing safety risk in electricity distribution processes using ET & BA improved technique and its ranking by VIKOR and TOPSIS models in fuzzy environment

    OpenAIRE

    S. Rahmani; M. Omidvari

    2016-01-01

    Introduction: Electrical industries are among high risk industries. The present study aimed to assess safety risk in electricity distribution processes using  ET&BA technique and also to compare with both VIKOR & TOPSIS methods in fuzzy environments.   Material and Methods: The present research is a descriptive study and ET&BA worksheet is the main data collection tool. Both Fuzzy TOPSIS and Fuzzy VIKOR methods were used for the worksheet analysis.   Result: Findi...

  17. Design for Distributed Moroccan Hospital Pharmacy Information Environment with Service Oriented Architecture

    OpenAIRE

    Omrana, Hajar; Nassiri, Safae; Belouadha, Fatima-Zahra; Roudiés, Ounsa

    2012-01-01

    In the last five years, Moroccan e-health system has focused on improving the quality of patient care services by making use of advanced Information and Communications Technologies (ICT) solutions. In actual fact, achieving runtime and efficient information sharing, through large-scale distributed environments such as e-health system, is not a trivial task. It seems to present many issues due to the heterogeneity and complex nature of data resources. This concerns, in particular, Moroccan Hos...

  18. Modelling and analysis of distributed simulation protocols with distributed graph transformation

    OpenAIRE

    Lara, Juan de; Taentzer, Gabriele

    2005-01-01

    Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. J. de Lara, and G. Taentzer, "Modelling and analysis of distributed simulation protocols with distributed graph transformation...

  19. Mosquito distribution in a saltmarsh: determinants of eggs in a variable environment.

    Science.gov (United States)

    Rowbottom, Raylea; Carver, Scott; Barmuta, Leon A; Weinstein, Philip; Allen, Geoff R

    2017-06-01

    Two saltmarsh mosquitoes dominate the transmission of Ross River virus (RRV, Togoviridae: Alphavirus), one of Australia's most prominent mosquito-borne diseases. Ecologically, saltmarshes vary in their structure, including habitat types, hydrological regimes, and diversity of aquatic fauna, all of which drive mosquito oviposition behavior. Understanding the distribution of vector mosquitoes within saltmarshes can inform early warning systems, surveillance, and management of vector populations. The aim of this study was to identify the distribution of Ae. camptorhynchus, a known vector for RRV, across a saltmarsh and investigate the influence that other invertebrate assemblage might have on Ae. camptorhynchus egg dispersal. We demonstrate that vegetation is a strong indicator for Ae. camptorhynchus egg distribution, and this was not correlated with elevation or other invertebrates located at this saltmarsh. Also, habitats within this marsh are less frequently inundated, resulting in dryer conditions. We conclude that this information can be applied in vector surveillance and monitoring of temperate saltmarsh environments and also provides a baseline for future investigations into understanding mosquito vector habitat requirements. © 2017 The Society for Vector Ecology.

  20. Using high performance interconnects in a distributed computing and mass storage environment

    International Nuclear Information System (INIS)

    Ernst, M.

    1994-01-01

    Detector Collaborations of the HERA Experiments typically involve more than 500 physicists from a few dozen institutes. These physicists require access to large amounts of data in a fully transparent manner. Important issues include Distributed Mass Storage Management Systems in a Distributed and Heterogeneous Computing Environment. At the very center of a distributed system, including tens of CPUs and network attached mass storage peripherals are the communication links. Today scientists are witnessing an integration of computing and communication technology with the open-quote network close-quote becoming the computer. This contribution reports on a centrally operated computing facility for the HERA Experiments at DESY, including Symmetric Multiprocessor Machines (84 Processors), presently more than 400 GByte of magnetic disk and 40 TB of automoted tape storage, tied together by a HIPPI open-quote network close-quote. Focussing on the High Performance Interconnect technology, details will be provided about the HIPPI based open-quote Backplane close-quote configured around a 20 Gigabit/s Multi Media Router and the performance and efficiency of the related computer interfaces

  1. Transient stability analysis of a distribution network with distributed generators

    NARCIS (Netherlands)

    Xyngi, I.; Ishchenko, A.; Popov, M.; Sluis, van der L.

    2009-01-01

    This letter describes the transient stability analysis of a 10-kV distribution network with wind generators, microturbines, and CHP plants. The network being modeled in Matlab/Simulink takes into account detailed dynamic models of the generators. Fault simulations at various locations are

  2. Spatio-Temporal Analysis of Urban Acoustic Environments with Binaural Psycho-Acoustical Considerations for IoT-Based Applications.

    Science.gov (United States)

    Segura-Garcia, Jaume; Navarro-Ruiz, Juan Miguel; Perez-Solano, Juan J; Montoya-Belmonte, Jose; Felici-Castell, Santiago; Cobos, Maximo; Torres-Aranda, Ana M

    2018-02-26

    Sound pleasantness or annoyance perceived in urban soundscapes is a major concern in environmental acoustics. Binaural psychoacoustic parameters are helpful to describe generic acoustic environments, as it is stated within the ISO 12913 framework. In this paper, the application of a Wireless Acoustic Sensor Network (WASN) to evaluate the spatial distribution and the evolution of urban acoustic environments is described. Two experiments are presented using an indoor and an outdoor deployment of a WASN with several nodes using an Internet of Things (IoT) environment to collect audio data and calculate meaningful parameters such as the sound pressure level, binaural loudness and binaural sharpness. A chunk of audio is recorded in each node periodically with a microphone array and the binaural rendering is conducted by exploiting the estimated directional characteristics of the incoming sound by means of DOA estimation. Each node computes the parameters in a different location and sends the values to a cloud-based broker structure that allows spatial statistical analysis through Kriging techniques. A cross-validation analysis is also performed to confirm the usefulness of the proposed system.

  3. Gamma-ray Burst Formation Environment: Comparison of Redshift Distributions of GRB Afterglows

    Directory of Open Access Journals (Sweden)

    Sung-Eun Kim

    2005-12-01

    Full Text Available Since gamma-ray bursts(GRBs have been first known to science societites in 1973, many scientists are involved in their studies. Observations of GRB afterglows provide us with much information on the environment in which the observed GRBs are born. Study of GRB afterglows deals with longer timescale emissions in lower energy bands (e.g., months or even up to years than prompt emissions in gamma-rays. Not all the bursts accompany afterglows in whole ranges of wavelengths. It has been suggested as a reason for that, for instance, that radio and/or X-ray afterglows are not recorded mainly due to lower sensitivity of detectors, and optical afterglows due to extinctions in intergalactic media or self-extinctions within a host galaxy itself. Based on the idea that these facts may also provide information on the GRB environment, we analyze statistical properties of GRB afterglows. We first select samples of the redshift-known GRBs according to the wavelength of afterglow they accompanied. We then compare their distributions as a function of redshift, using statistical methods. As a results, we find that the distribution of the GRBs with X-ray afterglows is consistent with that of the GRBs with optical afterglows. We, therefore, conclude that the lower detection rate of optical afterglows is not due to extinctions in intergalactic media.

  4. Distribution of lod scores in oligogenic linkage analysis.

    Science.gov (United States)

    Williams, J T; North, K E; Martin, L J; Comuzzie, A G; Göring, H H; Blangero, J

    2001-01-01

    In variance component oligogenic linkage analysis it can happen that the residual additive genetic variance bounds to zero when estimating the effect of the ith quantitative trait locus. Using quantitative trait Q1 from the Genetic Analysis Workshop 12 simulated general population data, we compare the observed lod scores from oligogenic linkage analysis with the empirical lod score distribution under a null model of no linkage. We find that zero residual additive genetic variance in the null model alters the usual distribution of the likelihood-ratio statistic.

  5. COMPARATIVE ANALYSIS OF THE BEHAVIOR OF COAXIAL AND FRONTAL COUPLINGS – WITH PERMANENT MAGNETS – IN HIGH TEMPERATURE ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    Marcel Oanca

    2004-12-01

    Full Text Available This paper presents a comparative analysis of the behavior of coaxial and frontal couplings – with permanent magnets – in high temperature environments specific to iron and steel industry. The comparative analysis is made at the level of the specific forces developed in the most difficult environments. The maximum temperature was limited for reasons of thermal stability of the Nd-Fe-B permanent magnets. In this context it was studied, by the help of the PDE-ase soft that uses the finite element method, the way magnetic induction modifies, the specific forces developed and the distribution of temperature within the coaxial and frontal couplers with permanent magnets, for variations of the distance between the magnets (air gap within the limits 2-20 mm.

  6. Rare earth elements determination and distribution patterns in sediments of polluted marine environment by instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Akyil, S.; Yusof, A.M.; Wood, A.K.H.

    2001-01-01

    Results obtained from the analysis of sediment core samples taken from a fairly polluted marine environment were analyzed for the REE contents to determine the concentrations of La, Ce, Sm, Eu, Tb, Dy and Yb using instrumental neutron activation analysis. Core samples were divided into strata of between 2 to 3 cm intervals and prepared in the powdered form before irradiating them in a TRIGA Mk.II reactor. Down-core concentration profiles of La, Ce, Sm, Eu, Tb, Dy and Yb in 3 core sediments from three sites are obtained. The shale-normalized REE pattern from each site was examined and later used to explain the history of sedimentation by natural processes such as shoreline erosion and weathering products deposited on the seabed and furnishing some baseline data and/or pollution trend occurring within the study area

  7. Size distributions of aerosols in an indoor environment with engineered nanoparticle synthesis reactors operating under different scenarios

    International Nuclear Information System (INIS)

    Sahu, Manoranjan; Biswas, Pratim

    2010-01-01

    Size distributions of nanoparticles in the vicinity of synthesis reactors will provide guidelines for safe operation and protection of workers. Nanoparticle concentrations and size distributions were measured in a research academic laboratory environment with two different types of gas-phase synthesis reactors under a variety of operating conditions. The variation of total particle number concentration and size distribution at different distances from the reactor, off-design state of the fume hood, powder handling during recovery, and maintenance of reactors are established. Significant increases in number concentration were observed at all the locations during off-design conditions (i.e., failure of the exhaust system). Clearance of nanoparticles from the work environment was longer under off-design conditions (20 min) compared to that under normal hood operating conditions (4-6 min). While lower particle number concentrations are observed during operation of furnace aerosol reactors in comparison to flame aerosol reactors, the handling, processing, and maintenance operations result in elevated concentrations in the work area.

  8. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jianhua Ni

    2016-08-01

    Full Text Available The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.

  9. Automatic determination of L/H transition times in DIII-D through a collaborative distributed environment

    International Nuclear Information System (INIS)

    Farias, G.; Vega, J.; González, S.; Pereira, A.; Lee, X.; Schissel, D.; Gohil, P.

    2012-01-01

    Highlights: ► An automatic predictor of L/H transition times has been implemented for the DIII-D tokamak. ► The system predicts the transition combining two techniques: a morphological pattern recognition algorithm and a support vector machines multi-layer model. ► The predictor is employed within a collaborative distributed computing environment. The system is trained remotely in the Ciemat computer cluster and operated on the DIII-D site. - Abstract: An automatic predictor of L/H transition times has been implemented for the DIII-D tokamak. The system predicts the transition combining two techniques: A morphological pattern recognition algorithm, which estimates the transition based on the waveform of a Dα emission signal, and a support vector machines multi-layer model, which predicts the L/H transition using a non-parametric model. The predictor is employed within a collaborative distributed computing environment. The system is trained remotely in the Ciemat computer cluster and operated on the DIII-D site.

  10. Dynamic Distribution and Layouting of Model-Based User Interfaces in Smart Environments

    Science.gov (United States)

    Roscher, Dirk; Lehmann, Grzegorz; Schwartze, Veit; Blumendorf, Marco; Albayrak, Sahin

    The developments in computer technology in the last decade change the ways of computer utilization. The emerging smart environments make it possible to build ubiquitous applications that assist users during their everyday life, at any time, in any context. But the variety of contexts-of-use (user, platform and environment) makes the development of such ubiquitous applications for smart environments and especially its user interfaces a challenging and time-consuming task. We propose a model-based approach, which allows adapting the user interface at runtime to numerous (also unknown) contexts-of-use. Based on a user interface modelling language, defining the fundamentals and constraints of the user interface, a runtime architecture exploits the description to adapt the user interface to the current context-of-use. The architecture provides automatic distribution and layout algorithms for adapting the applications also to contexts unforeseen at design time. Designers do not specify predefined adaptations for each specific situation, but adaptation constraints and guidelines. Furthermore, users are provided with a meta user interface to influence the adaptations according to their needs. A smart home energy management system serves as running example to illustrate the approach.

  11. Analysis of color environment in nuclear power plants

    International Nuclear Information System (INIS)

    Natori, Kazuyuki; Akagi, Ichiro; Souma, Ichiro; Hiraki, Tadao; Sakurai, Yukihiro.

    1996-01-01

    This article reports the results of color and psychological analysis of the outlook of nuclear power plants and the visual environments inside of the plants. Study one was the color measurements of the outlook of nuclear plants and the visual environment inside of the plants. Study two was a survey of the impressions on the visual environments of nuclear plants obtained from observers and interviews of the workers. Through these analysis, we have identified the present state of, and the problems of the color environments of the nuclear plants. In the next step, we have designed the color environments of inside and outside of the nuclear plants which we would recommend (inside designs were about fuel handling room, operation floor of turbine building, observers' pathways, central control room, rest room for the operators). Study three was the survey about impressions on our design inside and outside of the nuclear plants. Nuclear plant observers, residents in Osaka city, residents near the nuclear plants, the operators, employees of subsidiary company and the PR center guides rated their impressions on the designs. Study four was the survey about the design of the rest room for the operators controlling the plants. From the results of four studies, we have proposed some guidelines and problems about the future planning about the visual environments of nuclear power plants. (author)

  12. Expressing Environment Assumptions and Real-time Requirements for a Distributed Embedded System with Shared Variables

    DEFF Research Database (Denmark)

    Tjell, Simon; Fernandes, João Miguel

    2008-01-01

    In a distributed embedded system, it is often necessary to share variables among its computing nodes to allow the distribution of control algorithms. It is therefore necessary to include a component in each node that provides the service of variable sharing. For that type of component, this paper...... for the component. The CPN model can be used to validate the environment assumptions and the requirements. The validation is performed by execution of the model during which traces of events and states are automatically generated and evaluated against the requirements....

  13. About normal distribution on SO(3) group in texture analysis

    Science.gov (United States)

    Savyolova, T. I.; Filatov, S. V.

    2017-12-01

    This article studies and compares different normal distributions (NDs) on SO(3) group, which are used in texture analysis. Those NDs are: Fisher normal distribution (FND), Bunge normal distribution (BND), central normal distribution (CND) and wrapped normal distribution (WND). All of the previously mentioned NDs are central functions on SO(3) group. CND is a subcase for normal CLT-motivated distributions on SO(3) (CLT here is Parthasarathy’s central limit theorem). WND is motivated by CLT in R 3 and mapped to SO(3) group. A Monte Carlo method for modeling normally distributed values was studied for both CND and WND. All of the NDs mentioned above are used for modeling different components of crystallites orientation distribution function in texture analysis.

  14. Large Survey Database: A Distributed Framework for Storage and Analysis of Large Datasets

    Science.gov (United States)

    Juric, Mario

    2011-01-01

    The Large Survey Database (LSD) is a Python framework and DBMS for distributed storage, cross-matching and querying of large survey catalogs (>10^9 rows, >1 TB). The primary driver behind its development is the analysis of Pan-STARRS PS1 data. It is specifically optimized for fast queries and parallel sweeps of positionally and temporally indexed datasets. It transparently scales to more than >10^2 nodes, and can be made to function in "shared nothing" architectures. An LSD database consists of a set of vertically and horizontally partitioned tables, physically stored as compressed HDF5 files. Vertically, we partition the tables into groups of related columns ('column groups'), storing together logically related data (e.g., astrometry, photometry). Horizontally, the tables are partitioned into partially overlapping ``cells'' by position in space (lon, lat) and time (t). This organization allows for fast lookups based on spatial and temporal coordinates, as well as data and task distribution. The design was inspired by the success of Google BigTable (Chang et al., 2006). Our programming model is a pipelined extension of MapReduce (Dean and Ghemawat, 2004). An SQL-like query language is used to access data. For complex tasks, map-reduce ``kernels'' that operate on query results on a per-cell basis can be written, with the framework taking care of scheduling and execution. The combination leverages users' familiarity with SQL, while offering a fully distributed computing environment. LSD adds little overhead compared to direct Python file I/O. In tests, we sweeped through 1.1 Grows of PanSTARRS+SDSS data (220GB) less than 15 minutes on a dual CPU machine. In a cluster environment, we achieved bandwidths of 17Gbits/sec (I/O limited). Based on current experience, we believe LSD should scale to be useful for analysis and storage of LSST-scale datasets. It can be downloaded from http://mwscience.net/lsd.

  15. ANALYSIS OF VIRTUAL ENVIRONMENT BENEFIT IN E-LEARNING

    Directory of Open Access Journals (Sweden)

    NOVÁK, Martin

    2013-06-01

    Full Text Available The analysis of the virtual environment assets towards the e-learning process improvements is mentioned in this article. The virtual environment was created within the solution of the project ‘Virtualization’ at the Faculty of Economics and Administration, University of Pardubice. The aim of this project was to eliminate the disproportion of free access to licensed software between groups of part-time and full-time students. The research was realized within selected subjects of the study program System Engineering and Informatics. The subjects were connected to the informatics, applied informatics, control and decision making. Student subject results, student feedback based on electronic questionnaire and data from log file of virtual server usage were compared and analysed. Based on analysis of virtualization possibilities the solution of virtual environment was implemented through Microsoft Terminal Server.

  16. Distribution and importance of microplastics in the marine environment: A review of the sources, fate, effects, and potential solutions.

    Science.gov (United States)

    Auta, H S; Emenike, C U; Fauziah, S H

    2017-05-01

    The presence of microplastics in the marine environment poses a great threat to the entire ecosystem and has received much attention lately as the presence has greatly impacted oceans, lakes, seas, rivers, coastal areas and even the Polar Regions. Microplastics are found in most commonly utilized products (primary microplastics), or may originate from the fragmentation of larger plastic debris (secondary microplastics). The material enters the marine environment through terrestrial and land-based activities, especially via runoffs and is known to have great impact on marine organisms as studies have shown that large numbers of marine organisms have been affected by microplastics. Microplastic particles have been found distributed in large numbers in Africa, Asia, Southeast Asia, India, South Africa, North America, and in Europe. This review describes the sources and global distribution of microplastics in the environment, the fate and impact on marine biota, especially the food chain. Furthermore, the control measures discussed are those mapped out by both national and international environmental organizations for combating the impact from microplastics. Identifying the main sources of microplastic pollution in the environment and creating awareness through education at the public, private, and government sectors will go a long way in reducing the entry of microplastics into the environment. Also, knowing the associated behavioral mechanisms will enable better understanding of the impacts for the marine environment. However, a more promising and environmentally safe approach could be provided by exploiting the potentials of microorganisms, especially those of marine origin that can degrade microplastics. The concentration, distribution sources and fate of microplastics in the global marine environment were discussed, so also was the impact of microplastics on a wide range of marine biota. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Optimal sampling theory and population modelling - Application to determination of the influence of the microgravity environment on drug distribution and elimination

    Science.gov (United States)

    Drusano, George L.

    1991-01-01

    The optimal sampling theory is evaluated in applications to studies related to the distribution and elimination of several drugs (including ceftazidime, piperacillin, and ciprofloxacin), using the SAMPLE module of the ADAPT II package of programs developed by D'Argenio and Schumitzky (1979, 1988) and comparing the pharmacokinetic parameter values with results obtained by traditional ten-sample design. The impact of the use of optimal sampling was demonstrated in conjunction with NONMEM (Sheiner et al., 1977) approach, in which the population is taken as the unit of analysis, allowing even fragmentary patient data sets to contribute to population parameter estimates. It is shown that this technique is applicable in both the single-dose and the multiple-dose environments. The ability to study real patients made it possible to show that there was a bimodal distribution in ciprofloxacin nonrenal clearance.

  18. Phylogenetic diversity and environment-specific distributions of glycosyl hydrolase family 10 xylanases in geographically distant soils.

    Directory of Open Access Journals (Sweden)

    Guozeng Wang

    Full Text Available BACKGROUND: Xylan is one of the most abundant biopolymers on Earth. Its degradation is mediated primarily by microbial xylanase in nature. To explore the diversity and distribution patterns of xylanase genes in soils, samples of five soil types with different physicochemical characters were analyzed. METHODOLOGY/PRINCIPAL FINDINGS: Partial xylanase genes of glycoside hydrolase (GH family 10 were recovered following direct DNA extraction from soil, PCR amplification and cloning. Combined with our previous study, a total of 1084 gene fragments were obtained, representing 366 OTUs. More than half of the OTUs were novel (identities of <65% with known xylanases and had no close relatives based on phylogenetic analyses. Xylanase genes from all the soil environments were mainly distributed in Bacteroidetes, Proteobacteria, Acidobacteria, Firmicutes, Actinobacteria, Dictyoglomi and some fungi. Although identical sequences were found in several sites, habitat-specific patterns appeared to be important, and geochemical factors such as pH and oxygen content significantly influenced the compositions of xylan-degrading microbial communities. CONCLUSION/SIGNIFICANCE: These results provide insight into the GH 10 xylanases in various soil environments and reveal that xylan-degrading microbial communities are environment specific with diverse and abundant populations.

  19. Distributed Data Analysis in ATLAS

    CERN Document Server

    Nilsson, P; The ATLAS collaboration

    2012-01-01

    Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and NorduGrid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interfa...

  20. Distribution of hydrocarbon-degrading bacteria in the soil environment and their contribution to bioremediation.

    Science.gov (United States)

    Fukuhara, Yuki; Horii, Sachie; Matsuno, Toshihide; Matsumiya, Yoshiki; Mukai, Masaki; Kubo, Motoki

    2013-05-01

    A real-time PCR quantification method for indigenous hydrocarbon-degrading bacteria (HDB) carrying the alkB gene in the soil environment was developed to investigate their distribution in soil. The detection limit of indigenous HDB by the method was 1 × 10(6) cells/g-soil. The indigenous HDB were widely distributed throughout the soil environment and ranged from 3.7 × 10(7) to 5.0 × 10(8) cells/g-soil, and the ratio to total bacteria was 0.1-4.3 %. The dynamics of total bacteria, indigenous HDB, and Rhodococcus erythropolis NDKK6 (carrying alkB R2) during bioremediation were analyzed. During bioremediation with an inorganic nutrient treatment, the numbers of these bacteria were slightly increased. The numbers of HDB (both indigenous bacteria and strain NDKK6) were gradually decreased from the middle stage of bioremediation. Meanwhile, the numbers of these bacteria were highly increased and were maintained during bioremediation with an organic nutrient. The organic treatment led to activation of not only the soil bacteria but also the HDB, so an efficient bioremediation was carried out.

  1. Fillers as Signs of Distributional Learning

    Science.gov (United States)

    Taelman, Helena; Durieux, Gert; Gillis, Steven

    2009-01-01

    A longitudinal analysis is presented of the fillers of a Dutch-speaking child between 1;10 and 2;7. Our analysis corroborates familiar regularities reported in the literature: most fillers resemble articles in shape and distribution, and are affected by rhythmic and positional constraints. A novel finding is the impact of the lexical environment:…

  2. Spatial and temporal distribution of free-living protozoa in aquatic environments of a Brazilian semi-arid region

    Directory of Open Access Journals (Sweden)

    Maria Luisa Quinino de Medeiros

    2013-08-01

    Full Text Available Free-living protozoa organisms are distributed in aquatic environments and vary widely in both qualitative and quantitative terms. The unique ecological functions they exhibit in their habitats help to maintain the dynamic balance of these environments. Despite their wide range and abundance, studies on geographical distribution and ecology, when compared to other groups, are still scarce. This study aimed to identify and record the occurrence of free-living protozoa at three points in Piancó-Piranhas-Açu basin, in a semi-arid area of Rio Grande do Norte (RN state, and to relate the occurrence of taxa with variations in chlorophyll a, pH and temperature in the environments. Samples were collected in the Armando Ribeiro Gonçalves Dam, from two lentic environments upstream and a lotic ecosystem downstream. Sixty-five taxa of free-living protozoa were found. The Student's t-test showed significant inter-variable differences (p <0.05. Similar protozoan species were recorded under different degrees of trophic status according to chlorophyll a concentrations, suggesting the organisms identified are not ideal for indicating trophic level. We hypothesize that food availability, the influence of lentic and lotic systems and the presence of aquatic macrophytes influenced protozoan dynamics during the study period.

  3. Measurement based scenario analysis of short-range distribution system planning

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Bak-Jensen, Birgitte; Chen, Zhe

    2009-01-01

    This paper focuses on short-range distribution system planning using a probabilistic approach. Empirical probabilistic distributions of load demand and distributed generations are derived from the historical measurement data and incorporated into the system planning. Simulations with various...... feasible scenarios are performed based on a local distribution system at Støvring in Denmark. Simulation results provide more accurate and insightful information for the decision-maker when using the probabilistic analysis than using the worst-case analysis, so that a better planning can be achieved....

  4. Spatio-Temporal Analysis of Urban Acoustic Environments with Binaural Psycho-Acoustical Considerations for IoT-Based Applications

    Directory of Open Access Journals (Sweden)

    Jaume Segura-Garcia

    2018-02-01

    Full Text Available Sound pleasantness or annoyance perceived in urban soundscapes is a major concern in environmental acoustics. Binaural psychoacoustic parameters are helpful to describe generic acoustic environments, as it is stated within the ISO 12913 framework. In this paper, the application of a Wireless Acoustic Sensor Network (WASN to evaluate the spatial distribution and the evolution of urban acoustic environments is described. Two experiments are presented using an indoor and an outdoor deployment of a WASN with several nodes using an Internet of Things (IoT environment to collect audio data and calculate meaningful parameters such as the sound pressure level, binaural loudness and binaural sharpness. A chunk of audio is recorded in each node periodically with a microphone array and the binaural rendering is conducted by exploiting the estimated directional characteristics of the incoming sound by means of DOA estimation. Each node computes the parameters in a different location and sends the values to a cloud-based broker structure that allows spatial statistical analysis through Kriging techniques. A cross-validation analysis is also performed to confirm the usefulness of the proposed system.

  5. [Relationships of bee population fluctuation and distribution with natural environment in Anhui province].

    Science.gov (United States)

    Yu, Linsheng; Zou, Yunding; Bi, Shoudong; Wu, Houzhang; Cao, Yifeng

    2006-08-01

    In 2002 to approximately 2004, an investigation was made on the bee population dynamics and its relationships with the ecological environment in four ecological regions of Anhui Province. The results indicated that in the mountainous areas of south and west Anhui, there were 46 and 37 species of nectariferous plants, and the distribution density of Apis cerena cerena population was 2.01 and 1.95 colony x km(-2), respectively. In Jianghuai area and Huaibei plain, there were 17 and 12 species of nectariferous plants, which had concentrated and short flowering period and fitted for Apis mellifera Ligustica oysterring and producing, and the distribution density of Apis cerena cerena population was 0. 06 and 0. 02 colony x km(-2), respectively. Bee population fluctuation and distribution was affected by wasp predation. The breeding proportion of Apis cerena cerena to local apis population was 41.5%, 36.8%, 3.1% and 1.1%, and that of Apis mellifera Ligustica was 58.5%, 63.2%, 96.9% and 98.9% in the mountainous areas of south and west Anhui, Jianghuai area, and Huaibei plain, respectively.

  6. Staghorn: An Automated Large-Scale Distributed System Analysis Platform

    Energy Technology Data Exchange (ETDEWEB)

    Gabert, Kasimir [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Burns, Ian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Elliott, Steven [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kallaher, Jenna [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vail, Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-09-01

    Conducting experiments on large-scale distributed computing systems is becoming significantly easier with the assistance of emulation. Researchers can now create a model of a distributed computing environment and then generate a virtual, laboratory copy of the entire system composed of potentially thousands of virtual machines, switches, and software. The use of real software, running at clock rate in full virtual machines, allows experiments to produce meaningful results without necessitating a full understanding of all model components. However, the ability to inspect and modify elements within these models is bound by the limitation that such modifications must compete with the model, either running in or alongside it. This inhibits entire classes of analyses from being conducted upon these models. We developed a mechanism to snapshot an entire emulation-based model as it is running. This allows us to \\freeze time" and subsequently fork execution, replay execution, modify arbitrary parts of the model, or deeply explore the model. This snapshot includes capturing packets in transit and other input/output state along with the running virtual machines. We were able to build this system in Linux using Open vSwitch and Kernel Virtual Machines on top of Sandia's emulation platform Firewheel. This primitive opens the door to numerous subsequent analyses on models, including state space exploration, debugging distributed systems, performance optimizations, improved training environments, and improved experiment repeatability.

  7. Distance Determination Method for Normally Distributed Obstacle Avoidance of Mobile Robots in Stochastic Environments

    Directory of Open Access Journals (Sweden)

    Jinhong Noh

    2016-04-01

    Full Text Available Obstacle avoidance methods require knowledge of the distance between a mobile robot and obstacles in the environment. However, in stochastic environments, distance determination is difficult because objects have position uncertainty. The purpose of this paper is to determine the distance between a robot and obstacles represented by probability distributions. Distance determination for obstacle avoidance should consider position uncertainty, computational cost and collision probability. The proposed method considers all of these conditions, unlike conventional methods. It determines the obstacle region using the collision probability density threshold. Furthermore, it defines a minimum distance function to the boundary of the obstacle region with a Lagrange multiplier method. Finally, it computes the distance numerically. Simulations were executed in order to compare the performance of the distance determination methods. Our method demonstrated a faster and more accurate performance than conventional methods. It may help overcome position uncertainty issues pertaining to obstacle avoidance, such as low accuracy sensors, environments with poor visibility or unpredictable obstacle motion.

  8. LHCb Distributed Data Analysis on the Computing Grid

    CERN Document Server

    Paterson, S; Parkes, C

    2006-01-01

    LHCb is one of the four Large Hadron Collider (LHC) experiments based at CERN, the European Organisation for Nuclear Research. The LHC experiments will start taking an unprecedented amount of data when they come online in 2007. Since no single institute has the compute resources to handle this data, resources must be pooled to form the Grid. Where the Internet has made it possible to share information stored on computers across the world, Grid computing aims to provide access to computing power and storage capacity on geographically distributed systems. LHCb software applications must work seamlessly on the Grid allowing users to efficiently access distributed compute resources. It is essential to the success of the LHCb experiment that physicists can access data from the detector, stored in many heterogeneous systems, to perform distributed data analysis. This thesis describes the work performed to enable distributed data analysis for the LHCb experiment on the LHC Computing Grid.

  9. Distribution of coagulase-negative Staphylococcus species from milk and environment of dairy cows differs between herds.

    Science.gov (United States)

    Piessens, V; Van Coillie, E; Verbist, B; Supré, K; Braem, G; Van Nuffel, A; De Vuyst, L; Heyndrickx, M; De Vliegher, S

    2011-06-01

    In many parts of the world, coagulase-negative staphylococci (CNS) are the predominant pathogens causing intramammary infections (IMI) in dairy cows. The cows' environment is thought to be a possible source for CNS mastitis and this was investigated in the present paper. A longitudinal field study was carried out in 6 well-managed dairy herds to determine the distribution and epidemiology of various CNS species isolated from milk, causing IMI and living freely in the cows' environment, respectively. In each herd, quarter milk samples from a cohort of 10 lactating cows and environmental samples from stall air, slatted floor, sawdust from cubicles, and sawdust stock were collected monthly (n=13). Isolates from quarter milk samples (n=134) and the environment (n=637) were identified to species level using amplified fragment length polymorphism (AFLP) genotyping. Staphylococcus chromogenes, S. haemolyticus, S. epidermidis, and S. simulans accounted for 81.3% of all CNS milk isolates. Quarters were considered infected with CNS (positive IMI status) only when 2 out of 3 consecutive milk samples yielded the same CNS AFLP type. The species causing IMI were S. chromogenes (n=35 samples with positive IMI status), S. haemolyticus (n=29), S. simulans (n=14), and S. epidermidis (n=6). The observed persistent IMI cases (n=17) had a mean duration of 149.4 d (range 63.0 to 329.8 d). The CNS species predominating in the environment were S. equorum, S. sciuri, S. haemolyticus, and S. fleurettii. Herd-to-herd differences in distribution of CNS species were observed in both milk and the environment, suggesting that herd-level factors are involved in the establishment of particular species in a dairy herd. Primary reservoirs of the species causing IMI varied. Staphylococcus chromogenes and S. epidermidis were rarely found in the environment, indicating that other reservoirs were more important in their epidemiology. For S. haemolyticus and S. simulans, the environment was found as a

  10. Trends of distributed generation development in Lithuania

    International Nuclear Information System (INIS)

    Miskinis, Vaclovas; Norvaisa, Egidijus; Galinis, Arvydas; Konstantinaviciute, Inga

    2011-01-01

    The closure of Ignalina Nuclear Power Plant, impact of recent global recession of the economy, as well as changes and problems posed by the global climate change require significant alterations in the Lithuanian energy sector development. This paper describes the current status and specific features of the Lithuanian power system, and in particular discusses the role of the distributed generators. Country's energy policy during last two decades was focused on substantial modernisation of the energy systems, their reorganisation and creation of appropriate institutional structure and necessary legal basis. The most important factors stimulating development of distributed generation in Lithuania are the following: international obligations to increase contribution of power plants using renewable energy sources into electricity production balance; development of small (with capacity less than 50 MW) cogeneration power plants; implementation of energy policy directed to promotion of renewable energy sources and cogeneration. Analysis of the legal and economic environment, as well as principles of regulation of distributed generation and barriers to its development is presented. - Highlights: → Paper describes current status and specific features of the Lithuanian power system. → Analysis of the legal and economic environment regarding distributed generation. → Current situation is not favourable for distributed generation development. → Problems, barriers, principles of regulation of distributed generation is presented. → New energy policy regarding distributed generation and renewables.

  11. Resolution analysis of archive films for the purpose of their optimal digitization and distribution

    Science.gov (United States)

    Fliegel, Karel; Vítek, Stanislav; Páta, Petr; Myslík, Jiří; Pecák, Josef; Jícha, Marek

    2017-09-01

    With recent high demand for ultra-high-definition (UHD) content to be screened in high-end digital movie theaters but also in the home environment, film archives full of movies in high-definition and above are in the scope of UHD content providers. Movies captured with the traditional film technology represent a virtually unlimited source of UHD content. The goal to maintain complete image information is also related to the choice of scanning resolution and spatial resolution for further distribution. It might seem that scanning the film material in the highest possible resolution using state-of-the-art film scanners and also its distribution in this resolution is the right choice. The information content of the digitized images is however limited, and various degradations moreover lead to its further reduction. Digital distribution of the content in the highest image resolution might be therefore unnecessary or uneconomical. In other cases, the highest possible resolution is inevitable if we want to preserve fine scene details or film grain structure for archiving purposes. This paper deals with the image detail content analysis of archive film records. The resolution limit in captured scene image and factors which lower the final resolution are discussed. Methods are proposed to determine the spatial details of the film picture based on the analysis of its digitized image data. These procedures allow determining recommendations for optimal distribution of digitized video content intended for various display devices with lower resolutions. Obtained results are illustrated on spatial downsampling use case scenario, and performance evaluation of the proposed techniques is presented.

  12. The physics analysis environment of the ZEUS experiment

    International Nuclear Information System (INIS)

    Bauerdick, L.A.T.; Derugin, O.; Gilkinson, D.; Kasemann, M.; Manczak, O.

    1995-12-01

    The ZEUS Experiment has over the last three years developed its own model of the central computing environment for physics analysis. This model has been designed to provide ZEUS physicists with powerful and user friendly tools for data analysis as well as to be truly scalable and open. (orig.)

  13. Distributed interactive virtual environments for collaborative experiential learning and training independent of distance over Internet2.

    Science.gov (United States)

    Alverson, Dale C; Saiki, Stanley M; Jacobs, Joshua; Saland, Linda; Keep, Marcus F; Norenberg, Jeffrey; Baker, Rex; Nakatsu, Curtis; Kalishman, Summers; Lindberg, Marlene; Wax, Diane; Mowafi, Moad; Summers, Kenneth L; Holten, James R; Greenfield, John A; Aalseth, Edward; Nickles, David; Sherstyuk, Andrei; Haines, Karen; Caudell, Thomas P

    2004-01-01

    Medical knowledge and skills essential for tomorrow's healthcare professionals continue to change faster than ever before creating new demands in medical education. Project TOUCH (Telehealth Outreach for Unified Community Health) has been developing methods to enhance learning by coupling innovations in medical education with advanced technology in high performance computing and next generation Internet2 embedded in virtual reality environments (VRE), artificial intelligence and experiential active learning. Simulations have been used in education and training to allow learners to make mistakes safely in lieu of real-life situations, learn from those mistakes and ultimately improve performance by subsequent avoidance of those mistakes. Distributed virtual interactive environments are used over distance to enable learning and participation in dynamic, problem-based, clinical, artificial intelligence rules-based, virtual simulations. The virtual reality patient is programmed to dynamically change over time and respond to the manipulations by the learner. Participants are fully immersed within the VRE platform using a head-mounted display and tracker system. Navigation, locomotion and handling of objects are accomplished using a joy-wand. Distribution is managed via the Internet2 Access Grid using point-to-point or multi-casting connectivity through which the participants can interact. Medical students in Hawaii and New Mexico (NM) participated collaboratively in problem solving and managing of a simulated patient with a closed head injury in VRE; dividing tasks, handing off objects, and functioning as a team. Students stated that opportunities to make mistakes and repeat actions in the VRE were extremely helpful in learning specific principles. VRE created higher performance expectations and some anxiety among VRE users. VRE orientation was adequate but students needed time to adapt and practice in order to improve efficiency. This was also demonstrated successfully

  14. Distributions and kinetics of inert radionuclides in environment

    International Nuclear Information System (INIS)

    Fukui, Masami; Yamasaki, Keizo; Okamoto, Ken-ichi; Yoshimoto, Taka-aki

    1988-01-01

    Recently there has been increased concern over the exposures not only in the nuclear energy field but in natural radiation environment. So this paper presents a study on distributions and kinetics of radioactive gases such as 41 Ar and radon progeny in the Kyoto University Reactor, Research Institute (KUR-RI) and radon in the earth, which is one of the main radiation sources in out-door and has been continually generated. Amount of 41 Ar released from the KUR has been decreased by about one-tenth within past seven years and further efforts have been focused on minimizing the released activity according to the concept of ALARA. The simultaneity of the diurnal variations in the radioactive dust observed in several buildings at the KUR-RI indicates that the concentration changes are not caused by the activities of man. Peak concentrations in vertical profile of radon are observed at ca 40 cm below ground surface and a rising of water table following precipitation causes a brust of radon concentration by upflow in ground. Exposure levels on workers and the public resulting from the KUR operation and natural radiation induced by radon progeny will be discussed at this meeting. (author)

  15. Three-dimensional neutron dose distribution in the environment around a 1-GeV electron synchrotron facility at INS

    International Nuclear Information System (INIS)

    Uwamino, Y.; Nakamura, T.

    1987-01-01

    The three-dimensional (surface and altitude) skyshine neutron-dose-equivalent distribution around the 1-GeV electron synchrotron (ES) of the Institute for Nuclear Study, University of Tokyo, was measured with a high-sensitivity dose-equivalent counter. The neutron spectrum in the environment was also measured with a multimoderator spectrometer incorporating a 3 He counter. The dose-equivalent distribution and the leakage neutron spectrum at the surface of the ES building were measured with a Studsvik 2202D counter and the multimoderator spectrometer, including an indium activation detector. Skyshine neutron transport calculations, beginning with the photoneutron spectrum and yielding the dose-equivalent distribution in the environment, were performed with the DOT3.5 code and two Monte Carlo codes, MMCR-2 and MMCR-3, using the DLC-87/HILO group cross sections. The calculated neutron spectra at the top surface of the concrete ceiling and at a point 111 m from the ES agreed well with the measured results, and the calculated three-dimensional dose-equivalent distribution also agreed. The dose value increased linearly with altitude, and the slope was estimated for neutron-producing facilities. (author)

  16. Host tolerance, not symbiont tolerance, determines the distribution of coral species in relation to their environment at a Central Pacific atoll

    Science.gov (United States)

    Wicks, L. C.; Gardner, J. P. A.; Davy, S. K.

    2012-06-01

    Tolerance of environmental variables differs between corals and their dinoflagellate symbionts ( Symbiodinium spp.), controlling the holobiont's (host and symbiont combined) resilience to environmental stress. However, the ecological role that environmental variables play in holobiont distribution remains poorly understood. We compared the drivers of symbiont and coral species distributions at Palmyra Atoll, a location with a range of reef environments from low to high sediment concentrations (1-52 g dry weight m-2 day-1). We observed uniform holobiont partnerships across the atoll (e.g. Montipora spp. with Symbiodinium type C15 at all sites). Multivariate analysis revealed that field-based estimates of settling sediment predominantly explained the spatial variation of coral species among sites ( P coral rather than Symbiodinium physiology. The data highlight the importance of host tolerance to environmental stressors, which should be considered simultaneously with symbiont sensitivity when considering the impact of variations in environmental conditions on coral communities.

  17. Study on the behaviour of technetium-99 in soil environment

    International Nuclear Information System (INIS)

    Morita, S.; Katagiri, H.; Watanabe, H.; Akatsu, Y.; Ishiguro, H.

    1996-01-01

    Technetium-99 (Tc-99) is an important radionuclide for environmental assessment around nuclear fuel cycle facilities, because it has a long half-life and relatively high mobility in the environment. Its determination method, distribution and behaviour in the environment have been studied. While the method using an anti-coincidence type low background gas flow counter has been principally utilized as the conventional determination method for Tc-99, in recent years other determination methods such as neutron activation analysis and liquid scintillation counting method have also been considered. However, these methods are not effective to determine Tc-99 in environmental samples. because their detection limits are not low enough and measuring times are needed. Inductively Coupled Plasma Mass Spectrometer (ICP-MS) is a quantitative and qualitative analysis apparatus having superior characteristics such as extremely low detection limit, high resolution, high precision short measurement time and capability for simultaneous multi-elements analysis. This study developed an analysis technique to acquire field data. As the results, several information regarding the distribution and behaviour of Tc-99 in the surface soil environment was obtained. (author)

  18. Distribution of some natural and man-made radionuclides in soil from the city of Veles (Republic of Macedonia) and its environs.

    Science.gov (United States)

    Dimovska, Snezana; Stafilov, Trajce; Sajn, Robert; Frontasyeva, Marina

    2010-02-01

    A systematic study of soil radioactivity in the metallurgical centre of the Republic of Macedonia, the city of Veles and its environs, was carried out. The measurement of the radioactivity was performed in 55 samples from evenly distributed sampling sites. The gross alpha and gross beta radioactivity measurements were made as a screening, using a low background gas-flow proportional counter. For the analysis of (40)K, (238)U, (232)Th and (137)Cs, a P-type coaxial high purity germanium detector was used. The values for the activity concentrations of the natural radionuclides fall well within the worldwide range as reported in the literature. It is shown that the activity of man-made radionuclides, except for (137)Cs, is below the detection limit. (137)Cs originated from the atmospheric deposition and present in soil in the activity concentration range of 2-358 Bq kg(-1) is irregularly distributed over the sampled territory owing to the complicated orography of the land. The results of gamma spectrometry are compared to the K, U, and Th concentrations previously obtained by the reactor neutron activation analysis in the same soil samples.

  19. GALAXY ENVIRONMENTS OVER COSMIC TIME: THE NON-EVOLVING RADIAL GALAXY DISTRIBUTIONS AROUND MASSIVE GALAXIES SINCE z = 1.6

    International Nuclear Information System (INIS)

    Tal, Tomer; Van Dokkum, Pieter G.; Leja, Joel; Franx, Marijn; Wake, David A.; Whitaker, Katherine E.

    2013-01-01

    We present a statistical study of the environments of massive galaxies in four redshift bins between z = 0.04 and z = 1.6, using data from the Sloan Digital Sky Survey and the NEWFIRM Medium Band Survey. We measure the projected radial distribution of galaxies in cylinders around a constant number density selected sample of massive galaxies and utilize a statistical subtraction of contaminating sources. Our analysis shows that massive primary galaxies typically live in group halos and are surrounded by 2-3 satellites with masses more than one-tenth of the primary galaxy mass. The cumulative stellar mass in these satellites roughly equals the mass of the primary galaxy itself. We further find that the radial number density profile of galaxies around massive primaries has not evolved significantly in either slope or overall normalization in the past 9.5 Gyr. A simplistic interpretation of this result can be taken as evidence for a lack of mergers in the studied groups and as support for a static evolution model of halos containing massive primaries. Alternatively, there exists a tight balance between mergers and accretion of new satellites such that the overall distribution of galaxies in and around the halo is preserved. The latter interpretation is supported by a comparison to a semi-analytic model, which shows a similar constant average satellite distribution over the same redshift range.

  20. Family size, the physical environment, and socioeconomic effects across the stature distribution.

    Science.gov (United States)

    Carson, Scott Alan

    2012-04-01

    A neglected area in historical stature studies is the relationship between stature and family size. Using robust statistics and a large 19th century data set, this study documents a positive relationship between stature and family size across the stature distribution. The relationship between material inequality and health is the subject of considerable debate, and there was a positive relationship between stature and wealth and an inverse relationship between stature and material inequality. After controlling for family size and wealth variables, the paper reports a positive relationship between the physical environment and stature. Copyright © 2012 Elsevier GmbH. All rights reserved.

  1. Multiuser Diversity with Adaptive Modulation in Non-Identically Distributed Nakagami Fading Environments

    KAUST Repository

    Rao, Anlei

    2012-09-08

    In this paper, we analyze the performance of adaptive modulation with single-cell multiuser scheduling over independent but not identical distributed (i.n.i.d.) Nakagami fading channels. Closed-form expressions are derived for the average channel capacity, spectral efficiency, and bit-error-rate (BER) for both constant-power variable-rate and variable-power variable-rate uncoded/coded M-ary quadrature amplitude modulation (M-QAM) schemes. We also study the impact of time delay on the average BER of adaptive M-QAM. Selected numerical results show that the multiuser diversity brings a considerably better performance even over i.n.i.d. fading environments.

  2. E-Commerce Strategic Business Environment Analysis in Indonesia

    OpenAIRE

    Aribawa, Dwitya

    2016-01-01

    This research is aim to identified important factors in external business environment that tend to influence company capabilities to achieve objective. It conducts to several e-commerce in Indonesia. Those companies operate several industries, such as grocery and household retail, fashion retail, electronic and gadget retail and travel agency booking provider. We conduct thematic analysis with quad helix stakeholders approach. It found that the firm faces several external environment factors ...

  3. LOG-NORMAL DISTRIBUTION OF COSMIC VOIDS IN SIMULATIONS AND MOCKS

    Energy Technology Data Exchange (ETDEWEB)

    Russell, E.; Pycke, J.-R., E-mail: er111@nyu.edu, E-mail: jrp15@nyu.edu [Division of Science and Mathematics, New York University Abu Dhabi, P.O. Box 129188, Abu Dhabi (United Arab Emirates)

    2017-01-20

    Following up on previous studies, we complete here a full analysis of the void size distributions of the Cosmic Void Catalog based on three different simulation and mock catalogs: dark matter (DM), haloes, and galaxies. Based on this analysis, we attempt to answer two questions: Is a three-parameter log-normal distribution a good candidate to satisfy the void size distributions obtained from different types of environments? Is there a direct relation between the shape parameters of the void size distribution and the environmental effects? In an attempt to answer these questions, we find here that all void size distributions of these data samples satisfy the three-parameter log-normal distribution whether the environment is dominated by DM, haloes, or galaxies. In addition, the shape parameters of the three-parameter log-normal void size distribution seem highly affected by environment, particularly existing substructures. Therefore, we show two quantitative relations given by linear equations between the skewness and the maximum tree depth, and between the variance of the void size distribution and the maximum tree depth, directly from the simulated data. In addition to this, we find that the percentage of voids with nonzero central density in the data sets has a critical importance. If the number of voids with nonzero central density reaches ≥3.84% in a simulation/mock sample, then a second population is observed in the void size distributions. This second population emerges as a second peak in the log-normal void size distribution at larger radius.

  4. Single Particulate SEM-EDX Analysis of Iron-Containing Coarse Particulate Matter in an Urban Environment: Sources and Distribution of Iron within Cleveland, Ohio

    Science.gov (United States)

    The physicochemical properties of coarse-mode, iron-containing particles, and their temporal and spatial distributions are poorly understood. Single particle analysis combining x-ray elemental mapping and computer-controlled scanning electron microscopy (CCSEM-EDX) of passively ...

  5. HammerCloud: A Stress Testing System for Distributed Analysis

    CERN Document Server

    van der Ster, Daniel C; Ubeda Garcia, Mario; Paladin, Massimo

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud (HC) is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HC was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HC has been ...

  6. Framing and Enhancing Distributed Leadership in the Quality Management of Online Learning Environments in Higher Education

    Science.gov (United States)

    Holt, Dale; Palmer, Stuart; Gosper, Maree; Sankey, Michael; Allan, Garry

    2014-01-01

    This article reports on the findings of senior leadership interviews in a nationally funded project on distributed leadership in the quality management of online learning environments (OLEs) in higher education. Questions were framed around the development of an OLE quality management framework and the situation of the characteristics of…

  7. Infusing considerations of trophic dependencies into species distribution modelling.

    Science.gov (United States)

    Trainor, Anne M; Schmitz, Oswald J

    2014-12-01

    Community ecology involves studying the interdependence of species with each other and their environment to predict their geographical distribution and abundance. Modern species distribution analyses characterise species-environment dependency well, but offer only crude approximations of species interdependency. Typically, the dependency between focal species and other species is characterised using other species' point occurrences as spatial covariates to constrain the focal species' predicted range. This implicitly assumes that the strength of interdependency is homogeneous across space, which is not generally supported by analyses of species interactions. This discrepancy has an important bearing on the accuracy of inferences about habitat suitability for species. We introduce a framework that integrates principles from consumer-resource analyses, resource selection theory and species distribution modelling to enhance quantitative prediction of species geographical distributions. We show how to apply the framework using a case study of lynx and snowshoe hare interactions with each other and their environment. The analysis shows how the framework offers a spatially refined understanding of species distribution that is sensitive to nuances in biophysical attributes of the environment that determine the location and strength of species interactions. © 2014 John Wiley & Sons Ltd/CNRS.

  8. Continuous Distributed Top-k Monitoring over High-Speed Rail Data Stream in Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Hanning Wang

    2013-01-01

    Full Text Available In the environment of cloud computing, real-time mass data about high-speed rail which is based on the intense monitoring of large scale perceived equipment provides strong support for the safety and maintenance of high-speed rail. In this paper, we focus on the Top-k algorithm of continuous distribution based on Multisource distributed data stream for high-speed rail monitoring. Specifically, we formalized Top-k monitoring model of high-speed rail and proposed DTMR that is the Top-k monitoring algorithm with random, continuous, or strictly monotone aggregation functions. The DTMR was proved to be valid by lots of experiments.

  9. Sources and distribution of anthropogenic radionuclides in different marine environments

    International Nuclear Information System (INIS)

    Holm, E.

    1997-01-01

    The knowledge of the distribution in time and space radiologically important radionuclides from different sources in different marine environments is important for assessment of dose commitment following controlled or accidental releases and for detecting eventual new sources. Present sources from nuclear explosion tests, releases from nuclear facilities and the Chernobyl accident provide a tool for such studies. The different sources can be distinguished by different isotopic and radionuclide composition. Results show that radiocaesium behaves rather conservatively in the south and north Atlantic while plutonium has a residence time of about 8 years. On the other hand enhanced concentrations of plutonium in surface waters in arctic regions where vertical mixing is small and iceformation plays an important role. Significantly increased concentrations of plutonium are also found below the oxic layer in anoxic basins due to geochemical concentration. (author)

  10. A two-population sporadic meteoroid bulk density distribution and its implications for environment models

    Science.gov (United States)

    Moorhead, Althea V.; Blaauw, Rhiannon C.; Moser, Danielle E.; Campbell-Brown, Margaret D.; Brown, Peter G.; Cooke, William J.

    2017-12-01

    The bulk density of a meteoroid affects its dynamics in space, its ablation in the atmosphere, and the damage it does to spacecraft and lunar or planetary surfaces. Meteoroid bulk densities are also notoriously difficult to measure, and we are typically forced to assume a density or attempt to measure it via a proxy. In this paper, we construct a density distribution for sporadic meteoroids based on existing density measurements. We considered two possible proxies for density: the KB parameter introduced by Ceplecha and Tisserand parameter, TJ. Although KB is frequently cited as a proxy for meteoroid material properties, we find that it is poorly correlated with ablation-model-derived densities. We therefore follow the example of Kikwaya et al. in associating density with the Tisserand parameter. We fit two density distributions to meteoroids originating from Halley-type comets (TJ 2); the resulting two-population density distribution is the most detailed sporadic meteoroid density distribution justified by the available data. Finally, we discuss the implications for meteoroid environment models and spacecraft risk assessments. We find that correcting for density increases the fraction of meteoroid-induced spacecraft damage produced by the helion/antihelion source.

  11. Design and Implement a MapReduce Framework for Executing Standalone Software Packages in Hadoop-based Distributed Environments

    Directory of Open Access Journals (Sweden)

    Chao-Chun Chen

    2013-12-01

    Full Text Available The Hadoop MapReduce is the programming model of designing the auto scalable distributed computing applications. It provides developer an effective environment to attain automatic parallelization. However, most existing manufacturing systems are arduous and restrictive to migrate to MapReduce private cloud, due to the platform incompatible and tremendous complexity of system reconstruction. For increasing the efficiency of manufacturing systems with minimum modification of existing systems, we design a framework in this thesis, called MC-Framework: Multi-uses-based Cloudizing-Application Framework. It provides the simple interface to users for fairly executing requested tasks worked with traditional standalone software packages in MapReduce-based private cloud environments. Moreover, this thesis focuses on the multiuser workloads, but the default Hadoop scheduling scheme, i.e., FIFO, would increase delay under multiuser scenarios. Hence, we also propose a new scheduling mechanism, called Job-Sharing Scheduling, to explore and fairly share the jobs to machines in the MapReduce-based private cloud. Then, we prototype an experimental virtual-metrology module of a manufacturing system as a case study to verify and analysis the proposed MC-Framework. The results of our experiments indicate that our proposed framework enormously improved the time performance compared with the original package.

  12. Three-Phase Harmonic Analysis Method for Unbalanced Distribution Systems

    Directory of Open Access Journals (Sweden)

    Jen-Hao Teng

    2014-01-01

    Full Text Available Due to the unbalanced features of distribution systems, a three-phase harmonic analysis method is essential to accurately analyze the harmonic impact on distribution systems. Moreover, harmonic analysis is the basic tool for harmonic filter design and harmonic resonance mitigation; therefore, the computational performance should also be efficient. An accurate and efficient three-phase harmonic analysis method for unbalanced distribution systems is proposed in this paper. The variations of bus voltages, bus current injections and branch currents affected by harmonic current injections can be analyzed by two relationship matrices developed from the topological characteristics of distribution systems. Some useful formulas are then derived to solve the three-phase harmonic propagation problem. After the harmonic propagation for each harmonic order is calculated, the total harmonic distortion (THD for bus voltages can be calculated accordingly. The proposed method has better computational performance, since the time-consuming full admittance matrix inverse employed by the commonly-used harmonic analysis methods is not necessary in the solution procedure. In addition, the proposed method can provide novel viewpoints in calculating the branch currents and bus voltages under harmonic pollution which are vital for harmonic filter design. Test results demonstrate the effectiveness and efficiency of the proposed method.

  13. Collaborative Virtual Environments as Means to Increase the Level of Intersubjectivity in a Distributed Cognition System

    Science.gov (United States)

    Ligorio, M. Beatrice; Cesareni, Donatella; Schwartz, Neil

    2008-01-01

    Virtual environments are able to extend the space of interaction beyond the classroom. In order to analyze how distributed cognition functions in such an extended space, we suggest focusing on the architecture of intersubjectivity. The Euroland project--a virtual land created and populated by seven classrooms supported by a team of…

  14. Development of UCMS for Analysis of Designed and Measured Core Power Distribution

    International Nuclear Information System (INIS)

    Moon, Sang Rae; Hong, Sun Kwan; Yang, Sung Tae

    2009-01-01

    In this study, reactor core loading patterns were determined by calculating and verifying the factors affecting peak power and important core safety variables were reconciled with their design criteria using a newly designed unified core management system. Core loading patterns are designed for quadrant cores under the assumption that the power distribution of the reactor core is the same among symmetric fuel assemblies within the core. Actual core power distributions measured during core operation may differ slightly from their designed data. Reactor engineers monitor these differences between the designed and measured data by performing a surveillance procedure every month according to the technical specification requirements. It is difficult to monitor overall power distribution behavior throughout the assemblies using the current procedure because it requires the reactor engineer to compare the designed data with only the maximum value of the power peaking factor and the relative power density. It is necessary to enhance this procedure to check the primary variables such as core power distribution, because long cycle operation, high burnup, power up-rate, and improved fuel can change the environment in the core. To achieve this goal, a web-based Unified Core Management System (UCMS) was developed. To build the UCMS, a database system was established using reactor design data such as that in the Nuclear Design Report (NDR) and automated core analysis codes for all light water reactor power plants. The UCMS is designed to help reactor engineers to monitor important core variables and core safety margins by comparing the measured core power distribution with designed data for each fuel assembly during the cycle operation in nuclear power plants

  15. Stock-environment recruitment analysis for Namibian Cape hake ...

    African Journals Online (AJOL)

    Stock-environment recruitment analysis for Namibian Cape hake Merluccius capensis. ... The factors modulating recruitment success of Cape hake Merluccius capensis in Namibian waters are still unresolved. ... AJOL African Journals Online.

  16. Time evolution of distribution functions in dissipative environments

    International Nuclear Information System (INIS)

    Hu Li-Yun; Chen Fei; Wang Zi-Sheng; Fan Hong-Yi

    2011-01-01

    By introducing the thermal entangled state representation, we investigate the time evolution of distribution functions in the dissipative channels by bridging the relation between the initial distribution function and the any time distribution function. We find that most of them are expressed as such integrations over the Laguerre—Gaussian function. Furthermore, as applications, we derive the time evolution of photon-counting distribution by bridging the relation between the initial distribution function and the any time photon-counting distribution, and the time evolution of R-function characteristic of nonclassicality depth. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  17. The institutional structure and political economy of food distribution systems: A comparative analysis of six Eastern European countries

    DEFF Research Database (Denmark)

    Esbjerg, Lars; Skytte, Hans

    This paper discusses the food distribution systems of six Eastern European countries. It considers the macro and task environments of distribution systems, discussing the constraints and opportunities the environments present to companies. The institutional structure of retailing and wholesaling...... are analysed and important developments in the institutional structure are noted. The internal political economy of distribution channels in Eastern Europe is analysed and the modernisation of distribution systems discussed. Finally, some conclusions are offered and areas for future research suggested....

  18. Guest Editor's introduction: Special issue on distributed virtual environments

    Science.gov (United States)

    Lea, Rodger

    1998-09-01

    Distributed virtual environments (DVEs) combine technology from 3D graphics, virtual reality and distributed systems to provide an interactive 3D scene that supports multiple participants. Each participant has a representation in the scene, often known as an avatar, and is free to navigate through the scene and interact with both the scene and other viewers of the scene. Changes to the scene, for example, position changes of one avatar as the associated viewer navigates through the scene, or changes to objects in the scene via manipulation, are propagated in real time to all viewers. This ensures that all viewers of a shared scene `see' the same representation of it, allowing sensible reasoning about the scene. Early work on such environments was restricted to their use in simulation, in particular in military simulation. However, over recent years a number of interesting and potentially far-reaching attempts have been made to exploit the technology for a range of other uses, including: Social spaces. Such spaces can be seen as logical extensions of the familiar text chat space. In 3D social spaces avatars, representing participants, can meet in shared 3D scenes and in addition to text chat can use visual cues and even in some cases spatial audio. Collaborative working. A number of recent projects have attempted to explore the use of DVEs to facilitate computer-supported collaborative working (CSCW), where the 3D space provides a context and work space for collaboration. Gaming. The shared 3D space is already familiar, albeit in a constrained manner, to the gaming community. DVEs are a logical superset of existing 3D games and can provide a rich framework for advanced gaming applications. e-commerce. The ability to navigate through a virtual shopping mall and to look at, and even interact with, 3D representations of articles has appealed to the e-commerce community as it searches for the best method of presenting merchandise to electronic consumers. The technology

  19. Analysis of methods. [information systems evolution environment

    Science.gov (United States)

    Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity

  20. Ambient environment analysis by means of perception

    NARCIS (Netherlands)

    Bitterman, M.S.; Ciftcioglu, O.; Bhatt, M.; Schultz, C.

    2013-01-01

    Analysis of an ambient environment by means of perception is described. The surveillance of an object by human, who watches a scene via a monitor that shows camera sensed information, is investigated. Although the camera sensing process is a deterministic process, human perception of a scene via

  1. The clinical learning environment in nursing education: a concept analysis.

    Science.gov (United States)

    Flott, Elizabeth A; Linden, Lois

    2016-03-01

    The aim of this study was to report an analysis of the clinical learning environment concept. Nursing students are evaluated in clinical learning environments where skills and knowledge are applied to patient care. These environments affect achievement of learning outcomes, and have an impact on preparation for practice and student satisfaction with the nursing profession. Providing clarity of this concept for nursing education will assist in identifying antecedents, attributes and consequences affecting student transition to practice. The clinical learning environment was investigated using Walker and Avant's concept analysis method. A literature search was conducted using WorldCat, MEDLINE and CINAHL databases using the keywords clinical learning environment, clinical environment and clinical education. Articles reviewed were written in English and published in peer-reviewed journals between 1995-2014. All data were analysed for recurring themes and terms to determine possible antecedents, attributes and consequences of this concept. The clinical learning environment contains four attribute characteristics affecting student learning experiences. These include: (1) the physical space; (2) psychosocial and interaction factors; (3) the organizational culture and (4) teaching and learning components. These attributes often determine achievement of learning outcomes and student self-confidence. With better understanding of attributes comprising the clinical learning environment, nursing education programmes and healthcare agencies can collaborate to create meaningful clinical experiences and enhance student preparation for the professional nurse role. © 2015 John Wiley & Sons Ltd.

  2. Distribution of chlorpyrifos in rice paddy environment and its potential dietary risk.

    Science.gov (United States)

    Fu, Yan; Liu, Feifei; Zhao, Chenglin; Zhao, Ying; Liu, Yihua; Zhu, Guonian

    2015-09-01

    Chlorpyrifos is one of the most extensively used insecticides in China. The distribution and residues of chlorpyrifos in a paddy environment were characterized under field and laboratory conditions. The half-lives of chlorpyrifos in the two conditions were 0.9-3.8days (field) and 2.8-10.3days (laboratory), respectively. The initial distribution of chlorpyrifos followed the increasing order of waterchlorpyrifos was rather low compared to the acceptable daily intake (ADI=0.01mg/kg bw) due to rice consumption. The chronic exposure risk from chlorpyrifos in rice grain was 5.90% and 1.30% ADI from field and laboratory results respectively. Concerning the acute dietary exposure, intake estimated for the highest chlorpyrifos level did not exceed the acute reference dose (ARfD=0.1mg/kg bw). The estimated short-term intakes (ESTIs) were 0.78% and 0.25% of the ARfD for chlorpyrifos. The results showed that the use of chlorpyrifos in rice paddies was fairly safe for consumption of rice grain by consumers. Copyright © 2015. Published by Elsevier B.V.

  3. Comparative analysis through probability distributions of a data set

    Science.gov (United States)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  4. Assessing the Nexus of Built, Natural, and Social Environments and Public Health Outcomes

    Science.gov (United States)

    Archer, R.; Alexander, S.; Douglas, J.

    2017-12-01

    This study investigates community-related environmental justice concerns and chemical and non-chemical health stressors from built, natural, and social environments in Southeast Los Angeles (SELA) County and East Oakland, California. The geographical distribution of health outcomes is related to the built and natural environments, as well as impacts from the social environment. A holistic systems view is important in assessing healthy behaviors within a community, because they do not occur in isolation. Geospatial analysis will be performed to integrate a total environment framework and explore the spatial patterns of exposure to chemical and non-chemical stressors and access to health-promoting environments. Geographic Information Systems (GIS) analysis using primary and secondary existing data will be performed to determine how social environments impact exposure to chemical health stressors and access to health-promoting built and natural environments. This project will develop a comprehensive list of health-promoting built and natural environments (e.g., parks and community gardens) and polluting sites (e.g., shipping ports and sources of pollution not included in federal regulatory databases) in East Oakland and SELA. California Department of Public Health and U.S. Decennial Census data will also be included for geospatial analysis to overlay the distribution of air pollution-related morbidities (e.g. asthma, diabetes, and cancer) and access to health-promoting built and natural environments and related community assets, exposure to polluting industries, social disorganization, and public health outcomes in the target areas. This research will help identify the spatial and temporal distribution and cumulative impacts of critical pollution hotspots causing community environmental health impacts. The research team will also map how social environments impact exposure to chemical health stressors and access to health-promoting built and natural environments. The

  5. Telefacturing Based Distributed Manufacturing Environment for Optimal Manufacturing Service by Enhancing the Interoperability in the Hubs

    Directory of Open Access Journals (Sweden)

    V. K. Manupati

    2017-01-01

    Full Text Available Recent happenings are surrounding the manufacturing sector leading to intense progress towards the development of effective distributed collaborative manufacturing environments. This evolving collaborative manufacturing not only focuses on digitalisation of this environment but also necessitates service-dependent manufacturing system that offers an uninterrupted approach to a number of diverse, complicated, dynamic manufacturing operations management systems at a common work place (hub. This research presents a novel telefacturing based distributed manufacturing environment for recommending the manufacturing services based on the user preferences. The first step in this direction is to deploy the most advanced tools and techniques, that is, Ontology-based Protégé 5.0 software for transforming the huge stored knowledge/information into XML schema of Ontology Language (OWL documents and Integration of Process Planning and Scheduling (IPPS for multijobs in a collaborative manufacturing system. Thereafter, we also investigate the possibilities of allocation of skilled workers to the best feasible operations sequence. In this context, a mathematical model is formulated for the considered objectives, that is, minimization of makespan and total training cost of the workers. With an evolutionary algorithm and developed heuristic algorithm, the performance of the proposed manufacturing system has been improved. Finally, to manifest the capability of the proposed approach, an illustrative example from the real-time manufacturing industry is validated for optimal service recommendation.

  6. An Autonomous Mobile Agent-Based Distributed Learning Architecture-A Proposal and Analytical Analysis

    Directory of Open Access Journals (Sweden)

    I. Ahmed M. J. SADIIG

    2005-10-01

    Full Text Available An Autonomous Mobile Agent-Based Distributed Learning Architecture-A Proposal and Analytical Analysis Dr. I. Ahmed M. J. SADIIG Department of Electrical & Computer EngineeringInternational Islamic University GombakKuala Lumpur-MALAYSIA ABSTRACT The traditional learning paradigm invoving face-to-face interaction with students is shifting to highly data-intensive electronic learning with the advances in Information and Communication Technology. An important component of the e-learning process is the delivery of the learning contents to their intended audience over a network. A distributed learning system is dependent on the network for the efficient delivery of its contents to the user. However, as the demand of information provision and utilization increases on the Internet, the current information service provision and utilization methods are becoming increasingly inefficient. Although new technologies have been employed for efficient learning methodologies within the context of an e-learning environment, the overall efficiency of the learning system is dependent on the mode of distribution and utilization of its learning contents. It is therefore imperative to employ new techniques to meet the service demands of current and future e-learning systems. In this paper, an architecture based on autonomous mobile agents creating a Faded Information Field is proposed. Unlike the centralized information distribution in a conventional e-learning system, the information is decentralized in the proposed architecture resulting in increased efficiency of the overall system for distribution and utilization of system learning contents efficiently and fairly. This architecture holds the potential to address the heterogeneous user requirements as well as the changing conditions of the underlying network.

  7. Distributed bearing fault diagnosis based on vibration analysis

    Science.gov (United States)

    Dolenc, Boštjan; Boškoski, Pavle; Juričić, Đani

    2016-01-01

    Distributed bearing faults appear under various circumstances, for example due to electroerosion or the progression of localized faults. Bearings with distributed faults tend to generate more complex vibration patterns than those with localized faults. Despite the frequent occurrence of such faults, their diagnosis has attracted limited attention. This paper examines a method for the diagnosis of distributed bearing faults employing vibration analysis. The vibrational patterns generated are modeled by incorporating the geometrical imperfections of the bearing components. Comparing envelope spectra of vibration signals shows that one can distinguish between localized and distributed faults. Furthermore, a diagnostic procedure for the detection of distributed faults is proposed. This is evaluated on several bearings with naturally born distributed faults, which are compared with fault-free bearings and bearings with localized faults. It is shown experimentally that features extracted from vibrations in fault-free, localized and distributed fault conditions form clearly separable clusters, thus enabling diagnosis.

  8. Taxing energy to improve the environment. Efficiency and distributional effects

    International Nuclear Information System (INIS)

    Heijdra, B.J.; Van der Horst, A.

    1998-02-01

    The effects of environmental tax policy in a dynamic overlapping-generations model of a small open economy with environmental quality incorporated as a durable consumption good have been studied. Raising the energy tax may deliver an efficiency gain if agents care enough about the environment. The benefits are unevenly distributed across generations since capital ownership, and the capital loss induced by a tax increase, rises with age. A suitable egalitarian bond policy can be employed in order to ensure everybody gains to the same extent. With this additional instrument the optimal energy tax can be computed. The authors further considered a tax reform that simultaneously lowers labour taxation and raises the energy tax. This policy delivers qualitatively similar consequences as the first scenario, though all changes are less pronounced. A double dividend may appear soon after the reform but vanishes in the course of the transition. 22 refs

  9. Taxing energy to improve the environment. Efficiency and distributional effects

    Energy Technology Data Exchange (ETDEWEB)

    Heijdra, B.J.; Van der Horst, A. [Faculty of Economics and Econometrics, University of Amsterdam, Amsterdam (Netherlands)

    1998-02-01

    The effects of environmental tax policy in a dynamic overlapping-generations model of a small open economy with environmental quality incorporated as a durable consumption good have been studied. Raising the energy tax may deliver an efficiency gain if agents care enough about the environment. The benefits are unevenly distributed across generations since capital ownership, and the capital loss induced by a tax increase, rises with age. A suitable egalitarian bond policy can be employed in order to ensure everybody gains to the same extent. With this additional instrument the optimal energy tax can be computed. The authors further considered a tax reform that simultaneously lowers labour taxation and raises the energy tax. This policy delivers qualitatively similar consequences as the first scenario, though all changes are less pronounced. A double dividend may appear soon after the reform but vanishes in the course of the transition. 22 refs.

  10. Application of Markov chains-entropy to analysis of depositional environments

    Energy Technology Data Exchange (ETDEWEB)

    Men Guizhen; Shi Xiaohong; Zhao Shuzhi

    1989-01-01

    The paper systematically and comprehensively discussed application of Markov chains-entropy to analysis of depositional environments of the upper Carboniferous series Taiyuan Formation in Anjialing, Pingshuo open-cast mine, Shanxi. Definite geological meanings were given respectively to calculated values of transition probability matrix, extremity probability matrix, substitution matrix and the entropy. The lithologic successions of coarse-fine-coarse grained layers from bottom upwards in the coal-bearing series made up the general symmetric cyclic patterns. It was suggested that the coal-bearing strata deposited in the coal-forming environment in delta plain-littoral swamps. Quantitative study of cyclic visibility and variation of formation was conducted. The assemblage relation among stratigraphic sequences and the significance of predicting vertical change were emphasized. Results of study showed that overall analysis of Markov chains was an effective method for analysis of depositional environments of coal-bearing strata. 2 refs., 5 figs.

  11. Thermal analysis of HNPF spent fuel shipping container in torch environments

    International Nuclear Information System (INIS)

    Eggers, P.E.; Crawford, H.L.; Trujillo, A.A.; Pope, R.B.; Yoshimura, H.R.

    1980-01-01

    A thermal analysis of a modified HNPF cask has been performed to predict cask temperatures in simulated torch fire environments. A portion of this analysis also provided a basis for the modifications leading to the design of external cooling fins, a pressure relief valve system, a spent fuel torch fire analyses will provide comparison with experimentally measured temperatures of the HNPF cask, in torch fire environments, which will be obtained in the future. Scope of this study was analytical in nature and involved the use of a computer-based heat transfer analysis methods. Product of this study is the prediction of cask temperatures and internal pressures under normal and torch-fire accident conditions. This study also involved an analysis of the torch fire literature in order to develop an empirically based thermal model for the torch environment. 8 figures, 3 tables

  12. Determination of the elemental distribution in cigarette components and smoke by instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Wu, D.; Landsberger, S.; Larson, S.M.

    1997-01-01

    Cigarette smoking is a major source of particle released in indoor environments. A comprehensive study of the elemental distribution in cigarettes and cigarette smoke has been completed. Specifically, concentrations of thirty elements have been determined for the components of 15 types of cigarettes. Components include tobacco, ash, butts, filters, and cigarette paper. In addition, particulate matter from mainstream smoke (MS) and sidesstream smoke (SS) were analyzed. The technique of elemental determination used in the study is instrumental neutron activation analysis. The results show that certain heavy metals, such as As, Cd, K, Sb and Zn, are released into the MS and SS. These metals may then be part of the health risk of exposure to smoke. Other elements are retained, for the most part, in cigarette ash and butts. The elemental distribution among the cigarette components and smoke changes for different smoking conditions. (author)

  13. Study of cache performance in distributed environment for data processing

    International Nuclear Information System (INIS)

    Makatun, Dzmitry; Lauret, Jérôme; Šumbera, Michal

    2014-01-01

    Processing data in distributed environment has found its application in many fields of science (Nuclear and Particle Physics (NPP), astronomy, biology to name only those). Efficiently transferring data between sites is an essential part of such processing. The implementation of caching strategies in data transfer software and tools, such as the Reasoner for Intelligent File Transfer (RIFT) being developed in the STAR collaboration, can significantly decrease network load and waiting time by reusing the knowledge of data provenance as well as data placed in transfer cache to further expand on the availability of sources for files and data-sets. Though, a great variety of caching algorithms is known, a study is needed to evaluate which one can deliver the best performance in data access considering the realistic demand patterns. Records of access to the complete data-sets of NPP experiments were analyzed and used as input for computer simulations. Series of simulations were done in order to estimate the possible cache hits and cache hits per byte for known caching algorithms. The simulations were done for cache of different sizes within interval 0.001 – 90% of complete data-set and low-watermark within 0-90%. Records of data access were taken from several experiments and within different time intervals in order to validate the results. In this paper, we will discuss the different data caching strategies from canonical algorithms to hybrid cache strategies, present the results of our simulations for the diverse algorithms, debate and identify the choice for the best algorithm in the context of Physics Data analysis in NPP. While the results of those studies have been implemented in RIFT, they can also be used when setting up cache in any other computational work-flow (Cloud processing for example) or managing data storages with partial replicas of the entire data-set

  14. System analysis and planning of a gas distribution network

    Energy Technology Data Exchange (ETDEWEB)

    Salas, Edwin F.M.; Farias, Helio Monteiro [AUTOMIND, Rio de Janeiro, RJ (Brazil); Costa, Carla V.R. [Universidade Salvador (UNIFACS), BA (Brazil)

    2009-07-01

    The increase in demand by gas consumers require that projects or improvements in gas distribution networks be made carefully and safely to ensure a continuous, efficient and economical supply. Gas distribution companies must ensure that the networks and equipment involved are defined and designed at the appropriate time to attend to the demands of the market. To do that a gas distribution network analysis and planning tool should use distribution networks and transmission models for the current situation and the future changes to be implemented. These models are used to evaluate project options and help in making appropriate decisions in order to minimize the capital investment in new components or simple changes in operational procedures. Gas demands are increasing and it is important that gas distribute design new distribution systems to ensure this growth, considering financial constraints of the company, as well as local legislation and regulation. In this study some steps of developing a flexible system that attends to those needs will be described. The analysis of distribution requires geographically referenced data for the models as well as an accurate connectivity and the attributes of the equipment. GIS systems are often used as a deposit center that holds the majority of this information. GIS systems are constantly updated as distribution network equipment is modified. The distribution network modeling gathered from this system ensures that the model represents the current network condition. The benefits of this architecture drastically reduce the creation and maintenance cost of the network models, because network components data are conveniently made available to populate the distribution network. This architecture ensures that the models are continually reflecting the reality of the distribution network. (author)

  15. Sunflower mega-environments in Serbia revealed by GGE biplot analysis

    Directory of Open Access Journals (Sweden)

    Balalić Igor

    2013-01-01

    Full Text Available Sunflower mega-environment analysis was conducted for the grain yield data of 20 hybrids and 19 test locations during 2006, and 20 hybrids and 16 test locations during 2007. Combined data included 15 hybrids and 9 test locations common for both years and it was analyzed as balanced experiment. The analysis of variance components showed that hybrid by location interaction explained 2.74, 5.8, and 3.72 times more variation than hybrid, for grain yield, for 2006, 2007, combined data, respectively, and indicated potential mega-environment existence. Our results showed the existence of two mega-environments in Serbia sunflower growing region: (1 Kula Vitovnica, Aleksa Šantić, Sombor and (2 Rimski Šančevi, Kikinda. It has been concluded that if we want promising sunflower hybrids to be optimally used, they should be cropped differently for the two determined mega-environments.

  16. Study of the distribution characteristics of rare earth elements in Solanum lycocarpum from different tropical environments in Brazil by neutron activation analysis

    International Nuclear Information System (INIS)

    Maria, Sheila Piorino

    2001-01-01

    In this work, the concentration of eight rare earth elements (REE), La, Ce, Nd, Sm, Eu, Tb, Yb and Lu, was determined by neutron activation analysis (INAA), in plant leaves of Solanum lycocarpum. This species is a typical Brazilian 'cerrado' plant, widely distributed in Brazil. The analysis of the plant reference materials CRM Pine Needles (NIST 1575) and Spruce Needles (BCR 101) proved that the methodology applied was sufficiently accurate and precise for the determination of REE in plants. In order to better evaluate the uptake of the REE from the soil to the plant, the host soil was also analyzed by ESiAA. The studied areas were Salitre, MG, Serra do Cipo, MG, Lagoa da Pampulha and Mangabeiras, in Belo Horizonte, MG, and Cerrado de Emas, in Pirassununga, SP. The results were analyzed through the calculation of transfer factors soil-plant and by using diagrams normalized to chondrites. The data obtained showed different transfer factors from soil to plant as the subtract changes. Similar distribution patterns for the soil and the plant were obtained in all the studied sites, presenting an enrichment of the light REE (La to Sm), in contrast to the heavy REE (Eu to Lu), less absorbed. These results indicate that the light REE remain available to the plant in the more superficial soil layers. The similarity between the distribution patterns indicates a typical REE absorption by this species, in spite of the significant differences in the substratum . (author)

  17. Distributed analysis in ATLAS using GANGA

    International Nuclear Information System (INIS)

    Elmsheuser, Johannes; Brochu, Frederic; Egede, Ulrik; Reece, Will; Williams, Michael; Gaidioz, Benjamin; Maier, Andrew; Moscicki, Jakub; Vanderster, Daniel; Lee, Hurng-Chun; Pajchel, Katarina; Samset, Bjorn; Slater, Mark; Soroko, Alexander; Cowan, Greig

    2010-01-01

    Distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The needs to manage the resources are very high. In every experiment up to a thousand physicists will be submitting analysis jobs to the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without expertise in Grid technology. These tools enlarge the number of Grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments, provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment. Support for all Grids presently used by ATLAS, namely the LCG/EGEE, NDGF/NorduGrid, and OSG/PanDA is provided. The integration and interaction with the ATLAS data management system DQ2 into GANGA is a key functionality. An intelligent job brokering is set up by using the job splitting mechanism together with data-set and file location knowledge. The brokering is aided by an automated system that regularly processes test analysis jobs at all ATLAS DQ2 supported sites. Large numbers of analysis jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports amongst other things tasks of user analysis with reconstructed data and small scale production of Monte Carlo data.

  18. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2013-01-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system......, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...... results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles....

  19. DCE. Future IHEP's computing environment

    International Nuclear Information System (INIS)

    Zheng Guorui; Liu Xiaoling

    1995-01-01

    IHEP'S computing environment consists of several different computing environments established on IHEP computer networks. In which, the BES environment supported HEP computing is the main part of IHEP computing environment. Combining with the procedure of improvement and extension of BES environment, the authors describe development of computing environments in outline as viewed from high energy physics (HEP) environment establishment. The direction of developing to distributed computing of the IHEP computing environment based on the developing trend of present distributed computing is presented

  20. The ventilation influence on the spatial distribution of Rn-222 and its decay products in human inhabited environments

    International Nuclear Information System (INIS)

    Munoz, S.N.M.; Hadler, J.C.; Paulo, S.R.

    1996-01-01

    For the determination of the ventilation influence (directional flux of air induced by a fan) on the spatial distribution of Rn-222 and its decay products (daughters) present in human inhabited environments, a group of experimental results were obtained by means of the fission nuclear tracks left by α-particles over adequate plastic detectors CR-39). The exposure of these detectors was done in a closed environment considering the influence of ventilation for different angles, velocities and distances from fan. The results show that a relative quantity of daughters of Rn-222 are pulled out of the environment due to the effects of ventilation and plat-out

  1. Investigation on Oracle GoldenGate Veridata for Data Consistency in WLCG Distributed Database Environment

    OpenAIRE

    Asko, Anti; Lobato Pardavila, Lorena

    2014-01-01

    Abstract In the distributed database environment, the data divergence can be an important problem: if it is not discovered and correctly identified, incorrect data can lead to poor decision making, errors in the service and in the operative errors. Oracle GoldenGate Veridata is a product to compare two sets of data and identify and report on data that is out of synchronization. IT DB is providing a replication service between databases at CERN and other computer centers worldwide as a par...

  2. Atmospheric Environment Vulnerability Cause Analysis for the Beijing-Tianjin-Hebei Metropolitan Region.

    Science.gov (United States)

    Zhang, Yang; Shen, Jing; Li, Yu

    2018-01-13

    Assessing and quantifying atmospheric vulnerability is a key issue in urban environmental protection and management. This paper integrated the Analytical hierarchy process (AHP), fuzzy synthesis evaluation and Geographic Information System (GIS) spatial analysis into an Exposure-Sensitivity-Adaptive capacity (ESA) framework to quantitatively assess atmospheric environment vulnerability in the Beijing-Tianjin-Hebei (BTH) region with spatial and temporal comparisons. The elaboration of the relationships between atmospheric environment vulnerability and indices of exposure, sensitivity, and adaptive capacity supports enable analysis of the atmospheric environment vulnerability. Our findings indicate that the atmospheric environment vulnerability of 13 cities in the BTH region exhibits obvious spatial heterogeneity, which is caused by regional diversity in exposure, sensitivity, and adaptive capacity indices. The results of atmospheric environment vulnerability assessment and the cause analysis can provide guidance to pick out key control regions and recognize vulnerable indicators for study sites. The framework developed in this paper can also be replicated at different spatial and temporal scales using context-specific datasets to support environmental management.

  3. Retrospective analysis of 'gamma distribution' based IMRT QA criteria

    International Nuclear Information System (INIS)

    Wen, C.; Chappell, R.A.

    2010-01-01

    Full text: IMRT has been implemented into clinical practice at Royal Hobart Hospital (RHH) since mid 2006 for treating patients with Head and Neck (H and N) or prostate tumours. A local quality assurance (QA) acceptance criteria based on 'gamma distribution' for approving IMRT plan was developed and implemented in early 2007. A retrospective analysis of such criteria over 194 clinical cases will be presented. The RHH IMRT criteria was established with assumption that gamma distribution obtained through inter-comparison of 2 D dose maps between planned and delivered was governed by a positive-hail' normal distribution. A commercial system-MapCheck was used for 2 D dose map comparison with a built-in gamma analysis tool. Gamma distribution histogram was generated and recorded for all cases. By retrospectively analysing those distributions using curve fitting technique, a statistical gamma distribution can be obtained and evaluated. This analytical result can be used for future IMRT planing and treatment delivery. The analyses indicate that gamma distribution obtained through MapCheckTM is well under the normal distribution, particularly for prostate cases. The applied pass/fail criteria is not overly sensitive to identify 'false fails' but can be further tighten-up for smaller field while for larger field found in both H and N and prostate cases, the criteria was correctly applied. Non-uniform distribution of detectors in MapCheck and experience level of planners are two major factors to variation in gamma distribution among clinical cases. This criteria derived from clinical statistics is superior and more accurate than single-valued criteria for lMRT QA acceptance procedure. (author)

  4. Estimation of Speciation and Distribution of {sup 131}I in urban and natural Environments

    Energy Technology Data Exchange (ETDEWEB)

    Hormann, Volker; Fischer, Helmut W. [University of Bremen, Institute of Environmental Physics, Otto-Hahn-Allee 1, 28359 Bremen (Germany)

    2014-07-01

    {sup 131}I is a radionuclide that may be introduced into natural and urban environments via several pathways. As a result of nuclear accidents it may be washed out from air or settle onto the ground by dry deposition. In urban landscapes this is again washed out by rain, partly introduced into the sewer system and thus transported to the next wastewater plant where it may accumulate in certain compartments. In rural landscapes it may penetrate the soil and be more or less available to plant uptake depending on chemical and physical conditions. On a regular basis, {sup 131}I is released into the urban sewer system in the course of therapeutic and diagnostic treatment of patients with thyroid diseases. The speciation of iodine in the environment is complex. Depending on redox state and biological activity, it may appear as I{sup -}, IO{sub 3}{sup -}, I{sub 2} or bound to organic molecules (e.g. humic acids). Moreover, some of these species are bound to surfaces of particles suspended in water or present in soil, e.g. hydrous ferric oxides (HFO). It is to be expected that speciation and solid-liquid distribution of iodine strongly depends on environmental conditions. In this study, the speciation and solid-liquid distribution of iodine in environmental samples such as waste water, sewage sludge and soil are estimated with the help of the geochemical code PHREEQC. The calculations are carried out using chemical equilibrium and sorption data from the literature and chemical analyses of the media. We present the results of these calculations and compare them with experimental results of medical {sup 131}I in waste water and sewage sludge. The output of this study will be used in future work where transport and distribution of iodine in wastewater treatment plants and in irrigated agricultural soils will be modeled. (authors)

  5. An integrated economic and distributional analysis of energy policies

    Energy Technology Data Exchange (ETDEWEB)

    Labandeira, Xavier [Facultade de CC. Economicas, University of Vigo, 36310 Vigo (Spain); Labeaga, Jose M. [Instituto de Estudios Fiscales, Avda. Cardenal Herrera Oria 378, 28035 Madrid (Spain); Rodriguez, Miguel [Facultade de CC. Empresariais e Turismo, University of Vigo, 32004 Ourense (Spain)

    2009-12-15

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  6. An integrated economic and distributional analysis of energy policies

    International Nuclear Information System (INIS)

    Labandeira, Xavier; Labeaga, Jose M.; Rodriguez, Miguel

    2009-01-01

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  7. First Experiences with LHC Grid Computing and Distributed Analysis

    CERN Document Server

    Fisk, Ian

    2010-01-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  8. Assessing the spatial distribution of Tuta absoluta (Lepidoptera: Gelechiidae) eggs in open-field tomato cultivation through geostatistical analysis.

    Science.gov (United States)

    Martins, Júlio C; Picanço, Marcelo C; Silva, Ricardo S; Gonring, Alfredo Hr; Galdino, Tarcísio Vs; Guedes, Raul Nc

    2018-01-01

    The spatial distribution of insects is due to the interaction between individuals and the environment. Knowledge about the within-field pattern of spatial distribution of a pest is critical to planning control tactics, developing efficient sampling plans, and predicting pest damage. The leaf miner Tuta absoluta (Meyrick) (Lepidoptera: Gelechiidae) is the main pest of tomato crops in several regions of the world. Despite the importance of this pest, the pattern of spatial distribution of T. absoluta on open-field tomato cultivation remains unknown. Therefore, this study aimed to characterize the spatial distribution of T. absoluta in 22 commercial open-field tomato cultivations with plants at the three phenological development stages by using geostatistical analysis. Geostatistical analysis revealed that there was strong evidence for spatially dependent (aggregated) T. absoluta eggs in 19 of the 22 sample tomato cultivations. The maps that were obtained demonstrated the aggregated structure of egg densities at the edges of the crops. Further, T. absoluta was found to accomplish egg dispersal along the rows more frequently than it does between rows. Our results indicate that the greatest egg densities of T. absoluta occur at the edges of tomato crops. These results are discussed in relation to the behavior of T. absoluta distribution within fields and in terms of their implications for improved sampling guidelines and precision targeting control methods that are essential for effective pest monitoring and management. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  9. MODELING OF WATER DISTRIBUTION SYSTEM PARAMETERS AND THEIR PARTICULAR IMPORTANCE IN ENVIRONMENT ENGINEERING PROCESSES

    Directory of Open Access Journals (Sweden)

    Agnieszka Trębicka

    2016-05-01

    Full Text Available The object of this study is to present a mathematical model of water-supply network and the analysis of basic parameters of water distribution system with a digital model. The reference area is Kleosin village, municipality Juchnowiec Kościelny in podlaskie province, located at the border with Białystok. The study focused on the significance of every change related to the quality and quantity of water delivered to WDS through modeling the basic parameters of water distribution system in different variants of work in order to specify new, more rational ways of exploitation (decrease in pressure value and to define conditions for development and modernization of the water-supply network, with special analysis of the scheme, in frames of specification of the most dangerous places in the network. The analyzed processes are based on copying and developing the existing state of water distribution sub-system (the WDS with the use of mathematical modeling that includes the newest accessible computer techniques.

  10. Transition of a Three-Dimensional Unsteady Viscous Flow Analysis from a Research Environment to the Design Environment

    Science.gov (United States)

    Dorney, Suzanne; Dorney, Daniel J.; Huber, Frank; Sheffler, David A.; Turner, James E. (Technical Monitor)

    2001-01-01

    The advent of advanced computer architectures and parallel computing have led to a revolutionary change in the design process for turbomachinery components. Two- and three-dimensional steady-state computational flow procedures are now routinely used in the early stages of design. Unsteady flow analyses, however, are just beginning to be incorporated into design systems. This paper outlines the transition of a three-dimensional unsteady viscous flow analysis from the research environment into the design environment. The test case used to demonstrate the analysis is the full turbine system (high-pressure turbine, inter-turbine duct and low-pressure turbine) from an advanced turboprop engine.

  11. Scilab and Maxima Environment: Towards Free Software in Numerical Analysis

    Science.gov (United States)

    Mora, Angel; Galan, Jose Luis; Aguilera, Gabriel; Fernandez, Alvaro; Merida, Enrique; Rodriguez, Pedro

    2010-01-01

    In this work we will present the ScilabUMA environment we have developed as an alternative to Matlab. This environment connects Scilab (for numerical analysis) and Maxima (for symbolic computations). Furthermore, the developed interface is, in our opinion at least, as powerful as the interface of Matlab. (Contains 3 figures.)

  12. Textural Maturity Analysis and Sedimentary Environment Discrimination Based on Grain Shape Data

    Science.gov (United States)

    Tunwal, M.; Mulchrone, K. F.; Meere, P. A.

    2017-12-01

    Morphological analysis of clastic sedimentary grains is an important source of information regarding the processes involved in their formation, transportation and deposition. However, a standardised approach for quantitative grain shape analysis is generally lacking. In this contribution we report on a study where fully automated image analysis techniques were applied to loose sediment samples collected from glacial, aeolian, beach and fluvial environments. A range of shape parameters are evaluated for their usefulness in textural characterisation of populations of grains. The utility of grain shape data in ranking textural maturity of samples within a given sedimentary environment is evaluated. Furthermore, discrimination of sedimentary environment on the basis of grain shape information is explored. The data gathered demonstrates a clear progression in textural maturity in terms of roundness, angularity, irregularity, fractal dimension, convexity, solidity and rectangularity. Textural maturity can be readily categorised using automated grain shape parameter analysis. However, absolute discrimination between different depositional environments on the basis of shape parameters alone is less certain. For example, the aeolian environment is quite distinct whereas fluvial, glacial and beach samples are inherently variable and tend to overlap each other in terms of textural maturity. This is most likely due to a collection of similar processes and sources operating within these environments. This study strongly demonstrates the merit of quantitative population-based shape parameter analysis of texture and indicates that it can play a key role in characterising both loose and consolidated sediments. This project is funded by the Irish Petroleum Infrastructure Programme (www.pip.ie)

  13. PROTONEGOCIATIONS - SALES FORECAST AND COMPETITIVE ENVIRONMENT ANALYSIS METHOD

    Directory of Open Access Journals (Sweden)

    Lupu Adrian Gelu

    2009-05-01

    Full Text Available Protonegotiation management, as part of successful negotiations of the organizations, is an issue for analysis extremely important for today’s managers in the confrontations generated by the changes of the environments in the period of transition to marke

  14. Size distribution of magnetic iron oxide nanoparticles using Warren-Averbach XRD analysis

    Science.gov (United States)

    Mahadevan, S.; Behera, S. P.; Gnanaprakash, G.; Jayakumar, T.; Philip, J.; Rao, B. P. C.

    2012-07-01

    We use the Fourier transform based Warren-Averbach (WA) analysis to separate the contributions of X-ray diffraction (XRD) profile broadening due to crystallite size and microstrain for magnetic iron oxide nanoparticles. The profile shape of the column length distribution, obtained from WA analysis, is used to analyze the shape of the magnetic iron oxide nanoparticles. From the column length distribution, the crystallite size and its distribution are estimated for these nanoparticles which are compared with size distribution obtained from dynamic light scattering measurements. The crystallite size and size distribution of crystallites obtained from WA analysis are explained based on the experimental parameters employed in preparation of these magnetic iron oxide nanoparticles. The variation of volume weighted diameter (Dv, from WA analysis) with saturation magnetization (Ms) fits well to a core shell model wherein it is known that Ms=Mbulk(1-6g/Dv) with Mbulk as bulk magnetization of iron oxide and g as magnetic shell disorder thickness.

  15. Widening the scope of incident analysis in complex work environments

    NARCIS (Netherlands)

    Vuuren, van W.; Kanse, L.; Manser, T.

    2003-01-01

    Incident analysis is commonly used as a tooi to provide information on how to improve health and safety in complex work environments. The effectiveness of this tooi, however, depends on how the analysis is actually carried out. The traditional view on incident analysis highlights the role of human

  16. Anomalous Traffic Detection and Self-Similarity Analysis in the Environment of ATMSim

    Directory of Open Access Journals (Sweden)

    Hae-Duck J. Jeong

    2017-12-01

    Full Text Available Internet utilisation has steadily increased, predominantly due to the rapid recent development of information and communication networks and the widespread distribution of smartphones. As a result of this increase in Internet consumption, various types of services, including web services, social networking services (SNS, Internet banking, and remote processing systems have been created. These services have significantly enhanced global quality of life. However, as a negative side-effect of this rapid development, serious information security problems have also surfaced, which has led to serious to Internet privacy invasions and network attacks. In an attempt to contribute to the process of addressing these problems, this paper proposes a process to detect anomalous traffic using self-similarity analysis in the Anomaly Teletraffic detection Measurement analysis Simulator (ATMSim environment as a research method. Simulations were performed to measure normal and anomalous traffic. First, normal traffic for each attack, including the Address Resolution Protocol (ARP and distributed denial-of-service (DDoS was measured for 48 h over 10 iterations. Hadoop was used to facilitate processing of the large amount of collected data, after which MapReduce was utilised after storing the data in the Hadoop Distributed File System (HDFS. A new platform on Hadoop, the detection system ATMSim, was used to identify anomalous traffic after which a comparative analysis of the normal and anomalous traffic was performed through a self-similarity analysis. There were four categories of collected traffic that were divided according to the attack methods used: normal local area network (LAN traffic, DDoS attack, and ARP spoofing, as well as DDoS and ARP attack. ATMSim, the anomaly traffic detection system, was used to determine if real attacks could be identified effectively. To achieve this, the ATMSim was used in simulations for each scenario to test its ability to

  17. Analysis of the PPBE Process in the Current Dynamic Political Environment

    Science.gov (United States)

    2008-06-01

    provides a comparative analysis using the Political, Economic , Socio- Cultural, Technological, Ecological and Legal ( PESTEL ) Analysis model of the...37 A. PESTEL ANALYSIS OF THE 1960/1970 ERA...44 B. PESTEL ANALYSIS OF THE POST 9/11 ENVIRONMENT..................45 1. Political

  18. Distribution and behavior of tritium in the environment near its sources

    International Nuclear Information System (INIS)

    Fukui, Masami

    1994-01-01

    Of the long-lived radionuclides migrating globally through the environment, tritium was investigated because it is a latent source of detectable pollution in the Kyoto University Research Reactor Institute (KURRI). Moreover, mass transport in an in situ situation could be studied as well. Discharge rates of tritium from the operation of reactors and reprocessing plants throughout the world are given together with data on the background level of tritium in the environment and the resulting collective effective dose equivalent commitment. The behavior of tritium that has leaked from a heavy water facility in the Kyoto University Research Reactor (KURR) is dealt with by modeling its transport between air, water, and building materials. The spatial distributions of tritium in the air within the facilities of the KURR and KUCA (Kyoto University Critical Assembly), as measured by a convenient monitoring method that uses small water basins as passive samplers also are shown. Tritium discharged as liquid effluents in a reservoir for monitoring and in a retention pond at the KURRI site was monitored, and its behavior clarified by use of a box model and/or a classical dispersion equation. The purpose of this research is to keep radiation exposure to workers and the public as low as reasonably achievable, taking into account economic and social considerations (the ALARA concept). (author)

  19. Uncertainty analysis for secondary energy distributions

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1978-01-01

    In many transport calculations the integral design parameter of interest (response) is determined mainly by secondary particles such as gamma rays from (n,γ) reactions or secondary neutrons from inelastic scattering events or (n,2n) reactions. Standard sensitivity analysis usually allows to calculate the sensitivities to the production cross sections of such secondaries, but an extended formalism is needed to also obtain the sensitivities to the energy distribution of the generated secondary particles. For a 30-group standard cross-section set 84% of all non-zero table positions pertain to the description of secondary energy distributions (SED's) and only 16% to the actual reaction cross sections. Therefore, any sensitivity/uncertainty analysis which does not consider the effects of SED's is incomplete and neglects most of the input data. This paper describes the methods of how sensitivity profiles for SED's are obtained and used to estimate the uncertainty of an integral response due to uncertainties in these SED's. The detailed theory is documented elsewhere and implemented in the LASL sensitivity code SENSIT. SED sensitivity profiles have proven particularly valuable in cross-section uncertainty analyses for fusion reactors. Even when the production cross sections for secondary neutrons were assumed to be without error, the uncertainties in the energy distribution of these secondaries produced appreciable uncertainties in the calculated tritium breeding rate. However, complete error files for SED's are presently nonexistent. Therefore, methods will be described that allow rough error estimates due to estimated SED uncertainties based on integral SED sensitivities

  20. Distribution of peanut allergen in the environment.

    Science.gov (United States)

    Perry, Tamara T; Conover-Walker, Mary Kay; Pomés, Anna; Chapman, Martin D; Wood, Robert A

    2004-05-01

    Patients with peanut allergy can have serious reactions to very small quantities of peanut allergen and often go to extreme measures to avoid potential contact with this allergen. The purpose of this study was to detect peanut allergen under various environmental conditions and examine the effectiveness of cleaning agents for allergen removal. A monoclonal-based ELISA for Arachis hypogaea allergen 1 (Ara h 1; range of detection, 30-2000 ng/mL) was used to assess peanut contamination on cafeteria tables and other surfaces in schools, the presence of residual peanut protein after using various cleaning products on hands and tabletops, and airborne peanut allergen during the consumption of several forms of peanut. After hand washing with liquid soap, bar soap, or commercial wipes, Ara h 1 was undetectable. Plain water and antibacterial hand sanitizer left detectable Ara h 1 on 3 of 12 and 6 of 12 hands, respectively. Common household cleaning agents removed peanut allergen from tabletops, except dishwashing liquid, which left Ara h 1 on 4 of 12 tables. Of the 6 area preschools and schools evaluated, Ara h 1 was found on 1 of 13 water fountains, 0 of 22 desks, and 0 of 36 cafeteria tables. Airborne Ara h 1 was undetectable in simulated real-life situations when participants consumed peanut butter, shelled peanuts, and unshelled peanuts. The major peanut allergen, Ara h 1, is relatively easily cleaned from hands and tabletops with common cleaning agents and does not appear to be widely distributed in preschools and schools. We were not able to detect airborne allergen in many simulated environments.

  1. Location and Size Planning of Distributed Photovoltaic Generation in Distribution network System Based on K-means Clustering Analysis

    Science.gov (United States)

    Lu, Siqi; Wang, Xiaorong; Wu, Junyong

    2018-01-01

    The paper presents a method to generate the planning scenarios, which is based on K-means clustering analysis algorithm driven by data, for the location and size planning of distributed photovoltaic (PV) units in the network. Taken the power losses of the network, the installation and maintenance costs of distributed PV, the profit of distributed PV and the voltage offset as objectives and the locations and sizes of distributed PV as decision variables, Pareto optimal front is obtained through the self-adaptive genetic algorithm (GA) and solutions are ranked by a method called technique for order preference by similarity to an ideal solution (TOPSIS). Finally, select the planning schemes at the top of the ranking list based on different planning emphasis after the analysis in detail. The proposed method is applied to a 10-kV distribution network in Gansu Province, China and the results are discussed.

  2. A Distributed Multi-dimensional SOLAP Model of Remote Sensing Data and Its Application in Drought Analysis

    Directory of Open Access Journals (Sweden)

    LI Jiyuan

    2014-06-01

    Full Text Available SOLAP (Spatial On-Line Analytical Processing has been applied to multi-dimensional analysis of remote sensing data recently. However, its computation performance faces a considerable challenge from the large-scale dataset. A geo-raster cube model extended by Map-Reduce is proposed, which refers to the application of Map-Reduce (a data-intensive computing paradigm in the OLAP field. In this model, the existing methods are modified to adapt to distributed environment based on the multi-level raster tiles. Then the multi-dimensional map algebra is introduced to decompose the SOLAP computation into multiple distributed parallel map algebra functions on tiles under the support of Map-Reduce. The drought monitoring by remote sensing data is employed as a case study to illustrate the model construction and application. The prototype is also implemented, and the performance testing shows the efficiency and scalability of this model.

  3. Operating Characteristics of Statistical Methods for Detecting Gene-by-Measured Environment Interaction in the Presence of Gene-Environment Correlation under Violations of Distributional Assumptions.

    Science.gov (United States)

    Van Hulle, Carol A; Rathouz, Paul J

    2015-02-01

    Accurately identifying interactions between genetic vulnerabilities and environmental factors is of critical importance for genetic research on health and behavior. In the previous work of Van Hulle et al. (Behavior Genetics, Vol. 43, 2013, pp. 71-84), we explored the operating characteristics for a set of biometric (e.g., twin) models of Rathouz et al. (Behavior Genetics, Vol. 38, 2008, pp. 301-315), for testing gene-by-measured environment interaction (GxM) in the presence of gene-by-measured environment correlation (rGM) where data followed the assumed distributional structure. Here we explore the effects that violating distributional assumptions have on the operating characteristics of these same models even when structural model assumptions are correct. We simulated N = 2,000 replicates of n = 1,000 twin pairs under a number of conditions. Non-normality was imposed on either the putative moderator or on the ultimate outcome by ordinalizing or censoring the data. We examined the empirical Type I error rates and compared Bayesian information criterion (BIC) values. In general, non-normality in the putative moderator had little impact on the Type I error rates or BIC comparisons. In contrast, non-normality in the outcome was often mistaken for or masked GxM, especially when the outcome data were censored.

  4. Predictors of nurse manager stress: a dominance analysis of potential work environment stressors.

    Science.gov (United States)

    Kath, Lisa M; Stichler, Jaynelle F; Ehrhart, Mark G; Sievers, Andree

    2013-11-01

    Nurse managers have important but stressful jobs. Clinical or bedside nurse predictors of stress have been studied more frequently, but less has been done on work environment predictors for those in this first-line leadership role. Understanding the relative importance of those work environment predictors could be used to help identify the most fruitful areas for intervention, potentially improving recruitment and retention for nurse managers. Using Role Stress Theory and the Job Demands-Resources Theory, a model was tested examining the relative importance of five potential predictors of nurse manager stress (i.e., stressors). The work environment stressors included role ambiguity, role overload, role conflict, organizational constraints, and interpersonal conflict. A quantitative, cross-sectional survey study was conducted with a convenience sample of 36 hospitals in the Southwestern United States. All nurse managers working in these 36 hospitals were invited to participate. Of the 636 nurse managers invited, 480 responded, for a response rate of 75.5%. Questionnaires were distributed during nursing leadership meetings and were returned in person (in sealed envelopes) or by mail. Because work environment stressors were correlated, dominance analysis was conducted to examine which stressors were the most important predictors of nurse manager stress. Role overload was the most important predictor of stress, with an average of 13% increase in variance explained. The second- and third-most important predictors were organizational constraints and role conflict, with an average of 7% and 6% increase in variance explained, respectively. Because other research has shown deleterious effects of nurse manager stress, organizational leaders are encouraged to help nurse managers reduce their actual and/or perceived role overload and organizational constraints. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. High Level Analysis, Design and Validation of Distributed Mobile Systems with CoreASM

    Science.gov (United States)

    Farahbod, R.; Glässer, U.; Jackson, P. J.; Vajihollahi, M.

    System design is a creative activity calling for abstract models that facilitate reasoning about the key system attributes (desired requirements and resulting properties) so as to ensure these attributes are properly established prior to actually building a system. We explore here the practical side of using the abstract state machine (ASM) formalism in combination with the CoreASM open source tool environment for high-level design and experimental validation of complex distributed systems. Emphasizing the early phases of the design process, a guiding principle is to support freedom of experimentation by minimizing the need for encoding. CoreASM has been developed and tested building on a broad scope of applications, spanning computational criminology, maritime surveillance and situation analysis. We critically reexamine here the CoreASM project in light of three different application scenarios.

  6. Fissure formation in coke. 3: Coke size distribution and statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    D.R. Jenkins; D.E. Shaw; M.R. Mahoney [CSIRO, North Ryde, NSW (Australia). Mathematical and Information Sciences

    2010-07-15

    A model of coke stabilization, based on a fundamental model of fissuring during carbonisation is used to demonstrate the applicability of the fissuring model to actual coke size distributions. The results indicate that the degree of stabilization is important in determining the size distribution. A modified form of the Weibull distribution is shown to provide a better representation of the whole coke size distribution compared to the Rosin-Rammler distribution, which is generally only fitted to the lump coke. A statistical analysis of a large number of experiments in a pilot scale coke oven shows reasonably good prediction of the coke mean size, based on parameters related to blend rank, amount of low rank coal, fluidity and ash. However, the prediction of measures of the spread of the size distribution is more problematic. The fissuring model, the size distribution representation and the statistical analysis together provide a comprehensive capability for understanding and predicting the mean size and distribution of coke lumps produced during carbonisation. 12 refs., 16 figs., 4 tabs.

  7. Distributed metadata in a high performance computing environment

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Zhang, Zhenhua; Liu, Xuezhao; Tang, Haiying

    2017-07-11

    A computer-executable method, system, and computer program product for managing meta-data in a distributed storage system, wherein the distributed storage system includes one or more burst buffers enabled to operate with a distributed key-value store, the co computer-executable method, system, and computer program product comprising receiving a request for meta-data associated with a block of data stored in a first burst buffer of the one or more burst buffers in the distributed storage system, wherein the meta data is associated with a key-value, determining which of the one or more burst buffers stores the requested metadata, and upon determination that a first burst buffer of the one or more burst buffers stores the requested metadata, locating the key-value in a portion of the distributed key-value store accessible from the first burst buffer.

  8. Analysis of the moral habitability of the nursing work environment.

    Science.gov (United States)

    Peter, Elizabeth H; Macfarlane, Amy V; O'Brien-Pallas, Linda L

    2004-08-01

    Following health reform, nurses have experienced the tremendous stress of heavy workloads, long hours and difficult professional responsibilities. In recognition of these problems, a study was conducted that examined the impact of the working environment on the health of nurses. After conducting focus groups across Canada with nurses and others well acquainted with nursing issues, it became clear that the difficult work environments described had significant ethical implications. The aim of this paper is to report the findings of research that examined the moral habitability of the nursing working environment. A secondary analysis was conducted using the theoretical work of Margaret Urban Walker. Moral practices and responsibilities from Walker's perspective cannot be extricated from other social roles, practices and divisions of labour. Moral-social orders, such as work environments in this research, must be made transparent to examine their moral habitability. Morally habitable environments are those in which differently situated people experience their responsibilities as intelligible and coherent. They also foster recognition, cooperation and shared benefits. Four overarching categories were developed through the analysis of the data: (1) oppressive work environments; (2) incoherent moral understandings; (3) moral suffering and (4) moral influence and resistance. The findings clearly indicate that participants perceived the work environment to be morally uninhabitable. The social and spatial positioning of nurses left them vulnerable to being overburdened by and unsure of their responsibilities. Nevertheless, nurses found meaningful ways to resist and to influence the moral environment. We recommend that nurses develop strong moral identities, make visible the inseparability of their proximity to patients and moral accountability, and further identify what forms of collective action are most effective in improving the moral habitability of their work

  9. Techniques and environments for big data analysis parallel, cloud, and grid computing

    CERN Document Server

    Dehuri, Satchidananda; Kim, Euiwhan; Wang, Gi-Name

    2016-01-01

    This volume is aiming at a wide range of readers and researchers in the area of Big Data by presenting the recent advances in the fields of Big Data Analysis, as well as the techniques and tools used to analyze it. The book includes 10 distinct chapters providing a concise introduction to Big Data Analysis and recent Techniques and Environments for Big Data Analysis. It gives insight into how the expensive fitness evaluation of evolutionary learning can play a vital role in big data analysis by adopting Parallel, Grid, and Cloud computing environments.

  10. Comprehensive environment-suitability evaluation model about Carya cathayensis

    International Nuclear Information System (INIS)

    Da-Sheng, W.; Li-Juan, L.; Qin-Fen, Y.

    2013-01-01

    On the relation between the suitable environment and the distribution areas of Carya cathayensis Sarg., the current studies are mainly committed to qualitative descriptions, but did not consider quantitative models. The objective of this study was to establish a environment-suitability evaluation model which used to predict potential suitable areas of C. cathayensis. Firstly, the 3 factor data of soil type, soil parent material and soil thickness were obtained based on 2-class forest resource survey, and other factor data, which included elevation, slope, aspect, surface curvature, humidity index, and solar radiation index, were extracted from DEM (Digital Elevation Model). Additionally, the key affecting factors were defined by PCA (Principal Component Analysis), the weights of evaluation factors were determined by AHP (Analysis Hierarchy Process) and the quantitative classification of single factor was determined by membership function with fuzzy mathematics. Finally, a comprehensive environment-suitability evaluation model was established and which was also used to predict the potential suitable areas of C. cathayensis in Daoshi Town in the study region. The results showed that 85.6% of actual distribution areas were in the most suitable and more suitable regions and 11.5% in the general suitable regions

  11. Objective Bayesian Analysis of Skew- t Distributions

    KAUST Repository

    BRANCO, MARCIA D'ELIA

    2012-02-27

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student\\'s t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric around zero and proper. Moreover, we propose a Student\\'s t approximation of the Jeffreys prior that makes an objective Bayesian analysis easy to perform. We carry out a Monte Carlo simulation study that demonstrates an overall better behaviour of the maximum a posteriori estimator compared with the maximum likelihood estimator. We also compare the frequentist coverage of the credible intervals based on the Jeffreys prior and its approximation and show that they are similar. We further discuss location-scale models under scale mixtures of skew-normal distributions and show some conditions for the existence of the posterior distribution and its moments. Finally, we present three numerical examples to illustrate the implications of our results on inference for skew-t distributions. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  12. Weibull distribution in reliability data analysis in nuclear power plant

    International Nuclear Information System (INIS)

    Ma Yingfei; Zhang Zhijian; Zhang Min; Zheng Gangyang

    2015-01-01

    Reliability is an important issue affecting each stage of the life cycle ranging from birth to death of a product or a system. The reliability engineering includes the equipment failure data processing, quantitative assessment of system reliability and maintenance, etc. Reliability data refers to the variety of data that describe the reliability of system or component during its operation. These data may be in the form of numbers, graphics, symbols, texts and curves. Quantitative reliability assessment is the task of the reliability data analysis. It provides the information related to preventing, detect, and correct the defects of the reliability design. Reliability data analysis under proceed with the various stages of product life cycle and reliability activities. Reliability data of Systems Structures and Components (SSCs) in Nuclear Power Plants is the key factor of probabilistic safety assessment (PSA); reliability centered maintenance and life cycle management. The Weibull distribution is widely used in reliability engineering, failure analysis, industrial engineering to represent manufacturing and delivery times. It is commonly used to model time to fail, time to repair and material strength. In this paper, an improved Weibull distribution is introduced to analyze the reliability data of the SSCs in Nuclear Power Plants. An example is given in the paper to present the result of the new method. The Weibull distribution of mechanical equipment for reliability data fitting ability is very strong in nuclear power plant. It's a widely used mathematical model for reliability analysis. The current commonly used methods are two-parameter and three-parameter Weibull distribution. Through comparison and analysis, the three-parameter Weibull distribution fits the data better. It can reflect the reliability characteristics of the equipment and it is more realistic to the actual situation. (author)

  13. An Effective Distributed Model for Power System Transient Stability Analysis

    Directory of Open Access Journals (Sweden)

    MUTHU, B. M.

    2011-08-01

    Full Text Available The modern power systems consist of many interconnected synchronous generators having different inertia constants, connected with large transmission network and ever increasing demand for power exchange. The size of the power system grows exponentially due to increase in power demand. The data required for various power system applications have been stored in different formats in a heterogeneous environment. The power system applications themselves have been developed and deployed in different platforms and language paradigms. Interoperability between power system applications becomes a major issue because of the heterogeneous nature. The main aim of the paper is to develop a generalized distributed model for carrying out power system stability analysis. The more flexible and loosely coupled JAX-RPC model has been developed for representing transient stability analysis in large interconnected power systems. The proposed model includes Pre-Fault, During-Fault, Post-Fault and Swing Curve services which are accessible to the remote power system clients when the system is subjected to large disturbances. A generalized XML based model for data representation has also been proposed for exchanging data in order to enhance the interoperability between legacy power system applications. The performance measure, Round Trip Time (RTT is estimated for different power systems using the proposed JAX-RPC model and compared with the results obtained using traditional client-server and Java RMI models.

  14. Working environment in power generation

    International Nuclear Information System (INIS)

    1989-05-01

    The proceedings contain 21 papers, of which 7 are devoted to nuclear power generation. They are concerned with the working environment in the controlled areas of the Bohunice nuclear power plant, the unsuitable design of the control rooms with respect to reliability and safety of operation of the nuclear power plant, optimization of the man-working conditions relation, operation of transport facilities, refuelling and fuel element inspection, the human factor and the probabilityy assessment of the nuclear power plant operating safety, a proposal to establish a universal ergonometric programme for the electric power distribution system, and physical factors in the ergonometric analysis of the working environment. (J.B.)

  15. Analysis of Faraday Mirror in Auto-Compensating Quantum Key Distribution

    International Nuclear Information System (INIS)

    Wei Ke-Jin; Ma Hai-Qiang; Li Rui-Xue; Zhu Wu; Liu Hong-Wei; Zhang Yong; Jiao Rong-Zhen

    2015-01-01

    The ‘plug and play’ quantum key distribution system is the most stable and the earliest commercial system in the quantum communication field. Jones matrix and Jones calculus are widely used in the analysis of this system and the improved version, which is called the auto-compensating quantum key distribution system. Unfortunately, existing analysis has two drawbacks: only the auto-compensating process is analyzed and existing systems do not fully consider laser phase affected by a Faraday mirror (FM). In this work, we present a detailed analysis of the output of light pulse transmitting in a plug and play quantum key distribution system that contains only an FM, by Jones calculus. A similar analysis is made to a home-made auto-compensating system which contains two FMs to compensate for environmental effects. More importantly, we show that theoretical and experimental results are different in the plug and play interferometric setup due to the fact that a conventional Jones matrix of FM neglected an additional phase π on alternative polarization direction. To resolve the above problem, we give a new Jones matrix of an FM according to the coordinate rotation. This new Jones matrix not only resolves the above contradiction in the plug and play interferometric setup, but also is suitable for the previous analyses about auto-compensating quantum key distribution. (paper)

  16. Virtual environments for scene of crime reconstruction and analysis

    Science.gov (United States)

    Howard, Toby L. J.; Murta, Alan D.; Gibson, Simon

    2000-02-01

    This paper describes research conducted in collaboration with Greater Manchester Police (UK), to evalute the utility of Virtual Environments for scene of crime analysis, forensic investigation, and law enforcement briefing and training. We present an illustrated case study of the construction of a high-fidelity virtual environment, intended to match a particular real-life crime scene as closely as possible. We describe and evaluate the combination of several approaches including: the use of the Manchester Scene Description Language for constructing complex geometrical models; the application of a radiosity rendering algorithm with several novel features based on human perceptual consideration; texture extraction from forensic photography; and experiments with interactive walkthroughs and large-screen stereoscopic display of the virtual environment implemented using the MAVERIK system. We also discuss the potential applications of Virtual Environment techniques in the Law Enforcement and Forensic communities.

  17. User-friendly Tool for Power Flow Analysis and Distributed Generation Optimisation in Radial Distribution Networks

    Directory of Open Access Journals (Sweden)

    M. F. Akorede

    2017-06-01

    Full Text Available The intent of power distribution companies (DISCOs is to deliver electric power to their customers in an efficient and reliable manner – with minimal energy loss cost. One major way to minimise power loss on a given power system is to install distributed generation (DG units on the distribution networks. However, to maximise benefits, it is highly crucial for a DISCO to ensure that these DG units are of optimal size and sited in the best locations on the network. This paper gives an overview of a software package developed in this study, called Power System Analysis and DG Optimisation Tool (PFADOT. The main purpose of the graphical user interface-based package is to guide a DISCO in finding the optimal size and location for DG placement in radial distribution networks. The package, which is also suitable for load flow analysis, employs the GUI feature of MATLAB. Three objective functions are formulated into a single optimisation problem and solved with fuzzy genetic algorithm to simultaneously obtain DG optimal size and location. The accuracy and reliability of the developed tool was validated using several radial test systems, and the results obtained are evaluated against the existing similar package cited in the literature, which are impressive and computationally efficient.

  18. A framework for implementing a Distributed Intrusion Detection System (DIDS) with interoperabilty and information analysis

    OpenAIRE

    Davicino, Pablo; Echaiz, Javier; Ardenghi, Jorge Raúl

    2011-01-01

    Computer Intrusion Detection Systems (IDS) are primarily designed to protect availability, condentiality and integrity of critical information infrastructures. A Distributed IDS (DIDS) consists of several IDS over a large network(s), all of which communicate with each other, with a central server or with a cluster of servers that facilitates advanced network monitoring. In a distributed environment, DIDS are implemented using cooperative intelligent sensors distributed across the network(s). ...

  19. AXAF user interfaces for heterogeneous analysis environments

    Science.gov (United States)

    Mandel, Eric; Roll, John; Ackerman, Mark S.

    1992-01-01

    The AXAF Science Center (ASC) will develop software to support all facets of data center activities and user research for the AXAF X-ray Observatory, scheduled for launch in 1999. The goal is to provide astronomers with the ability to utilize heterogeneous data analysis packages, that is, to allow astronomers to pick the best packages for doing their scientific analysis. For example, ASC software will be based on IRAF, but non-IRAF programs will be incorporated into the data system where appropriate. Additionally, it is desired to allow AXAF users to mix ASC software with their own local software. The need to support heterogeneous analysis environments is not special to the AXAF project, and therefore finding mechanisms for coordinating heterogeneous programs is an important problem for astronomical software today. The approach to solving this problem has been to develop two interfaces that allow the scientific user to run heterogeneous programs together. The first is an IRAF-compatible parameter interface that provides non-IRAF programs with IRAF's parameter handling capabilities. Included in the interface is an application programming interface to manipulate parameters from within programs, and also a set of host programs to manipulate parameters at the command line or from within scripts. The parameter interface has been implemented to support parameter storage formats other than IRAF parameter files, allowing one, for example, to access parameters that are stored in data bases. An X Windows graphical user interface called 'agcl' has been developed, layered on top of the IRAF-compatible parameter interface, that provides a standard graphical mechanism for interacting with IRAF and non-IRAF programs. Users can edit parameters and run programs for both non-IRAF programs and IRAF tasks. The agcl interface allows one to communicate with any command line environment in a transparent manner and without any changes to the original environment. For example, the authors

  20. Analysis of Statistical Distributions Used for Modeling Reliability and Failure Rate of Temperature Alarm Circuit

    International Nuclear Information System (INIS)

    EI-Shanshoury, G.I.

    2011-01-01

    Several statistical distributions are used to model various reliability and maintainability parameters. The applied distribution depends on the' nature of the data being analyzed. The presented paper deals with analysis of some statistical distributions used in reliability to reach the best fit of distribution analysis. The calculations rely on circuit quantity parameters obtained by using Relex 2009 computer program. The statistical analysis of ten different distributions indicated that Weibull distribution gives the best fit distribution for modeling the reliability of the data set of Temperature Alarm Circuit (TAC). However, the Exponential distribution is found to be the best fit distribution for modeling the failure rate

  1. A Modeling Framework for Schedulability Analysis of Distributed Avionics Systems

    DEFF Research Database (Denmark)

    Han, Pujie; Zhai, Zhengjun; Nielsen, Brian

    2018-01-01

    This paper presents a modeling framework for schedulability analysis of distributed integrated modular avionics (DIMA) systems that consist of spatially distributed ARINC-653 modules connected by a unified AFDX network. We model a DIMA system as a set of stopwatch automata (SWA) in UPPAAL...

  2. Analysis of Java Distributed Architectures in Designing and Implementing a Client/Server Database System

    National Research Council Canada - National Science Library

    Akin, Ramis

    1998-01-01

    .... Information is scattered throughout organizations and must be easily accessible. A new solution is needed for effective and efficient management of data in today's distributed client/server environment...

  3. Statistical Analysis Of Failure Strength Of Material Using Weibull Distribution

    International Nuclear Information System (INIS)

    Entin Hartini; Mike Susmikanti; Antonius Sitompul

    2008-01-01

    In evaluation of ceramic and glass materials strength a statistical approach is necessary Strength of ceramic and glass depend on its measure and size distribution of flaws in these material. The distribution of strength for ductile material is narrow and close to a Gaussian distribution while strength of brittle materials as ceramic and glass following Weibull distribution. The Weibull distribution is an indicator of the failure of material strength resulting from a distribution of flaw size. In this paper, cumulative probability of material strength to failure probability, cumulative probability of failure versus fracture stress and cumulative probability of reliability of material were calculated. Statistical criteria calculation supporting strength analysis of Silicon Nitride material were done utilizing MATLAB. (author)

  4. A NEW INITIATIVE FOR TILING, STITCHING AND PROCESSING GEOSPATIAL BIG DATA IN DISTRIBUTED COMPUTING ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    A. Olasz

    2016-06-01

    Full Text Available Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/ developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu. The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows written in different development frameworks (e.g. Python, R or C#. It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.

  5. a New Initiative for Tiling, Stitching and Processing Geospatial Big Data in Distributed Computing Environments

    Science.gov (United States)

    Olasz, A.; Nguyen Thai, B.; Kristóf, D.

    2016-06-01

    Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage) nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats) are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/) developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu). The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows) written in different development frameworks (e.g. Python, R or C#). It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.

  6. Distributed computing environment for Mine Warfare Command

    OpenAIRE

    Pritchard, Lane L.

    1993-01-01

    Approved for public release; distribution is unlimited. The Mine Warfare Command in Charleston, South Carolina has been converting its information systems architecture from a centralized mainframe based system to a decentralized network of personal computers over the past several years. This thesis analyzes the progress Of the evolution as of May of 1992. The building blocks of a distributed architecture are discussed in relation to the choices the Mine Warfare Command has made to date. Ar...

  7. The distribution characteristics of trace elements in airborne particulates from an urban industrial complex area of Korea using instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Lim, Jong Myoung; Lee, Jin Hong; Chung, Yong Sam

    2005-01-01

    An instrumental neutron activation analysis was used to measure the concentrations of about 24 elements associated with airborne particulates (PM10) that were collected in the most polluted urban region of Daejeon city, Korea from 2000 to 2002. Using the measurement data for various elements, both the extent of elemental pollution in the study area and the seasonality in their distribution characteristics were examined. Examinations of their distribution patterns indicated that most elements with crustal origin tend to exhibit seasonal peaks during spring, while most elements with anthropogenic origin tend to exhibit seasonal peaks during fall or winter. In order to explain the factors regulating their mobilization properties, the data were processed by a factor analysis. Results of the factor analysis suggested competing roles of both industrial and natural source processes, despite that the study site is located at a downwind position of the industrial complex. Based on the overall results of this study, it is concluded that the site may be strongly impacted by man-made sources but the general patterns of elemental distributions in the study area inspected over a seasonal scale are quite consistent with those typically observed from natural environment

  8. Thermographic Analysis of Stress Distribution in Welded Joints

    Directory of Open Access Journals (Sweden)

    Domazet Ž.

    2010-06-01

    Full Text Available The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  9. Thermographic Analysis of Stress Distribution in Welded Joints

    Science.gov (United States)

    Piršić, T.; Krstulović Opara, L.; Domazet, Ž.

    2010-06-01

    The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural) stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis) in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  10. Distributed computing environment monitoring and user expectations

    International Nuclear Information System (INIS)

    Cottrell, R.L.A.; Logg, C.A.

    1996-01-01

    This paper discusses the growing needs for distributed system monitoring and compares it to current practices. It then goes to identify the components of distributed system monitoring and shows how they are implemented and successfully used at one site today to address the Local area Network (WAN), and host monitoring. It shows how this monitoring can be used to develop realistic service level expectations and also identifies the costs. Finally, the paper briefly discusses the future challenges in network monitoring. (author)

  11. Distributed Scaffolding: Synergy in Technology-Enhanced Learning Environments

    Science.gov (United States)

    Ustunel, Hale H.; Tokel, Saniye Tugba

    2018-01-01

    When technology is employed challenges increase in learning environments. Kim et al. ("Sci Educ" 91(6):1010-1030, 2007) presented a pedagogical framework that provides a valid technology-enhanced learning environment. The purpose of the present design-based study was to investigate the micro context dimension of this framework and to…

  12. Feasibility analysis of EDXRF method to detect heavy metal pollution in ecological environment

    Science.gov (United States)

    Hao, Zhixu; Qin, Xulei

    2018-02-01

    The change of heavy metal content in water environment, soil and plant can reflect the change of heavy metal pollution in ecological environment, and it is important to monitor the trend of heavy metal pollution in eco-environment by using water environment, soil and heavy metal content in plant. However, the content of heavy metals in nature is very low, the background elements of water environment, soil and plant samples are complex, and there are many interfering factors in the EDXRF system that will affect the spectral analysis results and reduce the detection accuracy. Through the contrastive analysis of several heavy metal elements detection methods, it is concluded that the EDXRF method is superior to other chemical methods in testing accuracy and method feasibility when the heavy metal pollution in soil is tested in ecological environment.

  13. Gerald: a general environment for radiation analysis and design

    International Nuclear Information System (INIS)

    Boyle, Ch.; Oliveira, P.I.E. de; Oliveira, C.R.E. de; Adams, M.L.; Galan, J.M.

    2005-01-01

    Full text of publication follows: This paper describes the status of the GERALD interactive workbench for the analysis of radiation transport problems. GERALD basically guides the user through the various steps that are necessary to solve a radiation transport problem, and is aimed at education, research and industry. The advantages of such workbench are many: quality assurance of problem setup, interaction of the user with problem solution, preservation of theory and legacy research codes, and rapid proto-typing and testing of new methods. The environment is of general applicability catering for analytical, deterministic and stochastic analysis of the radiation problem and is not tied to one specific solution method or code. However, GERALD is being developed as a portable, modular, open source framework which renders itself quite naturally to the coupling of existing computational tools through specifically developed plug-ins. By offering a common route for setting up, solving and analyzing radiation transport problems GERALD offers the possibility of methods intercomparison and validation. Such flexible radiation transport environment will also facilitate the coupling of radiation physics methods to other physical phenomena and their application to other areas of application such as medical physics and the environment. (authors)

  14. RECOMMENDATIONS FOR THE DISTRIBUTION STRATEGY IN CHANGING MARKET ENVIRONMENT : Case: Belgian Brewery Van Honsebrouck in Russia

    OpenAIRE

    Louckx, Yulia

    2014-01-01

    The efficient distribution strategy formulation becomes vital to the success and survival of any organization, especially when it is involved in international trade. Today’s world is particularly challenging due to rapidly changing market conditions. Therefore, in order to able to compete, satisfy customers, and meet the needs of other stakeholders profitably, it is crucial for any company to make profound market environment analyses, react to changes in the market and adjust strategies accor...

  15. NEAMS Nuclear Waste Management IPSC: evaluation and selection of tools for the quality environment

    International Nuclear Information System (INIS)

    Bouchard, Julie F.; Stubblefield, William Anthony; Vigil, Dena M.; Edwards, Harold Carter

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Nuclear Waste Management Integrated Performance and Safety Codes (NEAMS Nuclear Waste Management IPSC) is to provide an integrated suite of computational modeling and simulation (M and S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. These M and S capabilities are to be managed, verified, and validated within the NEAMS Nuclear Waste Management IPSC quality environment. M and S capabilities and the supporting analysis workflow and simulation data management tools will be distributed to end-users from this same quality environment. The same analysis workflow and simulation data management tools that are to be distributed to end-users will be used for verification and validation (V and V) activities within the quality environment. This strategic decision reduces the number of tools to be supported, and increases the quality of tools distributed to end users due to rigorous use by V and V activities. This report documents an evaluation of the needs, options, and tools selected for the NEAMS Nuclear Waste Management IPSC quality environment. The objective of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation Nuclear Waste Management Integrated Performance and Safety Codes (NEAMS Nuclear Waste Management IPSC) program element is to provide an integrated suite of computational modeling and simulation (M and S) capabilities to assess quantitatively the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. This objective will be fulfilled by acquiring and developing M and S capabilities, and establishing a defensible level of confidence in these M and S capabilities. The foundation for assessing the

  16. Mercury speciation and distribution in a glacierized mountain environment and their relevance to environmental risks in the inland Tibetan Plateau.

    Science.gov (United States)

    Sun, Xuejun; Zhang, Qianggong; Kang, Shichang; Guo, Junming; Li, Xiaofei; Yu, Zhengliang; Zhang, Guoshuai; Qu, Dongmei; Huang, Jie; Cong, Zhiyuan; Wu, Guangjian

    2018-08-01

    Glacierized mountain environments can preserve and release mercury (Hg) and play an important role in regional Hg biogeochemical cycling. However, the behavior of Hg in glacierized mountain environments and its environmental risks remain poorly constrained. In this research, glacier meltwater, runoff and wetland water were sampled in Zhadang-Qugaqie basin (ZQB), a typical glacierized mountain environment in the inland Tibetan Plateau, to investigate Hg distribution and its relevance to environmental risks. The total mercury (THg) concentrations ranged from 0.82 to 6.98ng·L -1 , and non-parametric pairwise multiple comparisons of the THg concentrations among the three different water samples showed that the THg concentrations were comparable. The total methylmercury (TMeHg) concentrations ranged from 0.041 to 0.115ng·L -1 , and non-parametric pairwise multiple comparisons of the TMeHg concentrations showed a significant difference. Both the THg and MeHg concentrations of water samples from the ZQB were comparable to those of other remote areas, indicating that Hg concentrations in the ZQB watershed are equivalent to the global background level. Particulate Hg was the predominant form of Hg in all runoff samples, and was significantly correlated with the total suspended particle (TSP) and not correlated with the dissolved organic carbon (DOC) concentration. The distribution of mercury in the wetland water differed from that of the other water samples. THg exhibited a significant correlation with DOC as well as TMeHg, whereas neither THg nor TMeHg was associated with TSP. Based on the above findings and the results from previous work, we propose a conceptual model illustrating the four Hg distribution zones in glacierized environments. We highlight that wetlands may enhance the potential hazards of Hg released from melting glaciers, making them a vital zone for investigating the environmental effects of Hg in glacierized environments and beyond. Copyright © 2018

  17. Web Based Distributed Coastal Image Analysis System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project develops Web based distributed image analysis system processing the Moderate Resolution Imaging Spectroradiometer (MODIS) data to provide decision...

  18. A Distributed Control System Prototyping Environment to Support Control Room Modernization

    Energy Technology Data Exchange (ETDEWEB)

    Lew, Roger Thomas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ulrich, Thomas Anthony [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-12-01

    Operators of critical processes, such as nuclear power production, must contend with highly complex systems, procedures, and regulations. Developing human-machine interfaces (HMIs) that better support operators is a high priority for ensuring the safe and reliable operation of critical processes. Human factors engineering (HFE) provides a rich and mature set of tools for evaluating the performance of HMIs, however the set of tools for developing and designing HMIs is still in its infancy. Here we propose a rapid prototyping approach for integrating proposed HMIs into their native environments before a design is finalized. This approach allows researchers and developers to test design ideas and eliminate design flaws prior to fully developing the new system. We illustrate this approach with four prototype designs developed using Microsoft’s Windows Presentation Foundation (WPF). One example is integrated into a microworld environment to test the functionality of the design and identify the optimal level of automation for a new system in a nuclear power plant. The other three examples are integrated into a full-scale, glasstop digital simulator of a nuclear power plant. One example demonstrates the capabilities of next generation control concepts; another aims to expand the current state of the art; lastly, an HMI prototype was developed as a test platform for a new control system currently in development at U.S. nuclear power plants. WPF possesses several characteristics that make it well suited to HMI design. It provides a tremendous amount of flexibility, agility, robustness, and extensibility. Distributed control system (DCS) specific environments tend to focus on the safety and reliability requirements for real-world interfaces and consequently have less emphasis on providing functionality to support novel interaction paradigms. Because of WPF’s large user-base, Microsoft can provide an extremely mature tool. Within process control applications,WPF is

  19. Scaling analysis of meteorite shower mass distributions

    DEFF Research Database (Denmark)

    Oddershede, Lene; Meibom, A.; Bohr, Jakob

    1998-01-01

    Meteorite showers are the remains of extraterrestrial objects which are captivated by the gravitational field of the Earth. We have analyzed the mass distribution of fragments from 16 meteorite showers for scaling. The distributions exhibit distinct scaling behavior over several orders of magnetude......; the observed scaling exponents vary from shower to shower. Half of the analyzed showers show a single scaling region while the orther half show multiple scaling regimes. Such an analysis can provide knowledge about the fragmentation process and about the original meteoroid. We also suggest to compare...... the observed scaling exponents to exponents observed in laboratory experiments and discuss the possibility that one can derive insight into the original shapes of the meteoroids....

  20. Radioactivity Distribution in Malaysian Marine Environment. Chapter 4

    International Nuclear Information System (INIS)

    Zal U'yun Wan Mahmood; Abdul Kadir Ishak; Norfaizal Mohamad; Wo, Y.M.; Kamarudin Samuding

    2015-01-01

    The marine radioactivity distribution mapping in malaysia was developed with the aim to illustrate the pattern of the distribution of both anthropogenic and natural radionuclides in seawater and sediments. The data collected will form the basis for an anthropogenic radioactivity.

  1. Investment decisions in environment of uncertainties integrated to the viability analysis of the distribution and transmission projects; Decisao de investimentos em ambiente de incertezas integrada a analise de viabilidade de projetos de transmissao e distribuicao

    Energy Technology Data Exchange (ETDEWEB)

    Gazzi, L.M.P. [Universidade de Sao Paulo (USP), SP (Brazil). Curso de Pos-graduacao em Engenharia Eletrica; Ramos, D.S. [Universidade de Sao Paulo (USP), SP (Brazil)

    2009-07-01

    This work intends to support the necessary investment decisions to the expansion of the network of 138/88 kV, belonging to the electric energy distribution companies. In the economic and financial analysis, are considered the needs of the Regulator for the tariff recognition of an investment, using economic engineering techniques to evaluate the feedback of the invested capital but considering an uncertainty environment, where the best decision depends on exogenous variables to the traditional planning process. In order to make viable the inclusion of the main variables of random behavior that condition the economic and financial performances of the analyzed alternatives for decision making, it was chosen the utilization of the methodology based on 'Real Options', which is an used technique in the financial market for some time.

  2. The Usability Analysis of An E-Learning Environment

    Directory of Open Access Journals (Sweden)

    Fulya TORUN

    2015-10-01

    Full Text Available In this research, an E-learning environment is developed for the teacher candidates taking the course on Scientific Research Methods. The course contents were adapted to one of the constructivist approach models referred to as 5E, and an expert opinion was received for the compliance of this model. An usability analysis was also performed to determine the usability of the e-learning environment. The participants of the research comprised 42 teacher candidates. The mixed method was used in the research. 3 different data collection tools were used in order to measure the three basic concepts of usability analyses, which are the dimensions of effectiveness, efficiency and satisfaction. Two of the data collection tools were the scales developed by different researchers and were applied with the approval received from the researchers involved. On the other hand, the usability test as another data tool was prepared by the researchers who conducted this study for the purpose of determining the participants’ success in handling the twelve tasks assigned to them with respect to the use of elearning environment, the seconds they spent on that environment and the number of clicks they performed. Considering the results of the analyses performed within the data obtained, the usability of the developed e-learning environment proved to be at a higher rate.

  3. Performance optimisations for distributed analysis in ALICE

    CERN Document Server

    Betev, L; Gheata, M; Grigoras, C; Hristov, P

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the framewo rks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available r esources and ranging from fully I/O - bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by a...

  4. Distributed computing environment monitoring and user expectations

    International Nuclear Information System (INIS)

    Cottrell, R.L.A.; Logg, C.A.

    1995-11-01

    This paper discusses the growing needs for distributed system monitoring and compares it to current practices. It then goes on to identify the components of distributed system monitoring and shows how they are implemented and successfully used at one site today to address the Local Area Network (LAN), network services and applications, the Wide Area Network (WAN), and host monitoring. It shows how this monitoring can be used to develop realistic service level expectations and also identifies the costs. Finally, the paper briefly discusses the future challenges in network monitoring

  5. Parametric distribution approach for flow availability in small hydro potential analysis

    Science.gov (United States)

    Abdullah, Samizee; Basri, Mohd Juhari Mat; Jamaluddin, Zahrul Zamri; Azrulhisham, Engku Ahmad; Othman, Jamel

    2016-10-01

    Small hydro system is one of the important sources of renewable energy and it has been recognized worldwide as clean energy sources. Small hydropower generation system uses the potential energy in flowing water to produce electricity is often questionable due to inconsistent and intermittent of power generated. Potential analysis of small hydro system which is mainly dependent on the availability of water requires the knowledge of water flow or stream flow distribution. This paper presented the possibility of applying Pearson system for stream flow availability distribution approximation in the small hydro system. By considering the stochastic nature of stream flow, the Pearson parametric distribution approximation was computed based on the significant characteristic of Pearson system applying direct correlation between the first four statistical moments of the distribution. The advantage of applying various statistical moments in small hydro potential analysis will have the ability to analyze the variation shapes of stream flow distribution.

  6. Analysis of the distribution of heavy metals in the soils of Bagega ...

    African Journals Online (AJOL)

    Uncontrolled exploitation and degradation in the environment over the past few decades as the result of urbanization and poverty has caused a serious damage to lives and properties. The study analysed the spatial distribution of heavy metal (Fe, Cu and Zn) in Bagega, Zamfara state. Three mapping units were identified ...

  7. HammerCloud: A Stress Testing System for Distributed Analysis

    International Nuclear Information System (INIS)

    Ster, Daniel C van der; García, Mario Úbeda; Paladin, Massimo; Elmsheuser, Johannes

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).

  8. WCET Analysis of Java Bytecode Featuring Common Execution Environments

    DEFF Research Database (Denmark)

    Luckow, Kasper Søe; Thomsen, Bent; Frost, Christian

    2011-01-01

    We present a novel tool for statically determining the Worst Case Execution Time (WCET) of Java Bytecode-based programs called Tool for Execution Time Analysis of Java bytecode (TetaJ). This tool differentiates itself from existing tools by separating the individual constituents of the execution...... environment into independent components. The prime benefit is that it can be used for execution environments featuring common embedded processors and software implementations of the JVM. TetaJ employs a model checking approach for statically determining WCET where the Java program, the JVM, and the hardware...

  9. Definition of Distribution Network Tariffs Considering Distribution Generation and Demand Response

    DEFF Research Database (Denmark)

    Soares, Tiago; Faria, Pedro; Vale, Zita

    2014-01-01

    The use of distribution networks in the current scenario of high penetration of Distributed Generation (DG) is a problem of great importance. In the competitive environment of electricity markets and smart grids, Demand Response (DR) is also gaining notable impact with several benefits for the wh......The use of distribution networks in the current scenario of high penetration of Distributed Generation (DG) is a problem of great importance. In the competitive environment of electricity markets and smart grids, Demand Response (DR) is also gaining notable impact with several benefits...... the determination of topological distribution factors, and consequent application of the MW-mile method. The application of the proposed tariffs definition methodology is illustrated in a distribution network with 33 buses, 66 DG units, and 32 consumers with DR capacity...

  10. Disease clusters, exact distributions of maxima, and P-values.

    Science.gov (United States)

    Grimson, R C

    1993-10-01

    This paper presents combinatorial (exact) methods that are useful in the analysis of disease cluster data obtained from small environments, such as buildings and neighbourhoods. Maxwell-Boltzmann and Fermi-Dirac occupancy models are compared in terms of appropriateness of representation of disease incidence patterns (space and/or time) in these environments. The methods are illustrated by a statistical analysis of the incidence pattern of bone fractures in a setting wherein fracture clustering was alleged to be occurring. One of the methodological results derived in this paper is the exact distribution of the maximum cell frequency in occupancy models.

  11. Spatial Distribution of Viruses Associated with Planktonic and Attached Microbial Communities in Hydrothermal Environments

    Science.gov (United States)

    Nunoura, Takuro; Kazama, Hiromi; Noguchi, Takuroh; Inoue, Kazuhiro; Akashi, Hironori; Yamanaka, Toshiro; Toki, Tomohiro; Yamamoto, Masahiro; Furushima, Yasuo; Ueno, Yuichiro; Yamamoto, Hiroyuki; Takai, Ken

    2012-01-01

    Viruses play important roles in marine surface ecosystems, but little is known about viral ecology and virus-mediated processes in deep-sea hydrothermal microbial communities. In this study, we examined virus-like particle (VLP) abundances in planktonic and attached microbial communities, which occur in physical and chemical gradients in both deep and shallow submarine hydrothermal environments (mixing waters between hydrothermal fluids and ambient seawater and dense microbial communities attached to chimney surface areas or macrofaunal bodies and colonies). We found that viruses were widely distributed in a variety of hydrothermal microbial habitats, with the exception of the interior parts of hydrothermal chimney structures. The VLP abundance and VLP-to-prokaryote ratio (VPR) in the planktonic habitats increased as the ratio of hydrothermal fluid to mixing water increased. On the other hand, the VLP abundance in attached microbial communities was significantly and positively correlated with the whole prokaryotic abundance; however, the VPRs were always much lower than those for the surrounding hydrothermal waters. This is the first report to show VLP abundance in the attached microbial communities of submarine hydrothermal environments, which presented VPR values significantly lower than those in planktonic microbial communities reported before. These results suggested that viral lifestyles (e.g., lysogenic prevalence) and virus interactions with prokaryotes are significantly different among the planktonic and attached microbial communities that are developing in the submarine hydrothermal environments. PMID:22210205

  12. Distributed Resource Energy Analysis and Management System (DREAMS) Development for Real-time Grid Operations

    Energy Technology Data Exchange (ETDEWEB)

    Nakafuji, Dora [Hawaiian Electric Company, Honululu, HI (United States); Gouveia, Lauren [Hawaiian Electric Company, Honululu, HI (United States)

    2016-10-24

    This project supports development of the next generation, integrated energy management infrastructure (EMS) able to incorporate advance visualization of behind-the-meter distributed resource information and probabilistic renewable energy generation forecasts to inform real-time operational decisions. The project involves end-users and active feedback from an Utility Advisory Team (UAT) to help inform how information can be used to enhance operational functions (e.g. unit commitment, load forecasting, Automatic Generation Control (AGC) reserve monitoring, ramp alerts) within two major EMS platforms. Objectives include: Engaging utility operations personnel to develop user input on displays, set expectations, test and review; Developing ease of use and timeliness metrics for measuring enhancements; Developing prototype integrated capabilities within two operational EMS environments; Demonstrating an integrated decision analysis platform with real-time wind and solar forecasting information and timely distributed resource information; Seamlessly integrating new 4-dimensional information into operations without increasing workload and complexities; Developing sufficient analytics to inform and confidently transform and adopt new operating practices and procedures; Disseminating project lessons learned through industry sponsored workshops and conferences;Building on collaborative utility-vendor partnership and industry capabilities

  13. Antibiotics in the coastal environment of the Hailing Bay region, South China Sea: Spatial distribution, source analysis and ecological risks.

    Science.gov (United States)

    Chen, Hui; Liu, Shan; Xu, Xiang-Rong; Zhou, Guang-Jie; Liu, Shuang-Shuang; Yue, Wei-Zhong; Sun, Kai-Feng; Ying, Guang-Guo

    2015-06-15

    In this study, the occurrence and spatial distribution of 38 antibiotics in surface water and sediment samples of the Hailing Bay region, South China Sea, were investigated. Twenty-one, 16 and 15 of 38 antibiotics were detected with the concentrations ranging from antibiotics in the water phase were correlated positively with chemical oxygen demand and nitrate. The source analysis indicated that untreated domestic sewage was the primary source of antibiotics in the study region. Fluoroquinolones showed strong sorption capacity onto sediments due to their high pseudo-partitioning coefficients. Risk assessment indicated that oxytetracycline, norfloxacin and erythromycin-H2O posed high risks to aquatic organisms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Balancing Uplink and Downlink under Asymmetric Traffic Environments Using Distributed Receive Antennas

    Science.gov (United States)

    Sohn, Illsoo; Lee, Byong Ok; Lee, Kwang Bok

    Recently, multimedia services are increasing with the widespread use of various wireless applications such as web browsers, real-time video, and interactive games, which results in traffic asymmetry between the uplink and downlink. Hence, time division duplex (TDD) systems which provide advantages in efficient bandwidth utilization under asymmetric traffic environments have become one of the most important issues in future mobile cellular systems. It is known that two types of intercell interference, referred to as crossed-slot interference, additionally arise in TDD systems; the performances of the uplink and downlink transmissions are degraded by BS-to-BS crossed-slot interference and MS-to-MS crossed-slot interference, respectively. The resulting performance unbalance between the uplink and downlink makes network deployment severely inefficient. Previous works have proposed intelligent time slot allocation algorithms to mitigate the crossed-slot interference problem. However, they require centralized control, which causes large signaling overhead in the network. In this paper, we propose to change the shape of the cellular structure itself. The conventional cellular structure is easily transformed into the proposed cellular structure with distributed receive antennas (DRAs). We set up statistical Markov chain traffic model and analyze the bit error performances of the conventional cellular structure and proposed cellular structure under asymmetric traffic environments. Numerical results show that the uplink and downlink performances of the proposed cellular structure become balanced with the proper number of DRAs and thus the proposed cellular structure is notably cost-effective in network deployment compared to the conventional cellular structure. As a result, extending the conventional cellular structure into the proposed cellular structure with DRAs is a remarkably cost-effective solution to support asymmetric traffic environments in future mobile cellular

  15. Matrix-specific distribution and diastereomeric profiles of hexabromocyclododecane (HBCD) in a multimedia environment: Air, soil, sludge, sediment, and fish.

    Science.gov (United States)

    Jo, Hyeyeong; Son, Min-Hui; Seo, Sung-Hee; Chang, Yoon-Seok

    2017-07-01

    Hexabromocyclododecane (HBCD) contamination and its diastereomeric profile were investigated in a multi-media environment along a river at the local scale in air, soil, sludge, sediment, and fish samples. The spatial distribution of HBCD in each matrix showed a different result. The highest concentrations of HBCD in air and soil were detected near a general industrial complex; in the sediment and sludge samples, they were detected in the down-stream region (i.e., urban area). Each matrix showed the specific distribution patterns of HBCD diastereomers, suggesting continuous inputs of contaminants, different physicochemical properties, or isomerizations. The particle phases in air, sludge, and fish matrices were dominated by α-HBCD, owing to HBCD's various isomerization processes and different degradation rate in the environment, and metabolic capabilities of the fish; in contrast, the sediment and soil matrices were dominated by γ-HBCD because of the major composition of the technical mixtures and the strong adsorption onto solid particles. Based on these results, the prevalent and matrix-specific distribution of HBCD diastereomers suggested that more careful consideration should be given to the characteristics of the matrices and their effects on the potential influence of HBCD at the diastereomeric level. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Confocal (micro)-XRF for 3D analysis of elements distribution in hot environmental particles

    International Nuclear Information System (INIS)

    Bielewski, M.; Eriksson, M.; Himbert, J.; Simon, R.; Betti, M.; Hamilton, T.F.

    2007-01-01

    Studies on the fate and transport of radioactive contaminates in the environment are often constrained by a lack of knowledge on the elemental distribution and general behavior of particulate bound radionuclides contained in hot particles. A number of hot particles were previously isolated from soil samples collected at former U.S. nuclear test sites in the Marshall Islands and characterized using non-destructive techniques [1]. The present investigation at HASYLAB is a part of larger research program at ITU regarding the characterization of environmental radioactive particles different locations and source-terms. Radioactive particles in the environment are formed under a number of different release scenarios and, as such, their physicochemical properties may provide a basis for identifying source-term specific contamination regimes. Consequently, studies on hot particles are not only important in terms of studying the elemental composition and geochemical behavior of hot particles but may also lead to advances in assessing the long-term impacts of radioactive contamination on the environment. Six particles isolated from soil samples collected at the Marshall Islands were studied. The element distribution in the particles was determined by confocal (micro)-XRF analysis using the ANKA FLUO beam line. The CRL (compound refractive lens) was used to focus the exciting beam and the polycapillary half lens to collimate the detector. The dimensions of confocal spot were measured by 'knife edge scanning' method with thin gold structure placed at Si wafer. The values of 3.1 x 1.4 x 18.4 (micro)m were achieved if defined as FWHMs of measured L?intensity profiles and when the19.1 keV exciting radiation was used. The collected XRF spectra were analyzed offline with AXIL [2] software to obtain net intensities of element characteristic lines.Further data processing and reconstruction of element distribution was done with the software 'R' [3] dedicated for statistical

  17. Distributed computing environments for future space control systems

    Science.gov (United States)

    Viallefont, Pierre

    1993-01-01

    The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.

  18. Radionuclide X-ray fluorescence analysis of components of the environment

    International Nuclear Information System (INIS)

    Toelgyessy, J.; Havranek, E.; Dejmkova, E.

    1983-12-01

    The physical foundations and methodology are described of radionuclide X-ray fluorescence analysis. The sources are listed of air, water and soil pollution, and the transfer of impurities into biological materials is described. A detailed description is presented of the sampling of air, soil and biological materials and their preparation for analysis. Greatest attention is devoted to radionuclide X-ray fluorescence analysis of the components of the environment. (ES)

  19. Neutron activation analysis applied to energy and environment

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1975-01-01

    Neutron activation analysis was applied to a number of problems concerned with energy production and the environment. Burning of fossil fuel, the search for new sources of uranium, possible presence of toxic elements in food and water, and the relationship of trace elements to cardiovascular disease are some of the problems in which neutron activation was used. (auth)

  20. The cost of electricity distribution in Italy: a quantitative analysis

    International Nuclear Information System (INIS)

    Scarpa, C.

    1998-01-01

    This paper presents a quantitative analysis of the cost of medium and low tension electricity distribution in Italy. An econometric analysis of the cost function is proposed, on the basis of data on 147 zones of the dominant firm, ENEL. Data are available only for 1996, which has forced to carry out only a cross-section OLS analysis. The econometric estimate shows the existence of significant scale economies, that the current organisational structure does not exploit. On this basis is also possible to control to what extent exogenous cost drivers affect costs. The role of numerous exogenous factors considered seems however quite limited. The area of the distribution zone and an indicator of quality are the only elements that appear significant from an economic viewpoint [it

  1. A distributed system for visualizing and analyzing multivariate and multidisciplinary data

    Science.gov (United States)

    Jacobson, Allan S.; Allen, Mark; Bailey, Michael; Blom, Ronald; Blume, Leo; Elson, Lee

    1993-01-01

    THe Linked Windows Interactive Data System (LinkWinds) is being developed with NASA support. The objective of this proposal is to adapt and apply that system in a complex network environment containing elements to be found by scientists working multidisciplinary teams on very large scale and distributed data sets. The proposed three year program will develop specific visualization and analysis tools, to be exercised locally and remotely in the LinkWinds environment, to demonstrate visual data analysis, interdisciplinary data analysis and cooperative and interactive televisualization and analysis of data by geographically separated science teams. These demonstrators will involve at least two science disciplines with the aim of producing publishable results.

  2. Questionnaire on the measurement condition of distribution coefficient

    International Nuclear Information System (INIS)

    Takebe, Shinichi; Kimura, Hideo; Matsuzuru, Hideo

    2001-05-01

    The distribution coefficient is used for various transport models to evaluate the migration behavior of radionuclides in the environment and is very important parameter in environmental impact assessment of nuclear facility. The questionnaire was carried out for the purpose of utilizing for the proposal of the standard measuring method of distribution coefficient. This report is summarized the result of questionnairing on the sampling methods and storage condition, the pretreatment methods, the analysis items in the physical/chemical characteristics of the sample, and the distribution coefficient measuring method and the measurement conditions in the research institutes within country. (author)

  3. Knowledge Flow Rules of Modern Design under Distributed Resource Environment

    Directory of Open Access Journals (Sweden)

    Junning Li

    2013-01-01

    Full Text Available The process of modern design under the distributed resource environment is interpreted as the process of knowledge flow and integration. As the acquisition of new knowledge strongly depends on resources, knowledge flow can be influenced by technical, economic, and social relation factors, and so forth. In order to achieve greater efficiency of knowledge flow and make the product more competitive, the root causes of the above factors should be acquired first. In this paper, the authors attempt to reveal the nature of design knowledge flow from the perspectives of fluid dynamics and energy. The knowledge field effect and knowledge agglomeration effect are analyzed, respectively, in which the knowledge field effect model considering single task node and the single knowledge energy model in the knowledge flow are established, then the general expression of knowledge energy conservation with consideration of the kinetic energy and potential energy of knowledge is built. Then, the knowledge flow rules and their influential factors including complete transfer and incomplete transfer of design knowledge are studied. Finally, the coupling knowledge flows in the knowledge service platform for modern design are analyzed to certify the feasibility of the research work.

  4. Preliminary study of the 129I distribution in environment of La Hague reprocessing plant with the help of a terrestrial moss: Homalotecium sericeum. Study report

    International Nuclear Information System (INIS)

    1999-01-01

    The preliminary study of the 129 I distribution has allowed to underline the limits of use of a Homalotecium sericeum type terrestrial moss as biological indicator. However, this preliminary study allowed all the same to give a spatial distribution of this radioelement around La Hague reprocessing plant (source term) that underlines the existence of four geographic areas in function of collected activities. The levels are generally under 99 Bq/kg dry. It is recommended to improve the knowledge that we can have of transfers and quantity of iodine 129 from the marine environment to the terrestrial environment, but also, the one that we can have of factors able to modify the spatial distribution of this radionuclide. (N.C.)

  5. Real-time Control Mediation in Agile Distributed Software Development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Aaen, Ivan; Mathiassen, Lars

    2008-01-01

    Agile distributed environments pose particular challenges related to control of quality and collaboration in software development. Moreover, while face-to-face interaction is fundamental in agile development, distributed environments must rely extensively on mediated interactions. On this backdrop...... control was mediated over distance by technology through real-time exchanges. Contrary to previous research, the analysis suggests that both formal and informal elements of real-time mediated control were used; that evolving goals and adjustment of expectations were two of the main issues in real......-time mediated control exchanges; and, that the actors, despite distances in space and culture, developed a clan-like pattern mediated by technology to help control quality and collaboration in software development....

  6. The distributed development environment for SDSS software

    International Nuclear Information System (INIS)

    Berman, E.; Gurbani, V.; Mackinnon, B.; Newberg, H. Nicinski, T.; Petravick, D.; Pordes, R.; Sergey, G.; Stoughton, C.; Lupton, R.

    1994-04-01

    The authors present an integrated science software development environment, code maintenance and support system for the Sloan Digital Sky Survey (SDSS) now being actively used throughout the collaboration

  7. User-friendly Tool for Power Flow Analysis and Distributed ...

    African Journals Online (AJOL)

    Akorede

    AKOREDE et al: TOOL FOR POWER FLOW ANALYSIS AND DISTRIBUTED GENERATION OPTIMISATION. 23 ... greenhouse gas emissions and the current deregulation of electric energy ..... Visual composition and temporal behaviour of GUI.

  8. Distribution of bat-borne viruses and environment patterns.

    Science.gov (United States)

    Afelt, Aneta; Lacroix, Audrey; Zawadzka-Pawlewska, Urszula; Pokojski, Wojciech; Buchy, Philippe; Frutos, Roger

    2018-03-01

    Environmental modifications are leading to biodiversity changes, loss and habitat disturbance. This in turn increases contacts between wildlife and hence the risk of transmission and emergence of zoonotic diseases. We analyzed the environment and land use using remote spatial data around the sampling locations of bats positive for coronavirus (21 sites) and astrovirus (11 sites) collected in 43 sites. A clear association between viruses and hosts was observed. Viruses associated to synanthropic bat genera, such as Myotis or Scotophilus were associated to highly transformed habitats with human presence while viruses associated to fruit bat genera were correlated with natural environments with dense forest, grassland areas and regions of high elevation. In particular, group C betacoronavirus were associated with mosaic habitats found in anthropized environments. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Molecular Isotopic Distribution Analysis (MIDAs) with adjustable mass accuracy.

    Science.gov (United States)

    Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  10. Sensitivity Analysis of Dynamic Tariff Method for Congestion Management in Distribution Networks

    DEFF Research Database (Denmark)

    Huang, Shaojun; Wu, Qiuwei; Liu, Zhaoxi

    2015-01-01

    The dynamic tariff (DT) method is designed for the distribution system operator (DSO) to alleviate the congestions that might occur in a distribution network with high penetration of distribute energy resources (DERs). Sensitivity analysis of the DT method is crucial because of its decentralized...... control manner. The sensitivity analysis can obtain the changes of the optimal energy planning and thereby the line loading profiles over the infinitely small changes of parameters by differentiating the KKT conditions of the convex quadratic programming, over which the DT method is formed. Three case...

  11. Space Environment Automated Alerts and Anomaly Analysis Assistant (SEA^5) for NASA

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop a comprehensive analysis and dissemination system (Space Environment Automated Alerts  & Anomaly Analysis Assistant: SEA5) that will...

  12. Radioactivity in the Norwegian Marine Environment

    International Nuclear Information System (INIS)

    2002-01-01

    The national monitoring programme for radioactivity in the marine environment was established in 1999. The programme is coordinated by the Norwegian Radiation Protection Authority (NRPA) in cooperation with the Institute of Marine Research (IMR). The principal objective of the programme is to document levels, distributions and trends of radionuclides in the marine environment. Data regarding discharges of radionuclides from both Norwegian and other sources are collected, and assessments of the resulting radiation exposures of humans and biota will be carried out. Results from the analysis of environmental samples collected in 1999 are presented in a new NRPA report (NRPA, 2001:9 ''Radioactivity in the Marine Environment 1999''. Some results from the monitoring programme in 1999 are summarised below along with more recent data concerning concentrations of the radionuclide technetium-99. (author)

  13. An Overview of NASA's Integrated Design and Engineering Analysis (IDEA) Environment

    Science.gov (United States)

    Robinson, Jeffrey S.

    2011-01-01

    Historically, the design of subsonic and supersonic aircraft has been divided into separate technical disciplines (such as propulsion, aerodynamics and structures), each of which performs design and analysis in relative isolation from others. This is possible, in most cases, either because the amount of interdisciplinary coupling is minimal, or because the interactions can be treated as linear. The design of hypersonic airbreathing vehicles, like NASA's X-43, is quite the opposite. Such systems are dominated by strong non-linear interactions between disciplines. The design of these systems demands that a multi-disciplinary approach be taken. Furthermore, increased analytical fidelity at the conceptual design phase is highly desirable, as many of the non-linearities are not captured by lower fidelity tools. Only when these systems are designed from a true multi-disciplinary perspective, can the real performance benefits be achieved and complete vehicle systems be fielded. Toward this end, the Vehicle Analysis Branch at NASA Langley Research Center has been developing the Integrated Design and Engineering Analysis (IDEA) Environment. IDEA is a collaborative environment for parametrically modeling conceptual and preliminary designs for launch vehicle and high speed atmospheric flight configurations using the Adaptive Modeling Language (AML) as the underlying framework. The environment integrates geometry, packaging, propulsion, trajectory, aerodynamics, aerothermodynamics, engine and airframe subsystem design, thermal and structural analysis, and vehicle closure into a generative, parametric, unified computational model where data is shared seamlessly between the different disciplines. Plans are also in place to incorporate life cycle analysis tools into the environment which will estimate vehicle operability, reliability and cost. IDEA is currently being funded by NASA?s Hypersonics Project, a part of the Fundamental Aeronautics Program within the Aeronautics

  14. Analysis of rainfall distribution in Kelantan river basin, Malaysia

    Science.gov (United States)

    Che Ros, Faizah; Tosaka, Hiroyuki

    2018-03-01

    Using rainfall gauge on its own as input carries great uncertainties regarding runoff estimation, especially when the area is large and the rainfall is measured and recorded at irregular spaced gauging stations. Hence spatial interpolation is the key to obtain continuous and orderly rainfall distribution at unknown points to be the input to the rainfall runoff processes for distributed and semi-distributed numerical modelling. It is crucial to study and predict the behaviour of rainfall and river runoff to reduce flood damages of the affected area along the Kelantan river. Thus, a good knowledge on rainfall distribution is essential in early flood prediction studies. Forty six rainfall stations and their daily time-series were used to interpolate gridded rainfall surfaces using inverse-distance weighting (IDW), inverse-distance and elevation weighting (IDEW) methods and average rainfall distribution. Sensitivity analysis for distance and elevation parameters were conducted to see the variation produced. The accuracy of these interpolated datasets was examined using cross-validation assessment.

  15. Distributed In-Line Analysis of Water Pollution in a Spanish Lake

    Directory of Open Access Journals (Sweden)

    Juan V. Capella

    2012-01-01

    Full Text Available This paper presents the architecture and implementation of a set of novel sensor nodes designed to measure ammonium, nitrate and chloride in real time, sending the data, by means of a network, to the base station in order to control the pollution in a lake. The results obtained being compared with those provided by the corresponding reference methods. Recovery analyses with ion selective electrodes and standard methods, study of interferences, and evaluation of major sensor features have also been carried out. The use of a wireless system for monitoring purposes will not only reduce the overall monitoring system cost in term of facilities setup and labor cost, but will also provide flexibility in terms of distance. The major advantages of the proposed in-line analysis compared with the classical off-line procedures are the elimination of contaminants due to sample handling, the minimization of the overall cost of data acquisition, the possibility of real-time analysis, allowing the rapid detection of pollutants, the ability to obtain detailed spatial and temporal data sets of complete environments, obtaining the spatial distribution of the analyzed parameters, as well as its variation with the passing of time, and finally the possibility of performing measurements in locations which are difficult to access (in this case a deep lake.

  16. A Framework System for Intelligent Support in Open Distributed Learning Environments--A Look Back from 16 Years Later

    Science.gov (United States)

    Hoppe, H. Ulrich

    2016-01-01

    The 1998 paper by Martin Mühlenbrock, Frank Tewissen, and myself introduced a multi-agent architecture and a component engineering approach for building open distributed learning environments to support group learning in different types of classroom settings. It took up prior work on "multiple student modeling" as a method to configure…

  17. Solution to food distribution; Shokuhin ryutsu solution

    Energy Technology Data Exchange (ETDEWEB)

    Takahashi, K; Shimizu, T [Fuji Electric Co. Ltd., Tokyo (Japan)

    2000-05-10

    The environment around the food industry has greatly changed these several years. It has become an important problem of enterprise management to structure a business model that can flexibly follow quality assurance based on hazard analysis critical control point (HACCP), safety securing, and changes in consumption behavior (individual liking and variety). With the theme narrowed down to food safety (industrial engineering [IE] and HACCP consulting frame) and distribution infrastructure from the standpoint of consumers (how to structure a supply chain management [SCM] system), this paper describes our activities for food distribution business. (author)

  18. Distributed expert systems for nuclear reactor control

    International Nuclear Information System (INIS)

    Otaduy, P.J.

    1992-01-01

    A network of distributed expert systems is the heart of a prototype supervisory control architecture developed at the Oak Ridge National Laboratory (ORNL) for an advanced multimodular reactor. Eight expert systems encode knowledge on signal acquisition, diagnostics, safeguards, and control strategies in a hybrid rule-based, multiprocessing and object-oriented distributed computing environment. An interactive simulation of a power block consisting of three reactors and one turbine provides a realistic, testbed for performance analysis of the integrated control system in real-time. Implementation details and representative reactor transients are discussed

  19. Distribution and relevance of iodinated X-ray contrast media and iodinated trihalomethanes in an aquatic environment.

    Science.gov (United States)

    Xu, Zhifa; Li, Xia; Hu, Xialin; Yin, Daqiang

    2017-10-01

    Distribution and relevance of iodinated X-ray contrast media (ICM) and iodinated disinfection byproducts (I-DBPs) in a real aquatic environment have been rarely documented. In this paper, some ICM were proven to be strongly correlated with I-DBPs through investigation of five ICM and five iodinated trihalomethanes (I-THMs) in surface water and two drinking water treatment plants (DWTPs) of the Yangtze River Delta, China. The total ICM concentrations in Taihu Lake and the Huangpu River ranged from 88.7 to 131 ng L -1 and 102-252 ng L -1 , respectively. While the total I-THM concentrations ranged from 128 to 967 ng L -1 in Taihu Lake and 267-680 ng L -1 in the Huangpu River. Iohexol, the dominant ICM, showed significant positive correlation (p < 0.01) with CHClI 2 in Taihu Lake. Iopamidol and iomeprol correlated positively (p < 0.01) with some I-THMs in the Huangpu River. The observed pronounced correlations between ICM and I-THMs indicated that ICM play an important role in the formation of I-THMs in a real aquatic environment. Characteristics of the I-THM species distributions indicated that I-THMs may be transformed by natural conditions. Both DWTPs showed negligible removal efficiencies for total ICM (<20%). Strikingly high concentrations of total I-THMs were observed in the finished water (2848 ng L -1 in conventional DWTP and 356 ng L -1 in advanced DWTP). Obvious transformation of ICM to I-THMs was observed during the chlorination and ozonization processes in DWTPs. We suggest that ICM is an important source for I-DBP formation in the real aquatic environment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Security in Distributed Collaborative Environments: Limitations and Solutions

    Science.gov (United States)

    Saadi, Rachid; Pierson, Jean-Marc; Brunie, Lionel

    The main goal of establishing collaboration between heterogeneous environment is to create such as Pervasive context which provide nomadic users with ubiquitous access to digital information and surrounding resources. However, the constraints of mobility and heterogeneity arise a number of crucial issues related to security, especially authentication access control and privacy. First of all, in this chapter we explore the trust paradigm, specially the transitive capability to enable a trust peer to peer collaboration. In this manner, when each organization sets its own security policy to recognize (authenticate) users members of a trusted community and provide them a local access (access control), the trust transitivity between peers will allows users to gain a broad, larger and controlled access inside the pervasive environment. Next, we study the problem of user's privacy. In fact in pervasive and ubiquitous environments, nomadic users gather and exchange certificates or credential which providing them rights to access by transitivity unknown and trusted environments. These signed documents embeds increasing number of attribute that require to be filtered according to such contextual situation. In this chapter, we propose a new morph signature enabling each certificate owner to preserve his privacy by discloses or blinds some sensitive attributes according to faced situation.

  1. Silicon Bipolar Distributed Oscillator Design and Analysis | Aku ...

    African Journals Online (AJOL)

    The design of high frequency silicon bipolar oscillator using common emitter (CE) with distributed output and analysis is carried out. The general condition for oscillation and the resulting analytical expressions for the frequency of oscillators were reviewed. Transmission line design was carried out using Butterworth LC ...

  2. On process capability and system availability analysis of the inverse Rayleigh distribution

    Directory of Open Access Journals (Sweden)

    Sajid Ali

    2015-04-01

    Full Text Available In this article, process capability and system availability analysis is discussed for the inverse Rayleigh lifetime distribution. Bayesian approach with a conjugate gamma distribution is adopted for the analysis. Different types of loss functions are considered to find Bayes estimates of the process capability and system availability. A simulation study is conducted for the comparison of different loss functions.

  3. Finite element analysis and modeling of temperature distribution in turning of titanium alloys

    Directory of Open Access Journals (Sweden)

    Moola Mohan Reddy

    2018-04-01

    Full Text Available The titanium alloys (Ti-6Al-4V have been widely used in aerospace, and medical applications and the demand is ever-growing due to its outstanding properties. In this paper, the finite element modeling on machinability of Ti-6Al-4V using cubic boron nitride and polycrystalline diamond tool in dry turning environment was investigated. This research was carried out to generate mathematical models at 95% confidence level for cutting force and temperature distribution regarding cutting speed, feed rate and depth of cut. The Box-Behnken design of experiment was used as Response Surface Model to generate combinations of cutting variables for modeling. Then, finite element simulation was performed using AdvantEdge®. The influence of each cutting parameters on the cutting responses was investigated using Analysis of Variance. The analysis shows that depth of cut is the most influential parameter on resultant cutting force whereas feed rate is the most influential parameter on cutting temperature. Also, the effect of the cutting-edge radius was investigated for both tools. This research would help to maximize the tool life and to improve surface finish.

  4. Social network analysis of study environment

    Directory of Open Access Journals (Sweden)

    Blaženka Divjak

    2010-06-01

    Full Text Available Student working environment influences student learning and achievement level. In this respect social aspects of students’ formal and non-formal learning play special role in learning environment. The main research problem of this paper is to find out if students' academic performance influences their position in different students' social networks. Further, there is a need to identify other predictors of this position. In the process of problem solving we use the Social Network Analysis (SNA that is based on the data we collected from the students at the Faculty of Organization and Informatics, University of Zagreb. There are two data samples: in the basic sample N=27 and in the extended sample N=52. We collected data on social-demographic position, academic performance, learning and motivation styles, student status (full-time/part-time, attitudes towards individual and teamwork as well as informal cooperation. Afterwards five different networks (exchange of learning materials, teamwork, informal communication, basic and aggregated social network were constructed. These networks were analyzed with different metrics and the most important were betweenness, closeness and degree centrality. The main result is, firstly, that the position in a social network cannot be forecast only by academic success and, secondly, that part-time students tend to form separate groups that are poorly connected with full-time students. In general, position of a student in social networks in study environment can influence student learning as well as her/his future employability and therefore it is worthwhile to be investigated.

  5. Development of the virtual research environment for analysis, evaluation and prediction of global climate change impacts on the regional environment

    Science.gov (United States)

    Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander; Fazliev, Alexander

    2017-04-01

    Description and the first results of the Russian Science Foundation project "Virtual computational information environment for analysis, evaluation and prediction of the impacts of global climate change on the environment and climate of a selected region" is presented. The project is aimed at development of an Internet-accessible computation and information environment providing unskilled in numerical modelling and software design specialists, decision-makers and stakeholders with reliable and easy-used tools for in-depth statistical analysis of climatic characteristics, and instruments for detailed analysis, assessment and prediction of impacts of global climate change on the environment and climate of the targeted region. In the framework of the project, approaches of "cloud" processing and analysis of large geospatial datasets will be developed on the technical platform of the Russian leading institution involved in research of climate change and its consequences. Anticipated results will create a pathway for development and deployment of thematic international virtual research laboratory focused on interdisciplinary environmental studies. VRE under development will comprise best features and functionality of earlier developed information and computing system CLIMATE (http://climate.scert.ru/), which is widely used in Northern Eurasia environment studies. The Project includes several major directions of research listed below. 1. Preparation of geo-referenced data sets, describing the dynamics of the current and possible future climate and environmental changes in detail. 2. Improvement of methods of analysis of climate change. 3. Enhancing the functionality of the VRE prototype in order to create a convenient and reliable tool for the study of regional social, economic and political consequences of climate change. 4. Using the output of the first three tasks, compilation of the VRE prototype, its validation, preparation of applicable detailed description of

  6. AND LANDSCAPE-ECOLOGICAL ANALYSIS OF ITS DISTRIBUTION

    OpenAIRE

    S. M. Musaeva

    2012-01-01

    The article is devoted to the study of helminthofauna of the striped lizard in Lankaran natural region. The landscape and ecological analysis of distribution of the helminthofauna is provided. As a result of studies on 99 individuals of striped lizard totally 14 species of helminthes, including 1 trematode species, 1 species of cestode, 3 species of akantocefals and 9 species of nematodes were found.

  7. AND LANDSCAPE-ECOLOGICAL ANALYSIS OF ITS DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    S. M. Musaeva

    2012-01-01

    Full Text Available The article is devoted to the study of helminthofauna of the striped lizard in Lankaran natural region. The landscape and ecological analysis of distribution of the helminthofauna is provided. As a result of studies on 99 individuals of striped lizard totally 14 species of helminthes, including 1 trematode species, 1 species of cestode, 3 species of akantocefals and 9 species of nematodes were found.

  8. Study on distribution and behavior of long-lived radionuclides in surface soil environment

    International Nuclear Information System (INIS)

    Morita, Shigemitsu; Watanabe, Hitoshi; Katagiri, Hiromi; Akatsu, Yasuo; Ishiguro, Hideharu

    1996-01-01

    Technetium-99 ( 99 Tc) and Neptunium-237 ( 237 Np) are important radionuclides for environmental assessment around nuclear fuel cycle facilities, because these have long-lives and relatively high mobility in the environment. Therefore, we have been studied the determination, distribution and behavior of such long-lived radionuclides in surface soil environment. A new analytical technique using Inductively Coupled Plasma Mass Spectrometry (ICP-MS) was applied to the determination of long-lived radionuclides in environmental samples. The determination method consists of dry ashing, anion exchange and solvent extraction to eliminate the interfering elements and ICP-MS measurement. The sensitivity of this method was 10 to 100,000 times higher, and the counting time was 300 to 100,000 times shorter than the conventional radioanalytical methods. The soil samples were collected at nine points and core soil sample was collected by an electric core sampler at one point. The core soil sample was divided into eight layers. The depth profiles showed that more than 90% of 99 Tc and 237 Np were retained in the surface layer up to 10cm in depth which contained much amount of organic materials. The results suggest that content of organic materials in soil is related to adsorption of 99 Tc and 237 Np onto soil. (author)

  9. Residual stress distribution analysis of heat treated APS TBC using image based modelling.

    Science.gov (United States)

    Li, Chun; Zhang, Xun; Chen, Ying; Carr, James; Jacques, Simon; Behnsen, Julia; di Michiel, Marco; Xiao, Ping; Cernik, Robert

    2017-08-01

    We carried out a residual stress distribution analysis in a APS TBC throughout the depth of the coatings. The samples were heat treated at 1150 °C for 190 h and the data analysis used image based modelling based on the real 3D images measured by Computed Tomography (CT). The stress distribution in several 2D slices from the 3D model is included in this paper as well as the stress distribution along several paths shown on the slices. Our analysis can explain the occurrence of the "jump" features near the interface between the top coat and the bond coat. These features in the residual stress distribution trend were measured (as a function of depth) by high-energy synchrotron XRD (as shown in our related research article entitled 'Understanding the Residual Stress Distribution through the Thickness of Atmosphere Plasma Sprayed (APS) Thermal Barrier Coatings (TBCs) by high energy Synchrotron XRD; Digital Image Correlation (DIC) and Image Based Modelling') (Li et al., 2017) [1].

  10. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    Efficient and cost effective transportation and logistics plays a vital role in the supply chains of the modern world’s manufacturers. Global distribution of goods is a very complicated matter as it involves many different distinct planning problems. The focus of this presentation is to demonstrate...... a number of important issues which have been identified when addressing the Distribution Network Design problem from a modelling angle. More specifically, we present an analysis of the research which has been performed in utilizing operational research in developing and optimising distribution systems....

  11. Distribution and transfer of radiocesium and radioiodine in the environment following the Fukushima nuclear accident - Distribution and transfer of radiocesium and radioiodine in the environment of Fukushima Prefecture following the nuclear accident

    Energy Technology Data Exchange (ETDEWEB)

    Muramatsu, Yasuyuki; Ohno, Takeshi; Sugiyama, Midori [Gakushuin University, Toshima-ku, Tokyo, 171-8588 (Japan); Sato, Mamoru [Fukushima Agricultural Technology Centre, Koriyama, Fukushima 963-0531 (Japan); Matsuzaki, Hiroyuki [The University of Tokyo, Bunkyo-ku, Tokyo 113-8656 (Japan)

    2014-07-01

    Large quantities of radioiodine and radiocesium were released from the accident at the Fukushima Daiichi Nuclear Power Plant (FDNPP) in March 2011. We have carried out intensive studies on the distribution and behaviour of these nuclides in the environment following the accident. Two topics obtained from our studies are presented. (1) Retrospective estimation of I-131 deposition through the analysis of I-129 in soil: It is necessary to obtain deposition data of radioiodine in Fukushima Prefecture for the assessment of thyroid doses due to the accident. However, short half-life of I-131 (8 days) made it impossible to obtain adequate sample coverage that would permit direct determination of the regional deposition patterns of I-131 within the prefecture and surrounding areas. On the other hand, I-129 released simultaneously during the accident still remains in soil, due to its long half-life of 1.57x10{sup 7} years. In order to reconstruct the I-131 deposition, we have determined I-129 concentrations by accelerator mass spectrometry (AMS). A good correlation was found between the measured concentrations of I-131 and I-129 in soils collected in the vicinity of FDNPP. We have analyzed I-129 in more than 500 soil samples collected systematically from Fukushima Prefecture. Using the obtained results, the I-131 deposition was calculated in different areas and the deposition map for I-131 was constructed. We also studied the vertical distribution of I-129 in soil. (2) Peculiar accumulation of radiocesium to some plants and mushrooms The radioactivity levels in agricultural crops decreased markedly in some months following the accident and their concentrations became lower than the Japanese guideline for foodstuffs (500 Bq/kg in 2011, and 100 Bq/kg after 2012). However, some agricultural products such as tea leaves and citrus fruits showed relatively higher values. Our analytical results obtained for the distribution of radiocesium in tea trees show that the root uptake

  12. An improved algorithm for connectivity analysis of distribution networks

    International Nuclear Information System (INIS)

    Kansal, M.L.; Devi, Sunita

    2007-01-01

    In the present paper, an efficient algorithm for connectivity analysis of moderately sized distribution networks has been suggested. Algorithm is based on generation of all possible minimal system cutsets. The algorithm is efficient as it identifies only the necessary and sufficient conditions of system failure conditions in n-out-of-n type of distribution networks. The proposed algorithm is demonstrated with the help of saturated and unsaturated distribution networks. The computational efficiency of the algorithm is justified by comparing the computational efforts with the previously suggested appended spanning tree (AST) algorithm. The proposed technique has the added advantage as it can be utilized for generation of system inequalities which is useful in reliability estimation of capacitated networks

  13. Risk analysis for a local gas distribution network

    International Nuclear Information System (INIS)

    Peters, J.W.

    1991-01-01

    Cost control and service reliability are popular topics when discussing strategic issues facing local distribution companies (LDCs) in the 1990s. The ability to provide secure and uninterrupted gas service is crucial for growth and company image, both with the public and regulatory agencies. At the same time, the industry is facing unprecedented competition from alternate fuels, and cost control is essential for maintaining a competitive edge in the market. On the surface, it would appear that cost control and service reliability are contradictory terms. Improvement in service reliability should cost something, or does it? Risk analysis can provide the answer from a distribution design perspective. From a gas distribution engineer's perspective, projects such as loops, backfeeds and even valve placement are designed to reduce, minimize and/or eliminate potential customer outages. These projects improve service reliability by acting as backups should a failure occur on a component of the distribution network. These contingency projects are cost-effective but their longterm benefit or true value is under question. Their purpose is to maintain supply to an area in the distribution network in the event of a failure somewhere else. Two phrases, potential customer outages and in the event of failure, identify uncertainty

  14. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    Science.gov (United States)

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2014-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance. PMID:24847184

  15. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments.

    Science.gov (United States)

    Bass, Ellen J; Baumgart, Leigh A; Shepley, Kathryn Klein

    2013-03-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance.

  16. Stratigraphy, facies analysis and depositional environments of the Upper Unit of Abu Roash "E" member in the Abu Gharadig field, Western Desert, Egypt

    Science.gov (United States)

    Hewaidy, Abdel Galil; Elshahat, O. R.; Kamal, Samy

    2018-03-01

    Abu Roach "E" member is of an important hydrocarbon reservoir-producing horizon in the Abu Gharadig Field (north Western Desert, Egypt). This study is used to build facies analysis and depositional environments model for the Upper Unit of the Abu Roash "E" member in Abu Gharadig Field. This target has been achieved throughout the sedimentological, wire line logs, lithostratigraphic and biostratigraphic analyses of more than 528 feet cores. The high-resolution biostratigraphic analysis provides a calibration for the paleo-bathymetry and depositional environmental interpretations. Biozonation and lithostratigraphic markers are used to constrain stratigraphic correlation. Integration between the core description and petorographic microfacies analysis by microscope examination provide an excellent indication for the rock types and depositional environments. Five depositional facies types are detected including carbonate inner ramp, tidal flats, tidal channels, supra-tidal and tide dominated delta facies. This model helps in the understanding of the Upper Unit of Abu Roash "E" member reservoir distribution as well as lateral and vertical facies changes that contribute to the development strategy for the remaining hydrocarbon reserves for this important oil reservoir.

  17. Synthetic environments

    Science.gov (United States)

    Lukes, George E.; Cain, Joel M.

    1996-02-01

    The Advanced Distributed Simulation (ADS) Synthetic Environments Program seeks to create robust virtual worlds from operational terrain and environmental data sources of sufficient fidelity and currency to interact with the real world. While some applications can be met by direct exploitation of standard digital terrain data, more demanding applications -- particularly those support operations 'close to the ground' -- are well-served by emerging capabilities for 'value-adding' by the user working with controlled imagery. For users to rigorously refine and exploit controlled imagery within functionally different workstations they must have a shared framework to allow interoperability within and between these environments in terms of passing image and object coordinates and other information using a variety of validated sensor models. The Synthetic Environments Program is now being expanded to address rapid construction of virtual worlds with research initiatives in digital mapping, softcopy workstations, and cartographic image understanding. The Synthetic Environments Program is also participating in a joint initiative for a sensor model applications programer's interface (API) to ensure that a common controlled imagery exploitation framework is available to all researchers, developers and users. This presentation provides an introduction to ADS and the associated requirements for synthetic environments to support synthetic theaters of war. It provides a technical rationale for exploring applications of image understanding technology to automated cartography in support of ADS and related programs benefitting from automated analysis of mapping, earth resources and reconnaissance imagery. And it provides an overview and status of the joint initiative for a sensor model API.

  18. Distributed Energy Systems in the Built Environment - European Legal Framework on Distributed Energy Systems in the Built Environment

    NARCIS (Netherlands)

    Pront-van Bommel, S.; Bregman, A.

    2013-01-01

    This paper aims to outline the stimuli provided by European law for promoting integrated planning of distributed renewable energy installations on the one hand and its limitations thereof on the other hand, also with regard to the role of local governments.

  19. A Script Analysis of the Distribution of Counterfeit Alcohol Across Two European Jurisdictions

    OpenAIRE

    Lord, Nicholas; Spencer, Jonathan; Bellotti, Elisa; Benson, Katie

    2017-01-01

    This article presents a script analysis of the distribution of counterfeit alcohols across two European jurisdictions. Based on an analysis of case file data from a European regulator and interviews with investigators, the article deconstructs the organisation of the distribution of the alcohol across jurisdictions into five scenes (collection, logistics, delivery, disposal, proceeds/finance) and analyses the actual (or likely permutations of) behaviours within each scene. The analysis also i...

  20. Bioprospecting and Functional Analysis of Neglected Environments

    DEFF Research Database (Denmark)

    Vogt, Josef Korbinian

    on Biological Diversity (explained in Chapter 3). Proteolytic enzymes – described in Chapter 4 – are the target for bioprospecting due to their high market value. Section II describes methods used for the analysis of metagenomic and RNA-seq datasets, including Manuscript I, which includes the taxonomic...... of arctic marine metagenomes reveals bacterial strategies for deep sea persistence (Manuscript II). Furthermore, this extreme environment is a fertile ground to mine for novel proteolytic enzymes. Manuscript III presents a bioinformatics approach to identify sequences for potential commercialization...

  1. Paleomagnetism.org : An online multi-platform open source environment for paleomagnetic data analysis

    NARCIS (Netherlands)

    Koymans, Mathijs R.; Langereis, C.G.; Pastor-Galán, D.; van Hinsbergen, D.J.J.

    2016-01-01

    This contribution provides an overview of Paleomagnetism.org, an open-source, multi-platform online environment for paleomagnetic data analysis. Paleomagnetism.org provides an interactive environment where paleomagnetic data can be interpreted, evaluated, visualized, and exported. The

  2. LXtoo: an integrated live Linux distribution for the bioinformatics community.

    Science.gov (United States)

    Yu, Guangchuang; Wang, Li-Gen; Meng, Xiao-Hua; He, Qing-Yu

    2012-07-19

    Recent advances in high-throughput technologies dramatically increase biological data generation. However, many research groups lack computing facilities and specialists. This is an obstacle that remains to be addressed. Here, we present a Linux distribution, LXtoo, to provide a flexible computing platform for bioinformatics analysis. Unlike most of the existing live Linux distributions for bioinformatics limiting their usage to sequence analysis and protein structure prediction, LXtoo incorporates a comprehensive collection of bioinformatics software, including data mining tools for microarray and proteomics, protein-protein interaction analysis, and computationally complex tasks like molecular dynamics. Moreover, most of the programs have been configured and optimized for high performance computing. LXtoo aims to provide well-supported computing environment tailored for bioinformatics research, reducing duplication of efforts in building computing infrastructure. LXtoo is distributed as a Live DVD and freely available at http://bioinformatics.jnu.edu.cn/LXtoo.

  3. Distributed Engine Control Empirical/Analytical Verification Tools

    Science.gov (United States)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  4. Distributed resistance model for the analysis of wire-wrapped rod bundles

    International Nuclear Information System (INIS)

    Ha, K. S.; Jung, H. Y.; Kwon, Y. M.; Jang, W. P.; Lee, Y. B.

    2003-01-01

    A partial flow blockage within a fuel assembly in liquid metal reactor may result in localized boiling or a failure of the fuel cladding. Thus, the precise analysis for the phenomenon is required for a safe design of LMR. MATRA-LMR code developed by KAERI models the flow distribution in an assembly by using the wire forcing function to consider the effects of wire-wrap spacers, which is important to the analysis for flow blockage. However, the wire forcing function does not have the capabilities of analysis when the flow blockage is occurred. And thus this model was altered to the distributed resistance model and the validation calculation was carried out against to the experiment of FFM 2A

  5. Qualitative analysis of precipiation distribution in Poland with use of different data sources

    Directory of Open Access Journals (Sweden)

    J. Walawender

    2008-04-01

    Full Text Available Geographical Information Systems (GIS can be used to integrate data from different sources and in different formats to perform innovative spatial and temporal analysis. GIS can be also applied for climatic research to manage, investigate and display all kinds of weather data.

    The main objective of this study is to demonstrate that GIS is a useful tool to examine and visualise precipitation distribution obtained from different data sources: ground measurements, satellite and radar data.

    Three selected days (30 cases with convective rainfall situations were analysed. Firstly, scalable GRID-based approach was applied to store data from three different sources in comparable layout. Then, geoprocessing algorithm was created within ArcGIS 9.2 environment. The algorithm included: GRID definition, reclassification and raster algebra. All of the calculations and procedures were performed automatically. Finally, contingency tables and pie charts were created to show relationship between ground measurements and both satellite and radar derived data. The results were visualised on maps.

  6. Reliability Worth Analysis of Distribution Systems Using Cascade Correlation Neural Networks

    DEFF Research Database (Denmark)

    Heidari, Alireza; Agelidis, Vassilios; Pou, Josep

    2018-01-01

    Reliability worth analysis is of great importance in the area of distribution network planning and operation. The reliability worth's precision can be affected greatly by the customer interruption cost model used. The choice of the cost models can change system and load point reliability indices....... In this study, a cascade correlation neural network is adopted to further develop two cost models comprising a probabilistic distribution model and an average or aggregate model. A contingency-based analytical technique is adopted to conduct the reliability worth analysis. Furthermore, the possible effects...

  7. Activation analysis in the environment: Science and technology

    International Nuclear Information System (INIS)

    Lenihan, J.

    1989-01-01

    Science is disciplined curiosity. Activation analysis was created more than 50 yr ago by Hevesy's curiosity and Levi's experimental skill. Technology is the exploitation of machines and materials for the fulfillment of human needs or wants. The early history of neutron activation analysis (NAA) was greatly influenced by military requirements. Since then the technique has found applications in many disciplines, including materials science, medicine, archaeology, geochemistry, agriculture, and forensic science. More recently, neutron activation analysts, responding to increasing public interest and concern, have made distinctive contributions to the study of environmental problems. Activation analysis, though it uses some procedures derived from physics, is essentially a chemical technique. The chemical study of the environment may be reviewed under many headings; three are discussed here: 1. occupational medicine 2. health of the general public 3. environmental pollution

  8. Precision Statistical Analysis of Images Based on Brightness Distribution

    Directory of Open Access Journals (Sweden)

    Muzhir Shaban Al-Ani

    2017-07-01

    Full Text Available Study the content of images is considered an important topic in which reasonable and accurate analysis of images are generated. Recently image analysis becomes a vital field because of huge number of images transferred via transmission media in our daily life. These crowded media with images lead to highlight in research area of image analysis. In this paper, the implemented system is passed into many steps to perform the statistical measures of standard deviation and mean values of both color and grey images. Whereas the last step of the proposed method concerns to compare the obtained results in different cases of the test phase. In this paper, the statistical parameters are implemented to characterize the content of an image and its texture. Standard deviation, mean and correlation values are used to study the intensity distribution of the tested images. Reasonable results are obtained for both standard deviation and mean value via the implementation of the system. The major issue addressed in the work is concentrated on brightness distribution via statistical measures applying different types of lighting.

  9. Internal environment analysis and its improvement in company Ltd "German Products Baltics"

    OpenAIRE

    Štekels, Jānis

    2012-01-01

    The topic of the Bachelors work is „Internal environment analysis and its improvment in company Ltd „German Products Baltics””. The objective of Bachelors work is to explore and analyze the internal environment and to develop proposals for its improvement. Subject of the work is actual, because each company before starting its business or to change something in companies work, should understood the companies strength and weaknesses by analyzing internal environment. Bachelor work consists of ...

  10. Distributed energy resources and benefits to the environment

    DEFF Research Database (Denmark)

    Akorede, Mudathir Funsho; Hizam, Hashim; Pouresmaeil, Edris

    2010-01-01

    generation in 2030 would be produced from fossil fuels. This global dependence on fossil fuels is dangerous to our environment in terms of their emissions unless specific policies and measures are put in place. Nevertheless, recent research reveals that a reduction in the emissions of these gases is possible...... on fossil fuels to our environment. The study finally justifies how DG technologies could substantially reduce greenhouse gas emissions when fully adopted; hence, reducing the public concerns over human health risks caused by the conventional method of electricity generation....

  11. Energy system analysis of fuel cells and distributed generation

    DEFF Research Database (Denmark)

    Mathiesen, Brian Vad; Lund, Henrik

    2007-01-01

    This chapter introduces Energy System Analysis methodologies and tools, which can be used for identifying the best application of different Fuel Cell (FC) technologies to different regional or national energy systems. The main point is that the benefits of using FC technologies indeed depend...... on the energy system in which they are used. Consequently, coherent energy systems analyses of specific and complete energy systems must be conducted in order to evaluate the benefits of FC technologies and in order to be able to compare alternative solutions. In relation to distributed generation, FC...... technologies are very often connected to the use of hydrogen, which has to be provided e.g. from electrolysers. Decentralised and distributed generation has the possibility of improving the overall energy efficiency and flexibility of energy systems. Therefore, energy system analysis tools and methodologies...

  12. Distribution System Reliability Analysis for Smart Grid Applications

    Science.gov (United States)

    Aljohani, Tawfiq Masad

    Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

  13. Comparing Distributions of Environmental Outcomes for Regulatory Environmental Justice Analysis

    Directory of Open Access Journals (Sweden)

    Glenn Sheriff

    2011-05-01

    Full Text Available Economists have long been interested in measuring distributional impacts of policy interventions. As environmental justice (EJ emerged as an ethical issue in the 1970s, the academic literature has provided statistical analyses of the incidence and causes of various environmental outcomes as they relate to race, income, and other demographic variables. In the context of regulatory impacts, however, there is a lack of consensus regarding what information is relevant for EJ analysis, and how best to present it. This paper helps frame the discussion by suggesting a set of questions fundamental to regulatory EJ analysis, reviewing past approaches to quantifying distributional equity, and discussing the potential for adapting existing tools to the regulatory context.

  14. Electronic structure and partial charge distribution of Doxorubicin in different molecular environments.

    Science.gov (United States)

    Poudel, Lokendra; Wen, Amy M; French, Roger H; Parsegian, V Adrian; Podgornik, Rudolf; Steinmetz, Nicole F; Ching, Wai-Yim

    2015-05-18

    The electronic structure and partial charge of doxorubicin (DOX) in three different molecular environments-isolated, solvated, and intercalated in a DNA complex-are studied by first-principles density functional methods. It is shown that the addition of solvating water molecules to DOX, together with the proximity to and interaction with DNA, has a significant impact on the electronic structure as well as on the partial charge distribution. Significant improvement in estimating the DOX-DNA interaction energy is achieved. The results are further elucidated by resolving the total density of states and surface charge density into different functional groups. It is concluded that the presence of the solvent and the details of the interaction geometry matter greatly in determining the stability of DOX complexation. Ab initio calculations on realistic models are an important step toward a more accurate description of the long-range interactions in biomolecular systems. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. [Correlative analysis of the diversity patterns of regional surface water, NDVI and thermal environment].

    Science.gov (United States)

    Duan, Jin-Long; Zhang, Xue-Lei

    2012-10-01

    Taking Zhengzhou City, the capital of Henan Province in Central China, as the study area, and by using the theories and methodologies of diversity, a discreteness evaluation on the regional surface water, normalized difference vegetation index (NDVI), and land surface temperature (LST) distribution was conducted in a 2 km x 2 km grid scale. Both the NDVI and the LST were divided into 4 levels, their spatial distribution diversity indices were calculated, and their connections were explored. The results showed that it was of operability and practical significance to use the theories and methodologies of diversity in the discreteness evaluation of the spatial distribution of regional thermal environment. There was a higher overlap of location between the distributions of surface water and the lowest temperature region, and the high vegetation coverage was often accompanied by low land surface temperature. In 1988-2009, the discreteness of the surface water distribution in the City had an obvious decreasing trend. The discreteness of the surface water distribution had a close correlation with the discreteness of the temperature region distribution, while the discreteness of the NDVI classification distribution had a more complicated correlation with the discreteness of the temperature region distribution. Therefore, more environmental factors were needed to be included for a better evaluation.

  16. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    This paper investigates the characteristics of typical optimisation models within Distribution Network Design. During the paper fourteen models known from the literature will be thoroughly analysed. Through this analysis a schematic approach to categorisation of distribution network design models...... for educational purposes. Furthermore, the paper can be seen as a practical introduction to network design modelling as well as a being an art manual or recipe when constructing such a model....

  17. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    Science.gov (United States)

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  18. APPLICATION OF GIS AND STATISTICAL METHODS IN ESTABLISHMENT OF THE RELATIONSHIPS BETWEEN GEOCHEMICAL CHARACTERISTICS OF THE ENVIRONMENT AND GEOGRAPHICAL DISTRIBUTION OF DISEASE INCIDENCE RATE

    Directory of Open Access Journals (Sweden)

    Ante Kutle

    2012-12-01

    Full Text Available Geochemical environment can influence human health causing chronic medical problems related to long-term, low-level exposures to toxic agents such are trace elements. Humans can be directly exposed to toxic substances by inhalation of air dust or indirectly through food chain or by consumption of local water for drinking, cooking, personal hygiene and recreational purposes. Chronic medical problems related to geochemical characteristics of the environment can also be caused by chronic deficit of chemical elements essential for humans. In this paper we will present several applications of the GIS and statistical methods for relating the geographical distribution of diseases with geochemical characteristics of the environment. In addition, we are presenting methods applied for distinguishing natural distribution of elements from anthropogenic contribution, which is important information for establishing protective measures necessary for decreasing the health risk (the paper is published in Croatian.

  19. Evaluation of Occupational Cold Environments: Field Measurements and Subjective Analysis

    Science.gov (United States)

    OLIVEIRA, A. Virgílio M.; GASPAR, Adélio R.; RAIMUNDO, António M.; QUINTELA, Divo A.

    2014-01-01

    The present work is dedicated to the study of occupational cold environments in food distribution industrial units. Field measurements and a subjective assessment based on an individual questionnaire were considered. The survey was carried out in 5 Portuguese companies. The field measurements include 26 workplaces, while a sample of 160 responses was considered for the subjective assessment. In order to characterize the level of cold exposure, the Required Clothing Insulation Index (IREQ) was adopted. The IREQ index highlights that in the majority of the workplaces the clothing ensembles worn are inadequate, namely in the freezing chambers where the protection provided by clothing is always insufficient. The questionnaires results show that the food distribution sector is characterized by a female population (70.6%), by a young work force (60.7% are less than 35 yr old) and by a population with a medium-length professional career (80.1% in this occupation for less than 10 yr). The incidence of health effects which is higher among women, the distribution of protective clothing (50.0% of the workers indicate one garment) and the significant percentage of workers (>75%) that has more difficulties in performing the activity during the winter represent other important results of the present study. PMID:24583510

  20. Evaluation of occupational cold environments: field measurements and subjective analysis.

    Science.gov (United States)

    Oliveira, A Virgílio M; Gaspar, Adélio R; Raimundo, António M; Quintela, Divo A

    2014-01-01

    The present work is dedicated to the study of occupational cold environments in food distribution industrial units. Field measurements and a subjective assessment based on an individual questionnaire were considered. The survey was carried out in 5 Portuguese companies. The field measurements include 26 workplaces, while a sample of 160 responses was considered for the subjective assessment. In order to characterize the level of cold exposure, the Required Clothing Insulation Index (IREQ) was adopted. The IREQ index highlights that in the majority of the workplaces the clothing ensembles worn are inadequate, namely in the freezing chambers where the protection provided by clothing is always insufficient. The questionnaires results show that the food distribution sector is characterized by a female population (70.6%), by a young work force (60.7% are less than 35 yr old) and by a population with a medium-length professional career (80.1% in this occupation for less than 10 yr). The incidence of health effects which is higher among women, the distribution of protective clothing (50.0% of the workers indicate one garment) and the significant percentage of workers (>75%) that has more difficulties in performing the activity during the winter represent other important results of the present study.

  1. Building a foundation to study distributed information behaviour

    Directory of Open Access Journals (Sweden)

    Terry L. von Thaden

    2007-01-01

    Full Text Available Introduction. The purpose of this research is to assess information behaviour as it pertains to operational teams in dynamic safety critical operations. Method. In this paper, I describe some of the problems faced by crews on modern flight decks and suggest a framework modelled on Information Science, Human Factors, and Activity Theory research to assess the distribution of information actions, namely information identification, gathering and use, by teams of users in a dynamic, safety critical environment. Analysis. By analysing the information behaviour of crews who have accidents and those who do not, researchers may be able to ascertain how they (fail to make use of essential, safety critical information in their information environment. The ultimate goal of this research is to differentiate information behaviour among the distinct outcomes. Results. This research affords the possibility to discern differences in distributed information behaviour illustrating that crews who err to the point of an accident appear to practice different distributed information behaviour than those who do not. This foundation serves to operationalise team sense-making through illustrating the social practice of information structuring within the activity of the work environment. Conclusion. . The distributed information behaviour framework provides a useful structure to study the patterning and organization of information distributed over space and time, to reach a common goal. This framework may allow researchers and investigators alike to identify critical information activity in the negotiation of meaning in high reliability safety critical work, eventually informing safer practice. This framework is applicable to other domains.

  2. Combining Static Analysis and Runtime Checking in Security Aspects for Distributed Tuple Spaces

    DEFF Research Database (Denmark)

    Yang, Fan; Aotani, Tomoyuki; Masuhara, Hidehiko

    2011-01-01

    Enforcing security policies to distributed systems is difficult, in particular, to a system containing untrusted components. We designed AspectKE*, an aspect-oriented programming language based on distributed tuple spaces to tackle this issue. One of the key features in AspectKE* is the program...... analysis predicates and functions that provide information on future behavior of a program. With a dual value evaluation mechanism that handles results of static analysis and runtime values at the same time, those functions and predicates enable the users to specify security policies in a uniform manner....... Our two-staged implementation strategy gathers fundamental static analysis information at load-time, so as to avoid performing all analysis at runtime. We built a compiler for AspectKE*, and successfully implemented security aspects for a distributed chat system and an electronic healthcare record...

  3. Radioactivity in the Marine Environment. Chapter 1

    International Nuclear Information System (INIS)

    Zal U'yun Wan Mahmood; Abdul Kadir Ishak; Norfaizal Mohamad; Wo, Y.M.; Kamarudin Samuding

    2015-01-01

    Radionuclide (radioactive isotopes or radioisotopes is widely distributed on the ground primarily in marine environments. Nowadays, more than 340 isotopes has been identified exist in our earth especially in marine environment. From that total, 80 isotopes was radioactive. The existence of radioactivity in the marine environment is through the direct and indirect distribution of radionuclides

  4. Optimizing NEURON Simulation Environment Using Remote Memory Access with Recursive Doubling on Distributed Memory Systems.

    Science.gov (United States)

    Shehzad, Danish; Bozkuş, Zeki

    2016-01-01

    Increase in complexity of neuronal network models escalated the efforts to make NEURON simulation environment efficient. The computational neuroscientists divided the equations into subnets amongst multiple processors for achieving better hardware performance. On parallel machines for neuronal networks, interprocessor spikes exchange consumes large section of overall simulation time. In NEURON for communication between processors Message Passing Interface (MPI) is used. MPI_Allgather collective is exercised for spikes exchange after each interval across distributed memory systems. The increase in number of processors though results in achieving concurrency and better performance but it inversely affects MPI_Allgather which increases communication time between processors. This necessitates improving communication methodology to decrease the spikes exchange time over distributed memory systems. This work has improved MPI_Allgather method using Remote Memory Access (RMA) by moving two-sided communication to one-sided communication, and use of recursive doubling mechanism facilitates achieving efficient communication between the processors in precise steps. This approach enhanced communication concurrency and has improved overall runtime making NEURON more efficient for simulation of large neuronal network models.

  5. Optimizing NEURON Simulation Environment Using Remote Memory Access with Recursive Doubling on Distributed Memory Systems

    Directory of Open Access Journals (Sweden)

    Danish Shehzad

    2016-01-01

    Full Text Available Increase in complexity of neuronal network models escalated the efforts to make NEURON simulation environment efficient. The computational neuroscientists divided the equations into subnets amongst multiple processors for achieving better hardware performance. On parallel machines for neuronal networks, interprocessor spikes exchange consumes large section of overall simulation time. In NEURON for communication between processors Message Passing Interface (MPI is used. MPI_Allgather collective is exercised for spikes exchange after each interval across distributed memory systems. The increase in number of processors though results in achieving concurrency and better performance but it inversely affects MPI_Allgather which increases communication time between processors. This necessitates improving communication methodology to decrease the spikes exchange time over distributed memory systems. This work has improved MPI_Allgather method using Remote Memory Access (RMA by moving two-sided communication to one-sided communication, and use of recursive doubling mechanism facilitates achieving efficient communication between the processors in precise steps. This approach enhanced communication concurrency and has improved overall runtime making NEURON more efficient for simulation of large neuronal network models.

  6. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    OpenAIRE

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2012-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noi...

  7. Analysis of Learning Environment Factors Based on Maslow’s Hierarchy of Needs

    Directory of Open Access Journals (Sweden)

    Košir Katja

    2013-09-01

    Full Text Available This paper provides a new analysis of some learning environment factors from the point of view of one of the most established motivational models, i.e. Maslow’s hierarchy of needs. For a teacher, this model can represent a meaningful tool for the analysis of the potential factors of pupils’ inadequate school adjustment. Some psychological constructs that can be conceptualized as learning environment factors are presented at specific levels of needs. As regards the level of physiological needs, this paper provides an overview of research studies on ergonomic factors of learning environment. As for safety needs, the paper outlines the concepts of classroom management and peer-to-peer violence, and presents some main research findings in both fields. The analysis regarding the level of love and belonging includes aspects of positive classroom climate and the concept of pupils’ social acceptance. Contemporary findings about the development of pupil’s academic self-concept are presented within the self-esteem and achievements needs. Flow is considered to be one of key factors that help teacher satisfy the self-actualization needs and stimulate pupils’ personal development. On the basis of this analysis, some implications and recommendations are given to help teachers efficiently encourage an integrated approach to pupil development.

  8. Radioanalytical studies of iodine behaviour in the environment

    International Nuclear Information System (INIS)

    Evans, G.J.; Hammad, K.A.

    1995-01-01

    The behaviour of iodine in the environment is of interest both in relation to radioecology and human nutrition. Radiochemical techniques were used to evaluate various aspects of the behaviour of iodine in the environment. The natural iodine content of plant, water and soil samples collected from three sites was determined using preconcentration neutron activation analysis (PNAA). The effect of initial chemical speciation on the distribution of iodine between various soils, sediments and waters was evaluated using I-131 tracer. Iodide was found to adsorb more extensively than iodate, although four most of the solid/water systems examined, a substantial portion of the iodate was slowly reduced to iodide. Experiments involving gamma irradiation suggest that much of the sorption of iodide and reduction of iodate involved microbial processes. Distribution coefficients measured using I-131 were comparable with values based on the natural I-127 content. (author) 18 refs.; 5 tabs

  9. Heavy metal distributions in Peru Basin surface sediments in relation to historic, present and disturbed redox environments

    Science.gov (United States)

    Koschinsky, Andrea

    Heavy metal distributions in deep-sea surface sediments and pore water profiles from five areas in the Peru Basin were investigated with respect to the redox environment and diagenetic processes in these areas. The 10-20-cm-thick Mn oxide-rich and minor metal-rich top layer is underlain by an increase in dissolved Mn and Ni concentrations resulting from the reduction of the MnO 2 phase below the oxic zone. The mobilised associated metals like Co, Zn and Cu are partly immobilised by sorption on clay, organic or Fe compounds in the post-oxic environment. Enrichment of dissolved Cu, Zn, Ni, Co, Pb, Cd, Fe and V within the upper 1-5 cm of the oxic zone can be attributed to the degradation of organic matter. In a core from one area at around 22-25 cm depth, striking enrichments of these metals in dissolved and solid forms were observed. Offset distributions between oxygen penetration and Mn reduction and the thickness of the Mn oxide-rich layer indicate fluctuations of the Mn redox boundary on a short-term time scale. Within the objectives of the German ATESEPP research programme, the effect of an industrial impact such as manganese nodule mining on the heavy metal cycle in the surface sediment was considered. If the oxic surface were to be removed or disturbed, oxygen would penetrate deep into the formerly suboxic sediment and precipitate Mn 2+ and metals like Ni and Co which are preferably scavenged by MnO 2. The solid enrichments of Cd, V, and other metals formed in post-oxic environments would move downward with the new redox boundary until a new equilibrium between oxygen diffusion and consumption is reached.

  10. Analysis of Radioactivity Contamination Level of Kartini Reactor Efluen Gas to the Environment

    International Nuclear Information System (INIS)

    Suratman; Purwanto; Aminjoyo, S

    1996-01-01

    The analysis of radioactivity contamination level of Kartini reactor efluen gas to the environment has been done from 13-10-'95 until 8-2-'96. The aim of this research is to determine the radioactivity contamination level on the environment resulted from the release of Kartini reactor efluen gas and other facilities at Yogyakarta Nuclear Research Centre through stack. The analysis methods is the student t-test, the first count factor test and the gamma spectrometry. The gas sampling were carried out in the stack reactor, reactor room, environment and in other room for comparison. Efluen gas was sucked through a filter by a high volume vacuum pump. The filter was counted for beta, gamma and alpha activities. The radioactivity contamination level of the efluen gas passing through the stack to the environment was measured between 0.57 - 1.34 Bq/m3, which was equal to the airborne radioactivity in environment between 0.69 - 1.12 Bq/m3. This radioactivity comes from radon daughter, decay products result from the natural uranium and thorium series of the materials of the building

  11. Strain distribution of confined Ge/GeO2 core/shell nanoparticles engineered by growth environments

    Science.gov (United States)

    Wei, Wenyan; Yuan, Cailei; Luo, Xingfang; Yu, Ting; Wang, Gongping

    2016-02-01

    The strain distributions of Ge/GeO2 core/shell nanoparticles confined in different host matrix grown by surface oxidation are investigated. The simulated results by finite element method demonstrated that the strains of the Ge core and the GeO2 shell strongly depend on the growth environments of the nanoparticles. Moreover, it can be found that there is a transformation of the strain on Ge core from tensile to compressive strain during the growth of Ge/GeO2 core/shell nanoparticles. And, the transformation of the strain is closely related with the Young's modulus of surrounding materials of Ge/GeO2 core/shell nanoparticles.

  12. Long-term performance of the SwissQuantum quantum key distribution network in a field environment

    International Nuclear Information System (INIS)

    Stucki, D; Gisin, N; Thew, R; Legré, M; Clausen, B; Monat, L; Page, J-B; Ribordy, G; Rochas, A; Robyr, S; Trinkler, P; Buntschu, F; Perroud, D; Felber, N; Henzen, L; Junod, P; Monbaron, P; Ventura, S; Litzistorf, G; Tavares, J

    2011-01-01

    In this paper, we report on the performance of the SwissQuantum quantum key distribution (QKD) network. The network was installed in the Geneva metropolitan area and ran for more than one-and-a-half years, from the end of March 2009 to the beginning of January 2011. The main goal of this experiment was to test the reliability of the quantum layer over a long period of time in a production environment. A key management layer has been developed to manage the key between the three nodes of the network. This QKD-secure network was utilized by end-users through an application layer. (paper)

  13. Vibrational Energy Distribution Analysis (VEDA): Scopes and limitations

    Science.gov (United States)

    Jamróz, Michał H.

    2013-10-01

    The principle of operations of the VEDA program written by the author for Potential Energy Distribution (PED) analysis of theoretical vibrational spectra is described. Nowadays, the PED analysis is indispensible tool in serious analysis of the vibrational spectra. To perform the PED analysis it is necessary to define 3N-6 linearly independent local mode coordinates. Already for 20-atomic molecules it is a difficult task. The VEDA program reads the input data automatically from the Gaussian program output files. Then, VEDA automatically proposes an introductory set of local mode coordinates. Next, the more adequate coordinates are proposed by the program and optimized to obtain maximal elements of each column (internal coordinate) of the PED matrix (the EPM parameter). The possibility for an automatic optimization of PED contributions is a unique feature of the VEDA program absent in any other programs performing PED analysis.

  14. Entanglement reactivation in separable environments

    International Nuclear Information System (INIS)

    Pirandola, Stefano

    2013-01-01

    Combining two entanglement-breaking channels into a correlated-noise environment restores the distribution of entanglement. Surprisingly, this reactivation can be induced by the injection of separable correlations from the composite environment. In any dimension (finite or infinite), we can construct classically correlated ‘twirling’ environments which are entanglement-breaking in the transmission of single systems but entanglement-preserving when two systems are transmitted. Here entanglement is simply preserved by the existence of decoherence-free subspaces. Remarkably, even when such subspaces do not exist, a fraction of the input entanglement can still be distributed. This is found in separable Gaussian environments, where distillable entanglement is able to survive the two-mode transmission, despite being broken in any single-mode transmission by the strong thermal noise. In the Gaussian setting, entanglement restoration is a threshold process, occurring only after a critical amount of correlations has been injected. Such findings suggest new perspectives for distributing entanglement in realistic environments with extreme decoherence, identifying separable correlations and classical memory effects as physical resources for ‘breaking entanglement-breaking’. (paper)

  15. FAME, the Flux Analysis and Modeling Environment

    Directory of Open Access Journals (Sweden)

    Boele Joost

    2012-01-01

    Full Text Available Abstract Background The creation and modification of genome-scale metabolic models is a task that requires specialized software tools. While these are available, subsequently running or visualizing a model often relies on disjoint code, which adds additional actions to the analysis routine and, in our experience, renders these applications suboptimal for routine use by (systems biologists. Results The Flux Analysis and Modeling Environment (FAME is the first web-based modeling tool that combines the tasks of creating, editing, running, and analyzing/visualizing stoichiometric models into a single program. Analysis results can be automatically superimposed on familiar KEGG-like maps. FAME is written in PHP and uses the Python-based PySCeS-CBM for its linear solving capabilities. It comes with a comprehensive manual and a quick-start tutorial, and can be accessed online at http://f-a-m-e.org/. Conclusions With FAME, we present the community with an open source, user-friendly, web-based "one stop shop" for stoichiometric modeling. We expect the application will be of substantial use to investigators and educators alike.

  16. Spatial distributions of Pseudomonas fluorescens colony variants in mixed-culture biofilms.

    Science.gov (United States)

    Workentine, Matthew L; Wang, Siyuan; Ceri, Howard; Turner, Raymond J

    2013-07-28

    The emergence of colony morphology variants in structured environments is being recognized as important to both niche specialization and stress tolerance. Pseudomonas fluorescens demonstrates diversity in both its natural environment, the rhizosphere, and in laboratory grown biofilms. Sub-populations of these variants within a biofilm have been suggested as important contributors to antimicrobial stress tolerance given their altered susceptibility to various agents. As such it is of interest to determine how these variants might be distributed in the biofilm environment. Here we present an analysis of the spatial distribution of Pseudomonas fluorescens colony morphology variants in mixed-culture biofilms with the wildtype phenotype. These findings reveal that two variant colony morphotypes demonstrate a significant growth advantage over the wildtype morphotype in the biofilm environment. The two variant morphotypes out-grew the wildtype across the entire biofilm and this occurred within 24 h and was maintained through to 96 h. This competitive advantage was not observed in homogeneous broth culture. The significant advantage that the variants demonstrate in biofilm colonization over the wildtype denotes the importance of this phenotype in structured environments.

  17. Survival Function Analysis of Planet Size Distribution

    OpenAIRE

    Zeng, Li; Jacobsen, Stein B.; Sasselov, Dimitar D.; Vanderburg, Andrew

    2018-01-01

    Applying the survival function analysis to the planet radius distribution of the Kepler exoplanet candidates, we have identified two natural divisions of planet radius at 4 Earth radii and 10 Earth radii. These divisions place constraints on planet formation and interior structure model. The division at 4 Earth radii separates small exoplanets from large exoplanets above. When combined with the recently-discovered radius gap at 2 Earth radii, it supports the treatment of planets 2-4 Earth rad...

  18. Estimation Accuracy on Execution Time of Run-Time Tasks in a Heterogeneous Distributed Environment

    Directory of Open Access Journals (Sweden)

    Qi Liu

    2016-08-01

    Full Text Available Distributed Computing has achieved tremendous development since cloud computing was proposed in 2006, and played a vital role promoting rapid growth of data collecting and analysis models, e.g., Internet of things, Cyber-Physical Systems, Big Data Analytics, etc. Hadoop has become a data convergence platform for sensor networks. As one of the core components, MapReduce facilitates allocating, processing and mining of collected large-scale data, where speculative execution strategies help solve straggler problems. However, there is still no efficient solution for accurate estimation on execution time of run-time tasks, which can affect task allocation and distribution in MapReduce. In this paper, task execution data have been collected and employed for the estimation. A two-phase regression (TPR method is proposed to predict the finishing time of each task accurately. Detailed data of each task have drawn interests with detailed analysis report being made. According to the results, the prediction accuracy of concurrent tasks’ execution time can be improved, in particular for some regular jobs.

  19. Distribution and enrichment of 210Po and 210Pb in the environment of Mangalore, South West coast of India

    International Nuclear Information System (INIS)

    Prakash, V.; Rajashekara, K.M.; Narayana, Y.

    2013-01-01

    The paper deals with the distribution and enrichment of 210 Po and 210 Pb in soil samples of Mangalore South west coast of India. The soil samples collected from the region were analyzed for 210 Po and 210 Pb activity using radiochemical analytical techniques to understand the distribution and enrichment of those radionuclides. The 210 Po activity in soil in the environment of Mangalore varies from 1.5 Bq kg -1 to 26.9 Bq kg -1 with a mean value of 12.6 Bq kg -1 and that of 210 Pb varies in the range 7.6 Bq kg -1 to 67.5 Bq kg -1 with a mean value of 38.9 Bq kg -1 . The mean 210 Po/ 210 Pb ratio observed was 0.3 and it shows that the radionuclides 210 Po and 210 Pb are not in equilibrium and the accumulation of 210 Pb in soil is more compared to 210 Po. A good correlation exists between the activities of 210 Po and 210 Pb with correlation coefficient r = 0.7. The absorbed gamma dose in the environment of the region varies from 39.4 nGy h -1 to 78.8 nGy h -1 with a mean value of 48.2 nGy h -1 . The results of the systematic studies on the distribution and enrichment of 210 Po and 210 Pb and the absorbed gamma dose rate in air are presented and discussed in this paper. (author)

  20. Just fracking: a distributive environmental justice analysis of unconventional gas development in Pennsylvania, USA

    Science.gov (United States)

    Clough, Emily; Bell, Derek

    2016-02-01

    This letter presents a distributive environmental justice analysis of unconventional gas development in the area of Pennsylvania lying over the Marcellus Shale, the largest shale gas formation in play in the United States. The extraction of shale gas using unconventional wells, which are hydraulically fractured (fracking), has increased dramatically since 2005. As the number of wells has grown, so have concerns about the potential public health effects on nearby communities. These concerns make shale gas development an environmental justice issue. This letter examines whether the hazards associated with proximity to wells and the economic benefits of shale gas production are fairly distributed. We distinguish two types of distributive environmental justice: traditional and benefit sharing. We ask the traditional question: are there a disproportionate number of minority or low-income residents in areas near to unconventional wells in Pennsylvania? However, we extend this analysis in two ways: we examine income distribution and level of education; and we compare before and after shale gas development. This contributes to discussions of benefit sharing by showing how the income distribution of the population has changed. We use a binary dasymetric technique to remap the data from the 2000 US Census and the 2009-2013 American Communities Survey and combine that data with a buffer containment analysis of unconventional wells to compare the characteristics of the population living nearer to unconventional wells with those further away before and after shale gas development. Our analysis indicates that there is no evidence of traditional distributive environmental injustice: there is not a disproportionate number of minority or low-income residents in areas near to unconventional wells. However, our analysis is consistent with the claim that there is benefit sharing distributive environmental injustice: the income distribution of the population nearer to shale gas wells

  1. Powerlaw: a Python package for analysis of heavy-tailed distributions.

    Science.gov (United States)

    Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar

    2014-01-01

    Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.

  2. Powerlaw: a Python package for analysis of heavy-tailed distributions.

    Directory of Open Access Journals (Sweden)

    Jeff Alstott

    Full Text Available Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.

  3. Automatic analysis of attack data from distributed honeypot network

    Science.gov (United States)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  4. Using Micro-Synchrophasor Data for Advanced Distribution Grid Planning and Operations Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Emma [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kiliccote, Sila [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McParland, Charles [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Roberts, Ciaran [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2014-07-01

    This report reviews the potential for distribution-grid phase-angle data that will be available from new micro-synchrophasors (µPMUs) to be utilized in existing distribution-grid planning and operations analysis. This data could augment the current diagnostic capabilities of grid analysis software, used in both planning and operations for applications such as fault location, and provide data for more accurate modeling of the distribution system. µPMUs are new distribution-grid sensors that will advance measurement and diagnostic capabilities and provide improved visibility of the distribution grid, enabling analysis of the grid’s increasingly complex loads that include features such as large volumes of distributed generation. Large volumes of DG leads to concerns on continued reliable operation of the grid, due to changing power flow characteristics and active generation, with its own protection and control capabilities. Using µPMU data on change in voltage phase angle between two points in conjunction with new and existing distribution-grid planning and operational tools is expected to enable model validation, state estimation, fault location, and renewable resource/load characterization. Our findings include: data measurement is outstripping the processing capabilities of planning and operational tools; not every tool can visualize a voltage phase-angle measurement to the degree of accuracy measured by advanced sensors, and the degree of accuracy in measurement required for the distribution grid is not defined; solving methods cannot handle the high volumes of data generated by modern sensors, so new models and solving methods (such as graph trace analysis) are needed; standardization of sensor-data communications platforms in planning and applications tools would allow integration of different vendors’ sensors and advanced measurement devices. In addition, data from advanced sources such as µPMUs could be used to validate models to improve

  5. Analysis of Virtual Learning Environments from a Comprehensive Semiotic Perspective

    Directory of Open Access Journals (Sweden)

    Gloria María Álvarez Cadavid

    2012-11-01

    Full Text Available Although there is a wide variety of perspectives and models for the study of online education, most of these focus on the analysis of the verbal aspects of such learning, while very few consider the relationship between speech and elements of a different nature, such as images and hypermediality. In a previous article we presented a proposal for a comprehensive semiotic analysis of virtual learning environments that more recently has been developed and tested for the study of different online training courses without instructional intervention. In this paper we use this same proposal to analyze online learning environments in the framework of courses with instructional intervention. One of the main observations in relation to this type of analyses is that the organizational aspects of the courses are found to be related to the way in which the input elements for the teaching and learning process are constructed.

  6. Adaptive tools in virtual environments: Independent component analysis for multimedia

    DEFF Research Database (Denmark)

    Kolenda, Thomas

    2002-01-01

    The thesis investigates the role of independent component analysis in the setting of virtual environments, with the purpose of finding properties that reflect human context. A general framework for performing unsupervised classification with ICA is presented in extension to the latent semantic in...... were compared to investigate computational differences and separation results. The ICA properties were finally implemented in a chat room analysis tool and briefly investigated for visualization of search engines results....

  7. POPULATION STRUCTURE AND SPATIAL DISTRIBUTION OF Ceratozamia mexicana BRONGN. (ZAMIACEAE IN PRESERVED AND DISTURBED ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    Andrés Rivera-Fernández

    2012-11-01

    Full Text Available Vegetal populations are affected by biotic and abiotic factors that influence the regeneration processes. The aims of this study were to know the population structure of Ceratozamia mexicana under two contrasting conditions (conserved site and disturbed site, and to determine if the sexual structure, the population density and the spatial distribution of C. mexicana are modified by effect of disturbance. Eight plots of 25 m2 within each site (conserved and disturbed were used. The structure and spatial distribution of the sites were determined. Methods included analysis of variance, spatial distribution indexes, and climatic and edaphic factors determined by conventional methods for their comparison. The conserved site showed a demographic structure of an inverted "J", while the disturbed site varied slightly with more discontinuous distribution. Population density was 0.78 individuals/m2 in the conserved site and 0.26 individuals/m2 in the disturbed site. Spatial distribution for all development stages of the plant was random, with the exception of the seedling stage, which was aggregated. Results showed that perturbation decreases the density of plants and removes reproductive individuals, which threatens the persistence of the population.

  8. Cost Analysis In A Multi-Mission Operations Environment

    Science.gov (United States)

    Newhouse, M.; Felton, L.; Bornas, N.; Botts, D.; Roth, K.; Ijames, G.; Montgomery, P.

    2014-01-01

    Spacecraft control centers have evolved from dedicated, single-mission or single missiontype support to multi-mission, service-oriented support for operating a variety of mission types. At the same time, available money for projects is shrinking and competition for new missions is increasing. These factors drive the need for an accurate and flexible model to support estimating service costs for new or extended missions; the cost model in turn drives the need for an accurate and efficient approach to service cost analysis. The National Aeronautics and Space Administration (NASA) Huntsville Operations Support Center (HOSC) at Marshall Space Flight Center (MSFC) provides operations services to a variety of customers around the world. HOSC customers range from launch vehicle test flights; to International Space Station (ISS) payloads; to small, short duration missions; and has included long duration flagship missions. The HOSC recently completed a detailed analysis of service costs as part of the development of a complete service cost model. The cost analysis process required the team to address a number of issues. One of the primary issues involves the difficulty of reverse engineering individual mission costs in a highly efficient multimission environment, along with a related issue of the value of detailed metrics or data to the cost model versus the cost of obtaining accurate data. Another concern is the difficulty of balancing costs between missions of different types and size and extrapolating costs to different mission types. The cost analysis also had to address issues relating to providing shared, cloud-like services in a government environment, and then assigning an uncertainty or risk factor to cost estimates that are based on current technology, but will be executed using future technology. Finally the cost analysis needed to consider how to validate the resulting cost models taking into account the non-homogeneous nature of the available cost data and the

  9. Preliminary application of tapered glass capillary microbeam in MeV-PIXE mapping of longan leaf for elemental concentration distribution analysis

    Science.gov (United States)

    Natyanun, S.; Unai, S.; Yu, L. D.; Tippawan, U.; Pussadee, N.

    2017-09-01

    This study was aimed at understanding elemental concentration distribution in local longan leaf for how the plant was affected by the environment or agricultural operation. The analysis applied the MeV-microbeam particle induced X-ray emission (PIXE) mapping technique using a home-developed tapered glass capillary microbeam system at Chiang Mai University. The microbeam was 2-MeV proton beam in 130 µm in diameter. The studying interest was in the difference in the elemental concentrations distributed between the leaf midrib and lamina areas. The micro proton beam analyzed the leaf sample across the leaf midrib edge to the leaf lamina area for total 9 data requisition spots. The resulting data were colored to form a 1D-map of the elemental concentration distribution. Seven dominant elements, Al, S, Cl, K, Ca, Sc and Fe, were identified, the first six of which were found having higher concentrations in the midrib area than in the lamina area, while the Fe concentration was in an opposite trend to that of the others.

  10. High-performance integrated virtual environment (HIVE): a robust infrastructure for next-generation sequence data analysis.

    Science.gov (United States)

    Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E; Tkachenko, Valery; Torcivia-Rodriguez, John; Voskanian, Alin; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja

    2016-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure.The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu. © The Author(s) 2016. Published by Oxford University Press.

  11. Distributed analysis using GANGA on the EGEE/LCG infrastructure

    International Nuclear Information System (INIS)

    Elmsheuser, J; Brochu, F; Harrison, K; Egede, U; Gaidioz, B; Liko, D; Maier, A; Moscicki, J; Muraru, A; Lee, H-C; Romanovsky, V; Soroko, A; Tan, C L

    2008-01-01

    The distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The need to facilitate the access to the resources is very high. In every experiment up to a thousand physicist will be submitting analysis jobs into the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without too much expertise in Grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment and the EGEE/LCG infrastructure. The integration with the ATLAS data management system DQ2 into GANGA is a key functionality. In combination with the job splitting mechanism large amounts of jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports tasks of user analysis with reconstructed data and small scale production of Monte Carlo data

  12. Virtual learning object and environment: a concept analysis.

    Science.gov (United States)

    Salvador, Pétala Tuani Candido de Oliveira; Bezerril, Manacés Dos Santos; Mariz, Camila Maria Santos; Fernandes, Maria Isabel Domingues; Martins, José Carlos Amado; Santos, Viviane Euzébia Pereira

    2017-01-01

    To analyze the concept of virtual learning object and environment according to Rodgers' evolutionary perspective. Descriptive study with a mixed approach, based on the stages proposed by Rodgers in his concept analysis method. Data collection occurred in August 2015 with the search of dissertations and theses in the Bank of Theses of the Coordination for the Improvement of Higher Education Personnel. Quantitative data were analyzed based on simple descriptive statistics and the concepts through lexicographic analysis with support of the IRAMUTEQ software. The sample was made up of 161 studies. The concept of "virtual learning environment" was presented in 99 (61.5%) studies, whereas the concept of "virtual learning object" was presented in only 15 (9.3%) studies. A virtual learning environment includes several and different types of virtual learning objects in a common pedagogical context. Analisar o conceito de objeto e de ambiente virtual de aprendizagem na perspectiva evolucionária de Rodgers. Estudo descritivo, de abordagem mista, realizado a partir das etapas propostas por Rodgers em seu modelo de análise conceitual. A coleta de dados ocorreu em agosto de 2015 com a busca de dissertações e teses no Banco de Teses e Dissertações da Coordenação de Aperfeiçoamento de Pessoal de Nível Superior. Os dados quantitativos foram analisados a partir de estatística descritiva simples e os conceitos pela análise lexicográfica com suporte do IRAMUTEQ. A amostra é constituída de 161 estudos. O conceito de "ambiente virtual de aprendizagem" foi apresentado em 99 (61,5%) estudos, enquanto o de "objeto virtual de aprendizagem" em apenas 15 (9,3%). Concluiu-se que um ambiente virtual de aprendizagem reúne vários e diferentes tipos de objetos virtuais de aprendizagem em um contexto pedagógico comum.

  13. SimpleITK Image-Analysis Notebooks: a Collaborative Environment for Education and Reproducible Research.

    Science.gov (United States)

    Yaniv, Ziv; Lowekamp, Bradley C; Johnson, Hans J; Beare, Richard

    2018-06-01

    Modern scientific endeavors increasingly require team collaborations to construct and interpret complex computational workflows. This work describes an image-analysis environment that supports the use of computational tools that facilitate reproducible research and support scientists with varying levels of software development skills. The Jupyter notebook web application is the basis of an environment that enables flexible, well-documented, and reproducible workflows via literate programming. Image-analysis software development is made accessible to scientists with varying levels of programming experience via the use of the SimpleITK toolkit, a simplified interface to the Insight Segmentation and Registration Toolkit. Additional features of the development environment include user friendly data sharing using online data repositories and a testing framework that facilitates code maintenance. SimpleITK provides a large number of examples illustrating educational and research-oriented image analysis workflows for free download from GitHub under an Apache 2.0 license: github.com/InsightSoftwareConsortium/SimpleITK-Notebooks .

  14. [A survey on ecological environment of wild Adiantum reniforme var. sinense in Three Gorges Reservoir region].

    Science.gov (United States)

    Xie, Zhong-de; Jia, Han; Fu, Shao-Zhi; Gao, Wen-Yuan; Zhang, Zhi-Wei; Sun, Wei

    2017-11-01

    The study aims at investigating the ecological environment Adiantum reniforme var. sinense of in Three Gorges Reservoir region, and providing a reference basis for the protection of resources and artificial cultivation of A. reniforme var. sinense. By using the method of investigation, field survey and experimental analysis, the vegetation, natural geographical environment, climate, soil nutrients of A. reniforme var. sinense were studied and analyzed. The survey found that A. reniforme var. sinense distribution area reduced fast in Three Gorges region, a lot of distribution has diminished and vanished due to excessive digging, currently only in 3 towns of Wanzhou there exist 4 wild distribution areas. The growth of A. reniforme var. sinense needs an environment with low altitude, steep slope and thin soil, northeast slope, canopy height and warm and humid climate characteristics, and the soil in distribution has the characteristics of high organic matter, available nitrogen, available potassium, and low available phosphorus content. Copyright© by the Chinese Pharmaceutical Association.

  15. Design of a distributed simulation environment for building control applications based on systems engineering methodology

    NARCIS (Netherlands)

    Yahiaoui, Azzedine

    2018-01-01

    The analysis of innovative designs that distributes control to buildings over a network is currently a challenging task as exciting building performance simulation tools do not offer sufficient capabilities and the flexibility to fully respond to the full complexity of Automated Buildings (ABs). For

  16. Patterns in the distribution of digital games via BitTorrent

    DEFF Research Database (Denmark)

    Drachen, Anders; Veitch, Robert W. D.

    2013-01-01

    The distribution of illegal copies of computer games via digital networks forms the centre in one of the most heated debates in the international games environment, but there is minimal objective information available. Here the results of a large-scale, open-method analysis of the distribution...... of computer games via BitTorrent peer-to-peer file-sharing protocol is presented. 173 games were included, tracked over a period of three months from 2010 to 2011. A total of 12.6 million unique peers were identified across over 200 countries. Analysis indicates that the distribution of illegal copies...... of games follows distinct pattern, e.g., that a few game titles drive the traffic - the 10 most accessed games encompassed 42.7% of the number of peers tracked. The traffic is geographically localised - 20 countries encompassed 76.7% of the total. Geographic patterns in the distribution of BitTorrent peers...

  17. Water hammer analysis in a water distribution system

    Directory of Open Access Journals (Sweden)

    John Twyman

    2017-04-01

    Full Text Available The solution to water hammer in a water distribution system (WDS is shown by applying three hybrid methods (HM based on the Box’s scheme, McCormack's method and Diffusive Scheme. Each HM formulation in conjunction with their relative advantages and disadvantages are reviewed. The analyzed WDS has pipes with different lengths, diameters and wave speeds, being the Courant number different in each pipe according to the adopted discretization. The HM results are compared with the results obtained by the Method of Characteristics (MOC. In reviewing the numerical attenuation, second order schemes based on Box and McCormack are more conservative from a numerical point of view, being recommendable their application in the analysis of water hammer in water distribution systems.

  18. Scaling to diversity: The DERECHOS distributed infrastructure for analyzing and sharing data

    Science.gov (United States)

    Rilee, M. L.; Kuo, K. S.; Clune, T.; Oloso, A.; Brown, P. G.

    2016-12-01

    Integrating Earth Science data from diverse sources such as satellite imagery and simulation output can be expensive and time-consuming, limiting scientific inquiry and the quality of our analyses. Reducing these costs will improve innovation and quality in science. The current Earth Science data infrastructure focuses on downloading data based on requests formed from the search and analysis of associated metadata. And while the data products provided by archives may use the best available data sharing technologies, scientist end-users generally do not have such resources (including staff) available to them. Furthermore, only once an end-user has received the data from multiple diverse sources and has integrated them can the actual analysis and synthesis begin. The cost of getting from idea to where synthesis can start dramatically slows progress. In this presentation we discuss a distributed computational and data storage framework that eliminates much of the aforementioned cost. The SciDB distributed array database is central as it is optimized for scientific computing involving very large arrays, performing better than less specialized frameworks like Spark. Adding spatiotemporal functions to the SciDB creates a powerful platform for analyzing and integrating massive, distributed datasets. SciDB allows Big Earth Data analysis to be performed "in place" without the need for expensive downloads and end-user resources. Spatiotemporal indexing technologies such as the hierarchical triangular mesh enable the compute and storage affinity needed to efficiently perform co-located and conditional analyses minimizing data transfers. These technologies automate the integration of diverse data sources using the framework, a critical step beyond current metadata search and analysis. Instead of downloading data into their idiosyncratic local environments, end-users can generate and share data products integrated from diverse multiple sources using a common shared environment

  19. Potential urban runoff impacts and contaminant distributions in shoreline and reservoir environments of Lake Havasu, southwestern United States.

    Science.gov (United States)

    Wilson, Doyle C

    2018-04-15

    Heavy metal, nutrient, and hydrocarbon levels in and adjacent to Lake Havasu, a regionally significant water supply reservoir with a highly controlled, dynamic flow regime, are assessed in relation to possible stormwater runoff impacts from an arid urban center. Shallow groundwater and sediment analyses from ephemeral drainage (wash) mouths that convey stormwater runoff from Lake Havasu City, Arizona to the reservoir, provided contaminant control points and correlation ties with the reservoir environment. Fine-grain sediments tend to contain higher heavy metal concentrations whereas nutrients are more evenly distributed, except low total organic carbon levels from young wash mouth surfaces devoid of vegetation. Heavy metal and total phosphate sediment concentrations in transects from wash mouths into the reservoir have mixed and decreasing trends, respectively. Both series may indicate chemical depositional influences from urban runoff, yet no statistically significant concentration differences occur between specific wash mouths and corresponding offshore transects. Heavy metal pollution indices of all sediments indicate no discernible to minor contamination, indicating that runoff impacts are minimal. Nevertheless, several heavy metal concentrations from mid-reservoir sediment sites increase southward through the length of the reservoir. Continual significant water flow through the reservoir may help to disperse locally derived runoff particulates, which could mix and settle down gradient with chemical loads from upriver sources and local atmospheric deposition. Incorporating the shoreline environment with the reservoir investigation provides spatial continuity in assessing contaminant sources and distribution patterns. This is particularly acute in the investigation of energetic, flow-through reservoirs in which sources may be overlooked if solely analyzing the reservoir environment. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Towards a Scalable and Adaptive Application Support Platform for Large-Scale Distributed E-Sciences in High-Performance Network Environments

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Chase Qishi [New Jersey Inst. of Technology, Newark, NJ (United States); Univ. of Memphis, TN (United States); Zhu, Michelle Mengxia [Southern Illinois Univ., Carbondale, IL (United States)

    2016-06-06

    The advent of large-scale collaborative scientific applications has demonstrated the potential for broad scientific communities to pool globally distributed resources to produce unprecedented data acquisition, movement, and analysis. System resources including supercomputers, data repositories, computing facilities, network infrastructures, storage systems, and display devices have been increasingly deployed at national laboratories and academic institutes. These resources are typically shared by large communities of users over Internet or dedicated networks and hence exhibit an inherent dynamic nature in their availability, accessibility, capacity, and stability. Scientific applications using either experimental facilities or computation-based simulations with various physical, chemical, climatic, and biological models feature diverse scientific workflows as simple as linear pipelines or as complex as a directed acyclic graphs, which must be executed and supported over wide-area networks with massively distributed resources. Application users oftentimes need to manually configure their computing tasks over networks in an ad hoc manner, hence significantly limiting the productivity of scientists and constraining the utilization of resources. The success of these large-scale distributed applications requires a highly adaptive and massively scalable workflow platform that provides automated and optimized computing and networking services. This project is to design and develop a generic Scientific Workflow Automation and Management Platform (SWAMP), which contains a web-based user interface specially tailored for a target application, a set of user libraries, and several easy-to-use computing and networking toolkits for application scientists to conveniently assemble, execute, monitor, and control complex computing workflows in heterogeneous high-performance network environments. SWAMP will enable the automation and management of the entire process of scientific

  1. DIRAC - The Distributed MC Production and Analysis for LHCb

    CERN Document Server

    Tsaregorodtsev, A

    2004-01-01

    DIRAC is the LHCb distributed computing grid infrastructure for MC production and analysis. Its architecture is based on a set of distributed collaborating services. The service decomposition broadly follows the ARDA project proposal, allowing for the possibility of interchanging the EGEE/ARDA and DIRAC components in the future. Some components developed outside the DIRAC project are already in use as services, for example the File Catalog developed by the AliEn project. An overview of the DIRAC architecture will be given, in particular the recent developments to support user analysis. The main design choices will be presented. One of the main design goals of DIRAC is the simplicity of installation, configuring and operation of various services. This allows all the DIRAC resources to be easily managed by a single Production Manager. The modular design of the DIRAC components allows its functionality to be easily extended to include new computing and storage elements or to handle new tasks. The DIRAC system al...

  2. Space station electrical power distribution analysis using a load flow approach

    Science.gov (United States)

    Emanuel, Ervin M.

    1987-01-01

    The space station's electrical power system will evolve and grow in a manner much similar to the present terrestrial electrical power system utilities. The initial baseline reference configuration will contain more than 50 nodes or busses, inverters, transformers, overcurrent protection devices, distribution lines, solar arrays, and/or solar dynamic power generating sources. The system is designed to manage and distribute 75 KW of power single phase or three phase at 20 KHz, and grow to a level of 300 KW steady state, and must be capable of operating at a peak of 450 KW for 5 to 10 min. In order to plan far into the future and keep pace with load growth, a load flow power system analysis approach must be developed and utilized. This method is a well known energy assessment and management tool that is widely used throughout the Electrical Power Utility Industry. The results of a comprehensive evaluation and assessment of an Electrical Distribution System Analysis Program (EDSA) is discussed. Its potential use as an analysis and design tool for the 20 KHz space station electrical power system is addressed.

  3. Estimation of monthly solar radiation distribution for solar energy system analysis

    International Nuclear Information System (INIS)

    Coskun, C.; Oktay, Z.; Dincer, I.

    2011-01-01

    The concept of probability density frequency, which is successfully used for analyses of wind speed and outdoor temperature distributions, is now modified and proposed for estimating solar radiation distributions for design and analysis of solar energy systems. In this study, global solar radiation distribution is comprehensively analyzed for photovoltaic (PV) panel and thermal collector systems. In this regard, a case study is conducted with actual global solar irradiation data of the last 15 years recorded by the Turkish State Meteorological Service. It is found that intensity of global solar irradiance greatly affects energy and exergy efficiencies and hence the performance of collectors. -- Research highlights: → The first study to apply global solar radiation distribution in solar system analyzes. → The first study showing global solar radiation distribution as a parameter of the solar irradiance intensity. → Time probability intensity frequency and probability power distribution do not have similar distribution patterns for each month. → There is no relation between the distribution of annual time lapse and solar energy with the intensity of solar irradiance.

  4. Research on biomass energy and environment from the past to the future: A bibliometric analysis.

    Science.gov (United States)

    Mao, Guozhu; Huang, Ning; Chen, Lu; Wang, Hongmei

    2018-09-01

    The development and utilization of biomass energy can help to change the ways of energy production and consumption and establish a sustainable energy system that can effectively promote the development of the national economy and strengthen the protection of the environment. Here,we perform a bibliometric analysis of 9514 literature reports in the Web of Science Core Collection searched with the key words "Biomass energy" and "Environment*" date from 1998 to 2017; hot topics in the research and development of biomass energy utilization, as well as the status and development trends of biomass energy utilization and the environment, were analyzed based on content analysis and bibliometrics. The interaction between biomass energy and the environment began to become a major concern as the research progressively deepened. This work is of great significance for the development and utilization of biomass energy to put forward specific suggestions and strategies based on the analysis and demonstration of relationships and interactions between biomass energy utilization and environment. It is also useful to researchers for selecting the future research topics. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Analysis of the international distribution of per capita CO2 emissions using the polarization concept

    International Nuclear Information System (INIS)

    Duro, Juan Antonio; Padilla, Emilio

    2008-01-01

    The concept of polarization is linked to the extent that a given distribution leads to the formation of homogeneous groups with opposing interests. This concept, which is basically different from the traditional one of inequality, is related to the level of inherent potential conflict in a distribution. The polarization approach has been widely applied in the analysis of income distribution. The extension of this approach to the analysis of international distribution of CO 2 emissions is quite useful as it gives a potent informative instrument for characterizing the state and evolution of the international distribution of emissions and its possible political consequences in terms of tensions and the probability of achieving agreements. In this paper we analyze the international distribution of per capita CO 2 emissions between 1971 and 2001 through the adaptation of the polarization concept and measures. We find that the most interesting grouped description deriving from the analysis is a two groups' one, which broadly coincide with Annex B and non-Annex B countries of the Kyoto Protocol, which shows the power of polarization analysis for explaining the generation of groups in the real world. The analysis also shows a significant reduction in international polarization in per capita CO 2 emissions between 1971 and 1995, but not much change since 1995, which might indicate that polarized distribution of emission is still one of the important factors leading to difficulties in achieving agreements for reducing global emissions. (author)

  6. Neuronal model with distributed delay: analysis and simulation study for gamma distribution memory kernel.

    Science.gov (United States)

    Karmeshu; Gupta, Varun; Kadambari, K V

    2011-06-01

    A single neuronal model incorporating distributed delay (memory)is proposed. The stochastic model has been formulated as a Stochastic Integro-Differential Equation (SIDE) which results in the underlying process being non-Markovian. A detailed analysis of the model when the distributed delay kernel has exponential form (weak delay) has been carried out. The selection of exponential kernel has enabled the transformation of the non-Markovian model to a Markovian model in an extended state space. For the study of First Passage Time (FPT) with exponential delay kernel, the model has been transformed to a system of coupled Stochastic Differential Equations (SDEs) in two-dimensional state space. Simulation studies of the SDEs provide insight into the effect of weak delay kernel on the Inter-Spike Interval(ISI) distribution. A measure based on Jensen-Shannon divergence is proposed which can be used to make a choice between two competing models viz. distributed delay model vis-á-vis LIF model. An interesting feature of the model is that the behavior of (CV(t))((ISI)) (Coefficient of Variation) of the ISI distribution with respect to memory kernel time constant parameter η reveals that neuron can switch from a bursting state to non-bursting state as the noise intensity parameter changes. The membrane potential exhibits decaying auto-correlation structure with or without damped oscillatory behavior depending on the choice of parameters. This behavior is in agreement with empirically observed pattern of spike count in a fixed time window. The power spectral density derived from the auto-correlation function is found to exhibit single and double peaks. The model is also examined for the case of strong delay with memory kernel having the form of Gamma distribution. In contrast to fast decay of damped oscillations of the ISI distribution for the model with weak delay kernel, the decay of damped oscillations is found to be slower for the model with strong delay kernel.

  7. Hierarchically structured distributed microprocessor network for control

    International Nuclear Information System (INIS)

    Greenwood, J.R.; Holloway, F.W.; Rupert, P.R.; Ozarski, R.G.; Suski, G.J.

    1979-01-01

    To satisfy a broad range of control-analysis and data-acquisition requirements for Shiva, a hierarchical, computer-based, modular-distributed control system was designed. This system handles the more than 3000 control elements and 1000 data acquisition units in a severe high-voltage, high-current environment. The control system design gives one a flexible and reliable configuration to meet the development milestones for Shiva within critical time limits

  8. The Elements of Competitive Environment of an Enterprise: A Case of Oligopolic Markets Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Algirdas Krivka

    2011-03-01

    Full Text Available The article raises the problem of the complex analysis of competitive environment of an enterprise, which is considered to be the main source of factors, influencing enterprise‘s strategic behaviour and performance. The elements of competitive environment are derived from “traditional” market structure characteristics, developed by the scholars of classical economics and modern microeconomics, with additional factors coming from industrial organization, theoretical oligopoly models, M. Porter’s five competitive forces and diamond. The developed set of the elements of competitive environment is applied for the comparative analysis of three Lithuanian oligopolic markets. The results obtained confirm the potential for practical application of the developed classification for similar analysis.Article in Lithuanian

  9. Current, voltage and temperature distribution modeling of light-emitting diodes based on electrical and thermal circuit analysis

    International Nuclear Information System (INIS)

    Yun, J; Shim, J-I; Shin, D-S

    2013-01-01

    We demonstrate a modeling method based on the three-dimensional electrical and thermal circuit analysis to extract current, voltage and temperature distributions of light-emitting diodes (LEDs). In our model, the electrical circuit analysis is performed first to extract the current and voltage distributions in the LED. Utilizing the result obtained from the electrical circuit analysis as distributed heat sources, the thermal circuit is set up by using the duality between Fourier's law and Ohm's law. From the analysis of the thermal circuit, the temperature distribution at each epitaxial film is successfully obtained. Comparisons of experimental and simulation results are made by employing an InGaN/GaN multiple-quantum-well blue LED. Validity of the electrical circuit analysis is confirmed by comparing the light distribution at the surface. Since the temperature distribution at each epitaxial film cannot be obtained experimentally, the apparent temperature distribution is compared at the surface of the LED chip. Also, experimentally obtained average junction temperature is compared with the value calculated from the modeling, yielding a very good agreement. The analysis method based on the circuit modeling has an advantage of taking distributed heat sources as inputs, which is essential for high-power devices with significant self-heating. (paper)

  10. Maintaining Traceability in an Evolving Distributed Computing Environment

    Science.gov (United States)

    Collier, I.; Wartel, R.

    2015-12-01

    The management of risk is fundamental to the operation of any distributed computing infrastructure. Identifying the cause of incidents is essential to prevent them from re-occurring. In addition, it is a goal to contain the impact of an incident while keeping services operational. For response to incidents to be acceptable this needs to be commensurate with the scale of the problem. The minimum level of traceability for distributed computing infrastructure usage is to be able to identify the source of all actions (executables, file transfers, pilot jobs, portal jobs, etc.) and the individual who initiated them. In addition, sufficiently fine-grained controls, such as blocking the originating user and monitoring to detect abnormal behaviour, are necessary for keeping services operational. It is essential to be able to understand the cause and to fix any problems before re-enabling access for the user. The aim is to be able to answer the basic questions who, what, where, and when concerning any incident. This requires retaining all relevant information, including timestamps and the digital identity of the user, sufficient to identify, for each service instance, and for every security event including at least the following: connect, authenticate, authorize (including identity changes) and disconnect. In traditional grid infrastructures (WLCG, EGI, OSG etc.) best practices and procedures for gathering and maintaining the information required to maintain traceability are well established. In particular, sites collect and store information required to ensure traceability of events at their sites. With the increased use of virtualisation and private and public clouds for HEP workloads established procedures, which are unable to see 'inside' running virtual machines no longer capture all the information required. Maintaining traceability will at least involve a shift of responsibility from sites to Virtual Organisations (VOs) bringing with it new requirements for their

  11. Lognormal Distribution of Cellular Uptake of Radioactivity: Statistical Analysis of α-Particle Track Autoradiography

    Science.gov (United States)

    Neti, Prasad V.S.V.; Howell, Roger W.

    2010-01-01

    Recently, the distribution of radioactivity among a population of cells labeled with 210Po was shown to be well described by a log-normal (LN) distribution function (J Nucl Med. 2006;47:1049–1058) with the aid of autoradiography. To ascertain the influence of Poisson statistics on the interpretation of the autoradiographic data, the present work reports on a detailed statistical analysis of these earlier data. Methods The measured distributions of α-particle tracks per cell were subjected to statistical tests with Poisson, LN, and Poisson-lognormal (P-LN) models. Results The LN distribution function best describes the distribution of radioactivity among cell populations exposed to 0.52 and 3.8 kBq/mL of 210Po-citrate. When cells were exposed to 67 kBq/mL, the P-LN distribution function gave a better fit; however, the underlying activity distribution remained log-normal. Conclusion The present analysis generally provides further support for the use of LN distributions to describe the cellular uptake of radioactivity. Care should be exercised when analyzing autoradiographic data on activity distributions to ensure that Poisson processes do not distort the underlying LN distribution. PMID:18483086

  12. Interesting association rule mining with consistent and inconsistent rule detection from big sales data in distributed environment

    Directory of Open Access Journals (Sweden)

    Dinesh J. Prajapati

    2017-06-01

    Full Text Available Nowadays, there is an increasing demand in mining interesting patterns from the big data. The process of analyzing such a huge amount of data is really computationally complex task when using traditional methods. The overall purpose of this paper is in twofold. First, this paper presents a novel approach to identify consistent and inconsistent association rules from sales data located in distributed environment. Secondly, the paper also overcomes the main memory bottleneck and computing time overhead of single computing system by applying computations to multi node cluster. The proposed method initially extracts frequent itemsets for each zone using existing distributed frequent pattern mining algorithms. The paper also compares the time efficiency of Mapreduce based frequent pattern mining algorithm with Count Distribution Algorithm (CDA and Fast Distributed Mining (FDM algorithms. The association generated from frequent itemsets are too large that it becomes complex to analyze it. Thus, Mapreduce based consistent and inconsistent rule detection (MR-CIRD algorithm is proposed to detect the consistent and inconsistent rules from big data and provide useful and actionable knowledge to the domain experts. These pruned interesting rules also give useful knowledge for better marketing strategy as well. The extracted consistent and inconsistent rules are evaluated and compared based on different interestingness measures presented together with experimental results that lead to the final conclusions.

  13. Benefits analysis of Soft Open Points for electrical distribution network operation

    International Nuclear Information System (INIS)

    Cao, Wanyu; Wu, Jianzhong; Jenkins, Nick; Wang, Chengshan; Green, Timothy

    2016-01-01

    Highlights: • An analysis framework was developed to quantify the operational benefits. • The framework considers both network reconfiguration and SOP control. • Benefits were analyzed through both quantitative and sensitivity analysis. - Abstract: Soft Open Points (SOPs) are power electronic devices installed in place of normally-open points in electrical power distribution networks. They are able to provide active power flow control, reactive power compensation and voltage regulation under normal network operating conditions, as well as fast fault isolation and supply restoration under abnormal conditions. A steady state analysis framework was developed to quantify the operational benefits of a distribution network with SOPs under normal network operating conditions. A generic power injection model was developed and used to determine the optimal SOP operation using an improved Powell’s Direct Set method. Physical limits and power losses of the SOP device (based on back to back voltage-source converters) were considered in the model. Distribution network reconfiguration algorithms, with and without SOPs, were developed and used to identify the benefits of using SOPs. Test results on a 33-bus distribution network compared the benefits of using SOPs, traditional network reconfiguration and the combination of both. The results showed that using only one SOP achieved a similar improvement in network operation compared to the case of using network reconfiguration with all branches equipped with remotely controlled switches. A combination of SOP control and network reconfiguration provided the optimal network operation.

  14. A MODEL OF HETEROGENEOUS DISTRIBUTED SYSTEM FOR FOREIGN EXCHANGE PORTFOLIO ANALYSIS

    Directory of Open Access Journals (Sweden)

    Dragutin Kermek

    2006-06-01

    Full Text Available The paper investigates the design of heterogeneous distributed system for foreign exchange portfolio analysis. The proposed model includes few separated and dislocated but connected parts through distributed mechanisms. Making system distributed brings new perspectives to performance busting where software based load balancer gets very important role. Desired system should spread over multiple, heterogeneous platforms in order to fulfil open platform goal. Building such a model incorporates different patterns from GOF design patterns, business patterns, J2EE patterns, integration patterns, enterprise patterns, distributed design patterns to Web services patterns. The authors try to find as much as possible appropriate patterns for planned tasks in order to capture best modelling and programming practices.

  15. Distributed intelligent urban environment monitoring system

    Science.gov (United States)

    Du, Jinsong; Wang, Wei; Gao, Jie; Cong, Rigang

    2018-02-01

    The current environmental pollution and destruction have developed into a world-wide major social problem that threatens human survival and development. Environmental monitoring is the prerequisite and basis of environmental governance, but overall, the current environmental monitoring system is facing a series of problems. Based on the electrochemical sensor, this paper designs a small, low-cost, easy to layout urban environmental quality monitoring terminal, and multi-terminal constitutes a distributed network. The system has been small-scale demonstration applications and has confirmed that the system is suitable for large-scale promotion

  16. DISTRIBUTED SYSTEM FOR HUMAN MACHINE INTERACTION IN VIRTUAL ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    Abraham Obed Chan-Canche

    2017-07-01

    Full Text Available The communication networks built by multiple devices and sensors are becoming more frequent. These device networks allow human-machine interaction development which aims to improve the human performance generating an adaptive environment in response to the information provided by it. The problem of this work is the quick integration of a device network that allows the development of a flexible immersive environment for different uses.

  17. BioVLAB-MMIA: a cloud environment for microRNA and mRNA integrated analysis (MMIA) on Amazon EC2.

    Science.gov (United States)

    Lee, Hyungro; Yang, Youngik; Chae, Heejoon; Nam, Seungyoon; Choi, Donghoon; Tangchaisin, Patanachai; Herath, Chathura; Marru, Suresh; Nephew, Kenneth P; Kim, Sun

    2012-09-01

    MicroRNAs, by regulating the expression of hundreds of target genes, play critical roles in developmental biology and the etiology of numerous diseases, including cancer. As a vast amount of microRNA expression profile data are now publicly available, the integration of microRNA expression data sets with gene expression profiles is a key research problem in life science research. However, the ability to conduct genome-wide microRNA-mRNA (gene) integration currently requires sophisticated, high-end informatics tools, significant expertise in bioinformatics and computer science to carry out the complex integration analysis. In addition, increased computing infrastructure capabilities are essential in order to accommodate large data sets. In this study, we have extended the BioVLAB cloud workbench to develop an environment for the integrated analysis of microRNA and mRNA expression data, named BioVLAB-MMIA. The workbench facilitates computations on the Amazon EC2 and S3 resources orchestrated by the XBaya Workflow Suite. The advantages of BioVLAB-MMIA over the web-based MMIA system include: 1) readily expanded as new computational tools become available; 2) easily modifiable by re-configuring graphic icons in the workflow; 3) on-demand cloud computing resources can be used on an "as needed" basis; 4) distributed orchestration supports complex and long running workflows asynchronously. We believe that BioVLAB-MMIA will be an easy-to-use computing environment for researchers who plan to perform genome-wide microRNA-mRNA (gene) integrated analysis tasks.

  18. Analysis of Effect of Education Entrepreneurship and Family Environment Towards Interest Students Entrepreneurs

    Directory of Open Access Journals (Sweden)

    Periansya Periansya

    2018-03-01

    Full Text Available This research aims to analyze the effect of entrepreneurship education and family environment on entrepreneurial interest. The population in this research is State Polytechnic of Sriwijaya Palembang student.. Sampling technique uses sampling proportional technique. Sample consists of 375 students. Analysis method uses double linear regression analysis technique. The result shows that partially entrepreneurship education, family environment gives a positive and significant effect on entrepreneurial interest of State Polytechnic of Sriwijaya. Simultaneously, entrepreneurship education and family environment gives a positive and significant effect on entrepreneurial interest.. The conclusions in this research that the education needs to be orientating on practice, case study, and invite interviewees from companies or industries. The existence of industrial practice based on student competency also can enhance knowledge and insight of students where

  19. Implementing GIS regression trees for generating the spatial distribution of copper in Mediterranean environments

    DEFF Research Database (Denmark)

    Bou Kheir, Rania; Greve, Mogens Humlekrog; Deroin, Jean-Paul

    2013-01-01

    Soil contamination by heavy metals has become a widespread dangerous problem in many parts of the world, including the Mediterranean environments. This is closely related to the increase irrigation by waste waters, to the uncontrolled application of sewage sludge, industrial effluents, pesticides...... and fertilizers, to the rapid urbanization, to the atmospheric deposition of dust and aerosols, to the vehicular emissions and to many other negative human activities. In this context, this paper predicts the spatial distribution and concentration level of copper (Cu) in the 195km2 of Nahr el-Jawz watershed......H, hydraulical conductivity, organic matter, stoniness ratio, soil depth, slope gradient, slope aspect, slope curvature, land cover/use, distance to drainage line, proximity to roads, nearness to cities, and surroundings to waste areas) were generated from satellite imageries, Digital Elevation Models (DEMs...

  20. Analysis of Jingdong Mall Logistics Distribution Model

    Science.gov (United States)

    Shao, Kang; Cheng, Feng

    In recent years, the development of electronic commerce in our country to speed up the pace. The role of logistics has been highlighted, more and more electronic commerce enterprise are beginning to realize the importance of logistics in the success or failure of the enterprise. In this paper, the author take Jingdong Mall for example, performing a SWOT analysis of their current situation of self-built logistics system, find out the problems existing in the current Jingdong Mall logistics distribution and give appropriate recommendations.

  1. Proceedings of the scientific meeting on 'behavior and distributions of trace substances in the environment'

    International Nuclear Information System (INIS)

    Fukui, M.; Matsuzuru, H.

    1998-02-01

    The scientific meeting was held at the Research Reactor Institute, Kyoto University on December 11-12, 1997. This single report covers all aspects concerning association of trace substances such as pesticides/herbicides, organic chemicals and radionuclides in the environment. The reason for having this meeting is to describe the distribution and behavior of trace substances in which the emphasis is directed towards the dynamic interaction between the soil-sediment-water system and the contaminants. The Chernobyl accident raised the attention on the fate of radionuclides released in the environment and stimulated many scientists, who carry out large scale 'field experiments' without using a tracer in a laboratory. Of course, fundamental laboratory studies are necessary to give direction to and to understand observations from field studies. These activities have brought a lot of knowledge and understanding towards revealing a part of the complexity of the transport processes. It is hoped that the assembled experts, will not only dwell on distinct scientific issues, but also be able to draw firm conclusions with respect to the effective environmental management of the ecological aspects of hazardous materials. The 25 of the presented papers are indexed individually. (J.P.N.)

  2. Projection methods for the analysis of molecular-frame photoelectron angular distributions

    International Nuclear Information System (INIS)

    Grum-Grzhimailo, A.N.; Lucchese, R.R.; Liu, X.-J.; Pruemper, G.; Morishita, Y.; Saito, N.; Ueda, K.

    2007-01-01

    A projection method is developed for extracting the nondipole contribution from the molecular frame photoelectron angular distributions of linear molecules. A corresponding convenient parametric form for the angular distributions is derived. The analysis was performed for the N 1s photoionization of the NO molecule a few eV above the ionization threshold. No detectable nondipole contribution was found for the photon energy of 412 eV

  3. CMS distributed analysis infrastructure and operations: experience with the first LHC data

    International Nuclear Information System (INIS)

    Vaandering, E W

    2011-01-01

    The CMS distributed analysis infrastructure represents a heterogeneous pool of resources distributed across several continents. The resources are harnessed using glite and glidein-based work load management systems (WMS). We provide the operational experience of the analysis workflows using CRAB-based servers interfaced with the underlying WMS. The automatized interaction of the server with the WMS provides a successful analysis workflow. We present the operational experience as well as methods used in CMS to analyze the LHC data. The interaction with CMS Run-registry for Run and luminosity block selections via CRAB is discussed. The variations of different workflows during the LHC data-taking period and the lessons drawn from this experience are also outlined.

  4. Monitoring of bioaerosol inhalation risks in different environments using a six-stage Andersen sampler and the PCR-DGGE method.

    Science.gov (United States)

    Xu, Zhenqiang; Yao, Maosheng

    2013-05-01

    Increasing evidences show that inhalation of indoor bioaerosols has caused numerous adverse health effects and diseases. However, the bioaerosol size distribution, composition, and concentration level, representing different inhalation risks, could vary with different living environments. The six-stage Andersen sampler is designed to simulate the sampling of different human lung regions. Here, the sampler was used in investigating the bioaerosol exposure in six different environments (student dorm, hospital, laboratory, hotel room, dining hall, and outdoor environment) in Beijing. During the sampling, the Andersen sampler was operated for 30 min for each sample, and three independent experiments were performed for each of the environments. The air samples collected onto each of the six stages of the sampler were incubated on agar plates directly at 26 °C, and the colony forming units (CFU) were manually counted and statistically corrected. In addition, the developed CFUs were washed off the agar plates and subjected to polymerase chain reaction (PCR)-denaturing gradient gel electrophoresis (DGGE) for diversity analysis. Results revealed that for most environments investigated, the culturable bacterial aerosol concentrations were higher than those of culturable fungal aerosols. The culturable bacterial and fungal aerosol fractions, concentration, size distribution, and diversity were shown to vary significantly with the sampling environments. PCR-DGGE analysis indicated that different environments had different culturable bacterial aerosol compositions as revealed by distinct gel band patterns. For most environments tested, larger (>3 μm) culturable bacterial aerosols with a skewed size distribution were shown to prevail, accounting for more than 60 %, while for culturable fungal aerosols with a normal size distribution, those 2.1-4.7 μm dominated, accounting for 20-40 %. Alternaria, Cladosporium, Chaetomium, and Aspergillus were found abundant in most

  5. Distributed activation energy model for kinetic analysis of multi-stage hydropyrolysis of coal

    Energy Technology Data Exchange (ETDEWEB)

    Liu, X.; Li, W.; Wang, N.; Li, B. [Chinese Academy of Sciences, Taiyuan (China). Inst. of Coal Chemistry

    2003-07-01

    Based on the new analysis of distributed activation energy model, a bicentral distribution model was introduced to the analysis of multi-stage hydropyrolysis of coal. The hydropyrolysis for linear temperature programming with and without holding stage were mathematically described and the corresponding kinetic expressions were achieved. Based on the kinetics, the hydropyrolysis (HyPr) and multi-stage hydropyrolysis (MHyPr) of Xundian brown coal was simulated. The results shows that both Mo catalyst and 2-stage holding can lower the apparent activation energy of hydropyrolysis and make activation energy distribution become narrow. Besides, there exists an optimum Mo loading of 0.2% for HyPy of Xundian lignite. 10 refs.

  6. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  7. Tense Usage Analysis in Verb Distribution in Brazilian Portuguese.

    Science.gov (United States)

    Hoge, Henry W., Comp.

    This section of a four-part research project investigating the syntax of Brazilian Portuguese presents data concerning tense usage in verb distribution. The data are derived from the analysis of selected literary samples from representative and contemporary writers. The selection of authors and tabulation of data are also described. Materials…

  8. PID temperature controller in pig nursery: spatial characterization of thermal environment

    Science.gov (United States)

    de Souza Granja Barros, Juliana; Rossi, Luiz Antonio; Menezes de Souza, Zigomar

    2018-05-01

    The use of enhanced technologies of temperature control can improve the thermal conditions in environments of livestock facilities. The objective of this study was to evaluate the spatial distribution of the thermal environment variables in a pig nursery with a heating system with two temperature control technologies based on the geostatistical analysis. The following systems were evaluated: overhead electrical resistance with Proportional, Integral, and Derivative (PID) controller and overhead electrical resistance with a thermostat. We evaluated the climatic variables: dry bulb temperature (Tbs), air relative humidity (RH), temperature and humidity index (THI), and enthalpy in the winter, at 7:00, 12:00, and 18:00 h. The spatial distribution of these variables was mapped by kriging. The results showed that the resistance heating system with PID controllers improved the thermal comfort conditions in the pig nursery in the coldest hours, maintaining the spatial distribution of the air temperature more homogeneous in the pen. During the hottest weather, neither system provided comfort.

  9. PID temperature controller in pig nursery: spatial characterization of thermal environment

    Science.gov (United States)

    de Souza Granja Barros, Juliana; Rossi, Luiz Antonio; Menezes de Souza, Zigomar

    2017-11-01

    The use of enhanced technologies of temperature control can improve the thermal conditions in environments of livestock facilities. The objective of this study was to evaluate the spatial distribution of the thermal environment variables in a pig nursery with a heating system with two temperature control technologies based on the geostatistical analysis. The following systems were evaluated: overhead electrical resistance with Proportional, Integral, and Derivative (PID) controller and overhead electrical resistance with a thermostat. We evaluated the climatic variables: dry bulb temperature (Tbs), air relative humidity (RH), temperature and humidity index (THI), and enthalpy in the winter, at 7:00, 12:00, and 18:00 h. The spatial distribution of these variables was mapped by kriging. The results showed that the resistance heating system with PID controllers improved the thermal comfort conditions in the pig nursery in the coldest hours, maintaining the spatial distribution of the air temperature more homogeneous in the pen. During the hottest weather, neither system provided comfort.

  10. PID temperature controller in pig nursery: spatial characterization of thermal environment.

    Science.gov (United States)

    de Souza Granja Barros, Juliana; Rossi, Luiz Antonio; Menezes de Souza, Zigomar

    2017-11-28

    The use of enhanced technologies of temperature control can improve the thermal conditions in environments of livestock facilities. The objective of this study was to evaluate the spatial distribution of the thermal environment variables in a pig nursery with a heating system with two temperature control technologies based on the geostatistical analysis. The following systems were evaluated: overhead electrical resistance with Proportional, Integral, and Derivative (PID) controller and overhead electrical resistance with a thermostat. We evaluated the climatic variables: dry bulb temperature (Tbs), air relative humidity (RH), temperature and humidity index (THI), and enthalpy in the winter, at 7:00, 12:00, and 18:00 h. The spatial distribution of these variables was mapped by kriging. The results showed that the resistance heating system with PID controllers improved the thermal comfort conditions in the pig nursery in the coldest hours, maintaining the spatial distribution of the air temperature more homogeneous in the pen. During the hottest weather, neither system provided comfort.

  11. Neutron activation analysis capability of natural objects' estimation for Latvian environment

    International Nuclear Information System (INIS)

    Damburg, N.A.; Mednis, I.V.; Taure, I.Ya.; Virtsavs, M.V.

    1989-01-01

    A review of literature data and the NAA techniques developed by the authors for the analysis of environmental saples (aerosols, fly ash, soil, pine needls, natural and technological waters) are presented. The methods are used for the routine analysis of some samples from the environment of industrial and power plants of Latvia to investigate and control the local pollution with heavy metals, arsenic, halogens

  12. Evaluating Domestic Hot Water Distribution System Options with Validated Analysis Models

    Energy Technology Data Exchange (ETDEWEB)

    Weitzel, E. [Alliance for Residential Building Innovation, Davis, CA (United States); Hoeschele, E. [Alliance for Residential Building Innovation, Davis, CA (United States)

    2014-09-01

    A developing body of work is forming that collects data on domestic hot water consumption, water use behaviors, and energy efficiency of various distribution systems. Transient System Simulation Tool (TRNSYS) is a full distribution system developed that has been validated using field monitoring data and then exercised in a number of climates to understand climate impact on performance. In this study, the Building America team built upon previous analysis modeling work to evaluate differing distribution systems and the sensitivities of water heating energy and water use efficiency to variations of climate, load, distribution type, insulation and compact plumbing practices. Overall, 124 different TRNSYS models were simulated. The results of this work are useful in informing future development of water heating best practices guides as well as more accurate (and simulation time efficient) distribution models for annual whole house simulation programs.

  13. Privacy-Preserving and Scalable Service Recommendation Based on SimHash in a Distributed Cloud Environment

    Directory of Open Access Journals (Sweden)

    Yanwei Xu

    2017-01-01

    Full Text Available With the increasing volume of web services in the cloud environment, Collaborative Filtering- (CF- based service recommendation has become one of the most effective techniques to alleviate the heavy burden on the service selection decisions of a target user. However, the service recommendation bases, that is, historical service usage data, are often distributed in different cloud platforms. Two challenges are present in such a cross-cloud service recommendation scenario. First, a cloud platform is often not willing to share its data to other cloud platforms due to privacy concerns, which decreases the feasibility of cross-cloud service recommendation severely. Second, the historical service usage data recorded in each cloud platform may update over time, which reduces the recommendation scalability significantly. In view of these two challenges, a novel privacy-preserving and scalable service recommendation approach based on SimHash, named SerRecSimHash, is proposed in this paper. Finally, through a set of experiments deployed on a real distributed service quality dataset WS-DREAM, we validate the feasibility of our proposal in terms of recommendation accuracy and efficiency while guaranteeing privacy-preservation.

  14. Cross-layer TCP Performance Analysis in IEEE 802.11 Vehicular Environments

    Directory of Open Access Journals (Sweden)

    T. Janevski

    2014-06-01

    Full Text Available In this paper we provide a performance analysis of TCP in IEEE 802.11 vehicular environments for different well-known TCP versions, such as Tahoe, Reno, New Reno, Vegas, and Sack. The parameters of interest from the TCP side are the number of Duplicate Acknowledgements - DupAck, and the number of Delayed Acknowledgements - DelAck, while on the wireless network side the analyzed parameter is the interface queue - IFQ. We have made the analysis for the worst-case distance scenario for single-hop and worst-case multihop vehicular environments. The results show that the number of wireless hops in vehicular environments significantly reduces the TCP throughput. The best average performances considering all scenarios were obtained for TCP Vegas. However, the results show that the interface queue at wireless nodes should be at least five packets or more. On the other side, due to shorter distances in the vehicular wireless network, results show possible flexibility of using different values for the DupAck without degradation of the TCP throughput. On the other side, the introduction of the DelAck parameter provides enhancement in the average TCP throughput for all TCP versions.

  15. Taking the distributed nature of cooperative work seriously

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Simone, Carla

    1998-01-01

    For CSCW facilities to be effective and viable, the inher-ently distributed nature of cooperative work must matched by a radically distributed environment. On the basis of a scenario derived from field studies, the paper describes a CSCW environment which supports the distributed con...

  16. Distributed Leadership in Drainage Basin Management: A Critical Analysis of ‘River Chief Policy’ from a Distributed Leadership Perspective

    Science.gov (United States)

    Zhang, Liuyi

    2018-02-01

    Water resources management has been more significant than ever since the official file stipulated ‘three red lines’ to scrupulously control water usage and water pollution, accelerating the promotion of ‘River Chief Policy’ throughout China. The policy launches creative approaches to include people from different administrative levels to participate and distributes power to increase drainage basin management efficiency. Its execution resembles features of distributed leadership theory, a vastly acknowledged western leadership theory with innovative perspective and visions to suit the modern world. This paper intends to analyse the policy from a distributed leadership perspective using Taylor’s critical policy analysis framework.

  17. Rapid establishment of a regular distribution of adult tropical Drosophila parasitoids in a multi-patch environment by patch defence behaviour.

    Science.gov (United States)

    de Jong, Peter W; Hemerik, Lia; Gort, Gerrit; van Alphen, Jacques J M

    2011-01-01

    Females of the larval parasitoid of Drosophila, Asobara citri, from sub-Saharan Africa, defend patches with hosts by fighting and chasing conspecific females upon encounter. Females of the closely related, palearctic species Asobara tabida do not defend patches and often search simultaneously in the same patch. The effect of patch defence by A. citri females on their distribution in a multi-patch environment was investigated, and their distributions were compared with those of A. tabida. For both species 20 females were released from two release-points in replicate experiments. Females of A. citri quickly reached a regular distribution across 16 patches, with a small variance/mean ratio per patch. Conversely, A. tabida females initially showed a clumped distribution, and after gradual dispersion, a more Poisson-like distribution across patches resulted (variance/mean ratio was closer to 1 and higher than for A. citri). The dispersion of A. tabida was most probably an effect of exploitation: these parasitoids increasingly made shorter visits to already exploited patches. We briefly discuss hypotheses on the adaptive significance of patch defence behaviour or its absence in the light of differences in the natural history of both parasitoid species, notably the spatial distribution of their hosts.

  18. A Systematic Approach for Engagement Analysis Under Multitasking Environments

    Science.gov (United States)

    Zhang, Guangfan; Leddo, John; Xu, Roger; Richey, Carl; Schnell, Tom; McKenzie, Frederick; Li, Jiang

    2011-01-01

    An overload condition can lead to high stress for an operator and further cause substantial drops in performance. On the other extreme, in automated systems, an operator may become underloaded; in which case, it is difficult for the operator to maintain sustained attention. When an unexpected event occurs, either internal or external to the automated system, a disengaged operation may neglect, misunderstand, or respond slowly/inappropriately to the situation. In this paper, we discuss a systematic approach monitor for extremes of cognitive workload and engagement in multitasking environments. Inferences of cognitive workload ar engagement are based on subjective evaluations, objective performance measures, physiological signals, and task analysis results. The systematic approach developed In this paper aggregates these types of information collected under the multitasking environment and can provide a real-time assessment or engagement.

  19. Relationships between the Raindrop Size Distribution and Properties of the Environment and Clouds Inferred from TRMM

    Science.gov (United States)

    Munchak, Stephen Joseph; Kummerow, Christian; Elsaesser, Gregory

    2013-01-01

    Variability in the raindrop sized distribution (DSD) has long been recognized as a source of uncertainty in relationships between radar reflectivity Z and rain rate R. In this study, we analyze DSD retrievals from two years of data gathered by the Tropical Rainfall Measuring Mission (TRMM) satellite and processed with a combined radar-radiometer retrieval algorithm over the global oceans equatorward of 35?. Numerous variables describing properties of each reflectivity profile, large-scale organization, and the background environment are examined for relationships to the reflectivity-normalized median drop diameter, epsilonDSD. In general, we find that higher freezing levels and relative humidities are associated with smaller epsilonDSD. Within a given environment, the mesoscale organization of precipitation and the vertical profile of reflectivity are associated with DSD characteristics. In the tropics, the smallest epsilonDSD values are found in large but shallow convective systems, where warm rain formation processes are thought to be predominant, whereas larger sizes are found in the stratiform regions of organized deep convection. In the extratropics, the largest epsilonDSD values are found in the scattered convection that occurs when cold, dry continental air moves over the much warmer ocean after the passage of a cold front. The geographical distribution of the retrieved DSDs is consistent with many of the observed regional Z-R relationships found in the literature as well as discrepancies between the TRMM radar-only and radiometer-only precipitation products. In particular, mid-latitude and tropical regions near land tend to have larger drops for a given reflectivity, whereas the smallest drops are found in the eastern Pacific Intertropical Convergence Zone.

  20. Distributed Language and Dialogism

    DEFF Research Database (Denmark)

    Steffensen, Sune Vork

    2015-01-01

    addresses Linell’s critique of Distributed Language as rooted in biosemiotics and in theories of organism-environment systems. It is argued that Linell’s sense-based approach entails an individualist view of how conspecific Others acquire their status as prominent parts of the sense-maker’s environment......This article takes a starting point in Per Linell’s (2013) review article on the book Distributed Language (Cowley, 2011a) and other contributions to the field of ‘Distributed Language’, including Cowley et al. (2010) and Hodges et al. (2012). The Distributed Language approach is a naturalistic...... and anti-representational approach to language that builds on recent developments in the cognitive sciences. With a starting point in Linell’s discussion of the approach, the article aims to clarify four aspects of a distributed view of language vis-à-vis the tradition of Dialogism, as presented by Linell...