WorldWideScience

Sample records for distributed workflow traces

  1. CMS Distributed Computing Workflow Experience

    CERN Document Server

    Haas, Jeffrey David

    2010-01-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simul...

  2. CMS distributed computing workflow experience

    Science.gov (United States)

    Adelman-McCarthy, Jennifer; Gutsche, Oliver; Haas, Jeffrey D.; Prosper, Harrison B.; Dutta, Valentina; Gomez-Ceballos, Guillelmo; Hahn, Kristian; Klute, Markus; Mohapatra, Ajit; Spinoso, Vincenzo; Kcira, Dorian; Caudron, Julien; Liao, Junhui; Pin, Arnaud; Schul, Nicolas; De Lentdecker, Gilles; McCartin, Joseph; Vanelderen, Lukas; Janssen, Xavier; Tsyganov, Andrey; Barge, Derek; Lahiff, Andrew

    2011-12-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simulation of proton-proton collisions for the CMS experiment is primarily carried out at the second tier of the CMS computing infrastructure. Half of the Tier-2 sites of CMS are reserved for central Monte Carlo (MC) production while the other half is available for user analysis. This paper summarizes the large throughput of the MC production operation during the data taking period of 2010 and discusses the latencies and efficiencies of the various types of MC production workflows. We present the operational procedures to optimize the usage of available resources and we the operational model of CMS for including opportunistic resources, such as the larger Tier-3 sites, into the central production operation.

  3. CMS distributed computing workflow experience

    International Nuclear Information System (INIS)

    Adelman-McCarthy, Jennifer; Gutsche, Oliver; Haas, Jeffrey D; Prosper, Harrison B; Dutta, Valentina; Gomez-Ceballos, Guillelmo; Hahn, Kristian; Klute, Markus; Mohapatra, Ajit; Spinoso, Vincenzo; Kcira, Dorian; Caudron, Julien; Liao Junhui; Pin, Arnaud; Schul, Nicolas; Lentdecker, Gilles De; McCartin, Joseph; Vanelderen, Lukas; Janssen, Xavier; Tsyganov, Andrey

    2011-01-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simulation of proton-proton collisions for the CMS experiment is primarily carried out at the second tier of the CMS computing infrastructure. Half of the Tier-2 sites of CMS are reserved for central Monte Carlo (MC) production while the other half is available for user analysis. This paper summarizes the large throughput of the MC production operation during the data taking period of 2010 and discusses the latencies and efficiencies of the various types of MC production workflows. We present the operational procedures to optimize the usage of available resources and we the operational model of CMS for including opportunistic resources, such as the larger Tier-3 sites, into the central production operation.

  4. Distributed interoperable workflow support for electronic commerce

    NARCIS (Netherlands)

    Papazoglou, M.; Jeusfeld, M.A.; Weigand, H.; Jarke, M.

    1998-01-01

    Abstract. This paper describes a flexible distributed transactional workflow environment based on an extensible object-oriented framework built around class libraries, application programming interfaces, and shared services. The purpose of this environment is to support a range of EC-like business

  5. Declarative Modelling and Safe Distribution of Healthcare Workflows

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao; Slaats, Tijs

    2012-01-01

    We present a formal technique for safe distribution of workflow processes described declaratively as Nested Condition Response (NCR) Graphs and apply the technique to a distributed healthcare workflow. Concretely, we provide a method to synthesize from a NCR Graph and any distribution of its events......-organizational case management. The contributions of this paper is to adapt the technique to allow for nested processes and milestones and to apply it to a healthcare workflow identified in a previous field study at danish hospitals....

  6. Workflow management in large distributed systems

    International Nuclear Information System (INIS)

    Legrand, I; Newman, H; Voicu, R; Dobre, C; Grigoras, C

    2011-01-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  7. Workflow management in large distributed systems

    Science.gov (United States)

    Legrand, I.; Newman, H.; Voicu, R.; Dobre, C.; Grigoras, C.

    2011-12-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  8. Distributed Global Transaction Support for Workflow Management Applications

    NARCIS (Netherlands)

    Vonk, J.; Grefen, P.W.P.J.; Boertjes, E.M.; Apers, Peter M.G.

    Workflow management systems require advanced transaction support to cope with their inherently long-running processes. The recent trend to distribute workflow executions requires an even more advanced transaction support system that is able to handle distribution. This paper presents a model as well

  9. modeling workflow management in a distributed computing system

    African Journals Online (AJOL)

    Dr Obe

    communication system, which allows for computerized support. ... Keywords: Distributed computing system; Petri nets;Workflow management. 1. ... A distributed operating system usually .... the questionnaire is returned with invalid data,.

  10. Distributed Workflow Service Composition Based on CTR Technology

    Science.gov (United States)

    Feng, Zhilin; Ye, Yanming

    Recently, WS-BPEL has gradually become the basis of a standard for web service description and composition. However, WS-BPEL cannot efficiently describe distributed workflow services for lacking of special expressive power and formal semantics. This paper presents a novel method for modeling distributed workflow service composition with Concurrent TRansaction logic (CTR). The syntactic structure of WS-BPEL and CTR are analyzed, and new rules of mapping WS-BPEL into CTR are given. A case study is put forward to show that the proposed method is appropriate for modeling workflow business services under distributed environments.

  11. Distributing Workflows over a Ubiquitous P2P Network

    Directory of Open Access Journals (Sweden)

    Eddie Al-Shakarchi

    2007-01-01

    Full Text Available This paper discusses issues in the distribution of bundled workflows across ubiquitous peer-to-peer networks for the application of music information retrieval. The underlying motivation for this work is provided by the DART project, which aims to develop a novel music recommendation system by gathering statistical data using collaborative filtering techniques and the analysis of the audio itsel, in order to create a reliable and comprehensive database of the music that people own and which they listen to. To achieve this, the DART scientists creating the algorithms need the ability to distribute the Triana workflows they create, representing the analysis to be performed, across the network on a regular basis (perhaps even daily in order to update the network as a whole with new workflows to be executed for the analysis. DART uses a similar approach to BOINC but differs in that the workers receive input data in the form of a bundled Triana workflow, which is executed in order to process any MP3 files that they own on their machine. Once analysed, the results are returned to DART's distributed database that collects and aggregates the resulting information. DART employs the use of package repositories to decentralise the distribution of such workflow bundles and this approach is validated in this paper through simulations that show that suitable scalability is maintained through the system as the number of participants increases. The results clearly illustrate the effectiveness of the approach.

  12. Provenance for distributed biomedical workflow execution

    NARCIS (Netherlands)

    Madougou, S.; Santcroos, M.; Benabdelkader, A.; van Schaik, B.D.; Shahand, S.; Korkhov, V.; van Kampen, A.H.C.; Olabarriaga, S.D.

    2012-01-01

    Scientific research has become very data and compute intensive because of the progress in data acquisition and measurement devices, which is particularly true in Life Sciences. To cope with this deluge of data, scientists use distributed computing and storage infrastructures. The use of such

  13. Optimising metadata workflows in a distributed information environment

    OpenAIRE

    Robertson, R. John; Barton, Jane

    2005-01-01

    The different purposes present within a distributed information environment create the potential for repositories to enhance their metadata by capitalising on the diversity of metadata available for any given object. This paper presents three conceptual reference models required to achieve this optimisation of metadata workflow: the ecology of repositories, the object lifecycle model, and the metadata lifecycle model. It suggests a methodology for developing the metadata lifecycle model, and ...

  14. Research and Implementation of Key Technologies in Multi-Agent System to Support Distributed Workflow

    Science.gov (United States)

    Pan, Tianheng

    2018-01-01

    In recent years, the combination of workflow management system and Multi-agent technology is a hot research field. The problem of lack of flexibility in workflow management system can be improved by introducing multi-agent collaborative management. The workflow management system adopts distributed structure. It solves the problem that the traditional centralized workflow structure is fragile. In this paper, the agent of Distributed workflow management system is divided according to its function. The execution process of each type of agent is analyzed. The key technologies such as process execution and resource management are analyzed.

  15. Data distribution method of workflow in the cloud environment

    Science.gov (United States)

    Wang, Yong; Wu, Junjuan; Wang, Ying

    2017-08-01

    Cloud computing for workflow applications provides the required high efficiency calculation and large storage capacity and it also brings challenges to the protection of trade secrets and other privacy data. Because of privacy data will cause the increase of the data transmission time, this paper presents a new data allocation algorithm based on data collaborative damage degree, to improve the existing data allocation strategy? Safety and public cloud computer algorithm depends on the private cloud; the static allocation method in the initial stage only to the non-confidential data division to improve the original data, in the operational phase will continue to generate data to dynamically adjust the data distribution scheme. The experimental results show that the improved method is effective in reducing the data transmission time.

  16. Wildfire: distributed, Grid-enabled workflow construction and execution

    Directory of Open Access Journals (Sweden)

    Issac Praveen

    2005-03-01

    Full Text Available Abstract Background We observe two trends in bioinformatics: (i analyses are increasing in complexity, often requiring several applications to be run as a workflow; and (ii multiple CPU clusters and Grids are available to more scientists. The traditional solution to the problem of running workflows across multiple CPUs required programming, often in a scripting language such as perl. Programming places such solutions beyond the reach of many bioinformatics consumers. Results We present Wildfire, a graphical user interface for constructing and running workflows. Wildfire borrows user interface features from Jemboss and adds a drag-and-drop interface allowing the user to compose EMBOSS (and other programs into workflows. For execution, Wildfire uses GEL, the underlying workflow execution engine, which can exploit available parallelism on multiple CPU machines including Beowulf-class clusters and Grids. Conclusion Wildfire simplifies the tasks of constructing and executing bioinformatics workflows.

  17. Dynamic work distribution in workflow management systems : how to balance quality and performance

    NARCIS (Netherlands)

    Kumar, Akhil; Aalst, van der W.M.P.; Verbeek, H.M.W.

    2002-01-01

    Today's workflow management systems offer work items to workers using rather primitive mechanisms.Although most workflow systems support a role-based distribution of work, they have problems dealing with unavailability of workers as a result of vacation or illness, overloading, context-dependent

  18. When Workflow Management Systems and Logging Systems Meet: Analyzing Large-Scale Execution Traces

    Energy Technology Data Exchange (ETDEWEB)

    Gunter, Daniel

    2008-07-31

    This poster shows the benefits of integrating a workflow management system with logging and log mining capabilities. By combing two existing, mature technologies: Pegasus-WMS and Netlogger, we are able to efficiently process execution logs of earthquake science workflows consisting of hundreds of thousands to one million tasks. In particular we show results of processing logs of CyberShake, a workflow application running on the TeraGrid. Client-side tools allow scientists to quickly gather statistics about a workflow run and find out which tasks executed, where they were executed, what was their runtime, etc. These statistics can be used to understand the performance characteristics of a workflow and help tune the execution parameters of the workflow management system. This poster shows the scalability of the system presenting results of uploading task execution records into the system and by showing results of querying the system for overall workflow performance information.

  19. Watchdog - a workflow management system for the distributed analysis of large-scale experimental data.

    Science.gov (United States)

    Kluge, Michael; Friedel, Caroline C

    2018-03-13

    The development of high-throughput experimental technologies, such as next-generation sequencing, have led to new challenges for handling, analyzing and integrating the resulting large and diverse datasets. Bioinformatical analysis of these data commonly requires a number of mutually dependent steps applied to numerous samples for multiple conditions and replicates. To support these analyses, a number of workflow management systems (WMSs) have been developed to allow automated execution of corresponding analysis workflows. Major advantages of WMSs are the easy reproducibility of results as well as the reusability of workflows or their components. In this article, we present Watchdog, a WMS for the automated analysis of large-scale experimental data. Main features include straightforward processing of replicate data, support for distributed computer systems, customizable error detection and manual intervention into workflow execution. Watchdog is implemented in Java and thus platform-independent and allows easy sharing of workflows and corresponding program modules. It provides a graphical user interface (GUI) for workflow construction using pre-defined modules as well as a helper script for creating new module definitions. Execution of workflows is possible using either the GUI or a command-line interface and a web-interface is provided for monitoring the execution status and intervening in case of errors. To illustrate its potentials on a real-life example, a comprehensive workflow and modules for the analysis of RNA-seq experiments were implemented and are provided with the software in addition to simple test examples. Watchdog is a powerful and flexible WMS for the analysis of large-scale high-throughput experiments. We believe it will greatly benefit both users with and without programming skills who want to develop and apply bioinformatical workflows with reasonable overhead. The software, example workflows and a comprehensive documentation are freely

  20. Declarative Event-Based Workflow as Distributed Dynamic Condition Response Graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2010-01-01

    We present Dynamic Condition Response Graphs (DCR Graphs) as a declarative, event-based process model inspired by the workflow language employed by our industrial partner and conservatively generalizing prime event structures. A dynamic condition response graph is a directed graph with nodes repr...... exemplify the use of distributed DCR Graphs on a simple workflow taken from a field study at a Danish hospital, pointing out their flexibility compared to imperative workflow models. Finally we provide a mapping from DCR Graphs to Buchi-automata....

  1. A history-tracing XML-based provenance framework for workflows

    NARCIS (Netherlands)

    Gerhards, M; Belloum, A.; Berretz, F.; Sander, V.; Skorupa, S.

    2010-01-01

    The importance of validating and reproducing the outcome of computational processes is fundamental to many application domains. Assuring the provenance of workflows will likely become even more important with respect to the incorporation of human tasks to standard workflows by emerging standards

  2. Automated Finite State Workflow for Distributed Data Production

    Science.gov (United States)

    Hajdu, L.; Didenko, L.; Lauret, J.; Amol, J.; Betts, W.; Jang, H. J.; Noh, S. Y.

    2016-10-01

    In statistically hungry science domains, data deluges can be both a blessing and a curse. They allow the narrowing of statistical errors from known measurements, and open the door to new scientific opportunities as research programs mature. They are also a testament to the efficiency of experimental operations. However, growing data samples may need to be processed with little or no opportunity for huge increases in computing capacity. A standard strategy has thus been to share resources across multiple experiments at a given facility. Another has been to use middleware that “glues” resources across the world so they are able to locally run the experimental software stack (either natively or virtually). We describe a framework STAR has successfully used to reconstruct a ~400 TB dataset consisting of over 100,000 jobs submitted to a remote site in Korea from STAR's Tier 0 facility at the Brookhaven National Laboratory. The framework automates the full workflow, taking raw data files from tape and writing Physics-ready output back to tape without operator or remote site intervention. Through hardening we have demonstrated 97(±2)% efficiency, over a period of 7 months of operation. The high efficiency is attributed to finite state checking with retries to encourage resilience in the system over capricious and fallible infrastructure.

  3. Automated Finite State Workflow for Distributed Data Production

    International Nuclear Information System (INIS)

    Hajdu, L; Didenko, L; Lauret, J; Betts, W; Amol, J; Jang, H J; Noh, S Y

    2016-01-01

    In statistically hungry science domains, data deluges can be both a blessing and a curse. They allow the narrowing of statistical errors from known measurements, and open the door to new scientific opportunities as research programs mature. They are also a testament to the efficiency of experimental operations. However, growing data samples may need to be processed with little or no opportunity for huge increases in computing capacity. A standard strategy has thus been to share resources across multiple experiments at a given facility. Another has been to use middleware that “glues” resources across the world so they are able to locally run the experimental software stack (either natively or virtually). We describe a framework STAR has successfully used to reconstruct a ∼400 TB dataset consisting of over 100,000 jobs submitted to a remote site in Korea from STAR's Tier 0 facility at the Brookhaven National Laboratory. The framework automates the full workflow, taking raw data files from tape and writing Physics-ready output back to tape without operator or remote site intervention. Through hardening we have demonstrated 97(±2)% efficiency, over a period of 7 months of operation. The high efficiency is attributed to finite state checking with retries to encourage resilience in the system over capricious and fallible infrastructure. (paper)

  4. Modeling Workflow Management in a Distributed Computing System ...

    African Journals Online (AJOL)

    Distributed computing is becoming increasingly important in our daily life. This is because it enables the people who use it to share information more rapidly and increases their productivity. A major characteristic feature or distributed computing is the explicit representation of process logic within a communication system, ...

  5. Earth observation scientific workflows in a distributed computing environment

    CSIR Research Space (South Africa)

    Van Zyl, TL

    2011-09-01

    Full Text Available capabilities has focused on the web services approach as exemplified by the OGC's Web Processing Service and by GRID computing. The approach to leveraging distributed computing resources described in this paper uses instead remote objects via RPy...

  6. Alert Messaging in the CMS Distributed Workflow System

    International Nuclear Information System (INIS)

    Maxa, Zdenek

    2012-01-01

    WMAgent is the core component of the CMS workload management system. One of the features of this job managing platform is a configurable messaging system aimed at generating, distributing and processing alerts: short messages describing a given alert-worthy information or pathological condition. Apart from the framework's sub-components running within the WMAgent instances, there is a stand-alone application collecting alerts from all WMAgent instances running across the CMS distributed computing environment. The alert framework has a versatile design that allows for receiving alert messages also from other CMS production applications, such as PhEDEx data transfer manager. We present implementation details of the system, including its Python implementation using ZeroMQ, CouchDB message storage and future visions as well as operational experiences. Inter-operation with monitoring platforms such as Dashboard or Lemon is described.

  7. A remote tracing facility for distributed systems

    International Nuclear Information System (INIS)

    Ehm, F.; Dworak, A.

    2012-01-01

    Today, CERN's control system is built upon a large number of C++ and Java services producing log events. In such a largely distributed environment these log messages are essential for problem recognition and tracing. Tracing is therefore vital for operation as understanding an issue in a subsystem means analysing log events in an efficient and fast manner. At present 3150 device servers are deployed on 1600 disk-less front-ends and they send their log messages via the network to an in-house developed central server which, in turn, saves them to files. However, this solution is not able to provide several highly desired features and has performance limitations which led to the development of a new solution. The new distributed tracing facility fulfills these requirements by taking advantage of the Streaming Text Oriented Messaging Protocol (STOMP) and ActiveMQ as the transport layer. The system not only allows storing critical log events centrally in files or in a database but also allows other clients (e.g. graphical interfaces) to read the same events concurrently by using the provided Java API. Thanks to the ActiveMQ broker technology the system can easily be extended to clients implemented in other languages and it is highly scalable in terms of performance. Long running tests have shown that the system can handle up to 10.000 messages/second. (authors)

  8. Supporting the Construction of Workflows for Biodiversity Problem-Solving Accessing Secure, Distributed Resources

    Directory of Open Access Journals (Sweden)

    J.S. Pahwa

    2006-01-01

    Full Text Available In the Biodiversity World (BDW project we have created a flexible and extensible Web Services-based Grid environment for biodiversity researchers to solve problems in biodiversity and analyse biodiversity patterns. In this environment, heterogeneous and globally distributed biodiversity-related resources such as data sets and analytical tools are made available to be accessed and assembled by users into workflows to perform complex scientific experiments. One such experiment is bioclimatic modelling of the geographical distribution of individual species using climate variables in order to explain past and future climate-related changes in species distribution. Data sources and analytical tools required for such analysis of species distribution are widely dispersed, available on heterogeneous platforms, present data in different formats and lack inherent interoperability. The present BDW system brings all these disparate units together so that the user can combine tools with little thought as to their original availability, data formats and interoperability. The new prototype BDW system architecture not only brings together heterogeneous resources but also enables utilisation of computational resources and provides a secure access to BDW resources via a federated security model. We describe features of the new BDW system and its security model which enable user authentication from a workflow application as part of workflow execution.

  9. Trace elements distribution in environmental compartments

    International Nuclear Information System (INIS)

    Queiroz, Juliana C. de; Peres, Sueli da Silva; Godoy, Maria Luiza D.P.

    2017-01-01

    Trace elements term defines the presence of low concentrations metals at environment. Some of them are considered biologically essential, as Co, Cu and Mn. Others can cause detriment to environment and human health, as Pb, Cd, Hg, As, Ti and U. A large number of them have radioactive isotopes, implying the evaluation of risks for human health should be done considering the precepts of environmental radiological protection. The ecosystem pollution with trace elements generates changes at the geochemistry cycle of these elements and in environmental quality. Soils have single characteristics when compared with another components of biosphere (air, water and biota), cause they introduce themselves not only as a drain towards contaminants, but also as natural buffer that control the transport of chemical elements and other substances for atmosphere, hydrosphere and biota. The main purpose of environmental monitoring program is to evaluate the levels of contaminants in the various compartments of the environment: natural or anthropogenic, and to assess the contribution of a potential contaminant source on the environment. Elemental Composition for the collected samples was determined by inductively coupled plasma mass spectroscopy. The main objective of this work was to evaluate the map baseline of concentration of interest trace elements in environmental samples of water, sediment and soil from Environmental Monitoring Program of Instituto de Radioprotecao e Dosimetria (IRD). The samples were analyzed using an inductively coupled plasma mass spectrometer (ICP-MS) at IRD. >From the knowledge of trace elements concentrations, could be evaluated the environmental quality parameters at the studied ecosystems. The data allowed evaluating some relevant aspects of the study of trace elements in soil and aquatic systems, with emphasis at the distribution, concentration and identification of main anthropic sources of contamination at environment. (author)

  10. Trace elements distribution in environmental compartments

    Energy Technology Data Exchange (ETDEWEB)

    Queiroz, Juliana C. de; Peres, Sueli da Silva; Godoy, Maria Luiza D.P., E-mail: suelip@ird.gov.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2017-11-01

    Trace elements term defines the presence of low concentrations metals at environment. Some of them are considered biologically essential, as Co, Cu and Mn. Others can cause detriment to environment and human health, as Pb, Cd, Hg, As, Ti and U. A large number of them have radioactive isotopes, implying the evaluation of risks for human health should be done considering the precepts of environmental radiological protection. The ecosystem pollution with trace elements generates changes at the geochemistry cycle of these elements and in environmental quality. Soils have single characteristics when compared with another components of biosphere (air, water and biota), cause they introduce themselves not only as a drain towards contaminants, but also as natural buffer that control the transport of chemical elements and other substances for atmosphere, hydrosphere and biota. The main purpose of environmental monitoring program is to evaluate the levels of contaminants in the various compartments of the environment: natural or anthropogenic, and to assess the contribution of a potential contaminant source on the environment. Elemental Composition for the collected samples was determined by inductively coupled plasma mass spectroscopy. The main objective of this work was to evaluate the map baseline of concentration of interest trace elements in environmental samples of water, sediment and soil from Environmental Monitoring Program of Instituto de Radioprotecao e Dosimetria (IRD). The samples were analyzed using an inductively coupled plasma mass spectrometer (ICP-MS) at IRD. >From the knowledge of trace elements concentrations, could be evaluated the environmental quality parameters at the studied ecosystems. The data allowed evaluating some relevant aspects of the study of trace elements in soil and aquatic systems, with emphasis at the distribution, concentration and identification of main anthropic sources of contamination at environment. (author)

  11. Trace element distribution in the rat cerebellum

    International Nuclear Information System (INIS)

    Kwiatek, W.M.; Long, G.J.; Pounds, J.G.; Reuhl, K.R.; Hanson, A.L.; Jones, K.W.

    1989-10-01

    Spatial distributions and concentrations of trace elements (TE) in the brain are important because TE perform catalytic structural functions in enzymes which regulate brain function and development. We have investigated the distributions of TE in rat cerebellum. Structures were sectioned and analyzed by the Synchrotron Radiation Induced X-ray Emission (SRIXE) method using the NSLS X-26 white-light microprobe facility. Advantages important for TE analysis of biological specimens with x-ray microscopy include short time of measurement, high brightness and flux, good spatial resolution, multielemental detection, good sensitivity, and non-destructive irradiation. Trace elements were measured in thin rat brain sections of 20-micrometers thickness. The analyses were performed on sample volumes as small as 0.2 nl with Minimum Detectable Limits (MDL) of 50 ppb wet weight for Fe, 100 ppb wet weight for Cu, and Zn, and 1 ppM wet weight for Pb. The distribution of TE in the molecular cell layer, granule cell layer and fiber tract of rat cerebella was investigated. Both point analyses and two-dimensional semi-quantitative mapping of the TE distribution in a section were used

  12. Distributed execution of aggregated multi domain workflows using an agent framework

    NARCIS (Netherlands)

    Zhao, Z.; Belloum, A.; de Laat, C.; Adriaans, P.; Hertzberger, B.; Zhang, L.J.; Watson, T.J.; Yang, J.; Hung, P.C.K.

    2007-01-01

    In e-Science, meaningful experiment processes and workflow engines emerge as important scientific resources. A complex experiment often involves services and processes developed in different scientific domains. Aggregating different workflows into one meta workflow avoids unnecessary rewriting of

  13. The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC

    Energy Technology Data Exchange (ETDEWEB)

    Kuznetsov, Valentin [Cornell U.; Fischer, Nils Leif [Heidelberg U.; Guo, Yuyi [Fermilab

    2018-03-19

    The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregate $\\mathcal{O}$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.

  14. Workflow-enabled distributed component-based information architecture for digital medical imaging enterprises.

    Science.gov (United States)

    Wong, Stephen T C; Tjandra, Donny; Wang, Huili; Shen, Weimin

    2003-09-01

    Few information systems today offer a flexible means to define and manage the automated part of radiology processes, which provide clinical imaging services for the entire healthcare organization. Even fewer of them provide a coherent architecture that can easily cope with heterogeneity and inevitable local adaptation of applications and can integrate clinical and administrative information to aid better clinical, operational, and business decisions. We describe an innovative enterprise architecture of image information management systems to fill the needs. Such a system is based on the interplay of production workflow management, distributed object computing, Java and Web techniques, and in-depth domain knowledge in radiology operations. Our design adapts the approach of "4+1" architectural view. In this new architecture, PACS and RIS become one while the user interaction can be automated by customized workflow process. Clinical service applications are implemented as active components. They can be reasonably substituted by applications of local adaptations and can be multiplied for fault tolerance and load balancing. Furthermore, the workflow-enabled digital radiology system would provide powerful query and statistical functions for managing resources and improving productivity. This paper will potentially lead to a new direction of image information management. We illustrate the innovative design with examples taken from an implemented system.

  15. Distributed trace using central performance counter memory

    Science.gov (United States)

    Satterfield, David L.; Sexton, James C.

    2013-01-22

    A plurality of processing cores, are central storage unit having at least memory connected in a daisy chain manner, forming a daisy chain ring layout on an integrated chip. At least one of the plurality of processing cores places trace data on the daisy chain connection for transmitting the trace data to the central storage unit, and the central storage unit detects the trace data and stores the trace data in the memory co-located in with the central storage unit.

  16. Distributed late-binding micro-scheduling and data caching for data-intensive workflows

    International Nuclear Information System (INIS)

    Delgado Peris, A.

    2015-01-01

    Today's world is flooded with vast amounts of digital information coming from innumerable sources. Moreover, it seems clear that this trend will only intensify in the future. Industry, society and remarkably science are not indifferent to this fact. On the contrary, they are struggling to get the most out of this data, which means that they need to capture, transfer, store and process it in a timely and efficient manner, using a wide range of computational resources. And this task is not always simple. A very representative example of the challenges posed by the management and processing of large quantities of data is that of the Large Hadron Collider experiments, which handle tens of petabytes of physics information every year. Based on the experience of one of these collaborations, we have studied the main issues involved in the management of huge volumes of data and in the completion of sizeable workflows that consume it. In this context, we have developed a general-purpose architecture for the scheduling and execution of workflows with heavy data requirements: the Task Queue. This new system builds on the late-binding overlay model, which has helped experiments to successfully overcome the problems associated to the heterogeneity and complexity of large computational grids. Our proposal introduces several enhancements to the existing systems. The execution agents of the Task Queue architecture share a Distributed Hash Table (DHT) and perform job matching and assignment cooperatively. In this way, scalability problems of centralized matching algorithms are avoided and workflow execution times are improved. Scalability makes fine-grained micro-scheduling possible and enables new functionalities, like the implementation of a distributed data cache on the execution nodes and the integration of data location information in the scheduling decisions...(Author)

  17. Particulate trace metals in Cochin backwaters: Distribution of seasonal indices

    Digital Repository Service at National Institute of Oceanography (India)

    Sankaranarayanan, V.N.; Jayalakshmy, K.V.; Joseph, T.

    that surface distribution pattern of the trace metal concentration of cobalt, nickel and iron was almost similar at the four stations thereby stressing the fact that seasonal fluctuations contributed a major part in the surface distribution of these metals...

  18. DISTRIBUTION OF TRACE ELEMENTS IN MUSCLE AND ORGANS ...

    African Journals Online (AJOL)

    a

    revealed organ specific distribution of trace metals in Tilapia, which has been discussed .... The concentrations of copper (Table 2) varied from 1.68–4.95 in muscle, .... The lead concentrations in muscle and organs of Tilapia from both lakes were comparable. ... A, D and K, trace minerals, and essential fats and amino acids.

  19. Trace element distribution in geological crystals

    Energy Technology Data Exchange (ETDEWEB)

    Den Besten, J L; Jamieson, D N; Weiser, P S [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1997-12-31

    Channelling is a useful microprobe technique for determining the structure of crystals, but until now has not been performed on geological crystals. The composition has been investigated rather than the structure, which can further explain the origin of the crystal and provide useful information on the substitutionality of trace elements. This may then lead to applications of extraction of valuable metals and semiconductor electronics. Natural crystals of pyrite, FeS{sub 2}, which contains a substantial concentration of gold were channeled and examined to identify the channel axis orientation. Rutherford Backscattering (RBS) and Particle Induced X-Ray Emission (PIXE) spectra using MeV ions were obtained in the experiment to provide a comparison of lattice and non-lattice trace elements. 3 figs.

  20. Trace element distribution in geological crystals

    Energy Technology Data Exchange (ETDEWEB)

    Den Besten, J.L.; Jamieson, D.N.; Weiser, P.S. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Channelling is a useful microprobe technique for determining the structure of crystals, but until now has not been performed on geological crystals. The composition has been investigated rather than the structure, which can further explain the origin of the crystal and provide useful information on the substitutionality of trace elements. This may then lead to applications of extraction of valuable metals and semiconductor electronics. Natural crystals of pyrite, FeS{sub 2}, which contains a substantial concentration of gold were channeled and examined to identify the channel axis orientation. Rutherford Backscattering (RBS) and Particle Induced X-Ray Emission (PIXE) spectra using MeV ions were obtained in the experiment to provide a comparison of lattice and non-lattice trace elements. 3 figs.

  1. Overcoming the Challenges of Implementing a Multi-Mission Distributed Workflow System

    Science.gov (United States)

    Sayfi, Elias; Cheng, Cecilia; Lee, Hyun; Patel, Rajesh; Takagi, Atsuya; Yu, Dan

    2009-01-01

    A multi-mission approach to solving the same problems for various projects is enticing. However, the multi-mission approach leads to the need to develop a configurable, adaptable and distributed system to meet unique project requirements. That, in turn, leads to a set of challenges varying from handling synchronization issues to coming up with a smart design that allows the "unknowns" to be decided later. This paper discusses the challenges that the Multi-mission Automated Task Invocation Subsystem (MATIS) team has come up against while designing the distributed workflow system, as well as elaborates on the solutions that were implemented. The first is to design an easily adaptable system that requires no code changes as a result of configuration changes. The number of formal deliveries is often limited because each delivery costs time and money. Changes such as the sequence of programs being called, a change of a parameter value in the program that is being automated should not result in code changes or redelivery.

  2. A Distributed Collaborative Workflow Based Approach to Data Collection and Analysis

    National Research Council Canada - National Science Library

    Gerecke, William; Enas, Douglas; Gottschlich, Susan

    2004-01-01

    ...) and Modeling and Simulation (M&S) systems and architectures. In our work we have found that in order to be maximally effective, these capabilities must be designed with the military user workflow process in mind...

  3. Subcellular trace element distribution in Geosiphon pyriforme

    International Nuclear Information System (INIS)

    Maetz, Mischa; Schuessler, Arthur; Wallianos, Alexandros; Traxel, Kurt

    1999-01-01

    Geosiphon pyriforme is a unique endosymbiotic consortium consisting of a soil dwelling fungus and the cyanobacterium Nostoc punctiforme. At present this symbiosis becomes very interesting because of its phylogenetic relationship to the arbuscular mycorrhizal (AM) fungi. Geosiphon pyriforme could be an important model system for these obligate symbiotic fungi, which supply 80-90% of all land plant species with nutrients, in particular phosphorous and trace elements. Combined PIXE and STIM analyses of the various compartments of Geosiphon give hints for the matter exchange between the symbiotic partners and their environment and the kind of nutrient storage and acquisition, in particular related to nitrogen fixation and metabolism. To determine the quality of our PIXE results we analysed several geological and biological standards over a time period of three years. This led to an overall precision of about 6% and an accuracy of 5-10% for nearly all detectable elements. In combination with the correction model for the occurring mass loss during the analyses this holds true even for biological targets

  4. Subcellular trace element distribution in Geosiphon pyriforme

    Energy Technology Data Exchange (ETDEWEB)

    Maetz, Mischa E-mail: mischa.maetz@mpi-hd.mpg.de; Schuessler, Arthur; Wallianos, Alexandros; Traxel, Kurt

    1999-04-02

    Geosiphon pyriforme is a unique endosymbiotic consortium consisting of a soil dwelling fungus and the cyanobacterium Nostoc punctiforme. At present this symbiosis becomes very interesting because of its phylogenetic relationship to the arbuscular mycorrhizal (AM) fungi. Geosiphon pyriforme could be an important model system for these obligate symbiotic fungi, which supply 80-90% of all land plant species with nutrients, in particular phosphorous and trace elements. Combined PIXE and STIM analyses of the various compartments of Geosiphon give hints for the matter exchange between the symbiotic partners and their environment and the kind of nutrient storage and acquisition, in particular related to nitrogen fixation and metabolism. To determine the quality of our PIXE results we analysed several geological and biological standards over a time period of three years. This led to an overall precision of about 6% and an accuracy of 5-10% for nearly all detectable elements. In combination with the correction model for the occurring mass loss during the analyses this holds true even for biological targets.

  5. Using SensorML to describe scientific workflows in distributed web service environments

    CSIR Research Space (South Africa)

    Van Zyl, TL

    2009-07-01

    Full Text Available for increased collaboration through workflow sharing. The Sensor Web is an open complex adaptive system the pervades the internet and provides access to sensor resources. One mechanism for describing sensor resources is through the use of Sensor ML. It is shown...

  6. Using SensorML to describe scientific workflows in distributed web service environments

    CSIR Research Space (South Africa)

    Van Zyl, TL

    2009-07-01

    Full Text Available for increased collaboration through workflow sharing. The Sensor Web is an open complex adaptive system the pervades the internet and provides access to sensor resources. One mechanism for describing sensor resources is through the use of SensorML. It is shown...

  7. Ray tracing the Wigner distribution function for optical simulations

    NARCIS (Netherlands)

    Mout, B.M.; Wick, Michael; Bociort, F.; Petschulat, Joerg; Urbach, Paul

    2018-01-01

    We study a simulation method that uses the Wigner distribution function to incorporate wave optical effects in an established framework based on geometrical optics, i.e., a ray tracing engine. We use the method to calculate point spread functions and show that it is accurate for paraxial systems

  8. Trace elements distribution in the Amazon floodplain soils

    International Nuclear Information System (INIS)

    Fernandes, E.A.N.; Ferraz, E.S.B.; Oliveira, H.

    1994-01-01

    Neutron activation analysis was performed on aluvial soil samples from several sites on the foodplains of the Amazon River and its major tributaries for trace elements determination. The spatial and temporal variations of chemical composition of floodland sediments in the Amazon basin are discussed. No significant difference was found in trace elemental distribution in the floodland soils along the Amazon main channel, even after the source material has been progressively diluted with that from lowland draining tributaries. It was also seen that the average chemical composition of floodplain soils compares well with that of the suspended sedimets. (author) 12 refs.; 5 figs.; 2 tabs

  9. A lightweight messaging-based distributed processing and workflow execution framework for real-time and big data analysis

    Science.gov (United States)

    Laban, Shaban; El-Desouky, Aly

    2014-05-01

    To achieve a rapid, simple and reliable parallel processing of different types of tasks and big data processing on any compute cluster, a lightweight messaging-based distributed applications processing and workflow execution framework model is proposed. The framework is based on Apache ActiveMQ and Simple (or Streaming) Text Oriented Message Protocol (STOMP). ActiveMQ , a popular and powerful open source persistence messaging and integration patterns server with scheduler capabilities, acts as a message broker in the framework. STOMP provides an interoperable wire format that allows framework programs to talk and interact between each other and ActiveMQ easily. In order to efficiently use the message broker a unified message and topic naming pattern is utilized to achieve the required operation. Only three Python programs and simple library, used to unify and simplify the implementation of activeMQ and STOMP protocol, are needed to use the framework. A watchdog program is used to monitor, remove, add, start and stop any machine and/or its different tasks when necessary. For every machine a dedicated one and only one zoo keeper program is used to start different functions or tasks, stompShell program, needed for executing the user required workflow. The stompShell instances are used to execute any workflow jobs based on received message. A well-defined, simple and flexible message structure, based on JavaScript Object Notation (JSON), is used to build any complex workflow systems. Also, JSON format is used in configuration, communication between machines and programs. The framework is platform independent. Although, the framework is built using Python the actual workflow programs or jobs can be implemented by any programming language. The generic framework can be used in small national data centres for processing seismological and radionuclide data received from the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear

  10. Concentration distribution of trace elements: from normal distribution to Levy flights

    International Nuclear Information System (INIS)

    Kubala-Kukus, A.; Banas, D.; Braziewicz, J.; Majewska, U.; Pajek, M.

    2003-01-01

    The paper discusses a nature of concentration distributions of trace elements in biomedical samples, which were measured by using the X-ray fluorescence techniques (XRF, TXRF). Our earlier observation, that the lognormal distribution well describes the measured concentration distribution is explained here on a more general ground. Particularly, the role of random multiplicative process, which models the concentration distributions of trace elements in biomedical samples, is discussed in detail. It is demonstrated that the lognormal distribution, appearing when the multiplicative process is driven by normal distribution, can be generalized to the so-called log-stable distribution. Such distribution describes the random multiplicative process, which is driven, instead of normal distribution, by more general stable distribution, being known as the Levy flights. The presented ideas are exemplified by the results of the study of trace element concentration distributions in selected biomedical samples, obtained by using the conventional (XRF) and (TXRF) X-ray fluorescence methods. Particularly, the first observation of log-stable concentration distribution of trace elements is reported and discussed here in detail

  11. Ray tracing the Wigner distribution function for optical simulations

    Science.gov (United States)

    Mout, Marco; Wick, Michael; Bociort, Florian; Petschulat, Joerg; Urbach, Paul

    2018-01-01

    We study a simulation method that uses the Wigner distribution function to incorporate wave optical effects in an established framework based on geometrical optics, i.e., a ray tracing engine. We use the method to calculate point spread functions and show that it is accurate for paraxial systems but produces unphysical results in the presence of aberrations. The cause of these anomalies is explained using an analytical model.

  12. Distribution of Selected Trace Elements in the Bayer Process

    Directory of Open Access Journals (Sweden)

    Johannes Vind

    2018-05-01

    Full Text Available The aim of this work was to achieve an understanding of the distribution of selected bauxite trace elements (gallium (Ga, vanadium (V, arsenic (As, chromium (Cr, rare earth elements (REEs, scandium (Sc in the Bayer process. The assessment was designed as a case study in an alumina plant in operation to provide an overview of the trace elements behaviour in an actual industrial setup. A combination of analytical techniques was used, mainly inductively coupled plasma mass spectrometry and optical emission spectroscopy as well as instrumental neutron activation analysis. It was found that Ga, V and As as well as, to a minor extent, Cr are principally accumulated in Bayer process liquors. In addition, Ga is also fractionated to alumina at the end of the Bayer processing cycle. The rest of these elements pass to bauxite residue. REEs and Sc have the tendency to remain practically unaffected in the solid phases of the Bayer process and, therefore, at least 98% of their mass is transferred to bauxite residue. The interest in such a study originates from the fact that many of these trace constituents of bauxite ore could potentially become valuable by-products of the Bayer process; therefore, the understanding of their behaviour needs to be expanded. In fact, Ga and V are already by-products of the Bayer process, but their distribution patterns have not been provided in the existing open literature.

  13. Main-, minor- and trace elements distribution in human brain

    International Nuclear Information System (INIS)

    Zoeger, N.; Streli, C.; Wobrauschek, P.; Jokubonis, C.; Pepponi, G.; Roschger, P.; Bohic, S.; Osterode, W.

    2004-01-01

    Lead (Pb) is known to induce adverse health effects in humans. In fact, cognitive deficits are repeatedly described with Pb exposure, but little is known about the distribution of lead in brain. Measurements of the distribution of Pb in human brain and to study if Pb is associated with the distribution of other chemical elements such as zinc (Zn), iron (Fe) is of great interest and could reveal some hints about the metabolism of Pb in brain. To determine the local distribution of lead (Pb) and other trace elements x-ray fluorescence spectroscopy (XRF) measurements have been performed, using a microbeam setup and highest flux synchrotron radiation. Experiments have been carried out at ID-22, ESRF, Grenoble, France. The installed microprobe setup provides a monochromatic beam (17 keV) from an undulator station focused by Kirkpatrick-Baez x-ray optics to a spot size of 5 μm x 3μm. Brain slices (20 μm thickness, imbedded in paraffin and mounted on Kapton foils) from areas of the frontal cortex, thalamus and hippocampus have been investigated. Generally no significant increase in fluorescence intensities could be detected in one of the investigated brain compartments. However Pb and other (trace) elements (e.g. S, Ca, Fe, Cu, Zn, Br) could be detected in all samples and showed strong inhomogeneities across the analyzed areas. While S, Ca, Fe, Cu, Zn and Br could be clearly assigned to the investigated brain structures (vessels, etc.) Pb showed a very different behavior. In some cases (e.g. plexus choroidei) Pb was located at the walls of the vessel, whereas with other structures (e.g. blood vessel) this correlation was not found. Moreover, the detected Pb in different brain areas was individually correlated with various elements. The local distribution of the detected elements in various brain structures will be discussed in this work. (author)

  14. Wireless remote control of clinical image workflow: using a PDA for off-site distribution and disaster recovery.

    Science.gov (United States)

    Documet, Jorge; Liu, Brent J; Documet, Luis; Huang, H K

    2006-07-01

    This paper describes a picture archiving and communication system (PACS) tool based on Web technology that remotely manages medical images between a PACS archive and remote destinations. Successfully implemented in a clinical environment and also demonstrated for the past 3 years at the conferences of various organizations, including the Radiological Society of North America, this tool provides a very practical and simple way to manage a PACS, including off-site image distribution and disaster recovery. The application is robust and flexible and can be used on a standard PC workstation or a Tablet PC, but more important, it can be used with a personal digital assistant (PDA). With a PDA, the Web application becomes a powerful wireless and mobile image management tool. The application's quick and easy-to-use features allow users to perform Digital Imaging and Communications in Medicine (DICOM) queries and retrievals with a single interface, without having to worry about the underlying configuration of DICOM nodes. In addition, this frees up dedicated PACS workstations to perform their specialized roles within the PACS workflow. This tool has been used at Saint John's Health Center in Santa Monica, California, for 2 years. The average number of queries per month is 2,021, with 816 C-MOVE retrieve requests. Clinical staff members can use PDAs to manage image workflow and PACS examination distribution conveniently for off-site consultations by referring physicians and radiologists and for disaster recovery. This solution also improves radiologists' effectiveness and efficiency in health care delivery both within radiology departments and for off-site clinical coverage.

  15. Sources and distribution of trace elements in Estonian peat

    Science.gov (United States)

    Orru, Hans; Orru, Mall

    2006-10-01

    This paper presents the results of the distribution of trace elements in Estonian mires. Sixty four mires, representative of the different landscape units, were analyzed for the content of 16 trace elements (Cr, Mn, Ni, Cu, Zn, and Pb using AAS; Cd by GF-AAS; Hg by the cold vapour method; and V, Co, As, Sr, Mo, Th, and U by XRF) as well as other peat characteristics (peat type, degree of humification, pH and ash content). The results of the research show that concentrations of trace elements in peat are generally low: V 3.8 ± 0.6, Cr 3.1 ± 0.2, Mn 35.1 ± 2.7, Co 0.50 ± 0.05, Ni 3.7 ± 0.2, Cu 4.4 ± 0.3, Zn 10.0 ± 0.7, As 2.4 ± 0.3, Sr 21.9 ± 0.9, Mo 1.2 ± 0.2, Cd 0.12 ± 0.01, Hg 0.05 ± 0.01, Pb 3.3 ± 0.2, Th 0.47 ± 0.05, U 1.3 ± 0.2 μg g - 1 and S 0.25 ± 0.02%. Statistical analyses on these large database showed that Co has the highest positive correlations with many elements and ash content. As, Ni, Mo, ash content and pH are also significantly correlated. The lowest abundance of most trace elements was recorded in mires fed only by precipitation (ombrotrophic), and the highest in mires fed by groundwater and springs (minerotrophic), which are situated in the flood plains of river valleys. Concentrations usually differ between the superficial, middle and bottom peat layers, but the significance decreases depending on the type of mire in the following order: transitional mires - raised bogs - fens. Differences among mire types are highest for the superficial but not significant for the basal peat layers. The use of peat with high concentrations of trace elements in agriculture, horticulture, as fuel, for water purification etc., may pose a risk for humans: via the food chain, through inhalation, drinking water etc.

  16. Molecular simulation workflows as parallel algorithms: the execution engine of Copernicus, a distributed high-performance computing platform.

    Science.gov (United States)

    Pronk, Sander; Pouya, Iman; Lundborg, Magnus; Rotskoff, Grant; Wesén, Björn; Kasson, Peter M; Lindahl, Erik

    2015-06-09

    Computational chemistry and other simulation fields are critically dependent on computing resources, but few problems scale efficiently to the hundreds of thousands of processors available in current supercomputers-particularly for molecular dynamics. This has turned into a bottleneck as new hardware generations primarily provide more processing units rather than making individual units much faster, which simulation applications are addressing by increasingly focusing on sampling with algorithms such as free-energy perturbation, Markov state modeling, metadynamics, or milestoning. All these rely on combining results from multiple simulations into a single observation. They are potentially powerful approaches that aim to predict experimental observables directly, but this comes at the expense of added complexity in selecting sampling strategies and keeping track of dozens to thousands of simulations and their dependencies. Here, we describe how the distributed execution framework Copernicus allows the expression of such algorithms in generic workflows: dataflow programs. Because dataflow algorithms explicitly state dependencies of each constituent part, algorithms only need to be described on conceptual level, after which the execution is maximally parallel. The fully automated execution facilitates the optimization of these algorithms with adaptive sampling, where undersampled regions are automatically detected and targeted without user intervention. We show how several such algorithms can be formulated for computational chemistry problems, and how they are executed efficiently with many loosely coupled simulations using either distributed or parallel resources with Copernicus.

  17. Implementation of the electronic DDA workflow for NSSS system design

    International Nuclear Information System (INIS)

    Eom, Young Sam; Kim, Yeon Sung; Lee, Suk Hee; Kim, Mi Kyung

    1996-06-01

    For improving NSSS design quality, and productivity several cases of the nuclear developed nation's integrated management system, such as Mitsubishi's NUWINGS (Japan), AECL's CANDID (Canada) and Duke Powes's (USA) were investigated, and it was studied in this report that the system implementation of NSSS design document computerization and the major workflow process of the DDA (Document Distribution for Agreement). On the basis of the requirements of design document computerization which covered preparation, review, approval and distribution of the engineering documents, KAERI Engineering Information Management System (KEIMS) was implemented. Major effects of this report are to implement GUI panel for input and retrieval of the document index information, to setup electronic document workflow, and to provide quality assurance verification by tracing the workflow history. Major effects of NSSS design document computerization are the improvement of efficiency and reliability and the engineering cost reduction by means of the fast documents verification capability and electronic document transferring system. 2 tabs., 16 figs., 9 refs. (Author)

  18. Oceanic distribution and geochemistry of several trace elements at GEOSECS stations

    International Nuclear Information System (INIS)

    Robertson, D.E.

    1975-01-01

    The biogeochemical and physical processes operating in the oceans create substantial geographical and vertical variations in the oceanic distribution of many trace elements. These variations are brought about by diverse mechanisms and involve trace elements of a wide spectrum of physicochemical and biological behavior. Thus, a knowledge of these trace element distributions can help characterize some of the ocean processes in which they participate. (auth)

  19. Distribution of uranium and some trace elements in groundwater of eastern delta, egypt

    International Nuclear Information System (INIS)

    Hamza, M.S.; Aly, A.I.M.; Swailem, F.M.; Elreedy, W.; Nada, A.

    1986-01-01

    The distribution pattern of uranium and some trace elements in groundwater of eastern Nile delta indicate general trend of increasing trace element concentration from west and south to North and east direction. This trend is most probably due to extensive leaching from the soil due to recharge from irrigation water. The geochemical correlation among trace elements was also investigated. Possible industrial pollution in bahtim area was detected.1 fig.,4 tab

  20. 12 Trace Metals Distribution in Fish Tissues, Bottom Sediments and ...

    African Journals Online (AJOL)

    `123456789jkl''''#

    Abstract. Water samples, bottom sediments, Tilapia, and Cat Fish from Okumeshi River in Delta state of Nigeria were analysed ... Keywords: Trace metals, Fish Tissues, Water, Bottom sediments, Okumeshi River. Introduction ..... Grey Mangroove Avicemmia marina (Forsk). ... sewage treatment plant oulet pipe extension on.

  1. Distributions of traces of metals on sorption from solutions of vanadium(V)

    International Nuclear Information System (INIS)

    Evseeva, N.K.; Turnaov, A.N.; Telegin, G.F.; Kremenskaya, I.N.

    1983-01-01

    A study is made of the distributions of traces of metals between aqueous solutions of vanadium(V) and a solid reagent made by introducing di-2-ethylhexylphosphoric acid into an inert matrix: a nonionic macroporous copolymer of polystyrene with divinyl benzene (wofatit Y 29). As regards degree of extraction, the trace components fall in the series zinc > cadmium > manganese > copper > cobalt, which resemble the extractability series. The vanadium content of the solution and the concentrations of the trace components have virtually no effect on the sorption. The process is effective in concentrating trace components from solutions containing vanadium(V)

  2. Trace elements distribution in bottom sediments from Amazon River estuary

    International Nuclear Information System (INIS)

    Lara, L.B.L.S.; Nadai Fernandes, E. de; Oliveira, H. de; Bacchi, M.A.

    1994-01-01

    The Amazon River discharges into a dynamic marine environment where there have been many interactive processes affecting dissolved and particulate solids, either those settling on the shelf or reaching the ocean. Trace elemental concentration, especially of the rare earth elements, have been determined by neutron activation analysis in sixty bottom sediment samples of the Amazon River estuary, providing information for the spatial and temporal variation study of those elements. (author). 16 refs, 6 figs, 3 tabs

  3. Study of trace elements distribution in various tissues structures

    International Nuclear Information System (INIS)

    Kwiatek, W.M.; Marczewska, E.

    1994-01-01

    Many papers have been written during the past ten years about TE study in cancer and normal tissues describing the use of different methods for detection of trace elements. Concentration of TE depends strongly on the sample measured. However, according to our knowledge, the role of TE in cancerous tissue is still known. Therefore, we propose to perform an experiment which will hopefully given us more information about the relationship between the concentration of elements in different tissues. The developing industry localised near Cracow becomes a serious danger for health of it's inhabitants. The negative influence of the air pollution to the living organisms is seen not only in the nature but also in humans. Therefore we want to analyse the trace element contents in the air. Such investigation will give the information about the pollution level in the City. The pollution has its obvious negative influence to health and toxic element concentration level in blood. It is interesting to check if placenta plays an effective role in foetus protection against toxic metals. In order to study this problem, the trace element analysis of placenta tissues will be done by means of synchrotron microbeam. (author). 1 ref

  4. A study about trace element distribution in cancer tissue and serum of cancer patients

    International Nuclear Information System (INIS)

    Lee, Jong In; Lee, Eun Joo; Jung, Young Joo

    1993-01-01

    Authers analyzed the trace element distribution of cancer tissue and its corresponding normal tissue and serum of preoperative and postoperative stage in gastric, colon, breast cancer patients. Zinc and rubidium were higher in concentration in breast cancer tissue than in normal tissue. As for the distribution of trace element in serum, bromine became about 10 times higher after gastric resection. This result can be applied to experimental carcinogenesis and to relationship with other prognostic factors. (Author)

  5. Radioactivity and concentration of some trace elements in sponges distributed along the Syrian coast

    International Nuclear Information System (INIS)

    Al-Masri, M. S.; Mamish, S.; Haleem, M. A.; Ammar, I.

    2009-07-01

    natural and artificial radionuclides ( 210 Po, 210 Pb, 40 K, 137 Cs, 234 U, 238 U) and concentration of some trace elements (Zn, Cu, Pb, Cd) in several types of sponges distributed along the Syrian coast have been studied. The samples were collected from four stations distributed at the Syrian coast (Al-Basset, Lattakia, Banise, Tartous). Concentration factors (CF) for the studied radionuclides and trace elements have been calculated in order to determine the sponges types to be used as biomonitors for the radionuclides and trace elements. (authors)

  6. Distribution and Potential Toxicity of Trace Metals in the Surface Sediments of Sundarban Mangrove Ecosystem, Bangladesh

    Science.gov (United States)

    Kumar, A.; Ramanathan, A.; Mathukumalli, B. K. P.; Datta, D. K.

    2014-12-01

    The distribution, enrichment and ecotoxocity potential of Bangladesh part of Sundarban mangrove was investigated for eight trace metals (As, Cd, Cr, Cu, Fe, Mn, Pb and Zn) using sediment quality assessment indices. The average concentration of trace metals in the sediments exceeded the crustal abundance suggesting sources other than natural in origin. Additionally, the trace metals profile may be a reflection of socio-economic development in the vicinity of Sundarban which further attributes trace metals abundance to the anthropogenic inputs. Geoaccumulation index suggests moderately polluted sediment quality w.r.t. Ni and As and background concentrations for Al, Fe, Mn, Cu, Zn, Pb, Co, As and Cd. Contamination factor analysis suggested low contamination by Zn, Cr, Co and Cd, moderate by Fe, Mn, Cu and Pb while Ni and As show considerable and high contamination, respectively. Enrichment factors for Ni, Pb and As suggests high contamination from either biota or anthropogenic inputs besides natural enrichment. As per the three sediment quality guidelines, Fe, Mn, Cu, Ni, Co and As would be more of a concern with respect to ecotoxicological risk in the Sundarban mangroves. The correlation between various physiochemical variables and trace metals suggested significant role of fine grained particles (clay) in trace metal distribution whereas owing to low organic carbon content in the region the organic complexation may not be playing significant role in trace metal distribution in the Sundarban mangroves.

  7. Log-stable concentration distributions of trace elements in biomedical samples

    International Nuclear Information System (INIS)

    Kubala-Kukus, A.; Kuternoga, E.; Braziewicz, J.; Pajek, M.

    2004-01-01

    In the present paper, which follows our earlier observation that the asymmetric and long-tailed concentration distributions of trace elements in biomedical samples, measured by the X-ray fluorescence techniques, can be modeled by the log-stable distributions, further specific aspects of this observation are discussed. First, we demonstrate that, typically, for a quite substantial fraction (10-20%) of trace elements studied in different kinds of biomedical samples, the measured concentration distributions are described in fact by the 'symmetric' log-stable distributions, i.e. the asymmetric distributions which are described by the symmetric stable distributions. This observation is, in fact, expected for the random multiplicative process, which models the concentration distributions of trace elements in the biomedical samples. The log-stable nature of concentration distribution of trace elements results in several problems of statistical nature, which have to be addressed in XRF data analysis practice. Consequently, in the present paper, the following problems, namely (i) the estimation of parameters for stable distributions and (ii) the testing of the log-stable nature of the concentration distribution by using the Anderson-Darling (A 2 ) test, especially for symmetric stable distributions, are discussed in detail. In particular, the maximum likelihood estimation and Monte Carlo simulation techniques were used, respectively, for estimation of stable distribution parameters and calculation of the critical values for the Anderson-Darling test. The discussed ideas are exemplified by the results of the study of trace element concentration distributions in selected biomedical samples, which were obtained by using the X-ray fluorescence (XRF, TXRF) methods

  8. Distribution of trace elements in moss biomonitors near Mumbai

    International Nuclear Information System (INIS)

    Chakrabortty, S.; Paratkar, G.T.; Jha, S.K.; Puranik, V.D.

    2004-01-01

    Elemental composition of mosses from Mahabaleshwar, a remote hill station near Mumbai was measured. Trace element profiles of two different species of mosses were compared. Chemical analysis for washed and unwashed moss samples was done using Energy Dispersive X-ray Fluorescence Spectrometry (EDXRF) and Instrumental Neutron Activation Analysis (INAA) techniques in an attempt to understand the variation. The comparative concentration of Al, Sr , Zn and Rb in both the mosses reflected the order of abundance of metal in the soil. The enrichment factor of Pb, was found more in Pinnatella alopccuroides than the other one whereas enrichment factor of Cr was more in Pterobryopsis flexiceps compared to Pinnatella alopccuroides. So they can be preferentially used as bioindicators for respective elements. (author)

  9. ATLAS Grid Workflow Performance Optimization

    CERN Document Server

    Elmsheuser, Johannes; The ATLAS collaboration

    2018-01-01

    The CERN ATLAS experiment grid workflow system manages routinely 250 to 500 thousand concurrently running production and analysis jobs to process simulation and detector data. In total more than 300 PB of data is distributed over more than 150 sites in the WLCG. At this scale small improvements in the software and computing performance and workflows can lead to significant resource usage gains. ATLAS is reviewing together with CERN IT experts several typical simulation and data processing workloads for potential performance improvements in terms of memory and CPU usage, disk and network I/O. All ATLAS production and analysis grid jobs are instrumented to collect many performance metrics for detailed statistical studies using modern data analytics tools like ElasticSearch and Kibana. This presentation will review and explain the performance gains of several ATLAS simulation and data processing workflows and present analytics studies of the ATLAS grid workflows.

  10. Distribution of trace elements in the coastal sea sediments of Maslinica Bay, Croatia

    Science.gov (United States)

    Mikulic, Nenad; Orescanin, Visnja; Elez, Loris; Pavicic, Ljiljana; Pezelj, Durdica; Lovrencic, Ivanka; Lulic, Stipe

    2008-02-01

    Spatial distributions of trace elements in the coastal sea sediments and water of Maslinica Bay (Southern Adriatic), Croatia and possible changes in marine flora and foraminifera communities due to pollution were investigated. Macro, micro and trace elements’ distributions in five granulometric fractions were determined for each sediment sample. Bulk sediment samples were also subjected to leaching tests. Elemental concentrations in sediments, sediment extracts and seawater were measured by source excited energy dispersive X-ray fluorescence (EDXRF). Concentrations of the elements Cr, Cu, Zn, and Pb in bulk sediment samples taken in the Maslinica Bay were from 2.1 to over six times enriched when compared with the background level determined for coarse grained carbonate sediments. A low degree of trace elements leaching determined for bulk sediments pointed to strong bonding of trace elements to sediment mineral phases. The analyses of marine flora pointed to higher eutrophication, which disturbs the balance between communities and natural habitats.

  11. Evaluation of distribution patterns and decision of distribution coefficients of trace elements in high-purity aluminium by INAA

    International Nuclear Information System (INIS)

    Hayakawa, Yasuhiro; Suzuki, Shogo; Hirai, Shoji

    1986-01-01

    Recently, a high-purity aluminium has been used in semi-coductor device, so on. It was required that trace impurities should be reduced and that its content should be quantitatively evaluated. In this study, distribution patterns of many trace impurities in 99.999 % aluminium ingots, which was purified using a normal freezing method, were evaluated by an INAA. The effective distribution coefficient k for each detected elements was calculated using a theoretical distribution equation in the normal freezing method. As a result, the elements of k 1 was Hf. Especially, La, Sm, U and Th could be effectively purified, but Sc and Hf could be scarcely purified. Further more, it was found that the slower freezing gave the effective distribution coefficient close to the equilibrium distribution coefficient, and that the effective distribution coefficient became smaller with the larger atomic radius. (author)

  12. Distribution of trace and minor elements in Hungarian spice paprika plants

    Energy Technology Data Exchange (ETDEWEB)

    Sziklai, I L; Oerdoegh, M; Szabo, E; Molnar, E

    1988-06-01

    Detailed investigations were carried out to study the distribution of trace and minor elements in different parts (fruit, seed and rib, peduncle, stem, leaf, root) of ripe Hungarian spice paprika plants. Two varieties were analyzed for their Cl, Co, Fe, K, Mg, Mn, Na, Rb, Sc, V and Zn content by non-destructive neutron activation analysis. The results showed that the iron contents of the samples were much higher than those of the other trace elements. For trace elements Co, Fe, Mn, Sc, V and Zn a considerable enrichment was observed in the leaf, while the Rb and K, Na, Mg showed accumulation mainly in the peduncle. (author) 8 refs.; 3 tabs.

  13. Distribution of particulate trace metals in the western Bay of Bengal

    Digital Repository Service at National Institute of Oceanography (India)

    Satyanarayana, D.; Murty, P.V.S.P.; Sarma, V.V.

    continuous increase from surface to bottom in the case ofFe, Ni, which appeared to be related to a combination offactors suchas authigenicprecipita tion/scavenging, rcsuspension of bottom rich sediments, and diffusion followed by precipitation at sedimcnt... ), most of these studies do not provide information onthe interaction of trace elements with particulate matter. The present study deals with the distribution of particulate trace metals (Fe, Mn, Co, Ni, Cu, Pb, Zn and Cd) and their possible interactions...

  14. A Lagrangian View of Stratospheric Trace Gas Distributions

    Science.gov (United States)

    Schoeberl, M. R.; Sparling, L.; Dessler, A.; Jackman, C. H.; Fleming, E. L.

    1998-01-01

    As a result of photochemistry, some relationship between the stratospheric age-of-air and the amount of tracer contained within an air sample is expected. The existence of such a relationship allows inferences about transport history to be made from observations of chemical tracers. This paper lays down the conceptual foundations for the relationship between age and tracer amount, developed within a Lagrangian framework. In general, the photochemical loss depends not only on the age of the parcel but also on its path. We show that under the "average path approximation" that the path variations are less important than parcel age. The average path approximation then allows us to develop a formal relationship between the age spectrum and the tracer spectrum. Using the relation between the tracer and age spectra, tracer-tracer correlations can be interpreted as resulting from mixing which connects parts of the single path photochemistry curve, which is formed purely from the action of photochemistry on an irreducible parcel. This geometric interpretation of mixing gives rise to constraints on trace gas correlations, and explains why some observations are do not fall on rapid mixing curves. This effect is seen in the ATMOS observations.

  15. Implementing bioinformatic workflows within the bioextract server

    Science.gov (United States)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  16. Integration of services into workflow applications

    CERN Document Server

    Czarnul, Pawel

    2015-01-01

    Describing state-of-the-art solutions in distributed system architectures, Integration of Services into Workflow Applications presents a concise approach to the integration of loosely coupled services into workflow applications. It discusses key challenges related to the integration of distributed systems and proposes solutions, both in terms of theoretical aspects such as models and workflow scheduling algorithms, and technical solutions such as software tools and APIs.The book provides an in-depth look at workflow scheduling and proposes a way to integrate several different types of services

  17. Interpretation of aerosol trace metal particle size distributions

    International Nuclear Information System (INIS)

    Johansson, T.B.; Van Grieken, R.E.; Winchester, J.W.

    1974-01-01

    Proton-induced X-ray emission (PIXE) analysis is capable of rapid routine determination of 10--15 elements present in amounts greater than or equal to 1 ng simultaneously in aerosol size fractions as collected by single orifice impactors over short periods of time. This enables detailed study of complex relationships between elements detected. Since absolute elemental concentrations may be strongly influenced by meteorological and topographical conditions, it is useful to normalize to a reference element. Comparison between the ratios of concentrations with aerosol and corresponding values for anticipated sources may lead to the identification of important sources for the elements. Further geochemical insights may be found through linear correlation coefficients, regression analysis, and cluster analysis. By calculating correlations for elemental pairs, an indication of the degree of covariance between the elements is obtained. Preliminary results indicate that correlations may be particle size dependent. A high degree of covariance may be caused either by a common source or may only reflect the conservative nature of the aerosol. In a regression analysis, by plotting elemental pairs and estimating the regression coefficients, we may be able to conclude if there is more than one source operating for a given element in a certain size range. Analysis of clustering of several elements, previously investigated for aerosol filter samples, can be applied to the analysis of aerosol size fractions. Careful statistical treatment of elemental concentrations as a function of aerosol particle size may thus yield significant information on the generation, transport and deposition of trace metals in the atmosphere

  18. Provenance-Based Debugging and Drill-Down in Data-Oriented Workflows

    KAUST Repository

    Ikeda, Robert; Cho, Junsang; Fang, Charlie; Salihoglu, Semih; Torikai, Satoshi; Widom, Jennifer

    2012-01-01

    Panda (for Provenance and Data) is a system that supports the creation and execution of data-oriented workflows, with automatic provenance generation and built-in provenance tracing operations. Workflows in Panda are arbitrary a cyclic graphs

  19. The failure trace archive : enabling comparative analysis of failures in diverse distributed systems

    NARCIS (Netherlands)

    Kondo, D.; Javadi, B.; Iosup, A.; Epema, D.H.J.

    2010-01-01

    With the increasing functionality and complexity of distributed systems, resource failures are inevitable. While numerous models and algorithms for dealing with failures exist, the lack of public trace data sets and tools has prevented meaningful comparisons. To facilitate the design, validation,

  20. Concurrency & Asynchrony in Declarative Workflows

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2015-01-01

    of concurrency in DCR Graphs admits asynchronous execution of declarative workflows both conceptually and by reporting on a prototype implementation of a distributed declarative workflow engine. Both the theoretical development and the implementation is supported by an extended example; moreover, the theoretical....... In this paper, we pro- pose a notion of concurrency for declarative process models, formulated in the context of Dynamic Condition Response (DCR) graphs, and exploiting the so-called “true concurrency” semantics of Labelled Asynchronous Transition Systems. We demonstrate how this semantic underpinning...

  1. Evaluation of Circle Diameter by Distributed Tactile Information in Active Tracing

    Directory of Open Access Journals (Sweden)

    Hiroyuki Nakamoto

    2013-01-01

    Full Text Available Active touch with voluntary movement on the surface of an object is important for human to obtain the local and detailed features on it. In addition, the active touch is considered to enhance the human spatial resolution. In order to improve dexterity performance of multifinger robotic hands, it is necessary to study an active touch method for robotic hands. In this paper, first, we define four requirements of a tactile sensor for active touch and design a distributed tactile sensor model, which can measure a distribution of compressive deformation. Second, we suggest a measurement process with the sensor model, a synthesis method of distributed deformations. In the experiments, a five-finger robotic hand with tactile sensors traces on the surface of cylindrical objects and evaluates the diameters. We confirm that the hand can obtain more information of the diameters by tracing the finger.

  2. Smallest eigenvalue distribution of the fixed-trace Laguerre beta-ensemble

    International Nuclear Information System (INIS)

    Chen Yang; Liu Dangzheng; Zhou Dasheng

    2010-01-01

    In this paper we study the entanglement of the reduced density matrix of a bipartite quantum system in a random pure state. It transpires that this involves the computation of the smallest eigenvalue distribution of the fixed-trace Laguerre ensemble of N x N random matrices. We showed that for finite N the smallest eigenvalue distribution may be expressed in terms of Jack polynomials. Furthermore, based on the exact results, we found a limiting distribution when the smallest eigenvalue is suitably scaled with N followed by a large N limit. Our results turn out to be the same as the smallest eigenvalue distribution of the classical Laguerre ensembles without the fixed-trace constraint. This suggests in a broad sense, the global constraint does not influence local correlations, at least, in the large N limit. Consequently, we have solved an open problem: the determination of the smallest eigenvalue distribution of the reduced density matrix-obtained by tracing out the environmental degrees of freedom-for a bipartite quantum system of unequal dimensions.

  3. Trace element distributions in aquatic sediments of Danang - Hoian area, Vietnam

    Energy Technology Data Exchange (ETDEWEB)

    Thuy, H.T.T.; Tobschall, H.J. [Erlangen-Nuernberg Univ., Erlangen (Germany). Inst. fuer Geologie und Mineralogie; An, P.V. [University of Mining and Geology, Hanoi (Viet Nam)

    2000-05-01

    Distribution of the trace elements Cr, Cu, Ni, Pb and Zn in surficial sediments of the river/sea environment in Danang - Hoian area (Vietnam) was investigated to examine the degree of metal pollution caused by anthropogenic activities. Point sources from domestic and industrial wastes are identified as dominant contributors of trace element accumulation. Surficial sediments of Hoian River show extremely high total concentrations of Cu (Average Concentration 295 {mu}g/g), Ni (AC 112 {mu}g/g), Pb (AC 396 {mu}g/g) and Zn (AC 429 mug/g) that exceed assigned safety levels ER-M. Similarly, the sediments of Han River show high Pb (AC 188 {mu}g/g) and Zn (AC 282 {mu}g/g) contents. In marine sediments of Thanhbinh beach Pb is also enriched (138 {mu}g/g) above guideline levels. In contrast the sediments of the Cude River are dominated by trace element concentrations close to background values. (orig.)

  4. WS-VLAM: A GT4 based workflow management system

    NARCIS (Netherlands)

    Wibisono, A.; Vasyunin, D.; Korkhov, V.; Zhao, Z.; Belloum, A.; de Laat, C.; Adriaans, P.; Hertzberger, B.

    2007-01-01

    Generic Grid middleware, e.g., Globus Toolkit 4 (GT4), provides basic services for scientific workflow management systems to discover, store and integrate workflow components. Using the state of the art Grid services can advance the functionality of workflow engine in orchestrating distributed Grid

  5. Trace element composition and distribution in micron area of dinosaur eggshell fossils determined by proton microprobe

    International Nuclear Information System (INIS)

    Chen Youhong; Zhu Jieqing; Wang Xiaohong; Wang Yimin

    1997-01-01

    The scanning proton microprobe and micro-PIXE quantitative analysis technique have been used to determine composition and distribution of the trace elements in micron areas of dinosaur eggshell fossils from the stratum of Upper Cretaceous system at Nanxiong Basin in Guangdong Province, China. The study shows that the trace elements mainly include Ti, V, Cr, Mn, Co, Ni, Cu, Zn, As, Rb, Sr, Y, Zr, Sb, Ba and Pb in the micron area, but they present different distributions. While the element Sr is mainly enriched in the near surface layer, others mainly reside in the near inner layer. A preliminary discussion on the reason of the dinosaur extinction is given based on the above study

  6. Trace element composition and distribution in micron area of dinosaur eggshell fossils determined by proton microprobe

    International Nuclear Information System (INIS)

    Chen Youhong; Zhu Jieqing; Wang Xiaohong; Wang Yimin

    1997-01-01

    The scanning proton microprobe and micro-PIXE quantitative analysis technique have been used to determine composition and distribution of the trace elements in micron areas of dinosaur eggshell fossils from the stratum of Upper Cretaceous system at Nanxiong Basin in Guangdong Province, China. The study shows that the trace elements mainly include Ti, V, Cr, Mn, Co, Ni, Cu, Zn, As, Rb, Sr, Y, Zr, Sb, Ba and Pb in the micron area, but they present different distributions. While the elements Sr is mainly enriched in the near surface layer, others mainly reside in the near inner layer. A preliminary discussion on the reason of the dinosaur extinction is given based on the above study

  7. Logical provenance in data-oriented workflows?

    KAUST Repository

    Ikeda, R.

    2013-04-01

    We consider the problem of defining, generating, and tracing provenance in data-oriented workflows, in which input data sets are processed by a graph of transformations to produce output results. We first give a new general definition of provenance for general transformations, introducing the notions of correctness, precision, and minimality. We then determine when properties such as correctness and minimality carry over from the individual transformations\\' provenance to the workflow provenance. We describe a simple logical-provenance specification language consisting of attribute mappings and filters. We provide an algorithm for provenance tracing in workflows where logical provenance for each transformation is specified using our language. We consider logical provenance in the relational setting, observing that for a class of Select-Project-Join (SPJ) transformations, logical provenance specifications encode minimal provenance. We have built a prototype system supporting the features and algorithms presented in the paper, and we report a few preliminary experimental results. © 2013 IEEE.

  8. Distribution of trace species in power plant streams: A European perspective

    International Nuclear Information System (INIS)

    Meij, R.

    1994-01-01

    In the Netherlands only pulverized coal-fired dry bottom boilers are installed. The flue gases are cleaned by high-efficiency cold-side electrostatic precipitators (ESPs) and in all large coal-fired power plants by flue-gas desulfurization (FGD) installations of the lime(stone)/gypsum process. KEMA has performed a large research program on the fate of (trace) elements at coal-fired power plants. A great deal of attention has been paid to the concentrations and distribution of trace elements in coal, in ash and in the vapor phase in the flue gases. Sixteen balance studies of coal-fired power plants, where coal imported from various countries is fired, have been performed. With the information provided by these studies the enrichment factors for the trace elements in ash and the vaporization percentage of the minor and trace elements in the flue gases have been calculated. Using these enrichment factors and vaporization percentages combined with data on the concentration in the coal, the concentrations in the ash and in the vapor phase in the flue gases can be predicted. The emission into the air of trace elements occurs in the solid state (fly ash) and in the gaseous state. The emissions in the solid state are low due to the high degree of removal in the ESPs. The emissions in the gaseous phase are, relatively speaking, more important. In an FGD both emissions are further diminished. In the next section the behavior of elements in the boiler and ESP will be discussed. The influence of the electrostatic precipitators will be reviewed the section thereafter, followed by the fate of gaseous minor and trace elements. And finally the behavior of elements in the FGD will be treated in the last section

  9. Workflow Support for Advanced Grid-Enabled Computing

    OpenAIRE

    Xu, Fenglian; Eres, M.H.; Tao, Feng; Cox, Simon J.

    2004-01-01

    The Geodise project brings computer scientists and engineer's skills together to build up a service-oriented computing environmnet for engineers to perform complicated computations in a distributed system. The workflow tool is a front GUI to provide a full life cycle of workflow functions for Grid-enabled computing. The full life cycle of workflow functions have been enhanced based our initial research and development. The life cycle starts with a composition of a workflow, followed by an ins...

  10. A Multilevel Secure Workflow Management System

    National Research Council Canada - National Science Library

    Kang, Myong H; Froscher, Judith N; Sheth, Amit P; Kochut, Krys J; Miller, John A

    1999-01-01

    The Department of Defense (DoD) needs multilevel secure (MLS) workflow management systems to enable globally distributed users and applications to cooperate across classification levels to achieve mission critical goals...

  11. Direct observation of two dimensional trace gas distributions with an airborne Imaging DOAS instrument

    Directory of Open Access Journals (Sweden)

    K.-P. Heue

    2008-11-01

    Full Text Available In many investigations of tropospheric chemistry information about the two dimensional distribution of trace gases on a small scale (e.g. tens to hundreds of metres is highly desirable. An airborne instrument based on imaging Differential Optical Absorption Spectroscopy has been built to map the two dimensional distribution of a series of relevant trace gases including NO2, HCHO, C2H2O2, H2O, O4, SO2, and BrO on a scale of 100 m.

    Here we report on the first tests of the novel aircraft instrument over the industrialised South African Highveld, where large variations in NO2 column densities in the immediate vicinity of several sources e.g. power plants or steel works, were measured. The observed patterns in the trace gas distribution are interpreted with respect to flux estimates, and it is seen that the fine resolution of the measurements allows separate sources in close proximity to one another to be distinguished.

  12. Distribution characteristics of available trace elements in soil from a reclaimed land in a mining area of north Shaanxi, China

    Directory of Open Access Journals (Sweden)

    Li Zhanbin

    2013-06-01

    Full Text Available Through field and laboratory tests we studied the temporal and spatial variation in the soil content of four available trace elements :copper(Cu, iron(Fe, manganese(Mn and zinc (Zn, to analyze their distribution characteristics in reclaimed mining land under different reclamation conditions. The available trace elements content varied considerably with different land reclamation patterns. Extended reclamation time was helpful for the recovery of the available trace element content in the soil, and after more than eight years of soil reclamation, the content of available trace elements was closer to or greater than that in soil under natural conditions. Various treatment measures significantly influenced the content and distribution of available trace elements in the soil, and reasonable artificial treatments, including covering the soil and growing shrubs and herbaceous plants, increased the content of available trace elements.

  13. Preliminary investigation on determination of radionuclide distribution in field tracing test site

    International Nuclear Information System (INIS)

    Tanaka, Tadao; Mukai, Masayuki; Takebe, Shinichi; Guo Zede; Li Shushen; Kamiyama, Hideo.

    1993-12-01

    Field tracing tests for radionuclide migration have been conducted by using 3 H, 60 Co, 85 Sr and 134 Cs, in the natural unsaturated loess zone at field test site of China Institute for Radiation Protection. It is necessary to obtain confidable distribution data of the radionuclides in the test site, in order to evaluate exactly the migration behavior of the radionuclides in situ. An available method to determine the distribution was proposed on the basis of preliminary discussing results on sampling method of soils from the test site and analytical method of radioactivity in the soils. (author)

  14. Using Mobile Agents to Implement Workflow System

    Institute of Scientific and Technical Information of China (English)

    LI Jie; LIU Xian-xing; GUO Zheng-wei

    2004-01-01

    Current workflow management systems usually adopt the existing technologies such as TCP/IP-based Web technologies and CORBA as well to fulfill the bottom communications.Very often it has been considered only from a theoretical point of view, mainly for the lack of concrete possibilities to execute with elasticity.MAT (Mobile Agent Technology) represents a very attractive approach to the distributed control of computer networks and a valid alternative to the implementation of strategies for workflow system.This paper mainly focuses on improving the performance of workflow system by using MAT.Firstly, the performances of workflow systems based on both CORBA and mobile agent are summarized and analyzed; Secondly, the performance contrast is presented by introducing the mathematic model of each kind of data interaction process respectively.Last, a mobile agent-based workflow system named MAWMS is presented and described in detail.

  15. Study of particle size and trace metal distribution in atmospheric aerosols of islamabad

    International Nuclear Information System (INIS)

    Shah, M.H.; Shaheen, N.

    2009-01-01

    Atmospheric aerosol samples were collected on glass fibre filters using high volume air samplers Half of each aerosol sample was solubilized in nitric acid/hydrochloric acid based wet digestion method and the concentration of trace metals was determined through flame atomic absorption spectrophotometer. Among the eight trace metals analyzed, mean concentration recorded for Zn (844 ng/m3), Fe (642 ng/m3) and Pb (253 ng/m3), was found to be higher than mean levels of Mn, Cr and Co. The size distribution of the collected particulate samples was carried out on mastersizer, which revealed PM/sub 100-10/ as the major fraction (55 %) followed by PM/sub 2.5-10/ (28 %). The correlation study evidenced a strong tendency of trace metals to be associated with fine particulate fractions. The atmospheric trace metal levels showed that the mean metal concentrations in the atmosphere of Islamabad are far higher than background and European urban sites mainly due to the anthropogenic emissions. (author)

  16. Spatial Distribution of Trace Elements in Rice Field at Prafi District Manokwari

    Directory of Open Access Journals (Sweden)

    Aplena Elen S. Bless

    2016-08-01

    Full Text Available Mapping spatial variability of trace elements in rice Ḁeld is necessary to obtain soil quality information to en-hance rice production. ἀis study was aimed to measure concentration and distribution of Zn, Cu, Fe, Pb, and Cd in two diᴀerent sites (SP1, SP2 of PraḀ rice Ḁeld in Manokwari West Papua. ἀe representative 26 soil samples were analysed for their available trace metal concentration (DTPA, soil pH, and C-organic and soil texture. ἀe result indicated that Fe toxicity and Zn deḀcient problems were encountered in both sites.  Rice Ḁeld in SP2 was more deḀcient in Zn than SP1. Site with the highest trace elements (Zn, Fe, Cu, and Cd concentration had low soil pH and high C-organic. Acidic soil has higher solubility of metals; while high C-organic could improve the formation of dissolve organic carbon-metal binding, hence it improving the trace metals concentration in soil solution.

  17. Workflow in Almaraz NPP

    International Nuclear Information System (INIS)

    Gonzalez Crego, E.; Martin Lopez-Suevos, C.

    2000-01-01

    Almaraz NPP decided to incorporate Workflow into its information system in response to the need to provide exhaustive follow-up and monitoring of each phase of the different procedures it manages. Oracle's Workflow was chosen for this purpose and it was integrated with previously developed applications. The objectives to be met in the incorporation of Workflow were as follows: Strict monitoring of procedures and processes. Detection of bottlenecks in the flow of information. Notification of those affected by pending tasks. Flexible allocation of tasks to user groups. Improved monitoring of management procedures. Improved communication. Similarly, special care was taken to: Integrate workflow processes with existing control panels. Synchronize workflow with installation procedures. Ensure that the system reflects use of paper forms. At present the Corrective Maintenance Request module is being operated using Workflow and the Work Orders and Notice of Order modules are about to follow suit. (Author)

  18. A method to mine workflows from provenance for assisting scientific workflow composition

    NARCIS (Netherlands)

    Zeng, R.; He, X.; Aalst, van der W.M.P.

    2011-01-01

    Scientific workflows have recently emerged as a new paradigm for representing and managing complex distributed scientific computations and are used to accelerate the pace of scientific discovery. In many disciplines, individual workflows are large and complicated due to the large quantities of data

  19. A virtual data language and system for scientific workflow management in data grid environments

    Science.gov (United States)

    Zhao, Yong

    With advances in scientific instrumentation and simulation, scientific data is growing fast in both size and analysis complexity. So-called Data Grids aim to provide high performance, distributed data analysis infrastructure for data- intensive sciences, where scientists distributed worldwide need to extract information from large collections of data, and to share both data products and the resources needed to produce and store them. However, the description, composition, and execution of even logically simple scientific workflows are often complicated by the need to deal with "messy" issues like heterogeneous storage formats and ad-hoc file system structures. We show how these difficulties can be overcome via a typed workflow notation called virtual data language, within which issues of physical representation are cleanly separated from logical typing, and by the implementation of this notation within the context of a powerful virtual data system that supports distributed execution. The resulting language and system are capable of expressing complex workflows in a simple compact form, enacting those workflows in distributed environments, monitoring and recording the execution processes, and tracing the derivation history of data products. We describe the motivation, design, implementation, and evaluation of the virtual data language and system, and the application of the virtual data paradigm in various science disciplines, including astronomy, cognitive neuroscience.

  20. Agreement Workflow Tool (AWT)

    Data.gov (United States)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  1. The flux distribution from a 1.25m2 target aligned heliostat: comparison of ray tracing and experimental results

    CSIR Research Space (South Africa)

    Maliage, M

    2012-05-01

    Full Text Available The purpose of this paper is to validate SolTrace for concentrating solar investigations at CSIR by means of a test case: the comparison of the flux distribution in the focal spot of a 1.25 m2 target aligned heliostat predicted by the ray tracing...

  2. Selected trace elements in the Sacramento River, California: Occurrence and distribution

    Science.gov (United States)

    Taylor, Howard E.; Antweiler, Ronald C.; Roth, David A.; Dileanis, Peter D.; Alpers, Charles N.

    2012-01-01

    The impact of trace elements from the Iron Mountain Superfund site on the Sacramento River and selected tributaries is examined. The concentration and distribution of many trace elements—including aluminum, arsenic, boron, barium, beryllium, bismuth, cadmium, cerium, cobalt, chromium, cesium, copper, dysprosium, erbium, europium, iron, gadolinium, holmium, potassium, lanthanum, lithium, lutetium, manganese, molybdenum, neodymium, nickel, lead, praseodymium, rubidium, rhenium, antimony, selenium, samarium, strontium, terbium, thallium, thulium, uranium, vanadium, tungsten, yttrium, ytterbium, zinc, and zirconium—were measured using a combination of inductively coupled plasma-mass spectrometry and inductively coupled plasma-atomic emission spectrometry. Samples were collected using ultraclean techniques at selected sites in tributaries and the Sacramento River from below Shasta Dam to Freeport, California, at six separate time periods from mid-1996 to mid-1997. Trace-element concentrations in dissolved (ultrafiltered [0.005-μm pore size]) and colloidal material, isolated at each site from large volume samples, are reported. For example, dissolved Zn ranged from 900 μg/L at Spring Creek (Iron Mountain acid mine drainage into Keswick Reservoir) to 0.65 μg/L at the Freeport site on the Sacramento River. Zn associated with colloidal material ranged from 4.3 μg/L (colloid-equivalent concentration) in Spring Creek to 21.8 μg/L at the Colusa site on the Sacramento River. Virtually all of the trace elements exist in Spring Creek in the dissolved form. On entering Keswick Reservoir, the metals are at least partially converted by precipitation or adsorption to the particulate phase. Despite this observation, few of the elements are removed by settling; instead the majority is transported, associated with colloids, downriver, at least to the Bend Bridge site, which is 67 km from Keswick Dam. Most trace elements are strongly associated with the colloid phase going

  3. Spatial Distribution and Fuzzy Health Risk Assessment of Trace Elements in Surface Water from Honghu Lake.

    Science.gov (United States)

    Li, Fei; Qiu, Zhenzhen; Zhang, Jingdong; Liu, Chaoyang; Cai, Ying; Xiao, Minsi

    2017-09-04

    Previous studies revealed that Honghu Lake was polluted by trace elements due to anthropogenic activities. This study investigated the spatial distribution of trace elements in Honghu Lake, and identified the major pollutants and control areas based on the fuzzy health risk assessment at screening level. The mean total content of trace elements in surface water decreased in the order of Zn (18.04 μg/L) > Pb (3.42 μg/L) > Cu (3.09 μg/L) > Cr (1.63 μg/L) > As (0.99 μg/L) > Cd (0.14 μg/L), within limits of Drinking Water Guidelines. The results of fuzzy health risk assessment indicated that there was no obvious non-carcinogenic risk to human health, while carcinogenic risk was observed in descending order of As > Cr > Cd > Pb. As was regarded to have the highest carcinogenic risk among selected trace elements because it generally accounted for 64% of integrated carcinogenic risk. Potential carcinogenic risk of trace elements in each sampling site was approximately at medium risk level (10 -5 to 10 -4 ). The areas in the south (S4, S13, and S16) and northeast (S8, S18, and S19) of Honghu Lake were regarded as the risk priority control areas. However, the corresponding maximum memberships of integrated carcinogenic risk in S1, S3, S10-S13, S15, and S18 were of relatively low credibility (50-60%), and may mislead the decision-makers in identifying the risk priority areas. Results of fuzzy assessment presented the subordinate grade and corresponding reliability of risk, and provided more full-scale results for decision-makers, which made up for the deficiency of certainty assessment to a certain extent.

  4. Concentrations and distributions of trace and minor elements in Chinese and Canadian coals and ashes

    International Nuclear Information System (INIS)

    Sun Jingxin; Jervis, R.E.

    1987-01-01

    A total of 35 trace and minor elements including some of environmental significance were determined in each of a selection of 15 Chinese and 6 Canadian thermal coals and their ashes by using the SLOWPOKE-2 nuclear reactor facility of the University of Toronto. The concentrations and distributions of these constituents among the coals and their combustion products (viz. ash and volatile matter) are presented. The detailed results showed wide variations in trace impurity concentrations (up to a factor of 100 and more) among the coals studied. Values for elemental enrichment factors (EF) relative to normal crustal abundances indicated that only As(EF=13), Br(5.7), I(16), S(230), Sb(11) and Se(320) were appreciably enriched in coal. (author) 14 refs.; 5 tabs

  5. Vertical distribution of particulate trace elements in a street canyon determined by PIXE analysis

    International Nuclear Information System (INIS)

    Raunemaa, T.; Hautojaervi, A.; Kaisla, K.; Gerlander, M.

    1981-01-01

    Suspended particles in a street canyon were investigated by collecting air particulate matter on thin filters at heigths 2.3 to 20.5 m. The weather parameters and traffic characteristics were registered during the collection. Quantitative analysis of 15 trace elements from AI to Pb was carried out by the PIXE method using 1.8-2.0 MeV protons. The concentration of lead was found to decrease exponentially when going from street level to roof level. Almost all the trace elements analyzed were found to fall into two groups with different vertical distributions. The collected matter above 10 m height was found to be due mainly to combustion originated motor vehicle exhaust, the matter below 10 m to soil originated dust. (orig.)

  6. Distribution of trace elements between clays and zeolites and aqueous solutions similar to sea water

    International Nuclear Information System (INIS)

    Berger, G.

    1992-01-01

    The mechanisms of solid-solution partitioning during mineral crystallization in sea water have been investigated for Rb, Cs, Co, Sr, U, Th and lanthanides as trace elements, and Fe, Mg-chlorite/smectites and Na-zeolites as solid phases. These minerals have been synthesized by alteration at 40 o C in saline solutions of silicate glasses of appropriate compositions. The variation of the distribution coefficients (D) with the concentration of the elements as well as competition mechanisms between elements of analogous crystallochemical properties have been studied. The ''trapping'' of trace elements is shown to be governed by two mechanisms, according to D values or to water-rock ratios. At low values of D the incorporation of elements is controlled only by D, whereas at high values it is controlled by the number of available crystallochemical sites. (Author)

  7. Distribution, provenance and early diagenesis of major and trace metals in sediment cores from the Mandovi estuary, western India

    Digital Repository Service at National Institute of Oceanography (India)

    Prajith, A.; Rao, V.P.; Chakraborty, P.

    Major elements and trace metals were analyzed in four sediment cores recovered along a transect in the Mandovi estuary for their distribution, provenance and early diagenesis. The sediments were clayey silts in cores from the upper/lower estuary...

  8. Distribution and seasonal variation of trace metals in surface sediments of the Mandovi estuary, west coast of India

    Digital Repository Service at National Institute of Oceanography (India)

    Alagarsamy, R.

    The concentration and distribution of selected trace metals in surface sediments of the Mandovi estuary were studied to determine the extent of anthropogenic inputs from mining activities and to estimate the effects of monsoon on geochemical...

  9. A Strategy for an MLS Workflow Management System

    National Research Council Canada - National Science Library

    Kang, Myong H; Froscher, Judith N; Eppinger, Brian J; Moskowitz, Ira S

    1999-01-01

    .... Therefore, DoD needs MLS workflow management systems (WFMS) to enable globally distributed users and existing applications to cooperate across classification domains to achieve mission critical goals...

  10. Can liming change root anatomy, biomass allocation and trace element distribution among plant parts of Salix × smithiana in trace element-polluted soils?

    Science.gov (United States)

    Vondráčková, Stanislava; Tlustoš, Pavel; Száková, Jiřina

    2017-08-01

    Willows (Salix spp.) are considered to be effective for the phytoremediation of trace elements from contaminated soils, but their efficiency is limited in heavily polluted soils because of poor growth. Liming can be a desirable measure to decrease the plant availability of elements, resulting in improved plant development. Notably, large root area and maximum soil penetration are basic parameters that improve the efficiency of phytoremediation. The impact of soil chemical properties on willow root anatomy and the distribution of trace elements below-ground have rarely been studied. The effect of liming on root parameters, biomass allocation and trace element distribution in non-harvestable (coarse roots, fine roots, stumps) and harvestable plant parts (twigs and leaves) of Salix × smithiana was assessed at the end of a 4-year pot experiment with two trace element-polluted soils that differed in terms of soil pH. Stump biomass predominated in weakly acidic soil. In neutral soil, the majority of biomass was located in fine roots and stumps; the difference from other plant parts was minor. Trace elements were the most concentrated in fine roots. Translocation to above-ground biomass increased as follows: Pb roots roots). Lime application decreased the concentrations of mobile Cd and Zn and related levels in plants, improved biomass production and root parameters and increased the removal of all trace elements in weakly acidic soil. None or minimum differences in the monitored parameters were recorded for dolomite treatments in both soils. The dose and source of liming had crucial effects on root anatomy. Growing willows in limed trace element-polluted soils is a suitable measure for combination of two remediation strategies, i.e. phytoextraction of Cd and Zn and assisted phytostabilization of As and Pb.

  11. Dynamic Reusable Workflows for Ocean Science

    Directory of Open Access Journals (Sweden)

    Richard P. Signell

    2016-10-01

    Full Text Available Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog searches and data access now make it possible to create catalog-driven workflows that automate—end-to-end—data search, analysis, and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused, and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS which automates the skill assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC Catalog Service for the Web (CSW, then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enter the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased

  12. Dynamic reusable workflows for ocean science

    Science.gov (United States)

    Signell, Richard; Fernandez, Filipe; Wilcox, Kyle

    2016-01-01

    Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog search and data access make it now possible to create catalog-driven workflows that automate — end-to-end — data search, analysis and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS) which automates the skill-assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC) Catalog Service for the Web (CSW), then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS) for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enters the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased use of dynamic

  13. Spatial distribution and potential sources of trace metals in insoluble particles of snow from Urumqi, China.

    Science.gov (United States)

    Li, Xiaolan; Jiang, Fengqing; Wang, Shaoping; Turdi, Muyesser; Zhang, Zhaoyong

    2015-01-01

    The purpose of this work is to characterize trace elements in snow in urban-suburb gradient over Urumqi city, China. The spatial distribution patterns of 11 trace metals in insoluble particulate matters of snow were revealed by using 102 snow samples collected in and around urban areas of Urumqi, a city suffering from severe wintertime air pollution in China. Similar spatial distribution for Mn, Cu, Zn, Ni, and Pb was found and their two significant high-value areas located in the west and east, respectively, and a high-value area in the south, which were correlated with factory emissions, traffic activities, and construction fugitive dust. The high-value areas of Cr, Ni, and V occurred in the northeast corner and along main traffic paths, which were linked to oil refinery and vehicular emissions. High value of Be presented in the west of the city. The high-value area of Co in the northeast could be related to local soil. Cd and U displayed relatively even spatial patterns in the urban area. In view of distance from the urban center, e.g., from the first circular belt to the fourth circular belt, except Be, V, Cd, and U, the contents of other metals generally decreased from the first circular belt to the forth circular belt, implying the effect of human activity clearly. Additionally, prevailing northwesterly winds and occasionally southeasterly winds in winter were associated with decreased, generally, concentrations of trace metal in snow from the urban center to the southern suburb along a northwest and southeast transect. The information on concentrations and spatial distributions of these metals in insoluble particles of snow in winter will be valuable for further environmental protection and planning.

  14. Speciation and Distribution of Trace Metals and Organic Matter in Marine Lake as In Situ Laboratory

    Science.gov (United States)

    Mlakar, M.; Fiket, Ž.; Cuculić, V.; Cukrov, N.; Geček, S.

    2016-02-01

    Marine lakes are unique, isolated marine systems, also recognized as in situ "laboratories" in which geochemical processes on a different scale compared to the open sea, can be observed. Impact of organic matter cycling on distribution of trace metals in the marine lake Mir, located on Dugi Otok Island, in the central part of the eastern Adriatic Sea, was investigated. Intense spatial and seasonal variations of physico-chemical parameters and organic matter concentrations in the water column of the Lake are governed predominantly by natural processes. Enhanced oxygen consumption in the Lake during summer season, high organic carbon concentrations and low redox potential result in occasional occurrence of anoxic conditions in the bottom layers. Speciation modelling showed that dissolved trace metals Cu, Pb and Zn, are mostly bound to organic matter, while Cd, Co and Ni are present predominantly as free ions and inorganic complexes. Trace metals removal from the water column and their retention in the sediment was found to depend on the nature of the relationship between specific metal and high proportion of organic matter (up to 9%) and inorganic phases, Fe-oxyhydroxydes or biogenic calcite. Surrounding karstic background, with occasional occurrences of red soil characterize deposited sediments as coarse grained and carbonate rich, whose elemental composition is affected by bathymetry of the basin and overall biological production.

  15. Spatial distribution of the trace elements zinc, strontium and lead in human bone tissue.

    Science.gov (United States)

    Pemmer, B; Roschger, A; Wastl, A; Hofstaetter, J G; Wobrauschek, P; Simon, R; Thaler, H W; Roschger, P; Klaushofer, K; Streli, C

    2013-11-01

    Trace elements are chemical elements in minute quantities, which are known to accumulate in the bone. Cortical and trabecular bones consist of bone structural units (BSUs) such as osteons and bone packets of different mineral content and are separated by cement lines. Previous studies investigating trace elements in bone lacked resolution and therefore very little is known about the local concentration of zinc (Zn), strontium (Sr) and lead (Pb) in BSUs of human bone. We used synchrotron radiation induced micro X-ray fluorescence analysis (SR μ-XRF) in combination with quantitative backscattered electron imaging (qBEI) to determine the distribution and accumulation of Zn, Sr, and Pb in human bone tissue. Fourteen human bone samples (10 femoral necks and 4 femoral heads) from individuals with osteoporotic femoral neck fractures as well as from healthy individuals were analyzed. Fluorescence intensity maps were matched with BE images and correlated with calcium (Ca) content. We found that Zn and Pb had significantly increased levels in the cement lines of all samples compared to the surrounding mineralized bone matrix. Pb and Sr levels were found to be correlated with the degree of mineralization. Interestingly, Zn intensities had no correlation with Ca levels. We have shown for the first time that there is a differential accumulation of the trace elements Zn, Pb and Sr in BSUs of human bone indicating different mechanisms of accumulation. © 2013. Published by Elsevier Inc. All rights reserved.

  16. Spatial distribution of the trace elements zinc, strontium and lead in human bone tissue☆

    Science.gov (United States)

    Pemmer, B.; Roschger, A.; Wastl, A.; Hofstaetter, J.G.; Wobrauschek, P.; Simon, R.; Thaler, H.W.; Roschger, P.; Klaushofer, K.; Streli, C.

    2013-01-01

    Trace elements are chemical elements in minute quantities, which are known to accumulate in the bone. Cortical and trabecular bones consist of bone structural units (BSUs) such as osteons and bone packets of different mineral content and are separated by cement lines. Previous studies investigating trace elements in bone lacked resolution and therefore very little is known about the local concentration of zinc (Zn), strontium (Sr) and lead (Pb) in BSUs of human bone. We used synchrotron radiation induced micro X-ray fluorescence analysis (SR μ-XRF) in combination with quantitative backscattered electron imaging (qBEI) to determine the distribution and accumulation of Zn, Sr, and Pb in human bone tissue. Fourteen human bone samples (10 femoral necks and 4 femoral heads) from individuals with osteoporotic femoral neck fractures as well as from healthy individuals were analyzed. Fluorescence intensity maps were matched with BE images and correlated with calcium (Ca) content. We found that Zn and Pb had significantly increased levels in the cement lines of all samples compared to the surrounding mineralized bone matrix. Pb and Sr levels were found to be correlated with the degree of mineralization. Interestingly, Zn intensities had no correlation with Ca levels. We have shown for the first time that there is a differential accumulation of the trace elements Zn, Pb and Sr in BSUs of human bone indicating different mechanisms of accumulation. PMID:23932972

  17. Microwave transport in EBT distribution manifolds using Monte Carlo ray-tracing techniques

    International Nuclear Information System (INIS)

    Lillie, R.A.; White, T.L.; Gabriel, T.A.; Alsmiller, R.G. Jr.

    1983-01-01

    Ray tracing Monte Carlo calculations have been carried out using an existing Monte Carlo radiation transport code to obtain estimates of the microsave power exiting the torus coupling links in EPT microwave manifolds. The microwave power loss and polarization at surface reflections were accounted for by treating the microwaves as plane waves reflecting off plane surfaces. Agreement on the order of 10% was obtained between the measured and calculated output power distribution for an existing EBT-S toroidal manifold. A cost effective iterative procedure utilizing the Monte Carlo history data was implemented to predict design changes which could produce increased manifold efficiency and improved output power uniformity

  18. Distribution of trace elements in whole blood of Syrian lymphomas patients using instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Bakir, M. A.; Serhil, A.; Mohammad, A.; Habil, K.

    2013-12-01

    In recent years, there had been much interest in the concentrations of trace metals occurring in human and animal tissues and in the manner in which these concentrations may alter in malignant and other diseases. Neutron activation analysis is consider one of several methods that have been described for the determination of trace elements in biological materials. This method possesses the sensitivity and specificity necessary for the estimation at the concentrations existing naturally in most tissues, particularly when only small samples are available for analysis. The purpose of this study was to compare blood concentrations of trace elements Co, Cr, Fe, Rb, Sc, Se, Th, and Zn of lymphomas Syrian patients with those of healthy volunteers. Also, determine the relationships between trace elements concentration and the histological type of lymphomas. The blood samples were collected from 39 healthy volunteers and 49 patients with histologically confirmed lymphomas (29 Hodgkin's HL and 20 non-Hodgkin's lymphomas NHL), and analyzed to obtain the concentration of the trace elements in blood. Then, comparison between the healthy volunteers and lymphomas patients (both HL and NHL) was made to elucidate differences of the concentration distributions of the elements in blood. However, statistical analysis using Student's t test revealed significantly high concentrations of Co, Cr, Sc, and Th in lymphoma patients. Whereas Fe and Rb were found significantly decreased in lymphomas patient comparing to control group. Increasing or decreasing concentrations of Se and Zn in lymphoma patients was found not significant. Comparison between the healthy volunteers and non-Hodgkin's lymphomas patients reveled that Co, Cr, Sc, and Th were significantly elevated whereas, Rb only one trace element was decreased and all change in concentrations (elevating or decreasing) of Se and Zn were not significant. Comparison between the healthy volunteers and Hodgkin

  19. Fraction-specific controls on the trace element distribution in iron formations : Implications for trace metal stable isotope proxies

    NARCIS (Netherlands)

    Oonk, Paul B.H.; Tsikos, Harilaos; Mason, Paul R.D.; Henkel, Susann; Staubwasser, Michael; Fryer, Lindi; Poulton, Simon W.; Williams, Helen M.

    2017-01-01

    Iron formations (IFs) are important geochemical repositories that provide constraints on atmospheric and ocean chemistry, prior to and during the onset of the Great Oxidation Event. Trace metal abundances and their Mo-Cr-U isotopic ratios have been widely used for investigating ocean redox processes

  20. Washability and Distribution Behaviors of Trace Elements of a High-Sulfur Coal, SW Guizhou, China

    Directory of Open Access Journals (Sweden)

    Wei Cheng

    2018-02-01

    Full Text Available The float-sink test is a commonly used technology for the study of coal washability, which determines optimal separation density for coal washing based on the desired sulfur and ash yield of the cleaned coal. In this study, the float-sink test is adopted for a high-sulfur Late Permian coal from Hongfa coalmine (No.26, southwestern Guizhou, China, to investigate its washability, and to analyze the organic affinities and distribution behaviors of some toxic and valuable trace elements. Results show that the coal is difficult to separate in terms of desulfurization. A cleaned coal could theoretically be obtained with a yield of 75.50%, sulfur 2.50%, and ash yield 11.33% when the separation density is 1.57 g/cm3. Trace elements’ distribution behaviors during the gravity separation were evaluated by correlation analysis and calculation. It was found that Cs, Ga, Ta, Th, Rb, Sb, Nb, Hf, Ba, Pb, In, Cu, and Zr are of significant inorganic affinity; while Sn, Co, Re, U, Mo, V, Cr, Ni, and Be are of relatively strong organic affinity. LREE (Light rare earth elements, however, seem to have weaker organic affinity than HREE (Heavy rare earth elements, which can probably be attributed to lanthanide contraction. When the separation density is 1.60 g/cm3, a large proportion of Sn, Be, Cr, U, V, Mo, Ni, Cd, Pb, and Cu migrate to the cleaned coal, but most of Mn, Sb and Th stay in the gangue. Coal preparation provides alternativity for either toxic elements removal or valuable elements preconcentration in addition to desulfurization and deashing. The enrichment of trace elements in the cleaned coal depends on the predetermined separation density which will influence the yields and ash yields of the cleaned coal.

  1. Distribution of trace elements in the brain of EL (epilepsy) mice.

    Science.gov (United States)

    Hirate, Maki; Takeda, Atsushi; Tamano, Haruna; Enomoto, Shuichi; Oku, Naoto

    2002-09-01

    The association of essential trace elements with epileptic seizures is poorly understood. On the basis of the evidences that the release of zinc from the brain of epilepsy (EL) mice, an animal model of genetically determined epilepsy, is enhanced by the induction of seizures and that alteration of zinc homeostasis is responsive to susceptibility to seizures, the distribution of trace elements in the brain was studied using EL mice and ddY mice, which form the genetic background for the inbred EL mice. The multitracer technique was applied to determine the distribution of trace elements. Twenty-four hours after intravenous injection of the multitracer, the concentration of 65Zn and 56Co in the brain of untreated EL mice was higher than in ddY mice, while the concentration of 65Zn and 56Co in the brain was decreased in seized EL mice. 75Se concentration in the hippocampus, cerebral cortex and cerebellum of untreated EL mice was lower than in ddY mice, while 75Se concentration in the hippocampus was increased in seized EL mice. 83Rb, an element of homologous series to potassium, concentration in the hippocampus and cerebral cortex of untreated EL mice was lower than in ddY mice, and 83Rb concentration in the cerebral cortex was decreased in seized EL mice. The movement of zinc, cobalt and selenium in the brain may be altered by enhancement of susceptibility to seizures. These results suggest that alteration of homeostasis of zinc, cobalt and selenium in the brain may be involved in the susceptibility, development or termination of seizures in EL mice. Copyright 2002 Elsevier Science B.V.

  2. Integrating prediction, provenance, and optimization into high energy workflows

    Energy Technology Data Exchange (ETDEWEB)

    Schram, M.; Bansal, V.; Friese, R. D.; Tallent, N. R.; Yin, J.; Barker, K. J.; Stephan, E.; Halappanavar, M.; Kerbyson, D. J.

    2017-10-01

    We propose a novel approach for efficient execution of workflows on distributed resources. The key components of this framework include: performance modeling to quantitatively predict workflow component behavior; optimization-based scheduling such as choosing an optimal subset of resources to meet demand and assignment of tasks to resources; distributed I/O optimizations such as prefetching; and provenance methods for collecting performance data. In preliminary results, these techniques improve throughput on a small Belle II workflow by 20%.

  3. Querying Workflow Logs

    Directory of Open Access Journals (Sweden)

    Yan Tang

    2018-01-01

    Full Text Available A business process or workflow is an assembly of tasks that accomplishes a business goal. Business process management is the study of the design, configuration/implementation, enactment and monitoring, analysis, and re-design of workflows. The traditional methodology for the re-design and improvement of workflows relies on the well-known sequence of extract, transform, and load (ETL, data/process warehousing, and online analytical processing (OLAP tools. In this paper, we study the ad hoc queryiny of process enactments for (data-centric business processes, bypassing the traditional methodology for more flexibility in querying. We develop an algebraic query language based on “incident patterns” with four operators inspired from Business Process Model and Notation (BPMN representation, allowing the user to formulate ad hoc queries directly over workflow logs. A formal semantics of this query language, a preliminary query evaluation algorithm, and a group of elementary properties of the operators are provided.

  4. The PBase Scientific Workflow Provenance Repository

    Directory of Open Access Journals (Sweden)

    Víctor Cuevas-Vicenttín

    2014-10-01

    Full Text Available Scientific workflows and their supporting systems are becoming increasingly popular for compute-intensive and data-intensive scientific experiments. The advantages scientific workflows offer include rapid and easy workflow design, software and data reuse, scalable execution, sharing and collaboration, and other advantages that altogether facilitate “reproducible science”. In this context, provenance – information about the origin, context, derivation, ownership, or history of some artifact – plays a key role, since scientists are interested in examining and auditing the results of scientific experiments. However, in order to perform such analyses on scientific results as part of extended research collaborations, an adequate environment and tools are required. Concretely, the need arises for a repository that will facilitate the sharing of scientific workflows and their associated execution traces in an interoperable manner, also enabling querying and visualization. Furthermore, such functionality should be supported while taking performance and scalability into account. With this purpose in mind, we introduce PBase: a scientific workflow provenance repository implementing the ProvONE proposed standard, which extends the emerging W3C PROV standard for provenance data with workflow specific concepts. PBase is built on the Neo4j graph database, thus offering capabilities such as declarative and efficient querying. Our experiences demonstrate the power gained by supporting various types of queries for provenance data. In addition, PBase is equipped with a user friendly interface tailored for the visualization of scientific workflow provenance data, making the specification of queries and the interpretation of their results easier and more effective.

  5. Responsive web design workflow

    OpenAIRE

    LAAK, TIMO

    2013-01-01

    Responsive Web Design Workflow is a literature review about Responsive Web Design, a web standards based modern web design paradigm. The goals of this research were to define what responsive web design is, determine its importance in building modern websites and describe a workflow for responsive web design projects. Responsive web design is a paradigm to create adaptive websites, which respond to the properties of the media that is used to render them. The three key elements of responsi...

  6. Foundations for Survivable System Development: Service Traces, Intrusion Traces, and Evaluation Models

    National Research Council Canada - National Science Library

    Linger, Richard

    2001-01-01

    .... On the system side, survivability specifications can be defined by essential-service traces that map essential-service workflows, derived from user requirements, into system component dependencies...

  7. Distribution of some artificial and natural radionuclides and trace elements in Syrian soils

    International Nuclear Information System (INIS)

    Al-Masri, M. S.; Shaik Khalil, H.; Amin, Y.; Ibrahim, S.; Hassan, M.

    2004-07-01

    Within the environmental monitoring program in Syria, about 115 surface soil and 38 profile soil samples were collected and analyzed during the period of 1998 to 2003 in order to determine the levels of natural and artificial radionuclides and some of trace elements (Cu, Zn, Cd, Pb). The concentrations of the natural radionuclides in the surface samples were found to vary from area to another and ranged from 2-50 Bq/kg, 4-228 Bq/kg, 4-55 Bq/kg, 1-143 Bq/kg and 96-672 Bq/kg for 224 Ra, 226 Ra, 228 Ra, 137 Cs and 04 K, respectively. While, the concentrations of the studied trace elements were varied between 0.5-5.6 mg/kg for U, 3.2-31.7 mg/kg for Pb, 14-141 mg/kg for Zn, 1.6-114 mg/kg for Cu and 0.25-2.7 mg/kg for Cd. Most of the reported values in this study were in the range of the natural uncontaminated surface soil concentrations and published values in many countries in the world. The results showed that the relation between the distribution of the natural radionuclides and depth was approximately the same for all radionuclides except for 137 Cs, which was extremely binded in the upper layers of soil. In addition, some differences in the concentrations of the studied trace elements with depth were observed. These differences may be due to the average of rainfall and the existence of some potential sources of contamination of such elements. However, the results of this study can be considered as a database for the natural background in Syria that helps to establish the radiation map of the country.(author)

  8. Distribution of trace elements in land plants and botanical taxonomy with special reference to rare earth elements and actinium

    International Nuclear Information System (INIS)

    Koyama, Mutsuo

    1989-01-01

    Distribution profiles of trace elements in land plants were studied by neutron activation analysis and radioactivity measurements without activation. Number of botanical samples analyzed were more than three thousand in which more than three hundred botanical species were included. New accumulator plants of Co, Cr, Zn, Cd, rare earth elements, Ac, U, etc., were found. Capabilities of accumulating trace elements can be related to the botanical taxonomy. Discussions are given from view points of inorganic chemistry as well as from botanical physiology

  9. Proceedings of the scientific meeting on 'behavior and distributions of trace substances in the environment'

    International Nuclear Information System (INIS)

    Fukui, M.; Matsuzuru, H.

    1998-02-01

    The scientific meeting was held at the Research Reactor Institute, Kyoto University on December 11-12, 1997. This single report covers all aspects concerning association of trace substances such as pesticides/herbicides, organic chemicals and radionuclides in the environment. The reason for having this meeting is to describe the distribution and behavior of trace substances in which the emphasis is directed towards the dynamic interaction between the soil-sediment-water system and the contaminants. The Chernobyl accident raised the attention on the fate of radionuclides released in the environment and stimulated many scientists, who carry out large scale 'field experiments' without using a tracer in a laboratory. Of course, fundamental laboratory studies are necessary to give direction to and to understand observations from field studies. These activities have brought a lot of knowledge and understanding towards revealing a part of the complexity of the transport processes. It is hoped that the assembled experts, will not only dwell on distinct scientific issues, but also be able to draw firm conclusions with respect to the effective environmental management of the ecological aspects of hazardous materials. The 25 of the presented papers are indexed individually. (J.P.N.)

  10. Distribution of siderophile and other trace elements in melt rock at the Chicxulub impact structure

    Science.gov (United States)

    Schuraytz, B. C.; Lindstrom, D. J.; Martinez, R. R.; Sharpton, V. L.; Marin, L. E.

    1994-01-01

    Recent isotopic and mineralogical studies have demonstrated a temporal and chemical link between the Chicxulub multiring impact basin and ejecta at the Cretaceous-Tertiary boundary. A fundamental problem yet to be resolved, however, is identification of the projectile responsible for this cataclysmic event. Drill core samples of impact melt rock from the Chichxulub structure contain Ir and Os abundances and Re-Os isotopic ratios indicating the presence of up to approx. 3 percent meteoritic material. We have used a technique involving microdrilling and high sensitivity instrumental neutron activation analysis (INAA) in conjunction with electron microprobe analysis to characterize further the distribution of siderophile and other trace elements among phases within the C1-N10 melt rock.

  11. Geochemical distribution of trace metals and organochlorine contaminants of a lake ontario shoreline marsh

    Energy Technology Data Exchange (ETDEWEB)

    Glooschenko, W A; Capocianco, J; Coburn, J; Glooschenko, V

    1981-02-01

    Rattray Marsh, an 8 ha marsh on the Lake Ontario shoreline at Mississauga, Ontario, is an important local habitat for waterfowl and shorebirds during spring and fall migration. A study was conducted to determine the distribution of nutrients (carbon, nitrogen, and phosphorus) and potential trace metal and organochlorine pollutants in the marsh as evidenced by the sedimentary concentrations of these compounds. Generally, copper, zinc, lead, and mercury were higher in concentration in local soils than in Lake Ontario sediments. Metals and organic carbon levels did not correlate, and the metals appeared to be associated with silts and clays. Organochlorine contaminants include p,p1-DDE, p,p1-DDD, p,p1-DDT, alpha-chlordane, PCB, mirex, and HCB.

  12. Distribution of uranium and some selected trace metals in human scalp hair from Balkans.

    Science.gov (United States)

    Zunic, Z S; Tokonami, S; Mishra, S; Arae, H; Kritsananuwat, R; Sahoo, S K

    2012-11-01

    The possible consequences of the use of depleted uranium (DU) used in Balkan conflicts in 1995 and 1999 for the people and the environment of this reason need attention. The heavy metal content in human hair may serve as a good indicator of dietary, environmental and occupational exposures to the metal compounds. The present work summarises the distribution of uranium and some selected trace metals such as Mn, Ni, Cu, Zn, Sr, Cd and Cs in the scalp hair of inhabitants from Balkans exposed to DU directly and indirectly, i.e. Han Pijesak, Bratoselce and Gornja Stubla areas. Except U and Cs, all other metals were compared with the worldwide reported values of occupationally unexposed persons. Uranium concentrations show a wide variation ranging from 0.9 ± 0.05 to 449 ± 12 µg kg(-1). Although hair samples were collected from Balkan conflict zones, uranium isotopic measurement ((235)U/(238)U) shows a natural origin rather than DU.

  13. Major and trace element distribution in soil and sediments from the Egyptian central Nile Valley

    Science.gov (United States)

    Badawy, W. M.; Ghanim, E. H.; Duliu, O. G.; El Samman, H.; Frontasyeva, M. V.

    2017-07-01

    The distributions of 32 major and trace elements in 72 surface soil and sediment samples collected from the Asyut to Cairo Nile river section were determined by epithermal neutron activation analysis and compared with corresponding data for the Upper Continental Crust, North American Shale Composite, Average Soil and Average Sediment as well as suspended sediments from Congo and Upper Niger Rivers, in order to establish to which extent the Nile sedimentary material can be related to similar material all over the world as well as to local geology. Their relative distributions indicate the presence of detrital material of igneous origin, most probably resulting from weathering of the Ethiopian Highlands and transported by the Blue Nile, the Nile main tributary. The distributions of nickel, zinc, and arsenic contents suggest that the lower part of the Nile and its surroundings including the Nile Delta is not seriously polluted with heavy metals, so that, in spite of a human activity, which lasted four millennia, the Nile River continues to be less affected by any anthropogenic contamination.

  14. Content and distribution of trace metals in pristine permafrost environments of Northeastern Siberia, Russia

    Science.gov (United States)

    Antcibor, I.; Eschenbach, A.; Kutzbach, L.; Bolshiyanov, D.; Pfeiffer, E.-M.

    2012-04-01

    Arctic regions are one of the most sensitive areas with respect to climatic changes and human impacts. Research is required to discover how the function of permafrost soils as a buffering system for metal pollutants could change in response to the predicted changes. The goal of this work is to determine the background levels of trace metals in the pristine arctic ecosystems of the Lena River Delta in Northeastern Siberia and to evaluate the possible effect of human impacts on this arctic region. The Lena River Delta represents areas with different dominating geomorphologic processes that can generally be divided between accumulation and erosion sites. Frequent changes of the river water level create different periods of sedimentation and result in the formation of stratified soils and sediment layers which are dominated either by mineral substrates with allochthonous organic matter or pure autochthonous peat. The deposited sediments that have formed the delta islands are mostly composed of sand fractions; therefore the buffering effects of clay materials can be neglected. Samoylov Island is representative of the south-central and eastern modern delta surfaces of the Lena River Delta and is selected as a pilot study site. We determined total element contents of Fe, Mn, Zn, Cd, Ni, Cu, As, Pb, Co and Hg in soil horizons from different polygonal elevated rims, polygonal depressed centers and the middle floodplain. High gravimetric concentrations (related to dry mass of soil material) of Mn and Fe are found within all soil profiles and vary from 0.14 to 1.39 g kg-1 and from 10.7 to 41.2 g kg-1, respectively. While the trace element concentrations do not exceed typical crustal abundances, the maximum values of most of the metals are observed within the soil profile situated at the middle floodplain. This finding suggests that apart from the parent material the second potential source of trace metals is due to allochthonous substance input during annual flooding of the

  15. A STUDY OF LEAKAGE OF TRACE METALS FROM CORROSION OF THE MUNICIPAL DRINKING WATER DISTRIBUTION SYSTEM

    Directory of Open Access Journals (Sweden)

    M.R SHA MANSOURI

    2003-09-01

    Full Text Available Introduction: A high portion of lead and copper concentration in municipal drinking water is related to the metallic structure of the distribution system and facets. The corrosive water in pipes and facets cause dissolution of the metals such as Pb, Cu, Cd, Zn, Fe and Mn into the water. Due to the lack of research work in this area, a study of the trace metals were performed in the drinking water distribution system in Zarin Shahr and Mobareke of Isfahan province. Methods: Based on the united states Environmental protection Agency (USEPA for the cities over than 50,000 population such as Zarin Shahr and Mobareke, 30 water samples from home facets with the minimum 6 hours retention time of water in pipes, were collected. Lead and cadmium concentration were determined using flameless Atomic Absorption. Cupper, Zinc, Iron and Manganese were determined using Atomic Absorption. Results: The average concentration of Pb, Cd, Zn, Fe and Mn in water distribution system fo Zarin Shahr were 5.7, 0.1, 80, 3042, 23065 and in Mobareke were 7.83, 0.8,210,3100, 253, 17µg respectively. The cocentration of Pb, Cd and Zn were zero at the beginning of the water samples from the municipal drinking water distribution system for both cities. Conclusion: The study showed that the corrosion by products (such as Pb, Cd and Zn was the results of dissolution of the galvanized pipes and brass facets. Lead concentration in over that 10 percent of the water samples in zarin shahr exceeded the drinking water standard level, which emphasize the evaluation and control of corrosion in drinking water distribution systems.

  16. Evaluation of trace elements distribution in water, sediment, soil and cassava plant in Muria peninsula environment by NAA method

    International Nuclear Information System (INIS)

    Muryono, H.; Sumining; Agus Taftazani; Kris Tri Basuki; Sukarman, A.

    1999-01-01

    The evaluation of trace elements distribution in water, sediment, soil and cassava plant in Muria peninsula by NAA method were done. The nuclear power plant (NPP) and the coal power plant (CPP) will be built in Muria peninsula, so, the Muria peninsula is an important site for samples collection and monitoring of environment. River-water, sediment, dryland-soil and cassava plant were choosen as specimens samples from Muria peninsula environment. The analysis result of trace elements were used as a contributed data for environment monitoring before and after NPP was built. The trace elements in specimens of river-water, sediment, dryland-soil and cassava plant samples were analyzed by INAA method. It was found that the trace elements distribution were not evenly distributed. Percentage of trace elements distribution in river-water, sediment, dryland-soil and cassava leaves were 0.00026-0.037% in water samples, 0.49-62.7% in sediment samples, 36.29-99.35% in soil samples and 0.21-99.35% in cassava leaves. (author)

  17. Evaluation of trace elements distribution in water, sediment, soil and cassava plant in Muria peninsula environment by NAA method

    Energy Technology Data Exchange (ETDEWEB)

    Muryono, H.; Sumining; Agus Taftazani; Kris Tri Basuki; Sukarman, A. [Yogyakarta Nuclear Research Center, Yogyakarta (Indonesia)

    1999-10-01

    The evaluation of trace elements distribution in water, sediment, soil and cassava plant in Muria peninsula by NAA method were done. The nuclear power plant (NPP) and the coal power plant (CPP) will be built in Muria peninsula, so, the Muria peninsula is an important site for samples collection and monitoring of environment. River-water, sediment, dryland-soil and cassava plant were choosen as specimens samples from Muria peninsula environment. The analysis result of trace elements were used as a contributed data for environment monitoring before and after NPP was built. The trace elements in specimens of river-water, sediment, dryland-soil and cassava plant samples were analyzed by INAA method. It was found that the trace elements distribution were not evenly distributed. Percentage of trace elements distribution in river-water, sediment, dryland-soil and cassava leaves were 0.00026-0.037% in water samples, 0.49-62.7% in sediment samples, 36.29-99.35% in soil samples and 0.21-99.35% in cassava leaves. (author)

  18. Multilevel Workflow System in the ATLAS Experiment

    CERN Document Server

    Borodin, M; The ATLAS collaboration; Golubkov, D; Klimentov, A; Maeno, T; Vaniachine, A

    2015-01-01

    The ATLAS experiment is scaling up Big Data processing for the next LHC run using a multilevel workflow system comprised of many layers. In Big Data processing ATLAS deals with datasets, not individual files. Similarly a task (comprised of many jobs) has become a unit of the ATLAS workflow in distributed computing, with about 0.8M tasks processed per year. In order to manage the diversity of LHC physics (exceeding 35K physics samples per year), the individual data processing tasks are organized into workflows. For example, the Monte Carlo workflow is composed of many steps: generate or configure hard-processes, hadronize signal and minimum-bias (pileup) events, simulate energy deposition in the ATLAS detector, digitize electronics response, simulate triggers, reconstruct data, convert the reconstructed data into ROOT ntuples for physics analysis, etc. Outputs are merged and/or filtered as necessary to optimize the chain. The bi-level workflow manager - ProdSys2 - generates actual workflow tasks and their jobs...

  19. Trace elements in particulate matter from metropolitan regions of Northern China: Sources, concentrations and size distributions.

    Science.gov (United States)

    Pan, Yuepeng; Tian, Shili; Li, Xingru; Sun, Ying; Li, Yi; Wentworth, Gregory R; Wang, Yuesi

    2015-12-15

    Public concerns over airborne trace elements (TEs) in metropolitan areas are increasing, but long-term and multi-site observations of size-resolved aerosol TEs in China are still lacking. Here, we identify highly elevated levels of atmospheric TEs in megacities and industrial sites in a Beijing-Tianjin-Hebei urban agglomeration relative to background areas, with the annual mean values of As, Pb, Ni, Cd and Mn exceeding the acceptable limits of the World Health Organization. Despite the spatial variability in concentrations, the size distribution pattern of each trace element was quite similar across the region. Crustal elements of Al and Fe were mainly found in coarse particles (2.1-9 μm), whereas the main fraction of toxic metals, such as Cu, Zn, As, Se, Cd and Pb, was found in submicron particles (metals were enriched by over 100-fold relative to the Earth's crust. The size distributions of Na, Mg, K, Ca, V, Cr, Mn, Ni, Mo and Ba were bimodal, with two peaks at 0.43-0.65 μm and 4.7-5.8 μm. The combination of the size distribution information, principal component analysis and air mass back trajectory model offered a robust technique for distinguishing the main sources for airborne TEs, e.g., soil dust, fossil fuel combustion and industrial emissions, at different sites. In addition, higher elemental concentrations coincided with westerly flow, indicating that polluted soil and fugitive dust were major sources of TEs on the regional scale. However, the contribution of coal burning, iron industry/oil combustion and non-ferrous smelters to atmospheric metal pollution in Northern China should be given more attention. Considering that the concentrations of heavy metals associated with fine particles in the target region were significantly higher than those in other Asian sites, the implementations of strict environmental standards in China are required to reduce the amounts of these hazardous pollutants released into the atmosphere. Copyright © 2015 Elsevier B

  20. Multilevel Workflow System in the ATLAS Experiment

    International Nuclear Information System (INIS)

    Borodin, M; De, K; Navarro, J Garcia; Golubkov, D; Klimentov, A; Maeno, T; Vaniachine, A

    2015-01-01

    The ATLAS experiment is scaling up Big Data processing for the next LHC run using a multilevel workflow system comprised of many layers. In Big Data processing ATLAS deals with datasets, not individual files. Similarly a task (comprised of many jobs) has become a unit of the ATLAS workflow in distributed computing, with about 0.8M tasks processed per year. In order to manage the diversity of LHC physics (exceeding 35K physics samples per year), the individual data processing tasks are organized into workflows. For example, the Monte Carlo workflow is composed of many steps: generate or configure hard-processes, hadronize signal and minimum-bias (pileup) events, simulate energy deposition in the ATLAS detector, digitize electronics response, simulate triggers, reconstruct data, convert the reconstructed data into ROOT ntuples for physics analysis, etc. Outputs are merged and/or filtered as necessary to optimize the chain. The bi-level workflow manager - ProdSys2 - generates actual workflow tasks and their jobs are executed across more than a hundred distributed computing sites by PanDA - the ATLAS job-level workload management system. On the outer level, the Database Engine for Tasks (DEfT) empowers production managers with templated workflow definitions. On the next level, the Job Execution and Definition Interface (JEDI) is integrated with PanDA to provide dynamic job definition tailored to the sites capabilities. We report on scaling up the production system to accommodate a growing number of requirements from main ATLAS areas: Trigger, Physics and Data Preparation. (paper)

  1. Capturing sunlight into a photobioreactor: Ray tracing simulations of the propagation of light from capture to distribution into the reactor

    NARCIS (Netherlands)

    Zijffers, J.F.; Janssen, M.G.J.; Tramper, J.; Wijffels, R.H.; Salim, S.

    2008-01-01

    The Green Solar Collector (GSC), a photobioreactor designed for area efficient outdoor cultivation of microalgae uses Fresnel lenses and light guides to focus, transport and distribute direct light into the algae suspension. Calculating the path of rays of light, so-called ray tracing, is used to

  2. Workflow management: an overview

    NARCIS (Netherlands)

    Ouyang, C.; Adams, M.; Wynn, M.T.; Hofstede, ter A.H.M.; Brocke, vom J.; Rosemann, M.

    2010-01-01

    Workflow management has its origin in the office automation systems of the seventies, but it is not until fairly recently that conceptual and technological breakthroughs have led to its widespread adoption. In fact, nowadays, processawareness has become an accepted and integral part of various types

  3. Distributed late-binding micro-scheduling and data caching for data-intensive workflows; Microplanificación de asignación tardía y almacenamiento temporal distribuidos para flujos de trabajo intensivos en datos

    Energy Technology Data Exchange (ETDEWEB)

    Delgado Peris, A.

    2015-07-01

    Today's world is flooded with vast amounts of digital information coming from innumerable sources. Moreover, it seems clear that this trend will only intensify in the future. Industry, society and remarkably science are not indifferent to this fact. On the contrary, they are struggling to get the most out of this data, which means that they need to capture, transfer, store and process it in a timely and efficient manner, using a wide range of computational resources. And this task is not always simple. A very representative example of the challenges posed by the management and processing of large quantities of data is that of the Large Hadron Collider experiments, which handle tens of petabytes of physics information every year. Based on the experience of one of these collaborations, we have studied the main issues involved in the management of huge volumes of data and in the completion of sizeable workflows that consume it. In this context, we have developed a general-purpose architecture for the scheduling and execution of workflows with heavy data requirements: the Task Queue. This new system builds on the late-binding overlay model, which has helped experiments to successfully overcome the problems associated to the heterogeneity and complexity of large computational grids. Our proposal introduces several enhancements to the existing systems. The execution agents of the Task Queue architecture share a Distributed Hash Table (DHT) and perform job matching and assignment cooperatively. In this way, scalability problems of centralized matching algorithms are avoided and workflow execution times are improved. Scalability makes fine-grained micro-scheduling possible and enables new functionalities, like the implementation of a distributed data cache on the execution nodes and the integration of data location information in the scheduling decisions...(Author)

  4. Ferret Workflow Anomaly Detection System

    National Research Council Canada - National Science Library

    Smith, Timothy J; Bryant, Stephany

    2005-01-01

    The Ferret workflow anomaly detection system project 2003-2004 has provided validation and anomaly detection in accredited workflows in secure knowledge management systems through the use of continuous, automated audits...

  5. Distribution of Major and Trace Elements in a Tropical Hydroelectric Reservoir in Sarawak, Malaysia.

    Science.gov (United States)

    Sim, Siong Fong; Ling, Teck Yee; Nyanti, Lee; Ean Lee, Terri Zhuan; Mohd Irwan Lu, Nurul Aida Lu; Bakeh, Tomy

    2014-01-01

    This paper reports the metals content in water, sediment, macroalgae, aquatic plant, and fish of Batang Ai Hydroelectric Reservoir in Sarawak, Malaysia. The samples were acid digested and subjected to atomic absorption spectrometry analysis for Na, K, Mn, Cr, Ni, Zn, Mg, Fe, Sn, Al, Ca, As, Se, and Hg. The total Hg content was analysed on the mercury analyser. Results showed that metals in water, sediment, macroalgae, aquatic plant, and fish are distinguishable, with sediment and biota samples more susceptible to metal accumulation. The distributions of heavy metals in water specifically Se, Sn, and As could have associated with the input of fish feed, boating, and construction activities. The accumulation of heavy metals in sediment, macroalgae, and aquatic plant on the other hand might be largely influenced by the redox conditions in the aquatic environment. According to the contamination factor and the geoaccumulation index, sediment in Batang Ai Reservoir possesses low risk of contamination. The average metal contents in sediment and river water are consistently lower than the literature values reported and well below the limit of various guidelines. For fishes, trace element Hg was detected; however, the concentration was below the permissible level suggested by the Food and Agriculture Organization.

  6. Distribution of energy levels of quantum free particle on the Liouville surface and trace formulae

    International Nuclear Information System (INIS)

    Bleher, P.M.; Kosygin, D.V.; Sinai, Y.G.

    1995-01-01

    We consider the Weyl asymptotic formula [{E n ≤R 2 }=Area Q.R 2 /(4π)+n(R), for eigenvalues of the Laplace-Beltrami operator on a two-dimensional torus Q with a Liouville metric which is in a sense the most general case of an integrable metric. We prove that if the surface Q is non-degenerate then the remainder term n(R) has the form n(R)=R 1/2 θ(R), where θ(R) is an almost periodic function of the Besicovitch class B 1 , and the Fourier amplitudes and the Fourier frequencies of θ(R) can be expressed via lengths of closed geodesics on Q and other simple geometric characteristics of these geodesics. We prove then that if the surface Q is generic then the limit distribution of θ(R) has a density p(t), which is an entire function of t possessing an asymptotics on a real line, logp(t)∝-C ± t 4 as t→±∞. An explicit expression for the Fourier transform of p(t) via Fourier amplitudes of θ(R) is also given. We obtain the analogue of the Guillemin-Duistermaat trace formula for the Liouville surfaces and discuss its accuracy. (orig.)

  7. A trace-driven analysis of name and attribute caching in a distributed system

    Science.gov (United States)

    Shirriff, Ken W.; Ousterhout, John K.

    1992-01-01

    This paper presents the results of simulating file name and attribute caching on client machines in a distributed file system. The simulation used trace data gathered on a network of about 40 workstations. Caching was found to be advantageous: a cache on each client containing just 10 directories had a 91 percent hit rate on name look ups. Entry-based name caches (holding individual directory entries) had poorer performance for several reasons, resulting in a maximum hit rate of about 83 percent. File attribute caching obtained a 90 percent hit rate with a cache on each machine of the attributes for 30 files. The simulations show that maintaining cache consistency between machines is not a significant problem; only 1 in 400 name component look ups required invalidation of a remotely cached entry. Process migration to remote machines had little effect on caching. Caching was less successful in heavily shared and modified directories such as /tmp, but there weren't enough references to /tmp overall to affect the results significantly. We estimate that adding name and attribute caching to the Sprite operating system could reduce server load by 36 percent and the number of network packets by 30 percent.

  8. Distribution and significance of trace element pollutants in hair of the Iraqi population

    International Nuclear Information System (INIS)

    Al-Shahristani, H.; Shihab, K.M.; Jalil, M.

    1979-01-01

    Hair is an excellent indicator of man's exposure to trace element environmental pollutants. Several hundred human head-hair samples were randomly collected from various regions of Iraq representing the general population. These were analysed by thermal neutron activation analysis and the following elements were instrumentally determined: Cr, Fe, Co, Ni, Zn, As, Se, Br, Ag, Cd, Sb, La, Au, Hg, Th and U. The average concentrations of these elements and the frequency distributions among the population are given and compared with concentrations from other regions of the world. Except for Br, the elemental concentrations determined for this population are, in general, similar to those reported for other areas. The low consumption of sea foods in Iraq is perhaps the cause of this anomaly. For certain population groups, high levels of Hg, Au, Cr and Se have been measured and the causes are discussed. In certain cases when exposure to a pollutant has taken place, it is shown that the biological half-life of the element in man can be determined by following the concentration variation along the hair strand. The ratio of the concentration in hair to the average body concentration as well as the total body-burden of the element are also determined from these curves. For methylmercury, the average biological half-life in man was found to be 72 days and the ratio of concentration in hair to average body concentration to be 137. (author)

  9. On Secure Workflow Decentralisation on the Internet

    Directory of Open Access Journals (Sweden)

    Petteri Kaskenpalo

    2010-06-01

    Full Text Available Decentralised workflow management systems are a new research area, where most work to-date has focused on the system's overall architecture. As little attention has been given to the security aspects in such systems, we follow a security driven approach, and consider, from the perspective of available security building blocks, how security can be implemented and what new opportunities are presented when empowering the decentralised environment with modern distributed security protocols. Our research is motivated by a more general question of how to combine the positive enablers that email exchange enjoys, with the general benefits of workflow systems, and more specifically with the benefits that can be introduced in a decentralised environment. This aims to equip email users with a set of tools to manage the semantics of a message exchange, contents, participants and their roles in the exchange in an environment that provides inherent assurances of security and privacy. This work is based on a survey of contemporary distributed security protocols, and considers how these protocols could be used in implementing a distributed workflow management system with decentralised control . We review a set of these protocols, focusing on the required message sequences in reviewing the protocols, and discuss how these security protocols provide the foundations for implementing core control-flow, data, and resource patterns in a distributed workflow environment.

  10. Investigation of exposure rates and radionuclide and trace metal distributions along the Hanford Reach of the Columbia River

    International Nuclear Information System (INIS)

    Cooper, A.T.; Woodruff, R.K.

    1993-09-01

    Studies have been conducted to investigate exposure rates, and radionuclide and trace metal distributions along the Columbia River where it borders the Hanford Site. The last major field study was conducted in 1979. With recently renewed interest in various land use and resource protection alternatives, it is important to have data that represent current conditions. Radionuclides and trace metals were surveyed in Columbia River shoreline soils along the Hanford Site (Hanford Reach). The work was conducted as part of the Surface Environmental Surveillance Project, Pacific Northwest Laboratory. The survey consisted of taking exposure rate measurements and soil samples primarily at locations known or expected to have elevated exposure rates

  11. Chemometrics in biomonitoring: Distribution and correlation of trace elements in tree leaves

    Energy Technology Data Exchange (ETDEWEB)

    Deljanin, Isidora [Innovation Center of the Faculty of Technology and Metallurgy, Karnegijeva 4, 11120 Belgrade, Serbia, (Serbia); Antanasijević, Davor, E-mail: dantanasijevic@tmf.bg.ac.rs [Innovation Center of the Faculty of Technology and Metallurgy, Karnegijeva 4, 11120 Belgrade, Serbia, (Serbia); Bjelajac, Anđelika [Innovation Center of the Faculty of Technology and Metallurgy, Karnegijeva 4, 11120 Belgrade, Serbia, (Serbia); Urošević, Mira Aničić [Institute of Physics, University of Belgrade, Pregrevica 118, 11080 Belgrade, Serbia, (Serbia); Nikolić, Miroslav [Institute for Multidisciplinary Research, University of Belgrade, Kneza Viseslava 1, 11030 Belgrade (Serbia); Perić-Grujić, Aleksandra; Ristić, Mirjana [University of Belgrade, Faculty of Technology and Metallurgy, Karnegijeva 4, 11120 Belgrade (Serbia)

    2016-03-01

    The concentrations of 15 elements were measured in the leaf samples of Aesculus hippocastanum, Tilia spp., Betula pendula and Acer platanoides collected in May and September of 2014 from four different locations in Belgrade, Serbia. The objective was to assess the chemical characterization of leaf surface and in-wax fractions, as well as the leaf tissue element content, by analyzing untreated, washed with water and washed with chloroform leaf samples, respectively. The combined approach of self-organizing networks (SON) and Preference Ranking Organization Method for Enrichment Evaluation (PROMETHEE) aided by Geometrical Analysis for Interactive Aid (GAIA) was used in the interpretation of multiple element loads on/in the tree leaves. The morphological characteristics of the leaf surfaces and the elemental composition of particulate matter (PM) deposited on tree leaves were studied by using scanning electron microscopy (SEM) with energy dispersive spectroscopy (EDS) detector. The results showed that the amounts of retained and accumulated element concentrations depend on several parameters, such as chemical properties of the element and morphological properties of the leaves. Among the studied species, Tilia spp. was found to be the most effective in the accumulation of elements in leaf tissue (70% of the total element concentration), while A. hippocastanum had the lowest accumulation (54%). After water and chloroform washing, the highest percentages of removal were observed for Al, V, Cr, Cu, Zn, As, Cd and Sb (> 40%). The PROMETHEE/SON ranking/classifying results were in accordance with the results obtained from the GAIA clustering techniques. The combination of the techniques enabled extraction of additional information from datasets. Therefore, the use of both the ranking and clustering methods could be a useful tool to be applied in biomonitoring studies of trace elements. - Highlights: • Surface and in-wax fractions showed different trace element

  12. Chemometrics in biomonitoring: Distribution and correlation of trace elements in tree leaves

    International Nuclear Information System (INIS)

    Deljanin, Isidora; Antanasijević, Davor; Bjelajac, Anđelika; Urošević, Mira Aničić; Nikolić, Miroslav; Perić-Grujić, Aleksandra; Ristić, Mirjana

    2016-01-01

    The concentrations of 15 elements were measured in the leaf samples of Aesculus hippocastanum, Tilia spp., Betula pendula and Acer platanoides collected in May and September of 2014 from four different locations in Belgrade, Serbia. The objective was to assess the chemical characterization of leaf surface and in-wax fractions, as well as the leaf tissue element content, by analyzing untreated, washed with water and washed with chloroform leaf samples, respectively. The combined approach of self-organizing networks (SON) and Preference Ranking Organization Method for Enrichment Evaluation (PROMETHEE) aided by Geometrical Analysis for Interactive Aid (GAIA) was used in the interpretation of multiple element loads on/in the tree leaves. The morphological characteristics of the leaf surfaces and the elemental composition of particulate matter (PM) deposited on tree leaves were studied by using scanning electron microscopy (SEM) with energy dispersive spectroscopy (EDS) detector. The results showed that the amounts of retained and accumulated element concentrations depend on several parameters, such as chemical properties of the element and morphological properties of the leaves. Among the studied species, Tilia spp. was found to be the most effective in the accumulation of elements in leaf tissue (70% of the total element concentration), while A. hippocastanum had the lowest accumulation (54%). After water and chloroform washing, the highest percentages of removal were observed for Al, V, Cr, Cu, Zn, As, Cd and Sb (> 40%). The PROMETHEE/SON ranking/classifying results were in accordance with the results obtained from the GAIA clustering techniques. The combination of the techniques enabled extraction of additional information from datasets. Therefore, the use of both the ranking and clustering methods could be a useful tool to be applied in biomonitoring studies of trace elements. - Highlights: • Surface and in-wax fractions showed different trace element

  13. Concentrations of 137Cs and trace elements in zooplankton, and their vertical distributions off Rokkasho, Japan

    International Nuclear Information System (INIS)

    Kaeriyama, Hideki; Ishii, Toshiaki; Watabe, Teruhisa; Kusakabe, Masashi

    2007-01-01

    Zooplankton samples were collected at about 50 m depth with a large ring net (160-cm mouth diameter, 0.5-mm mesh) in May, June, October 2005 and June 2006 off Rokkasho, Japan where a nuclear fuel reprocessing plant will be in full-scale operation in the near future. Plankters in each sample were separated based on their species. Eight samples were used for the determination of 137 Cs concentration and the other 21 samples were used for the determination of its stable isotope, Cs along with some other trace elements. All the samples were characterized by five dominant species, i.e. euphausiids, chaetognaths, copepods; Neocalanus spp., amphipods; Themisto spp. and Cyphocaris sp. Plankton samples were also taken at three to five discrete depths between the surface and ≤ 1,000 m in depth during daytime and nighttime for analysis of vertical distribution patterns of biomass, and for assessment of daily vertical migration activity. Integrated net zooplankton biomass at nighttime ranged from 0.85 to 8.74 g-DW m -2 in the 0-150 m layer without any appreciable day-night differences in the vertical distribution; below the layer, it decreased significantly. Only in spring, appreciable day-night differences in the vertical distribution were observed at the shallowest station. Concentrations of Cs and Co did not show significant difference among the five species. However, higher concentrations of Sr were observed in two amphipods. It is likely that amphipods had a different biological process in Sr metabolism from others. The concentration of 137 Cs in zooplankton was usually very low and sometimes under the detection limit. In the present study, the highest concentration of 137 Cs in zooplankton was 24 mBq kg-WW -1 , corresponding to the concentration factor (CF) of 14, if the value of 1.7 mBq L -1 was given to the 137 Cs concentration in seawater. The water-column inventory of 137 Cs in a zooplankton community is calculated to be 0.29 to 1.95 mBq m -2 , based on the data on

  14. A Multi-Dimensional Classification Model for Scientific Workflow Characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Ramakrishnan, Lavanya; Plale, Beth

    2010-04-05

    Workflows have been used to model repeatable tasks or operations in manufacturing, business process, and software. In recent years, workflows are increasingly used for orchestration of science discovery tasks that use distributed resources and web services environments through resource models such as grid and cloud computing. Workflows have disparate re uirements and constraints that affects how they might be managed in distributed environments. In this paper, we present a multi-dimensional classification model illustrated by workflow examples obtained through a survey of scientists from different domains including bioinformatics and biomedical, weather and ocean modeling, astronomy detailing their data and computational requirements. The survey results and classification model contribute to the high level understandingof scientific workflows.

  15. Insightful Workflow For Grid Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Charles Earl

    2008-10-09

    We developed a workflow adaptation and scheduling system for Grid workflow. The system currently interfaces with and uses the Karajan workflow system. We developed machine learning agents that provide the planner/scheduler with information needed to make decisions about when and how to replan. The Kubrick restructures workflow at runtime, making it unique among workflow scheduling systems. The existing Kubrick system provides a platform on which to integrate additional quality of service constraints and in which to explore the use of an ensemble of scheduling and planning algorithms. This will be the principle thrust of our Phase II work.

  16. Distribution of trace elements in tissues of shrimp species Litopenaeus vannamei (Boone, 1931 from Bahia, Brazil

    Directory of Open Access Journals (Sweden)

    E. Silva

    Full Text Available Abstract In this study, concentrations of trace elements in tissues of shrimp species (Litopenaeus vannamei from farming and zone natural coastal located in the northeastern Brazil were investigated. The elements determination was performed by optical emission spectrometry with inductively coupled plasma (ICP OES. The following ranges of concentrations in the tissues were obtained in µg g–1 dry weight: Al: 13.4-886.5, Cd: 0.93-1.80; Cu: 24.8-152; Fe: 3.2-410.9; Mn: 0.36-24.4; Se: 0.094-9.81 and Zn: 20.3-109.4. The shrimp muscle can be a good iron source (about 88.9 mg–1g dry weight. The distribution of Se concentration in tissues showed much variation between locations, and the concentration levels found in shrimp muscles of wild samples were high, where its levels in 67% of muscle and 50% of others tissues samples exceeded the ANVISA limit, indicating evidence of selenium bioaccumulation. Significant correlation was observed between the following pairs of elements: Fe-Zn (r= –0.70, Mn-Cu (r= –0.74, Se-Cu (r= –0.68, Se-Mn (r= 0.82 in the muscles; Fe-Al (r= 0.99, Mn-Al (r= 0.62, Mn-Fe (r= 0.62, Se-Al (r = 0.88, Se-Fe (r= 0.87, Se-Mn (r= 0.58 in the exoskeleton and Cu-Zn (r = 0.68, Al-Cu (r= 0.88, Fe-Cu (r= 0.95 and Fe-Al (r= 0.97 in the viscera.

  17. Trace metal distribution in sediments of the Pearl River Estuary and the surrounding coastal area, South China

    International Nuclear Information System (INIS)

    Ip, Carman C.M.; Li Xiangdong; Zhang Gan; Wai, Onyx W.H.; Li, Y.-S.

    2007-01-01

    Surface sediments and sediment cores collected at the Pearl River Estuary (PRE) and its surrounding coastal area were analysed for total metal concentrations, chemical partitioning, and Pb isotopic compositions. The distribution of Cu, Cr, Pb, and Zn demonstrated a typical diffusion pattern from the land to the direction of the sea. Two hotspots of trace metal contamination were located at the mixed zone between freshwater and marine waters. The enrichment of metals in the sediments could be attributed to the deposition of the dissolved and particulate trace metals in the water column at the estuarine area. The similar Pb isotopic signatures of the sediments at the PRE and its surrounding coastal area offered strong evidence that the PRE was a major source of trace metals to the adjacent coastal area. Slightly lower 206 Pb/ 207 Pb ratios in the coastal sediments may indicate other inputs of Pb in addition to the PRE sources, including the inputs from Hong Kong and other parts of the region. - The distribution of trace metals in sediments reflected contaminant sources, physical and chemical deposition processes

  18. Trace element distribution in different chemical fractions of False Bay sediments

    International Nuclear Information System (INIS)

    Rosental, R.

    1984-05-01

    Trace metals in the aquatic environment are generally concentrated on solid geochemical phases which eventually become incorporated into estuarine and marine sediments. The mechanism of trace metal concentration is believed to be adsorption on various geochemical phases, such as hydrous metal oxides, clays and organic matter. Metals in estuarine sediments can thus be expected to be partitioned between different phases, depending on the concentration of the phase and the strength of the adsorption bond. The bioavailability of sediment-bound metals to deposit-feeding organisms will depend on trace metal partitioning and the kinetics of biological metal uptake from each geochemical phase. The major objective of this study was to establish an analytical procedure involving sequential chemical extractions for the partitioning of particulate trace metals in sediment samples, collected from False Bay. Eight metals were examined, i.e. Cd, Cu, Cr, Fe, Mn, Ni, Pb and Zn. X-ray diffraction was also used in the study

  19. Workflow User Interfaces Patterns

    Directory of Open Access Journals (Sweden)

    Jean Vanderdonckt

    2012-03-01

    Full Text Available Este trabajo presenta una colección de patrones de diseño de interfaces de usuario para sistemas de información para el flujo de trabajo; la colección incluye cuarenta y tres patrones clasificados en siete categorías identificados a partir de la lógica del ciclo de vida de la tarea sobre la base de la oferta y la asignación de tareas a los responsables de realizarlas (i. e. recursos humanos durante el flujo de trabajo. Cada patrón de la interfaz de usuario de flujo de trabajo (WUIP, por sus siglas en inglés se caracteriza por las propiedades expresadas en el lenguaje PLML para expresar patrones y complementado por otros atributos y modelos que se adjuntan a dicho modelo: la interfaz de usuario abstracta y el modelo de tareas correspondiente. Estos modelos se especifican en un lenguaje de descripción de interfaces de usuario. Todos los WUIPs se almacenan en una biblioteca y se pueden recuperar a través de un editor de flujo de trabajo que vincula a cada patrón de asignación de trabajo a su WUIP correspondiente.A collection of user interface design patterns for workflow information systems is presented that contains forty three resource patterns classified in seven categories. These categories and their corresponding patterns have been logically identified from the task life cycle based on offering and allocation operations. Each Workflow User Interface Pattern (WUIP is characterized by properties expressed in the PLML markup language for expressing patterns and augmented by additional attributes and models attached to the pattern: the abstract user interface and the corresponding task model. These models are specified in a User Interface Description Language. All WUIPs are stored in a library and can be retrieved within a workflow editor that links each workflow pattern to its corresponding WUIP, thus giving rise to a user interface for each workflow pattern.

  20. Distribution of trace gases and aerosols in the Siberian air shed during wildfires of summer 2012

    Science.gov (United States)

    Belan, Boris D.; Paris, Jean-Daiel; Nedelec, Philippe; Antokhin, Pavel N.; Arshinova, Victoriya; Arshinov, Mikhail Yu.; Belan, Sergey B.; Davydov, Denis K.; Ivlev, Georgii A.; Fofonov, Alexandre V.; Kozlov, Artem V.; Rasskazchikova, Tatyana M.; Savkin, Denis E.; Simonenkov, Denis V.; Sklyadneva, Tatyana K.; Tolmachev, Gennadii N.

    2017-04-01

    During the last two decades, three strong biomass burning events have been observed in Russia: two of them in 2002 and 2010 in the European part of Russia, and another one in 2012 in West and East Siberia. In this paper we present results of the extensive airborne study of the vertical distribution of trace gases and aerosols carried out during strong wildfire event happened in summer 2012 in Siberia. For this purpose, the Optik TU-134 aircraft laboratory was used as a research platform. A large-scale airborne campaign has been undertaken along the route Novosibirsk-Mirny-Yakutsk-Bratsk-Novosibirsk on 31st of July and 1st of August, 2012. Flight pattern consisted of a number of ascents and descents between close to the ground and 8 km altitude that enabled 20 vertical profiles to be obtained. Campaign was conducted under the weather conditions of low-gradient baric field that determined the low speed transport of air masses, as well as the accumulation of biomass burning emissions in the region under study. Highest concentrations of CO2, CH4 and CO over wildfire spots reached 432 ppm, 2367 ppb, and 4036 ppb, correspondingly. If we exclude from the analysis the data obtained when crossing smoke plumes, we can find a difference between background concentrations measured in the atmosphere over regions affected by biomass burning and clean areas. Enhancement of CO2 over the wildfire areas changed with altitude. On average, it was 10.5 ppm in the atmospheric boundary layer (ABL) and 5-6 ppm in the free troposphere. Maximum CO2 enhancements reached 27 ppm and 24 ppm, correspondingly. The averaged CH4 enhancement varied from 75 ppb in the boundary layer to 30 ppb in the upper troposphere, and a little bit lower than 30 ppb in the middle troposphere. Maximum CH4 enhancements reached 202 ppb, 108 ppb, and 50-60 ppb, correspondingly. The averaged and maximum enhancements of CO differed by an order of magnitude. Thus, in the ABL the maximum difference in concentration between

  1. Accelerating the scientific exploration process with scientific workflows

    International Nuclear Information System (INIS)

    Altintas, Ilkay; Barney, Oscar; Cheng, Zhengang; Critchlow, Terence; Ludaescher, Bertram; Parker, Steve; Shoshani, Arie; Vouk, Mladen

    2006-01-01

    Although an increasing amount of middleware has emerged in the last few years to achieve remote data access, distributed job execution, and data management, orchestrating these technologies with minimal overhead still remains a difficult task for scientists. Scientific workflow systems improve this situation by creating interfaces to a variety of technologies and automating the execution and monitoring of the workflows. Workflow systems provide domain-independent customizable interfaces and tools that combine different tools and technologies along with efficient methods for using them. As simulations and experiments move into the petascale regime, the orchestration of long running data and compute intensive tasks is becoming a major requirement for the successful steering and completion of scientific investigations. A scientific workflow is the process of combining data and processes into a configurable, structured set of steps that implement semi-automated computational solutions of a scientific problem. Kepler is a cross-project collaboration, co-founded by the SciDAC Scientific Data Management (SDM) Center, whose purpose is to develop a domain-independent scientific workflow system. It provides a workflow environment in which scientists design and execute scientific workflows by specifying the desired sequence of computational actions and the appropriate data flow, including required data transformations, between these steps. Currently deployed workflows range from local analytical pipelines to distributed, high-performance and high-throughput applications, which can be both data- and compute-intensive. The scientific workflow approach offers a number of advantages over traditional scripting-based approaches, including ease of configuration, improved reusability and maintenance of workflows and components (called actors), automated provenance management, 'smart' re-running of different versions of workflow instances, on-the-fly updateable parameters, monitoring

  2. The trace formula and the distribution of eigenvalues of Schroedinger operators on manifolds all of whose geodesics are closed

    International Nuclear Information System (INIS)

    Schubert, R.

    1995-05-01

    We investigate the behaviour of the remainder term R(E) in the Weyl formula {nvertical stroke E n ≤E}=Vol(M).E d/2 /[(4π) d/2 Γ(d/2+1)]+R(E) for the eigenvalues E n of a Schroedinger operator on a d-dimensional compact Riemannian manifold all of whose geodesics are closed. We show that R(E) is of the form E (d-1)/2 Θ(√E), where Θ(x) is an almost periodic function of Besicovitch class B 2 which has a limit distribution whose density is a box-shaped function. Furthermore we derive a trace formula and study higher order terms in the asymptotics of the coefficients related to the periodic orbits. The periodicity of the geodesic flow leads to a very simple structure of the trace formula which is the reason why the limit distribution can be computed explicitly. (orig.)

  3. Distribution of trace metals at Hopewell Furnace National Historic Site, Berks and Chester Counties, Pennsylvania

    Science.gov (United States)

    Sloto, Ronald A.; Reif, Andrew G.

    2011-01-01

    Hopewell Furnace, located approximately 50 miles northwest of Philadelphia, was a cold-blast, charcoal iron furnace that operated for 113 years (1771 to 1883). The purpose of this study by the U.S. Geological Survey, in cooperation with the National Park Service, was to determine the distribution of trace metals released to the environment from an historical iron smelter at Hopewell Furnace National Historic Site (NHS). Hopewell Furnace used iron ore from local mines that contained abundant magnetite and accessory sulfide minerals enriched in arsenic, cobalt, copper, and other metals. Ore, slag, cast iron furnace products, soil, groundwater, stream base flow, streambed sediment, and benthic macroinvertebrates were sampled for this study. Soil samples analyzed in the laboratory had concentrations of trace metals low enough to meet Pennsylvania Department of Environmental Protection standards for non-residential use. Groundwater samples from the supply well met U.S. Environmental Protection Agency drinking-water regulations. Concentrations of metals in surface-water base flow at the five stream sampling sites were below continuous concentration criteria for protection of aquatic organisms. Concentrations of metals in sediment at the five stream sites were below probable effects level guidelines for protection of aquatic organisms except for copper at site HF-3. Arsenic, copper, lead, zinc, and possibly cobalt were incorporated into the cast iron produced by Hopewell Furnace. Manganese was concentrated in slag along with iron, nickel, and zinc. The soil near the furnace has elevated concentrations of chromium, copper, iron, lead, and zinc compared to background soil concentrations. Concentrations of toxic elements were not present at concentrations of concern in water, soil, or stream sediments, despite being elevated in ore, slag, and cast iron furnace products. The base-flow surface-water samples indicated good overall quality. The five sampled sites generally had

  4. The influence of ethanol addition on the spatial emission distribution of traces in a vertical argon stabilized DC arc plasma

    Directory of Open Access Journals (Sweden)

    MARIJA TODOROVIC

    2004-05-01

    Full Text Available The plasma of a vertical argon stabilized DC arc at atmospheric pressure is applied as a spectrochemical source. The lateral distributions of relative spectral line intensities of some trace elements (Zn, Pt, Cd, Mg, Ca and Al introduced into the plasma in the form of aqueous and ethanol–aqueous solutions were experimentally determined. These distributions were correlated with the calculated equilibrium plasma composition of the arc plasma. On the basis of the obtained results, an explanation of the influence of ethanol addition on the radiation densities from an arc plasma is given.

  5. Marine lake as in situ laboratory for studies of organic matter influence on speciation and distribution of trace metals

    Science.gov (United States)

    Mlakar, Marina; Fiket, Željka; Geček, Sunčana; Cukrov, Neven; Cuculić, Vlado

    2015-07-01

    Karst marine lakes are unique marine systems, also recognized as in situ "laboratories" in which geochemical processes on a different scale compared to the open sea, can be observed. In this study, organic matter cycle and its impact on distribution of trace metals in the marine lake Mir, located on Dugi Otok Island, in the central part of the eastern Adriatic Sea, was investigated for the first time. Studied marine lake is small, isolated, shallow basin, with limited communication with the open sea. Intense spatial and seasonal variations of organic matter, dissolved and particulate (DOC, POC), and dissolved trace metals concentrations in the water column of the Lake are governed predominantly by natural processes. Enhanced oxygen consumption in the Lake during summer season, high DOC and POC concentrations and low redox potential result in occasional occurrence of anoxic conditions in the bottom layers with appearance of sulfur species. Speciation modeling showed that dissolved trace metals Cu, Pb and Zn, are mostly bound to organic matter, while Cd, Co and Ni are present predominantly as free ions and inorganic complexes. Trace metals removal from the water column and their retention in the sediment was found to depend on the nature of the relationship between specific metal and organic or inorganic phases, sulfides, Fe-oxyhydroxydes or biogenic calcite. The above is reflected in the composition of the sediments, which are, in addition to influence of karstic background and bathymetry of the basin, significantly affected by accumulation of detritus at the bottom of the Lake.

  6. DIaaS: Data-Intensive workflows as a service - Enabling easy composition and deployment of data-intensive workflows on Virtual Research Environments

    Science.gov (United States)

    Filgueira, R.; Ferreira da Silva, R.; Deelman, E.; Atkinson, M.

    2016-12-01

    We present the Data-Intensive workflows as a Service (DIaaS) model for enabling easy data-intensive workflow composition and deployment on clouds using containers. DIaaS model backbone is Asterism, an integrated solution for running data-intensive stream-based applications on heterogeneous systems, which combines the benefits of dispel4py with Pegasus workflow systems. The stream-based executions of an Asterism workflow are managed by dispel4py, while the data movement between different e-Infrastructures, and the coordination of the application execution are automatically managed by Pegasus. DIaaS combines Asterism framework with Docker containers to provide an integrated, complete, easy-to-use, portable approach to run data-intensive workflows on distributed platforms. Three containers integrate the DIaaS model: a Pegasus node, and an MPI and an Apache Storm clusters. Container images are described as Dockerfiles (available online at http://github.com/dispel4py/pegasus_dispel4py), linked to Docker Hub for providing continuous integration (automated image builds), and image storing and sharing. In this model, all required software (workflow systems and execution engines) for running scientific applications are packed into the containers, which significantly reduces the effort (and possible human errors) required by scientists or VRE administrators to build such systems. The most common use of DIaaS will be to act as a backend of VREs or Scientific Gateways to run data-intensive applications, deploying cloud resources upon request. We have demonstrated the feasibility of DIaaS using the data-intensive seismic ambient noise cross-correlation application (Figure 1). The application preprocesses (Phase1) and cross-correlates (Phase2) traces from several seismic stations. The application is submitted via Pegasus (Container1), and Phase1 and Phase2 are executed in the MPI (Container2) and Storm (Container3) clusters respectively. Although both phases could be executed

  7. Biowep: a workflow enactment portal for bioinformatics applications.

    Science.gov (United States)

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-03-08

    The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of

  8. Biowep: a workflow enactment portal for bioinformatics applications

    Directory of Open Access Journals (Sweden)

    Romano Paolo

    2007-03-01

    Full Text Available Abstract Background The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS, can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. Results We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. Conclusion We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical

  9. Altitudinal patterns and controls of trace metal distribution in soils of a remote high mountain, Southwest China.

    Science.gov (United States)

    Li, Rui; Bing, Haijian; Wu, Yanhong; Zhou, Jun; Xiang, Zhongxiang

    2018-02-01

    The aim of this study is to reveal the effects of regional human activity on trace metal accumulation in remote alpine ecosystems under long-distance atmospheric transport. Trace metals (Cd, Pb, and Zn) in soils of the Mt. Luoji, Southwest China, were investigated along a large altitudinal gradient [2200-3850 m above sea level (a.s.l.)] to elaborate the key factors controlling their distribution by Pb isotopic composition and statistical models. The concentrations of Cd, Pb, and Zn in the surface soils (O and A horizons) were relatively low at the altitudes of 3500-3700 m a.s.l. The enrichment factors of trace metals in the surface soils increased with altitude. After normalization for soil organic matter, the concentrations of Cd still increased with altitude, whereas those of Pb and Zn did not show a clear altitudinal trend. The effects of vegetation and cold trapping (CTE) (pollutant enrichment by decreasing temperature with increasing altitude) mainly determined the distribution of Cd and Pb in the O horizon, whereas CTE and bedrock weathering (BW) controlled that of Zn. In the A horizon, the distribution of Cd and Pb depended on the vegetation regulation, whereas that of Zn was mainly related to BW. Human activity, including ores mining and fossil fuels combustion, increased the trace metal deposition in the surface soils. The anthropogenic percentage of Cd, Pb, and Zn quantified 92.4, 67.8, and 42.9% in the O horizon, and 74.5, 33.9, and 24.9% in the A horizon, respectively. The anthropogenic metals deposited at the high altitudes of Mt. Luoji reflected the impact of long-range atmospheric transport on this remote alpine ecosystem from southern and southwestern regions.

  10. Spatial distribution of trace metals in water resources impacted by PGM activities

    CSIR Research Space (South Africa)

    Walters, Chavon R

    2012-05-01

    Full Text Available the aquatic environment from natural and anthropogenic sources (such as industrial effluents and mining wastes). Trace metals can accumulate in fish (which are often at the top of the aquatic food chain) either through water or the food chain (1), and metals...

  11. Data Workflow - A Workflow Model for Continuous Data Processing

    NARCIS (Netherlands)

    Wombacher, Andreas

    2010-01-01

    Online data or streaming data are getting more and more important for enterprise information systems, e.g. by integrating sensor data and workflows. The continuous flow of data provided e.g. by sensors requires new workflow models addressing the data perspective of these applications, since

  12. Effects of statistical distribution of joint trace length on the stability of tunnel excavated in jointed rock mass

    Directory of Open Access Journals (Sweden)

    Kayvan Ghorbani

    2015-12-01

    Full Text Available The rock masses in a construction site of underground cavern are generally not continuous, due to the presence of discontinuities, such as bedding, joints, faults, and fractures. The performance of an underground cavern is principally ruled by the mechanical behaviors of the discontinuities in the vicinity of the cavern. During underground excavation, many surrounding rock failures have close relationship with joints. The stability study on tunnel in jointed rock mass is of importance to rock engineering, especially tunneling and underground space development. In this study, using the probability density distribution functions of negative exponential, log-normal and normal, we investigated the effect of joint trace length on the stability parameters such as stress and displacement of tunnel constructed in rock mass using UDEC (Universal Distinct Element Code. It was obtained that normal distribution function of joint trace length is more critical on the stability of tunnel, and exponential distribution function has less effect on the tunnel stability compared to the two other distribution functions.

  13. Effects of cooking and subcellular distribution on the bioaccessibility of trace elements in two marine fish species.

    Science.gov (United States)

    He, Mei; Ke, Cai-Huan; Wang, Wen-Xiong

    2010-03-24

    In current human health risk assessment, the maximum acceptable concentrations of contaminants in food are mostly based on the total concentrations. However, the total concentration of contaminants may not always reflect the available amount. Bioaccessibility determination is thus required to improve the risk assessment of contaminants. This study used an in vitro digestion model to assess the bioaccessibility of several trace elements (As, Cd, Cu, Fe, Se, and Zn) in the muscles of two farmed marine fish species (seabass Lateolabrax japonicus and red seabream Pagrosomus major ) of different body sizes. The total concentrations and subcellular distributions of these trace elements in fish muscles were also determined. Bioaccessibility of these trace elements was generally high (>45%), and the lowest bioaccessibility was observed for Fe. Cooking processes, including boiling, steaming, frying, and grilling, generally decreased the bioaccessibility of these trace elements, especially for Cu and Zn. The influences of frying and grilling were greater than those of boiling and steaming. The relationship of bioaccessibility and total concentration varied with the elements. A positive correlation was found for As and Cu and a negative correlation for Fe, whereas no correlation was found for Cd, Se, and Zn. A significant positive relationship was demonstrated between the bioaccessibility and the elemental partitioning in the heat stable protein fraction and in the trophically available fraction, and a negative correlation was observed between the bioaccessibility and the elemental partitioning in metal-rich granule fraction. Subcellular distribution may thus affect the bioaccessibility of metals and should be considered in the risk assessment for seafood safety.

  14. LabelFlow Framework for Annotating Workflow Provenance

    Directory of Open Access Journals (Sweden)

    Pinar Alper

    2018-02-01

    Full Text Available Scientists routinely analyse and share data for others to use. Successful data (reuse relies on having metadata describing the context of analysis of data. In many disciplines the creation of contextual metadata is referred to as reporting. One method of implementing analyses is with workflows. A stand-out feature of workflows is their ability to record provenance from executions. Provenance is useful when analyses are executed with changing parameters (changing contexts and results need to be traced to respective parameters. In this paper we investigate whether provenance can be exploited to support reporting. Specifically; we outline a case-study based on a real-world workflow and set of reporting queries. We observe that provenance, as collected from workflow executions, is of limited use for reporting, as it supports queries partially. We identify that this is due to the generic nature of provenance, its lack of domain-specific contextual metadata. We observe that the required information is available in implicit form, embedded in data. We describe LabelFlow, a framework comprised of four Labelling Operators for decorating provenance with domain-specific Labels. LabelFlow can be instantiated for a domain by plugging it with domain-specific metadata extractors. We provide a tool that takes as input a workflow, and produces as output a Labelling Pipeline for that workflow, comprised of Labelling Operators. We revisit the case-study and show how Labels provide a more complete implementation of reporting queries.

  15. Biogeochemical and hydrological controls on fate and distribution of trace metals in oiled Gulf salt marshes

    Science.gov (United States)

    Keevan, J.; Natter, M.; Lee, M.; Keimowitz, A.; Okeke, B.; Savrda, C.; Saunders, J.

    2011-12-01

    On April 20, 2010, the drilling rig Deepwater Horizon exploded in the Gulf of Mexico, resulting in the release of approximately 5 million barrels of crude oil into the environment. Oil and its associated trace metals have been demonstrated to have a detrimental effect on coastal wetland ecosystems. Wetlands are particularly susceptible to oil contamination because they are composed largely of fine-grained sediments, which have a high capacity to adsorb organic matter and metals. The biogeochemical cycling of trace metals can be strongly influenced by microbial activity, specifically those of sulfate- and iron-reducing bacteria. Microbial activity may be enhanced by an increase in amounts of organic matter such as oil. This research incorporates an assessment of levels of trace metals and associated biogeochemical changes from ten coastal marshes in Alabama, Mississippi, and Louisiana. These sampling sites range in their pollution levels from pristine to highly contaminated. A total digestion analysis of wetland sediments shows higher concentrations of certain trace metals (e.g., Ni, Cu, Pb, Zn, Sr, Co, V, Ba, Hg, As) in heavily-oiled areas compared to less-affected and pristine sites. Due to chemical complexation among organic compounds and metals, crude oils often contain elevated levels (up to hundreds of mg/kg) of trace metals At the heavily-oiled Louisiana sites (e.g., Bay Jimmy, Bayou Dulac, Bay Batiste), elevated levels of metals and total organic carbon have been found in sediments down to depths of 30 cm. Clearly the contamination is not limited to shallow sediments and oil, along with various associated metals, may be invading into deeper (pre-industrial) portions of the marsh sediments. Pore-waters extracted from contaminated sediments are characterized by very high levels of reduced sulfur (up to 80 mg/kg), in contrast to fairly low ferrous iron concentrations (<0.02 mg/kg). The influx of oil into the wetlands might provide the initial substrate and

  16. Parametric Room Acoustic Workflows

    DEFF Research Database (Denmark)

    Parigi, Dario; Svidt, Kjeld; Molin, Erik

    2017-01-01

    The paper investigates and assesses different room acoustics software and the opportunities they offer to engage in parametric acoustics workflow and to influence architectural designs. The first step consists in the testing and benchmarking of different tools on the basis of accuracy, speed...... and interoperability with Grasshopper 3d. The focus will be placed to the benchmarking of three different acoustic analysis tools based on raytracing. To compare the accuracy and speed of the acoustic evaluation across different tools, a homogeneous set of acoustic parameters is chosen. The room acoustics parameters...... included in the set are reverberation time (EDT, RT30), clarity (C50), loudness (G), and definition (D50). Scenarios are discussed for determining at different design stages the most suitable acoustic tool. Those scenarios are characterized, by the use of less accurate but fast evaluation tools to be used...

  17. Implementing Oracle Workflow

    CERN Document Server

    Mathieson, D W

    1999-01-01

    CERN (see [CERN]) is the world's largest physics research centre. Currently there are around 5,000 people working at the CERN site located on the border of France and Switzerland near Geneva along with another 4,000 working remotely at institutes situated all around the globe. CERN is currently working on the construction of our newest scientific instrument called the Large Hadron Collider (LHC); the construction alone of this 27-kilometre particle accelerator will not complete until 2005. Like many businesses in the current economic climate CERN is expected to continue growing, yet staff numbers are planned to fall in the coming years. In essence, do more with less. In an environment such as this, it is critical that the administration is as efficient as possible. One of the ways that administrative procedures are streamlined is by the use of an organisation-wide workflow system.

  18. Digital workflows in contemporary orthodontics

    Directory of Open Access Journals (Sweden)

    Lars R Christensen

    2017-01-01

    Full Text Available Digital workflows are now increasingly possible in orthodontic practice. Workflows designed to improve the customization of orthodontic appliances are now available through laboratories and orthodontic manufacturing facilities in many parts of the world. These now have the potential to improve certain aspects of patient care.

  19. Trace elements in loggerhead turtles (Caretta caretta) stranded in mainland Portugal: Bioaccumulation and tissue distribution.

    Science.gov (United States)

    Nicolau, Lídia; Monteiro, Sílvia S; Pereira, Andreia T; Marçalo, Ana; Ferreira, Marisa; Torres, Jordi; Vingada, José; Eira, Catarina

    2017-07-01

    Pollution is among the most significant threats that endanger sea turtles worldwide. Waters off the Portuguese mainland are acknowledged as important feeding grounds for juvenile loggerheads. However, there is no data on trace element concentrations in marine turtles occurring in these waters. We present the first assessment of trace element concentrations in loggerhead turtles (Caretta caretta) occurring off the coast of mainland Portugal. Also, we compare our results with those from other areas and discuss parameters that may affect element concentrations. Trace element concentrations (As, Cd, Cu, Pb, Mn, Hg, Ni, Se, Zn) were determined in kidney, liver and muscle samples from 38 loggerheads stranded between 2011 and 2013. As was the only element with higher concentrations in muscle (14.78 μg g -1 ww) than in liver or kidney. Considering non-essential elements, Cd presented the highest concentrations in kidney (34.67 μg g -1 ) and liver (5.03 μg g -1 ). Only a weak positive link was found between renal Cd and turtle size. Inter-elemental correlations were observed in both liver and kidney tissues. Hepatic Hg values (0.30 ± 0.03 μg g -1 ) were higher than values reported in loggerheads in the Canary Islands but lower than in Mediterranean loggerheads. Cd concentrations in the present study were only exceeded by values found in turtles from the Pacific. Although many endogenous and exogenous parameters related with complex life cycle changes and wide geographic range may influence trace element accumulation, the concentrations of Cd are probably related to the importance of crustaceans in loggerhead diet in the Portuguese coast. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Trace elements in home-produced eggs in Belgium: Levels and spatiotemporal distribution

    International Nuclear Information System (INIS)

    Waegeneers, Nadia; Hoenig, Michel; Goeyens, Leo; De Temmerman, Ludwig

    2009-01-01

    The purpose of this study was to evaluate the levels of arsenic, cadmium, lead, copper and zinc in home-produced eggs, soils and kitchen waste samples of private chicken owners in Belgium, and to determine spatiotemporal differences in trace element contents in eggs. Eggs were sampled in all provinces of Belgium in autumn 2006 and spring 2007. A total number of 59 private chicken owners participated in the study. Trace elements were determined by inductively coupled plasma-mass spectrometry except for mercury, which was determined by atomic absorption of mercury vapour. The mean fresh weight concentrations in eggs in autumn and spring respectively were < 8.0 and < 8.0 μg/kg for arsenic, 0.5 and < 0.5 μg/kg for cadmium, 116 and 74 μg/kg for lead, 0.43 and 0.52 mg/kg for copper, 20.3 and 19.2 mg/kg for zinc, and 3.15 and 4.44 μg/kg for mercury. Analysis of variance determined significant differences in some trace element concentrations in eggs among seasons and regions in Belgium. Average concentrations of arsenic, cadmium and mercury corresponded well with values measured in other countries, while copper and zinc concentrations were within the same order of magnitude as in other countries. Average lead concentrations were high compared to concentrations in eggs from other countries and correlated well with lead concentrations in soil, indicating that the soil is an important source. Other sources of trace elements in eggs might be home-grown vegetables and forage (grass and herbs), and indirectly, air pollution.

  1. Trace element distribution in the snow cover from an urban area in central Poland.

    Science.gov (United States)

    Siudek, Patrycja; Frankowski, Marcin; Siepak, Jerzy

    2015-05-01

    This work presents the first results from winter field campaigns focusing on trace metals and metalloid chemistry in the snow cover from an urbanized region in central Poland. Samples were collected between January and March 2013 and trace element concentrations were determined using GF-AAS. A large inter-seasonal variability depending on anthropogenic emission, depositional processes, and meteorological conditions was observed. The highest concentration (in μg L(-1)) was reported for Pb (34.90), followed by Ni (31.37), Zn (31.00), Cu (13.71), Cr (2.36), As (1.58), and Cd (0.25). In addition, several major anthropogenic sources were identified based on principal component analysis (PCA), among which the most significant was the activity of industry and coal combustion for residential heating. It was stated that elevated concentrations of some trace metals in snow samples were associated with frequent occurrence of south and southeast advection of highly polluted air masses toward the sampling site, suggesting a large impact of regional urban/industrial pollution plumes.

  2. Constructing Workflows from Script Applications

    Directory of Open Access Journals (Sweden)

    Mikołaj Baranowski

    2012-01-01

    Full Text Available For programming and executing complex applications on grid infrastructures, scientific workflows have been proposed as convenient high-level alternative to solutions based on general-purpose programming languages, APIs and scripts. GridSpace is a collaborative programming and execution environment, which is based on a scripting approach and it extends Ruby language with a high-level API for invoking operations on remote resources. In this paper we describe a tool which enables to convert the GridSpace application source code into a workflow representation which, in turn, may be used for scheduling, provenance, or visualization. We describe how we addressed the issues of analyzing Ruby source code, resolving variable and method dependencies, as well as building workflow representation. The solutions to these problems have been developed and they were evaluated by testing them on complex grid application workflows such as CyberShake, Epigenomics and Montage. Evaluation is enriched by representing typical workflow control flow patterns.

  3. Rare earth element distributions in the West Pacific: Trace element sources and conservative vs. non-conservative behavior

    Science.gov (United States)

    Behrens, Melanie K.; Pahnke, Katharina; Paffrath, Ronja; Schnetger, Bernhard; Brumsack, Hans-Jürgen

    2018-03-01

    Recent studies suggest that transport and water mass mixing may play a dominant role in controlling the distribution of dissolved rare earth element concentrations ([REE]) at least in parts of the North and South Atlantic and the Pacific Southern Ocean. Here we report vertically and spatially high-resolution profiles of dissolved REE concentrations ([REE]) along a NW-SE transect in the West Pacific and examine the processes affecting the [REE] distributions in this area. Surface water REE patterns reveal sources of trace element (TE) input near South Korea and in the tropical equatorial West Pacific. Positive europium anomalies and middle REE enrichments in surface and subsurface waters are indicative of TE input from volcanic islands and fingerprint in detail small-scale equatorial zonal eastward transport of TEs to the iron-limited tropical East Pacific. The low [REE] of North and South Pacific Tropical Waters and Antarctic Intermediate Water are a long-range (i.e., preformed) laterally advected signal, whereas increasing [REE] with depth within North Pacific Intermediate Water result from release from particles. Optimum multiparameter analysis of deep to bottom waters indicates a dominant control of lateral transport and mixing on [REE] at the depth of Lower Circumpolar Deep Water (≥3000 m water depth; ∼75-100% explained by water mass mixing), allowing the northward tracing of LCDW to ∼28°N in the Northwest Pacific. In contrast, scavenging in the hydrothermal plumes of the Lau Basin and Tonga-Fiji area at 1500-2000 m water depth leads to [REE] deficits (∼40-60% removal) and marked REE fractionation in the tropical West Pacific. Overall, our data provide evidence for active trace element input both near South Korea and Papua New Guinea, and for a strong lateral transport component in the distribution of dissolved REEs in large parts of the West Pacific.

  4. Brain-wide mapping of axonal connections: workflow for automated detection and spatial analysis of labeling in microscopic sections

    Directory of Open Access Journals (Sweden)

    Eszter Agnes ePapp

    2016-04-01

    Full Text Available Axonal tracing techniques are powerful tools for exploring the structural organization of neuronal connections. Tracers such as biotinylated dextran amine (BDA and Phaseolus vulgaris leucoagglutinin (Pha-L allow brain-wide mapping of connections through analysis of large series of histological section images. We present a workflow for efficient collection and analysis of tract-tracing datasets with a focus on newly developed modules for image processing and assignment of anatomical location to tracing data. New functionality includes automatic detection of neuronal labeling in large image series, alignment of images to a volumetric brain atlas, and analytical tools for measuring the position and extent of labeling. To evaluate the workflow, we used high-resolution microscopic images from axonal tracing experiments in which different parts of the rat primary somatosensory cortex had been injected with BDA or Pha-L. Parameters from a set of representative images were used to automate detection of labeling in image series covering the entire brain, resulting in binary maps of the distribution of labeling. For high to medium labeling densities, automatic detection was found to provide reliable results when compared to manual analysis, whereas weak labeling required manual curation for optimal detection. To identify brain regions corresponding to labeled areas, section images were aligned to the Waxholm Space (WHS atlas of the Sprague Dawley rat brain (v2 by custom-angle slicing of the MRI template to match individual sections. Based on the alignment, WHS coordinates were obtained for labeled elements and transformed to stereotaxic coordinates. The new workflow modules increase the efficiency and reliability of labeling detection in large series of images from histological sections, and enable anchoring to anatomical atlases for further spatial analysis and comparison with other data.

  5. Distribution of trace levels of therapeutic gallium in bone as mapped by synchrotron X-ray microscopy

    International Nuclear Information System (INIS)

    Bockman, R.S.; Repo, M.A.; Warrell, R.P. Jr.; Pounds, J.G.; Schidlovsky, G.; Gordon, B.M.; Jones, K.W.

    1990-01-01

    Gallium nitrate, a drug that inhibits calcium release from bone, has been proven a safe and effective treatment for the accelerated bone resorption associated with cancer. Though bone is a target organ for gallium, the kinetics, sites, and effects of gallium accumulation in bone are not known. The authors have used synchrotron X-ray microscopy to map the distribution of trace levels of gallium in bone. After short-term in vivo administration of gallium nitrate to rats, trace (nanogram) amounts of gallium preferentially localized to the metabolically active regions in the metaphysis as well as the endosteal and periosteal surfaces of diaphyseal bone, regions where new bone formation and modeling were occurring. The amounts measured were well below the levels known to be cytotoxic. Iron and zinc, trace elements normally found in bone, were decreased in amount after in vivo administration of gallium. These studies represent a first step toward understanding the mechanism(s) of action of gallium in bone by suggesting the possible cellular, structural, and elemental targets of gallium

  6. Tropospheric chemistry over the lower Great Plains of the United States. 2. Trace gas profiles and distributions

    Science.gov (United States)

    Luke, Winston T.; Dickerson, Russell R.; Ryan, William F.; Pickering, Kenneth E.; Nunnermacker, Linda J.

    1992-12-01

    Convective clouds and thunderstorms redistribute air pollutants vertically, and by altering the chemistry and radiative balance of the upper troposphere, these local actions can have global consequences. To study these effects, measurements of trace gases ozone, O3, carbon monoxide, CO, and odd nitrogen were made aboard the NCAR Sabreliner on 18 flights over the southern Great Plains during June 1985. To demonstrate chemical changes induced by vertical motions in the atmosphere and to facilitate comparison with computer model calculations, these data were categorized according to synoptic flow patterns. Part 1 of this two-part paper details the alternating pulses of polar and maritime air masses that dominate the vertical mixing in this region. In this paper, trace gas measurements are presented as altitude profiles (0-12 km) with statistical distributions of mixing ratios for each species in each flow pattern. The polar flow regime is characterized by northwesterly winds, subsiding air, and convective stability. Concentrations of CO and total odd nitrogen (NOy) are relatively high in the shallow planetary boundary layer (PBL) but decrease rapidly with altitude. Ozone, on the other hand, is uniformly distributed, suggesting limited photochemical production; in fact, nitric oxide, NO, mixing ratios fell below 10 ppt (parts per 1012 by volume) in the midtroposphere. The maritime regime is characterized by southerly surface winds, convective instability, and a deep PBL; uniformly high concentrations of trace gases were found up to 4 km on one flight. Severe storms occur in maritime flow, especially when capped by a dry layer, and they transport large amounts of CO, O3, and NOy into the upper troposphere. Median NO levels at high altitude exceeded 300 ppt. Lightning produces spikes of NO (but not CO) with mixing ratios sometimes exceeding 1000 ppt. This flow pattern tends to leave the midtroposphere relatively clean with concentrations of trace gases similar to those

  7. Distribution of sulphur and trace elements in peat. A literature survey with some additional sulphur analyses

    Energy Technology Data Exchange (ETDEWEB)

    Huttunen, S; Karhu, M

    1981-01-01

    A survey on the literature and contemporary research was made on peat sulphur and trace element studies. Marked variance between different peatlands and peat types has been noted. The available information is still inadequate for generalizations or statistical analysis mainly due to methodological variations and temporal and spatial variations in results. At the moment, the criteria applied in peatland inventories and evaluations are inadequate with respect to peat quality determinations. To some extent the quality of fuel peat should be determined in a mire inventory prior to peatland utilization. The areas over sulphide clay and some sulphate depositions may considerably increase the peat sulphur content. A proposal has been made to include the sulphur content monitoring in the cases where it exceeds 0.3 per cent. The trace elements may also bring about an increase in peat emissions if the deepest peat layers or polluted layers are burnt. The most important elements in this respect are Al, Fe, Mn, Pb, Zn, V, Ni, Hg, Cu, Cr, as well as As and U. The first ten because of the relatively high concentrations and last two because of pollution or toxocity and ore deposit factors. The peat hydrogen ion concentration has a positive correlation with copper and vanadium. The correlation is positive with the cobalt and nickel contents when the pH is low, and negative at a higher pH. A general peat type correlation shows maximum trace element contents in basal Carex peats with subsoil effects. The peat ash content and the Ti, Pb, V, Cr, Ni, and S contents have positive correlations. (Refs. 290).

  8. Distribution of sulphur and trace elements in peat. [Literature survey with some additional sulphur analyses

    Energy Technology Data Exchange (ETDEWEB)

    Huttunen, S; Karhu, M

    1981-01-01

    A survey on the literature and contemporary research was made on peat sulphur and trace element studies. Marked variance between different peatlands and peat types has been noted. The available information is still inadequate for generalizations or statistical analysis mainly due to methodological variations and temporal and spatial variations in results. At the moment, the criteria applied in peatland inventories and evaluations are inadequate with respect to peat quality determinations. To some extent the quality of fuel peat should be determined in a mire inventory prior to peatland utilization. The areas over sulphide clay and some sulphate depositions may considerably increase the peat sulphur content. A proposal has been made to include the sulphur content monitoring in the cases where it exceeds 0.3 per cent. The trace elements may also bring about an increase in peat emissions if the deepest peat layers or polluted layers are burnt. The most important elements in this respect are Al, Fe, Mn, Pb, Zn, V, Ni, Hg, Cu, Cr, as well as As and U. The first ten because of the relatively high concentrations and last two because of pollution or toxocity and ore deposit factors. The peat hydrogen ion concentration has a positive correlation with copper and vanadium. The correlation is positive with the cobalt and nickel contents when the pH is low, and negative at a higher pH. A general peat type correlation shows maximum trace element contents in basal Carex peats with subsoil effects. The peat ash content and the Ti, Pb, V, Cr, Ni and S contents have positive correlations.

  9. Distribution of trace elements in selected pulverized coals as a function of particle size and density

    Science.gov (United States)

    Senior, C.L.; Zeng, T.; Che, J.; Ames, M.R.; Sarofim, A.F.; Olmez, I.; Huggins, Frank E.; Shah, N.; Huffman, G.P.; Kolker, A.; Mroczkowski, S.; Palmer, C.; Finkelman, R.

    2000-01-01

    Trace elements in coal have diverse modes of occurrence that will greatly influence their behavior in many coal utilization processes. Mode of occurrence is important in determining the partitioning during coal cleaning by conventional processes, the susceptibility to oxidation upon exposure to air, as well as the changes in physical properties upon heating. In this study, three complementary methods were used to determine the concentrations and chemical states of trace elements in pulverized samples of four US coals: Pittsburgh, Illinois No. 6, Elkhorn and Hazard, and Wyodak coals. Neutron Activation Analysis (NAA) was used to measure the absolute concentration of elements in the parent coals and in the size- and density-fractionated samples. Chemical leaching and X-ray absorption fine structure (XAFS) spectroscopy were used to provide information on the form of occurrence of an element in the parent coals. The composition differences between size-segregated coal samples of different density mainly reflect the large density difference between minerals, especially pyrite, and the organic portion of the coal. The heavy density fractions are therefore enriched in pyrite and the elements associated with pyrite, as also shown by the leaching and XAFS methods. Nearly all the As is associated with pyrite in the three bituminous coals studied. The sub-bituminous coal has a very low content of pyrite and arsenic; in this coal arsenic appears to be primarily organically associated. Selenium is mainly associated with pyrite in the bituminous coal samples. In two bituminous coal samples, zinc is mostly in the form of ZnS or associated with pyrite, whereas it appears to be associated with other minerals in the other two coals. Zinc is also the only trace element studied that is significantly more concentrated in the smaller (45 to 63 ??m) coal particles.

  10. Notes on saltwater intrusion and trace element distribution in Metro Manila groundwaters

    International Nuclear Information System (INIS)

    Santos, G. Jr.; Ramos, A.F.; Fernandez, L.G.; Almoneda, R.V.; Garcia, T.Y.; Cruz, C.C.; Petrache, C.A.; Andal, T.T.; Alcantara, E.

    1989-01-01

    Preliminary analyses of waters for uranium and other trace elements from deepwells operated by the Metropolitan Waterworks and Sewerage System (MWSS) in Metro Manila were performed. Uranium, which ranged from 0.2 ppb to 6 ppb, was correlated with saltwater intrusion. Values >=0.8 ppb for uranium were considered indicative of saline water intrusion in the aquifers. Saline water intrusions in Malabon, Navotas, Paranaque, Las Pinas, Bacoor, Imus, Kawit, Pasig, Antipolo, San Mateo, Taguig, Cainta, Taytay, Alabang and Muntinlupa were noted. Most of these areas were also identified by MWSS as being affected by saltwater intrusion. Tritium values ranged from 0 (below detection limits) to 44 tritium units. Except for one well in Muntinlupa, all the values obtained were below the lower limit of detection of 30.83 T.U. Mercury contents in six well locations had values above the maximum limit set by the National Standards for Drinking Water. Four wells exceeded the permissible level for manganese while two wells had iron concentrations greater than the National Standards. Other trace element concentrations such as Cr, Pb, Zn, Co and Ni either did not exceed their permissible levels or were not included in the National Standards. (Auth.). 6 refs.; 1 tab.; 3 figs

  11. CMS data and workflow management system

    CERN Document Server

    Fanfani, A; Bacchi, W; Codispoti, G; De Filippis, N; Pompili, A; My, S; Abbrescia, M; Maggi, G; Donvito, G; Silvestris, L; Calzolari, F; Sarkar, S; Spiga, D; Cinquili, M; Lacaprara, S; Biasotto, M; Farina, F; Merlo, M; Belforte, S; Kavka, C; Sala, L; Harvey, J; Hufnagel, D; Fanzago, F; Corvo, M; Magini, N; Rehn, J; Toteva, Z; Feichtinger, D; Tuura, L; Eulisse, G; Bockelman, B; Lundstedt, C; Egeland, R; Evans, D; Mason, D; Gutsche, O; Sexton-Kennedy, L; Dagenhart, D W; Afaq, A; Guo, Y; Kosyakov, S; Lueking, L; Sekhri, V; Fisk, I; McBride, P; Bauerdick, L; Bakken, J; Rossman, P; Wicklund, E; Wu, Y; Jones, C; Kuznetsov, V; Riley, D; Dolgert, A; van Lingen, F; Narsky, I; Paus, C; Klute, M; Gomez-Ceballos, G; Piedra-Gomez, J; Miller, M; Mohapatra, A; Lazaridis, C; Bradley, D; Elmer, P; Wildish, T; Wuerthwein, F; Letts, J; Bourilkov, D; Kim, B; Smith, P; Hernandez, J M; Caballero, J; Delgado, A; Flix, J; Cabrillo-Bartolome, I; Kasemann, M; Flossdorf, A; Stadie, H; Kreuzer, P; Khomitch, A; Hof, C; Zeidler, C; Kalini, S; Trunov, A; Saout, C; Felzmann, U; Metson, S; Newbold, D; Geddes, N; Brew, C; Jackson, J; Wakefield, S; De Weirdt, S; Adler, V; Maes, J; Van Mulders, P; Villella, I; Hammad, G; Pukhaeva, N; Kurca, T; Semneniouk, I; Guan, W; Lajas, J A; Teodoro, D; Gregores, E; Baquero, M; Shehzad, A; Kadastik, M; Kodolova, O; Chao, Y; Ming Kuo, C; Filippidis, C; Walzel, G; Han, D; Kalinowski, A; Giro de Almeida, N M; Panyam, N

    2008-01-01

    CMS expects to manage many tens of peta bytes of data to be distributed over several computing centers around the world. The CMS distributed computing and analysis model is designed to serve, process and archive the large number of events that will be generated when the CMS detector starts taking data. The underlying concepts and the overall architecture of the CMS data and workflow management system will be presented. In addition the experience in using the system for MC production, initial detector commissioning activities and data analysis will be summarized.

  12. Geochemical distribution of major, trace and rare elements in chromite ores of Neyriz ophiolite

    International Nuclear Information System (INIS)

    Karimi, M.; Hosseini, S. Z.; Khankahdani, K. N.

    2016-01-01

    The chromite deposits in the Neyriz area have lenticular and sometimes vein-like shape which are replaced in serpentinized dunite and harzburgite. Chromite and serpentinized olivines are major minerals and hematite and magnetite are minor minerals in the chromitic ores. Except chromite, other minerals have secondary origin that are related to serpentinization procceses. Whereas along with chromite, only a few of minerals such as pentlandite have primary origin. Native copper and sulfides such as chalcopyrite and bornite have been formed secondarily in microfracturs of chromite grains filled by serpentine. The results of the geochemical data from chromite ores are indicated by the type of chromite in alpine. Despite being the most abundant element in LREE relative to HREE, only six elements Dy, Eu, La, Lu, Nd, and Y are the most common among other elements. Finally, chromite ore in the area is economic but the frequency of trace elements is minimal and non-economic.

  13. Distributed Sensor Particles for Remote Fluorescence Detection of Trace Analytes: UXO/CW; TOPICAL

    International Nuclear Information System (INIS)

    SINGH, ANUP K.; GUPTA, ALOK; MULCHANDANI, ASHOK; CHEN, WILFRED; BHATIA, RIMPLE B.; SCHOENIGER, JOSEPH S.; ASHLEY, CAROL S.; BRINKER, C. JEFFREY; HANCE, BRADLEY G.; SCHMITT, RANDAL L.; JOHNSON, MARK S.; HARGIS JR. PHILIP J.; SIMONSON, ROBERT J.

    2001-01-01

    This report summarizes the development of sensor particles for remote detection of trace chemical analytes over broad areas, e.g residual trinitrotoluene from buried landmines or other unexploded ordnance (UXO). We also describe the potential of the sensor particle approach for the detection of chemical warfare (CW) agents. The primary goal of this work has been the development of sensor particles that incorporate sample preconcentration, analyte molecular recognition, chemical signal amplification, and fluorescence signal transduction within a ''grain of sand''. Two approaches for particle-based chemical-to-fluorescence signal transduction are described: (1) enzyme-amplified immunoassays using biocompatible inorganic encapsulants, and (2) oxidative quenching of a unique fluorescent polymer by TNT

  14. Surface distribution of dissolved trace metals in the oligotrophic ocean and their influence on phytoplankton biomass and productivity

    KAUST Repository

    Pinedo-González, Paulina

    2015-10-25

    The distribution of bioactive trace metals has the potential to enhance or limit primary productivity and carbon export in some regions of the world ocean. To study these connections, the concentrations of Cd, Co, Cu, Fe, Mo, Ni, and V were determined for 110 surface water samples collected during the Malaspina 2010 Circumnavigation Expedition (MCE). Total dissolved Cd, Co, Cu, Fe, Mo, Ni, and V concentrations averaged 19.0 ± 5.4 pM, 21.4 ± 12 pM, 0.91 ± 0.4 nM, 0.66 ± 0.3 nM, 88.8 ± 12 nM, 1.72 ± 0.4 nM, and 23.4 ± 4.4 nM, respectively, with the lowest values detected in the Central Pacific and increased values at the extremes of all transects near coastal zones. Trace metal concentrations measured in surface waters of the Atlantic Ocean during the MCE were compared to previously published data for the same region. The comparison revealed little temporal changes in the distribution of Cd, Co, Cu, Fe, and Ni over the last 30 years. We utilized a multivariable linear regression model to describe potential relationships between primary productivity and the hydrological, biological, trace nutrient and macronutrient data collected during the MCE. Our statistical analysis shows that primary productivity in the Indian Ocean is best described by chlorophyll a, NO3, Ni, temperature, SiO4, and Cd. In the Atlantic Ocean, primary productivity is correlated with chlorophyll a, NO3, PO4, mixed layer depth, Co, Fe, Cd, Cu, V, and Mo. The variables salinity, temperature, SiO4, NO3, PO4, Fe, Cd, and V were found to best predict primary productivity in the Pacific Ocean. These results suggest that some of the lesser studied trace elements (e.g., Ni, V, Mo, and Cd) may play a more important role in regulating oceanic primary productivity than previously thought and point to the need for future experiments to verify their potential biological functions.

  15. Surface distribution of dissolved trace metals in the oligotrophic ocean and their influence on phytoplankton biomass and productivity

    KAUST Repository

    Pinedo-Gonzá lez, Paulina; West, A. Joshua; Tovar-Sá nchez, Antonio; Duarte, Carlos M.; Marañ ó n, Emilio; Cermeñ o, Pedro; Gonzá lez, Natalia; Sobrino, Cristina; Huete-Ortega, Marí a; Ferná ndez, Ana; Ló pez-Sandoval, Daffne C.; Vidal, Montserrat; Blasco, Dolors; Estrada, Marta; Sañ udo-Wilhelmy, Sergio A.

    2015-01-01

    The distribution of bioactive trace metals has the potential to enhance or limit primary productivity and carbon export in some regions of the world ocean. To study these connections, the concentrations of Cd, Co, Cu, Fe, Mo, Ni, and V were determined for 110 surface water samples collected during the Malaspina 2010 Circumnavigation Expedition (MCE). Total dissolved Cd, Co, Cu, Fe, Mo, Ni, and V concentrations averaged 19.0 ± 5.4 pM, 21.4 ± 12 pM, 0.91 ± 0.4 nM, 0.66 ± 0.3 nM, 88.8 ± 12 nM, 1.72 ± 0.4 nM, and 23.4 ± 4.4 nM, respectively, with the lowest values detected in the Central Pacific and increased values at the extremes of all transects near coastal zones. Trace metal concentrations measured in surface waters of the Atlantic Ocean during the MCE were compared to previously published data for the same region. The comparison revealed little temporal changes in the distribution of Cd, Co, Cu, Fe, and Ni over the last 30 years. We utilized a multivariable linear regression model to describe potential relationships between primary productivity and the hydrological, biological, trace nutrient and macronutrient data collected during the MCE. Our statistical analysis shows that primary productivity in the Indian Ocean is best described by chlorophyll a, NO3, Ni, temperature, SiO4, and Cd. In the Atlantic Ocean, primary productivity is correlated with chlorophyll a, NO3, PO4, mixed layer depth, Co, Fe, Cd, Cu, V, and Mo. The variables salinity, temperature, SiO4, NO3, PO4, Fe, Cd, and V were found to best predict primary productivity in the Pacific Ocean. These results suggest that some of the lesser studied trace elements (e.g., Ni, V, Mo, and Cd) may play a more important role in regulating oceanic primary productivity than previously thought and point to the need for future experiments to verify their potential biological functions.

  16. Privacy-aware workflow management

    NARCIS (Netherlands)

    Alhaqbani, B.; Adams, M.; Fidge, C.J.; Hofstede, ter A.H.M.; Glykas, M.

    2013-01-01

    Information security policies play an important role in achieving information security. Confidentiality, Integrity, and Availability are classic information security goals attained by enforcing appropriate security policies. Workflow Management Systems (WfMSs) also benefit from inclusion of these

  17. Summer Student Report - AV Workflow

    CERN Document Server

    Abramson, Jessie

    2014-01-01

    The AV Workflow is web application which allows cern users to publish, update and delete videos from cds. During my summer internship I implemented the backend of the new version of the AV Worklow in python using the django framework.

  18. Provenance-Based Debugging and Drill-Down in Data-Oriented Workflows

    KAUST Repository

    Ikeda, Robert

    2012-04-01

    Panda (for Provenance and Data) is a system that supports the creation and execution of data-oriented workflows, with automatic provenance generation and built-in provenance tracing operations. Workflows in Panda are arbitrary a cyclic graphs containing both relational (SQL) processing nodes and opaque processing nodes programmed in Python. For both types of nodes, Panda generates logical provenance - provenance information stored at the processing-node level - and uses the generated provenance to support record-level backward tracing and forward tracing operations. In our demonstration we use Panda to integrate, process, and analyze actual education data from multiple sources. We specifically demonstrate how Panda\\'s provenance generation and tracing capabilities can be very useful for workflow debugging, and for drilling down on specific results of interest. © 2012 IEEE.

  19. ERROR HANDLING IN INTEGRATION WORKFLOWS

    Directory of Open Access Journals (Sweden)

    Alexey M. Nazarenko

    2017-01-01

    Full Text Available Simulation experiments performed while solving multidisciplinary engineering and scientific problems require joint usage of multiple software tools. Further, when following a preset plan of experiment or searching for optimum solu- tions, the same sequence of calculations is run multiple times with various simulation parameters, input data, or conditions while overall workflow does not change. Automation of simulations like these requires implementing of a workflow where tool execution and data exchange is usually controlled by a special type of software, an integration environment or plat- form. The result is an integration workflow (a platform-dependent implementation of some computing workflow which, in the context of automation, is a composition of weakly coupled (in terms of communication intensity typical subtasks. These compositions can then be decomposed back into a few workflow patterns (types of subtasks interaction. The pat- terns, in their turn, can be interpreted as higher level subtasks.This paper considers execution control and data exchange rules that should be imposed by the integration envi- ronment in the case of an error encountered by some integrated software tool. An error is defined as any abnormal behavior of a tool that invalidates its result data thus disrupting the data flow within the integration workflow. The main requirementto the error handling mechanism implemented by the integration environment is to prevent abnormal termination of theentire workflow in case of missing intermediate results data. Error handling rules are formulated on the basic pattern level and on the level of a composite task that can combine several basic patterns as next level subtasks. The cases where workflow behavior may be different, depending on user's purposes, when an error takes place, and possible error handling op- tions that can be specified by the user are also noted in the work.

  20. Kronos: a workflow assembler for genome analytics and informatics

    Science.gov (United States)

    Taghiyar, M. Jafar; Rosner, Jamie; Grewal, Diljot; Grande, Bruno M.; Aniba, Radhouane; Grewal, Jasleen; Boutros, Paul C.; Morin, Ryan D.

    2017-01-01

    Abstract Background: The field of next-generation sequencing informatics has matured to a point where algorithmic advances in sequence alignment and individual feature detection methods have stabilized. Practical and robust implementation of complex analytical workflows (where such tools are structured into “best practices” for automated analysis of next-generation sequencing datasets) still requires significant programming investment and expertise. Results: We present Kronos, a software platform for facilitating the development and execution of modular, auditable, and distributable bioinformatics workflows. Kronos obviates the need for explicit coding of workflows by compiling a text configuration file into executable Python applications. Making analysis modules would still require programming. The framework of each workflow includes a run manager to execute the encoded workflows locally (or on a cluster or cloud), parallelize tasks, and log all runtime events. The resulting workflows are highly modular and configurable by construction, facilitating flexible and extensible meta-applications that can be modified easily through configuration file editing. The workflows are fully encoded for ease of distribution and can be instantiated on external systems, a step toward reproducible research and comparative analyses. We introduce a framework for building Kronos components that function as shareable, modular nodes in Kronos workflows. Conclusions: The Kronos platform provides a standard framework for developers to implement custom tools, reuse existing tools, and contribute to the community at large. Kronos is shipped with both Docker and Amazon Web Services Machine Images. It is free, open source, and available through the Python Package Index and at https://github.com/jtaghiyar/kronos. PMID:28655203

  1. HisT/PLIER: A two-fold Provenance Approach for Grid-enabled Scientific Workflows using WS-VLAM

    NARCIS (Netherlands)

    Gerhards, M.; Sander, V.; Belloum, A.; Vasunin, D.; Benabdelkader, A.; Jha, S.; Felde, N.G.; Buyya, R.; Fedak, G.

    2011-01-01

    Large scale scientific applications are frequently modeled as a workflow that is executed under the control of a workflow management system. One crucial requirement is the validation of the generated results, e.g. The trace ability of the experiment execution path. The automated tracking and storage

  2. Dose distributions of a proton beam for eye tumor therapy: Hybrid pencil-beam ray-tracing calculations

    International Nuclear Information System (INIS)

    Rethfeldt, Ch.; Fuchs, H.; Gardey, K.-U.

    2006-01-01

    For the case of eye tumor therapy with protons, improvements are introduced compared to the standard dose calculation which implies straight-line optics and the constant-density assumption for the eye and its surrounding. The progress consists of (i) taking account of the lateral scattering of the protons in tissue by folding the entrance fluence distribution with the pencil beam distribution widening with growing depth in the tissue, (ii) rescaling the spread-out Bragg peak dose distribution in water with the radiological path length calculated voxel by voxel on ray traces through a realistic density matrix for the treatment geometry, yielding a trajectory dependence of the geometrical range. Distributions calculated for some specific situations are compared to measurements and/or standard calculations, and differences to the latter are discussed with respect to the requirements of therapy planning. The most pronounced changes appear for wedges placed in front of the eye, causing additional widening of the lateral falloff. The more accurate prediction of the dose dependence at the field borders is of interest with respect to side effects in the risk organs of the eye

  3. The distribution of radionuclides and some trace metals in the water columns of the Japan and Bonin trenches

    International Nuclear Information System (INIS)

    Nozari, Y.; Yamada, M.; Shitashima, K.; Tsubota, H.

    1998-01-01

    Presented here is the first geochemical data on the U/Th series Th, Pa, Ac, and Pb isotopes and artificial fallout radionuclides ( 90 Sr, 137 Cs, and Pu isotopes), and some trace elements (V, Zn, Cd, Cu, Mn, and Ni) in two water columns of the Japan and Bonin trenches down to the bottom depths of 7585 m and 9750 m, respectively. Hydrographic properties such as temperature, salinity dissolved oxygen, and nutrient content within the trench valley remain constant at the same levels as those in the bottom water of the Northwest Pacific basin (typically ∼6000 m in depth). The radionuclide activities and most trace metal concentrations are also not very different from those in the overlying water at depths of around 5000-6000 m. This means that any chemical alteration which sea water undergoes during its residence within the trench was not obviously detected by the techniques used here. The suggestion follows that the trench water is rather freely communicating y isopycnal mixing with the bottom water overlying the Northwest Pacific abyssal plain. The trench waters contain high 239,240 Pu activities throughout, indicating that Pu is actively regenerating from rapidly sinking, large particles at the bottom interface, probably due to a change in the oxidation state. On the other hand, the vertical profiles of 210 Pb and 231 Pa show lower activities within the trench than those in the overlying deep waters, suggesting that the effect of boundary and bottom scavenging is significant in controlling their oceanic distributions. However, none of the trace metals studied here obviously follows the behaviour of the above nuclides. The 228 Th data show scattering within the Bonin Trench that is largely ascribable to analytical errors. If, however we accept that the scatter of 228 Th data is real and the variation is caused solely by decay of its parent 228 Ra, we can set an upper limit of ∼5 years for the renewal time of the trench water. (authors)

  4. Distribution of trace elements in the natural waters of Bacino del Cordevole (Dolomite Alps, Agordo, Belluno, Italy)

    International Nuclear Information System (INIS)

    Brondi, M.; De Cassan, M.; Gragnani, R.; Orlandi, C.; Paganin, G.

    1989-12-01

    This work deals with (the study of the) distribution and circulation of trace elements in Italian aquifers corresponding to a wide range of environmental conditions, such as water chemism, lithology, hydrogeology, geochemical conditions as well as level of contamination. During 1985 the hydrogeochemical study was carried out on springs and surface waters from an area located in the East-Alpine- Range (Agordo, BL). The waters salinity is generally low (100+200 mg/1 of t.d.s.). Some springs, leaching levels of gypsum in the Bellerophon formation, present salinity values higher than 1 g/1. In the studied area the chemisa of the water is widely influenced by lithology. In fact different percentage ratios of principal cations and anions were observed for groups of samples: waters leaching carbonatic-rocks show the highest in Ca/Mg and HCO 3 ratios; waters coming from volcanic or metamorphic formations have the lowest Ca+Mg ratio; waters leaching gypsum horizons show the highest sulphates ratio. The content of trace elements are generally very low and show non significant contamination of the examined area. The high Zn contents of the springs 36 and 37 are due to the presence of heavy metal mineralization near the sampling sites. Vanadium reaches relative high contents in the samples 22 (11μg/l) and 35 (9 μg/1), that flow in volcanic rocks. In general, the higher vanadium contents correspond to waters in volcanites. Uranium and molybdenum exhibit significant correlation coefficient with the electric conductance, respectively r=0,87 and r=0,51. These two elements are characterized by high geochemical mobility and generally their concentrations increase with increasing salinity if precipitation processes do not occur. The geochemical characteristics of elements and peculiar geochemical processes effect trace concentrations element more than lithology. In fact only vanadium contents show a significant correlation with volcanic rocks. (author)

  5. Trace Element Removal in Distributed Drinking Water Treatment Systems by Cathodic H2O2 Production and UV Photolysis.

    Science.gov (United States)

    Barazesh, James M; Prasse, Carsten; Wenk, Jannis; Berg, Stephanie; Remucal, Christina K; Sedlak, David L

    2018-01-02

    As water scarcity intensifies, point-of-use and point-of-entry treatment may provide a means of exploiting locally available water resources that are currently considered to be unsafe for human consumption. Among the different classes of drinking water contaminants, toxic trace elements (e.g., arsenic and lead) pose substantial operational challenges for distributed drinking water treatment systems. Removal of toxic trace elements via adsorption onto iron oxides is an inexpensive and robust treatment method; however, the presence of metal-complexing ligands associated with natural organic matter (NOM) often prevents the formation of iron precipitates at the relatively low concentrations of dissolved iron typically present in natural water sources, thereby requiring the addition of iron which complicates the treatment process and results in a need to dispose of relatively large amounts of accumulated solids. A point-of-use treatment device consisting of a cathodic cell that produced hydrogen peroxide (H 2 O 2 ) followed by an ultraviolet (UV) irradiation chamber was used to decrease colloid stabilization and metal-complexing capacity of NOM present in groundwater. Exposure to UV light altered NOM, converting ∼6 μM of iron oxides into settable forms that removed between 0.5 and 1 μM of arsenic (As), lead (Pb), and copper (Cu) from solution via adsorption. After treatment, changes in NOM consistent with the loss of iron-complexing carboxylate ligands were observed, including decreases in UV absorbance and shifts in the molecular composition of NOM to higher H/C and lower O/C ratios. Chronoamperometric experiments conducted in synthetic groundwater revealed that the presence of Ca 2+ and Mg 2+ inhibited intramolecular charge-transfer within photoexcited NOM, leading to substantially increased removal of iron and trace elements.

  6. Dissolved trace metal (Cu, Cd, Co, Ni, and Ag) distribution and Cu speciation in the southern Yellow Sea and Bohai Sea, China

    Science.gov (United States)

    Li, Li; Xiaojing, Wang; Jihua, Liu; Xuefa, Shi

    2017-02-01

    Trace metals play an important role in biogeochemical cycling in ocean systems. However, because the use of trace metal clean sampling and analytical techniques has been limited in coastal China, there are few accurate trace metal data for that region. This work studied spatial distribution of selected dissolved trace metals (Ag, Cu, Co, Cd, and Ni) and Cu speciation in the southern Yellow Sea (SYS) and Bohai Sea (BS). In general, the average metal (Cu, Co, Cd, and Ni) concentrations found in the SYS were lower by a factor of two than those in BS, and they are comparable to dissolved trace metal concentrations in coastal seawater of the United States and Europe. Possible sources and sinks and physical and biological processes that influenced the distribution of these trace metals in the study region were further examined. Close relationships were found between the trace metal spatial distribution with local freshwater discharge and processes such as sediment resuspension and biological uptake. Ag, owing to its extremely low concentrations, exhibited a unique distribution pattern that magnified the influences from the physical and biological processes. Cu speciation in the water column showed that, in the study region, Cu was strongly complexed with organic ligands and concentrations of free cupric ion were in the range of 10-12.6-10-13.2 mol L-1. The distribution of Cu-complexing ligand, indicated by values of the side reaction coefficient α', was similar to the Chl a distribution, suggesting that in situ biota production may be one main source of Cu-complexing organic ligand.

  7. Distributions of dissolved trace metals (Cd, Cu, Mn, Pb, Ag in the southeastern Atlantic and the Southern Ocean

    Directory of Open Access Journals (Sweden)

    M. Boye

    2012-08-01

    Full Text Available Comprehensive synoptic datasets (surface water down to 4000 m of dissolved cadmium (Cd, copper (Cu, manganese (Mn, lead (Pb and silver (Ag are presented along a section between 34° S and 57° S in the southeastern Atlantic Ocean and the Southern Ocean to the south off South Africa. The vertical distributions of Cu and Ag display nutrient-like profiles similar to silicic acid, and of Cd similar to phosphate. The distribution of Mn shows a subsurface maximum in the oxygen minimum zone, whereas Pb concentrations are rather invariable with depth. Dry deposition of aerosols is thought to be an important source of Pb to surface waters close to South Africa, and dry deposition and snowfall may have been significant sources of Cu and Mn at the higher latitudes. Furthermore, the advection of water masses enriched in trace metals following contact with continental margins appeared to be an important source of trace elements to the surface, intermediate and deep waters in the southeastern Atlantic Ocean and the Antarctic Circumpolar Current. Hydrothermal inputs may have formed a source of trace metals to the deep waters over the Bouvet Triple Junction ridge crest, as suggested by relatively enhanced dissolved Mn concentrations. The biological utilization of Cu and Ag was proportional to that of silicic acid across the section, suggesting that diatoms formed an important control over the removal of Cu and Ag from surface waters. However, uptake by dino- and nano-flagellates may have influenced the distribution of Cu and Ag in the surface waters of the subtropical Atlantic domain. Cadmium correlated strongly with phosphate (P, yielding lower Cd / P ratios in the subtropical surface waters where phosphate concentrations were below 0.95 μM. The greater depletion of Cd relative to P observed in the Weddell Gyre compared to the Antarctic Circumpolar Current could be due to increase Cd uptake induced by iron-limiting conditions in these high

  8. Distribution of trace elements in a modified and grain refined aluminium-silicon hypoeutectic alloy.

    Science.gov (United States)

    Faraji, M; Katgerman, L

    2010-08-01

    The influence of modifier and grain refiner on the nucleation process of a commercial hypoeutectic Al-Si foundry alloy (A356) was investigated using optical microscopy, scanning electron microscopy (SEM) and electron probe microanalysis technique (EPMA). Filtering was used to improve the casting quality; however, it compromised the modification of silicon. Effect of filtering on strontium loss was also studied using the afore-mentioned techniques. EPMA was used to trace the modifying and grain refining agents inside matrix and eutectic Si. This was to help understanding mechanisms of nucleation and modification in this alloy. Using EPMA, the negative interaction of Sr and Al3TiB was closely examined. In modified structure, it was found that the maximum point of Sr concentration was in line with peak of silicon; however, in case of just 0.1wt% added Ti, the peak of Ti concentration was not in line with aluminium, (but it was close to Si peak). Furthermore, EPMA results showed that using filter during casting process lowered the strontium content, although produced a cleaner melt. (c) 2010 Elsevier Ltd. All rights reserved.

  9. Distribution of toxic trace elements in soil/sediment in post-Katrina New Orleans and the Louisiana Delta

    International Nuclear Information System (INIS)

    Su Tingzhi; Shu Shi; Shi Honglan; Wang Jianmin; Adams, Craig; Witt, Emitt C.

    2008-01-01

    This study provided a comprehensive assessment of seven toxic trace elements (As, Pb, V, Cr, Cd, Cu, and Hg) in the soil/sediment of Katrina affected greater New Orleans region 1 month after the recession of flood water. Results indicated significant contamination of As and V and non-significant contamination of Cd, Cr, Cu, Hg and Pb at most sampling sites. Compared to the reported EPA Region 6 soil background inorganic levels, except As, the concentrations of other six elements had greatly increased throughout the studied area; St. Bernard Parish and Plaquemines Parish showed greater contamination than other regions. Comparison between pre- and post-Katrina data in similar areas, and data for surface, shallow, and deep samples indicated that the trace element distribution in post-Katrina New Orleans was not obviously attributed to the flooding. This study suggests that more detailed study of As and V contamination at identified locations is needed. - This article provides an in-depth assessment of the contamination of As, Pb, V, Cr, Cd, Cu, and Hg in post-Katrina greater New Orleans region

  10. Spatial distribution of trace metals in sediments from urban streams of Semarang, Central Java, Indonesia.

    NARCIS (Netherlands)

    Widianarko, B.; Verweij, R.A.; van Gestel, C.A.M.; van Straalen, N.M.

    2000-01-01

    Elevated environmental concentrations of metals are usually associated with the impact of urbanization. The present study is focused on metal contamination in urban sediments. A field survey was carried out to determine the distribution of four metals, i.e., cadmium (Cd), lead (Pb), copper (Cu), and

  11. Urbanization effects on sediment and trace metals distribution in an urban winter pond (Netanya, Israel)

    NARCIS (Netherlands)

    Zohar, I.; Teutsch, N.; Levin, N.; Mackin, G.; de Stigter, H.; Bookman, R.

    2017-01-01

    PurposeThis paper aims to elucidate urban development-induced processes affecting the sediment and the distribution of contaminating metals in a seasonal pond located in the highly populated Israeli Coastal Plain. The paper demonstrates how an integrated approach, including geochemical,

  12. Supporting Trust in Globally Distributed Software Teams: The Impact of Visualized Collaborative Traces on Perceived Trustworthiness

    Science.gov (United States)

    Trainer, Erik Harrison

    2012-01-01

    Trust plays an important role in collaborations because it creates an environment in which people can openly exchange ideas and information with one another and engineer innovative solutions together with less perceived risk. The rise in globally distributed software development has created an environment in which workers are likely to have less…

  13. Distribution of essential trace elements in animals. Manganese and vanadium ion

    International Nuclear Information System (INIS)

    Sakurai, Hiromu; Nishida, Mikio; Koyama, Mutsuo; Takada, Jitsuya.

    1994-01-01

    We determined the tissue and subcellular distributions of Mn(II) by ESR and of total Mn by neutron activation analysis combined with chemical separation. Mn(II) contents of the thyroid, hypophysis, adrenal, pancreas, liver and kidney, tissues were low. In animals treated with Mn(II)Cl, the total Mn content of all tissues increased, but the Mn(II) content remained low. In subcellular distribution, the total Mn content was high in nuclear and mitochondrial fractions of liver and kidney, and in the microsomal and supernatant fractions of the pancreas. The ratio of Mn(II) to total Mn was relatively high in microsomes of the liver and kidney of control rats, and in the nuclear fraction of pancreas of Mn-treated rats. Partially purified liver and mitochondria were found to contain high level of Mn than the crude compartments, indicating that Mn is tightly bound in each cellular compartment. Distribution of Mn in organs and subcellular fractions of rats was investigated. Treatment of STZ resulted in unchanged Mn levels in most organs. Mn content, however, was decreased in the liver mitochondrial fraction and increased in supernatant fraction. Mn levels in both the liver and kidney of rats treated with cisplatin were increased after 7 days of drug administration. The distribution of vanadyl(+4) species estimated by ESR, and total V, determined by neutron activation analysis, were examined in organs and subcellular fractions of the liver of rats treated with vanadyl sulfate or sodium vanadate(+5). Both V compounds distributed in a similar manner in the following order; kidney>serum>liver≅blood>pancreas>testis>lung≅spleen. The ratio of vanadyl ion to total V in a whole homogenate was almost the same after the both treatments, but the ratios in subcellular fractions varies depending on the V compound and the fraction. Approximately 30-70% of the vanadium was reduced to vanadyl form in the subcellular fractions of the liver. (J.P.N.)

  14. Chemometrics in biomonitoring: Distribution and correlation of trace elements in tree leaves.

    Science.gov (United States)

    Deljanin, Isidora; Antanasijević, Davor; Bjelajac, Anđelika; Urošević, Mira Aničić; Nikolić, Miroslav; Perić-Grujić, Aleksandra; Ristić, Mirjana

    2016-03-01

    The concentrations of 15 elements were measured in the leaf samples of Aesculus hippocastanum, Tilia spp., Betula pendula and Acer platanoides collected in May and September of 2014 from four different locations in Belgrade, Serbia. The objective was to assess the chemical characterization of leaf surface and in-wax fractions, as well as the leaf tissue element content, by analyzing untreated, washed with water and washed with chloroform leaf samples, respectively. The combined approach of self-organizing networks (SON) and Preference Ranking Organization Method for Enrichment Evaluation (PROMETHEE) aided by Geometrical Analysis for Interactive Aid (GAIA) was used in the interpretation of multiple element loads on/in the tree leaves. The morphological characteristics of the leaf surfaces and the elemental composition of particulate matter (PM) deposited on tree leaves were studied by using scanning electron microscopy (SEM) with energy dispersive spectroscopy (EDS) detector. The results showed that the amounts of retained and accumulated element concentrations depend on several parameters, such as chemical properties of the element and morphological properties of the leaves. Among the studied species, Tilia spp. was found to be the most effective in the accumulation of elements in leaf tissue (70% of the total element concentration), while A. hippocastanum had the lowest accumulation (54%). After water and chloroform washing, the highest percentages of removal were observed for Al, V, Cr, Cu, Zn, As, Cd and Sb (>40%). The PROMETHEE/SON ranking/classifying results were in accordance with the results obtained from the GAIA clustering techniques. The combination of the techniques enabled extraction of additional information from datasets. Therefore, the use of both the ranking and clustering methods could be a useful tool to be applied in biomonitoring studies of trace elements. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Study of distribution and differential accumulation of trace elements in plant leaves using neutron activation analysis

    International Nuclear Information System (INIS)

    Koyama, Mutsuo; Takada, Jitsuya; Shirakawa, Masahiro; Katayama, Koshi.

    1983-01-01

    Plant leaves were collected from geologically different forests three times from April to May, from August to September, and from October to November. Although the concentration of inorganic elements showed the constant distribution pattern in the same trees, the distribution pattern was peculiar to plant species and elements. Ia and halogen groups were prominent in herbaceous plants, while IIa group, except for Ba accumulated into pteridophyta, was prominent in woody plants. Of the transition metal elements, Mn was highly accumulated in Tea senensis. The high concentration of Mn was more marked in Araliaceae than in Tea senensis. Specific high concentrations of Fe and Co were noted in Ecephorbiaceae, Zn and Cd in Aquifoliaceae, and Al, rare earth elements and Ra in Gleichenia japonica and Dicranopteris dichotome. (Namekawa, K.)

  16. Distribution of trace elements in sediment and soil from river Vardar Basin, Macedonia/Greece.

    Science.gov (United States)

    Popov, Stanko Ilić; Stafilov, Trajče; Šajn, Robert; Tănăselia, Claudiu

    2016-01-01

    A systematic study was carried out to investigate the distribution of 59 elements in the sediment and soil samples collected from the river Vardar (Republic of Macedonia and Greece) and its major tributaries. The samples were collected from 28 sampling sites. Analyses were performed by inductively coupled plasma-mass spectrometry. R-mode factor analysis (FA) was used to identify and characterise element associations. Seven associations of elements were determined by the method of multivariate statistics. Every factor (Factors 1-3 and 6 and 7 as geogenic and Factors 4 and 5 as anthropogenic associations of elements) are examined and explained separately. The distribution of various elements showed that there is a presence of anthropogenic elements (Ag, Cd, Cu, Ge, Pb, Sn and Zn) introduced in the river sediments and soils from the mining, metallurgical, industrial and agricultural activities in Vardar River Basin, which covers most of the Republic of Macedonia and Central-northern part of Greece.

  17. Activation analysis study on subcellular distribution of trace elements in human brain tumor

    International Nuclear Information System (INIS)

    Zheng Jian; Zhuan Guisun; Wang Yongji; Dong Mo; Zhang Fulin

    1992-01-01

    The concentrations of up to 11 elements in subcellular fractions of human brain (normal and malignant tumor) have been determined by a combination of gradient centrifugation and INAA methods. Samples of human brain were homogenized in a glass homogenizer tube, the homogenate was separated into nuclei, mitochondrial, myelin, synaptosome fractions, and these fractions were then analyzed using the INAA method. The discussions of elemental subcelleular distributions in human brain malignant tumor are presented in this paper. (author) 11 refs.; 2 figs.; 4 tabs

  18. Determination and correlation of spatial distribution of trace elements in normal and neoplastic breast tissues evaluated by μ-XRF

    International Nuclear Information System (INIS)

    Silva, M.P.; Oliveira, M.A.; Poletti, M.E.

    2012-01-01

    Full text: Some trace elements, naturally present in breast tissues, participate in a large number of biological processes, which include among others, activation or inhibition of enzymatic reactions and changes on cell membranes permeability, suggesting that these elements may influence carcinogenic processes. Thus, knowledge of the amounts of these elements and their spatial distribution in normal and neoplastic tissues may help in understanding the role of these elements in the carcinogenic process and tumor progression of breast cancers. Concentrations of trace elements like Ca, Fe, Cu and Zn, previously studied at LNLS using TXRF and conventional XRF, were elevated in neoplastic breast tissues compared to normal tissues. In this study we determined the spatial distribution of these elements in normal and neoplastic breast tissues using μ-XRF technique. We analyzed 22 samples of normal and neoplastic breast tissues (malignant and benign) obtained from paraffin blocks available for study at the Department of Pathology HC-FMRP/USP. From the blocks, a small fraction of material was removed and subjected to histological sections of 60 μm thick made with a microtome. The slices where placed in holder samples and covered with ultralen film. Tissue samples were irradiated with a white beam of synchrotron radiation. The samples were positioned at 45 degrees with respect to the incident beam on a table with 3 freedom degrees (x, y and z), allowing independent positioning of the sample in these directions. The white beam was collimated by a 20 μm microcapillary and samples were fully scanned. At each step, a spectrum was detected for 10 s. The fluorescence emitted by elements present in the sample was detected by a Si (Li) detector with 165 eV at 5.9 keV energy resolution, placed at 90 deg with respect to the incident beam. Results reveal that trace elements Ca-Zn and Fe-Cu could to be correlated in malignant breast tissues. Quantitative results, achieved by Spearman

  19. Groundwaters of Florence (Italy): Trace element distribution and vulnerability of the aquifers

    Science.gov (United States)

    Bencini, A.; Ercolanelli, R.; Sbaragli, A.; Verrucchi, C.

    1993-11-01

    of nitrogenous species, and the sorption capacity of clay minerals and organic matter with respect to trace metals.

  20. The distribution pattern of main and trace elements in phosphatic laterites from Pirocaua Plateau (MA)

    International Nuclear Information System (INIS)

    Siqueira, N.V.M. de.

    1982-01-01

    The phosphate-laterite of Pirocaua (state of Maranhao) was studied with basis on several lines of evidence, namely, structure of the deposit, mineral distribution, variations in chemical composition and chemistry of ground waters in the region. The distribution of elements during the formation of the deposit is interpreted and the conditioning factors analysed. Air and water samples were studied respectively by X-ray diffraction and atomic absorption spectroscopy. The results show that fluctuations of the hydrostatic level were important during the formation of the phosphate horizon. When the deposit was formed there was also a decrease of the activity of the silicic acid and a parallel increase of acidity towards the top of the cross-section studied. In these conditions, Fe sup(2+) migrated towards the top of the deposit and was precipitated as Fe sup(3+) in the oxidizing zone. Migration of phosphate was in part due to its affinities to clay minerals, in which mechanism ground water played a major role. Mass balance calculations indicate that of the parent-rock is a phyllite, it is necessary an extreme enrichment in P and Sr to give the composition of the phosphate horizon. With basis on these observations we conclude that: 1) the parent-rock must have contents of P and Sr higher than the average for phyllites; or 2) the phosphate has some other source. (author)

  1. On the distribution of trace element concentrations in multiple bone elements in 10 Danish medieval and post-medieval individuals.

    Science.gov (United States)

    Lund Rasmussen, Kaare; Skytte, Lilian; D'imporzano, Paolo; Orla Thomsen, Per; Søvsø, Morten; Lier Boldsen, Jesper

    2017-01-01

    The differences in trace element concentrations among 19 different bone elements procured from 10 archaeologically derived human skeletons have been investigated. The 10 individuals are dated archaeologically and some by radiocarbon dating to the medieval and post-medieval period, an interval from ca. AD 1150 to ca. AD 1810. This study is relevant for two reasons. First, most archaeometric studies analyze only one bone sample from each individual; so to what degree are the bones in the human body equal in trace element chemistry? Second, differences in turnover time of the bone elements makes the cortical tissues record the trace element concentrations in equilibrium with the blood stream over a longer time earlier in life than the trabecular. Therefore, any differences in trace element concentrations between the bone elements can yield what can be termed a chemical life history of the individual, revealing changes in diet, provenance, or medication throughout life. Thorough decontamination and strict exclusion of non-viable data has secured a dataset of high quality. The measurements were carried out using Inductively Coupled Plasma Mass Spectrometry (for Fe, Mn, Al, Ca, Mg, Na, Ba, Sr, Zn, Pb and As) and Cold Vapor Atomic Absorption Spectroscopy (for Hg) on ca. 20 mg samples. Twelve major and trace elements have been measured on 19 bone elements from 10 different individuals interred at five cemeteries widely distributed in medieval and renaissance Denmark. The ranges of the concentrations of elements were: Na (2240-5660 µg g -1 ), Mg (440-2490 µg g -1 ), Al (9-2030 µg g -1 ), Ca (22-36 wt. %), Mn (5-11450 µg g -1 ), Fe (32-41850 µg g -1 ), Zn (69-2610 µg g -1 ), As (0.4-120 µg g -1 ), Sr (101-815 µg g -1 ), Ba (8-880 µg g -1 ), Hg (7-78730 ng g -1 ), and Pb (0.8-426 µg g -1 ). It is found that excess As is mainly of diagenetic origin. The results support that Ba and Sr concentrations are effective provenance or dietary indicators. Migrating

  2. Trace metal distributions in Posidonia oceanica and sediments from Taranto Gulf (Ionian Sea, Southern Italy

    Directory of Open Access Journals (Sweden)

    A. DI LEO

    2013-03-01

    Full Text Available Distribution of metals (Hg, Pb, Sn, Cu, Cd and Zn was determined in sediments and in different tissues of Posidonia oceanica collected from San Pietro Island, Taranto Gulf (Ionian Sea, Southern Italy. In seagrass, results, compared with metal concentrations in sediments, showed that the highest concentrations of Hg, Pb, Sn and Cu were found in the roots, while in the green leaves were found the highest levels of Cd and Zn. Instead the lowest metal concentrations were found in the basal part of the leaf. Levels of  metals in the leaves were similar to those found by other authors in uncontaminated areas of the Mediterranean Sea. Mercury levels in roots were correlated to levels in sediments. This could demonstrate the plant memorizes sediments contamination . This study reinforces the usefulness and the relevance of Posidonia oceanica as an indicator of spatial metal contamination and an interesting tool for environmental quality evaluation.

  3. Trace element accumulation and distribution in sunflower plants at the stages of flower bud and maturity

    Directory of Open Access Journals (Sweden)

    Susanna De Maria

    2013-02-01

    Full Text Available The aim of this study was to analyze the accumulation and distribution of cadmium (Cd, zinc (Zn and copper (Cu in different portions of plants of sunflower (Helianthus annuus L., cv. Oleko grown in soil with contaminants (5, 300, 400 mg kg–1 of Cd, Zn and Cu, respectively and without (untreated soil as a control from the emergence of cotyledon leaves until to two phenological stages: flower bud (R-1 and maturity (R- 9. Sunflower accumulated considerable amounts of heavy metals in both phenological stages showing slight reductions of dry matter production. At R-1 stage, Cd, Zn and Cu were accumulated mainly in the roots with concentrations respectively up to 5.4, 233 and 160 mg kg–1 of dry matter with a low translocation from roots to the aerial part. Yet at the R-1 stage, the bioconcentration factor (BCF of Cd showed a significantly higher value in the Cd-Zn-Cu treatment (0.27 with respect to the untreated control (0.02, vice versa was observed for Cu, whereas no significant difference between treatments was observed for Zn (0.12 on average. However among metals, Cd showed the highest value of BCF. Referring only to the epigeous portion, differences in the accumulation and distribution of the three metals in the treated plants were found in both phenological stages; indeed passing from flower bud to the maturity stage, Cd, Zn and Cu concentrations increased in the stems and leaves, particularly in the old ones, whereas decreased in the heads. Metal accumulation in the achenes was very low and never exceed the toxicity threshold value considered for livestock. The high storage of heavy metals in roots and the probable re-translocation of the three metals along the plant during the growing cycle could be considered as a strategy of sunflower to preserve young metabolically-active leaves and reproductive organs from toxic metal concentrations.

  4. Similarity measures for scientific workflows

    OpenAIRE

    Starlinger, Johannes

    2016-01-01

    In Laufe der letzten zehn Jahre haben Scientific Workflows als Werkzeug zur Erstellung von reproduzierbaren, datenverarbeitenden in-silico Experimenten an Aufmerksamkeit gewonnen, in die sowohl lokale Skripte und Anwendungen, als auch Web-Services eingebunden werden können. Über spezialisierte Online-Bibliotheken, sogenannte Repositories, können solche Workflows veröffentlicht und wiederverwendet werden. Mit zunehmender Größe dieser Repositories werden Ähnlichkeitsmaße für Scientific Workfl...

  5. Scientific Workflow Management in Proteomics

    Science.gov (United States)

    de Bruin, Jeroen S.; Deelder, André M.; Palmblad, Magnus

    2012-01-01

    Data processing in proteomics can be a challenging endeavor, requiring extensive knowledge of many different software packages, all with different algorithms, data format requirements, and user interfaces. In this article we describe the integration of a number of existing programs and tools in Taverna Workbench, a scientific workflow manager currently being developed in the bioinformatics community. We demonstrate how a workflow manager provides a single, visually clear and intuitive interface to complex data analysis tasks in proteomics, from raw mass spectrometry data to protein identifications and beyond. PMID:22411703

  6. Analysing scientific workflows: Why workflows not only connect web services

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; Wolstencroft, K.; Neerincx, P.B.T.; Roos, M.; Rauwerda, H.; Breit, T.M.; Zhang, L.J.

    2009-01-01

    Life science workflow systems are developed to help life scientists to conveniently connect various programs and web services. In practice however, much time is spent on data conversion, because web services provided by different organisations use different data formats. We have analysed all the

  7. Analysing scientific workflows: why workflows not only connect web services

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; Wolstencroft, K.; Neerincx, P.B.T.; Roos, M.; Rauwerda, H.; Breit, T.M.; Zhang, LJ.

    2009-01-01

    Life science workflow systems are developed to help life scientists to conveniently connect various programs and web services. In practice however, much time is spent on data conversion, because web services provided by different organisations use different data formats. We have analysed all the

  8. Office 2010 Workflow Developing Collaborative Solutions

    CERN Document Server

    Mann, David; Enterprises, Creative

    2010-01-01

    Workflow is the glue that binds information worker processes, users, and artifacts. Without workflow, information workers are just islands of data and potential. Office 2010 Workflow details how to implement workflow in SharePoint 2010 and the client Microsoft Office 2010 suite to help information workers share data, enforce processes and business rules, and work more efficiently together or solo. This book covers everything you need to know-from what workflow is all about to creating new activities; from the SharePoint Designer to Visual Studio 2010; from out-of-the-box workflows to state mac

  9. Distribution and temporal variation of trace metal enrichment in surface sediments of San Jorge Bay, Chile.

    Science.gov (United States)

    Valdés, Jorge; Román, Domingo; Guiñez, Marcos; Rivera, Lidia; Morales, Tatiana; Morales, Tomás; Avila, Juan; Cortés, Pedro

    2010-08-01

    Cu, Pb, and Hg concentrations were determined in surface sediment samples collected at three sites in San Jorge Bay, northern Chile. This study aims to evaluate differences in their spatial distribution and temporal variability. The highest metal concentrations were found at the site "Puerto", where minerals (Cu and Pb) have been loaded for more than 60 years. On the other hand, Hg does not pose a contamination problem in this bay. Cu and Pb concentrations showed significant variations from 1 year to another. These variations seem to be a consequence of the combination of several factors, including changes in the loading and/or storage of minerals in San Jorge Bay, the dredging of bottom sediments (especially at Puerto), and seasonal changes in physical-chemical properties of the water column that modify the exchange of metals at the sediment-water interface. Differences in the contamination factor and geoaccumulation index suggest that pre-industrial concentrations measured in marine sediments of this geographical zone, were better than geological values (average shale, continental crust average) for evaluating the degree of contamination in this coastal system. Based on these last two indexes, San Jorge Bay has a serious problem of Cu and Pb pollution at the three sampling locations. However, only Cu exceeds the national maximum values used to evaluate ecological risk and the health of marine environments. It is suggested that Chilean environmental legislation for marine sediment quality--presently under technical discussion--is not an efficient tool for protecting the marine ecosystem.

  10. Spatial distribution of trace elements in topsoils adjacent to main avenues of Sao Paulo city, Brazil

    International Nuclear Information System (INIS)

    Ribeiro, Andreza P.; Figueiredo, Ana Maria G.; Nammoura-Neto, Georges M.; Silva, Natalia C.; Ticianelli, Regina B.; Camargo, Sonia P.; Enzweiler, Jacinta

    2009-01-01

    In the present study, the concentration and distribution of Ba, Cu, Mo, Pb, S, Zn and Zr in soils collected along two main avenues (Pinheiros River Highway and Tiete River Highway) with high traffic density in the metropolitan region of Sao Paulo, Brazil, are presented, and their possible sources are discussed. These elements are strongly considered as contaminants originated from vehicular emissions. The analytical technique employed was XRF. The data set was evaluated by a t test for independent samples (group: avenues) at a 0.05 significance level. According to t test, the average contents obtained from Pinheiros River Highways are significantly different than the Tiete River, except for Mo. Multivariate statistic approaches (Pearson Correlation, Cluster and Factorial Analysis - FA) were adopted for data treatment. FA identified two main factors which accounted for about 86% of the total variance. The behavior of Ba, Cu, Pb, S and Zn were explained by the Factor 1. This indicates that the elements may have similar sources, probably related to gas emissions escaping from the vehicle fuel system. Factor 2 included Mo and Zr, suggesting their origin in the sample soils may be associated with the deterioration process of some device in the vehicular engine system or may be associated with the chemical composition of the urban soil analyzed. (author)

  11. Trace Elements Distribution in Human Gallstones, Bile and Gallbladder Tissues Using Instrumental Neutron Activation Analysis

    International Nuclear Information System (INIS)

    Abugassa, I. O.; Khrbish, Y. S.; Bshir, A. T.; Doubali, K.; Abugassa, S. O.

    2007-01-01

    This study focuses on the elemental distribution in different types of gallstones; bile and gallbladder tissues using neutron activation analysis technique based on k0-INAA method in Tajura center. Samples were collected from patients who undergone open surgery of gallbladder (cholecystectomy) at El-khadra University Hospital in Tripoli, aged between 23-80 yr. The samples obtained from patients who don't suffer from any chronic diseases, therefore, they were not taking any medications that might elevate the concentration of certain elements in the body. Samples were prepared and lyophilized by different process in a clean room. All samples were irradiated in the reactor and measured in the neutron activation laboratory. In order to obtain accurate results, Au and Zr flux monitors were irradiated with the samples for flux ratio (f) and α determinations and to account for any flux variations within the container. The irradiations of the samples were carried out in the reactor channels VCR11 and VCR12 for 8 hours under f (32 and 14) and α parameters (0.0183, 0.1678) respectively. More than 20 elements were determined in the above mentioned samples. Several SRM were irradiated with the samples to insure the reliability of the results.

  12. Elastic Scheduling of Scientific Workflows under Deadline Constraints in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Nazia Anwar

    2018-01-01

    Full Text Available Scientific workflow applications are collections of several structured activities and fine-grained computational tasks. Scientific workflow scheduling in cloud computing is a challenging research topic due to its distinctive features. In cloud environments, it has become critical to perform efficient task scheduling resulting in reduced scheduling overhead, minimized cost and maximized resource utilization while still meeting the user-specified overall deadline. This paper proposes a strategy, Dynamic Scheduling of Bag of Tasks based workflows (DSB, for scheduling scientific workflows with the aim to minimize financial cost of leasing Virtual Machines (VMs under a user-defined deadline constraint. The proposed model groups the workflow into Bag of Tasks (BoTs based on data dependency and priority constraints and thereafter optimizes the allocation and scheduling of BoTs on elastic, heterogeneous and dynamically provisioned cloud resources called VMs in order to attain the proposed method’s objectives. The proposed approach considers pay-as-you-go Infrastructure as a Service (IaaS clouds having inherent features such as elasticity, abundance, heterogeneity and VM provisioning delays. A trace-based simulation using benchmark scientific workflows representing real world applications, demonstrates a significant reduction in workflow computation cost while the workflow deadline is met. The results validate that the proposed model produces better success rates to meet deadlines and cost efficiencies in comparison to adapted state-of-the-art algorithms for similar problems.

  13. Hermes: Seamless delivery of containerized bioinformatics workflows in hybrid cloud (HTC environments

    Directory of Open Access Journals (Sweden)

    Athanassios M. Kintsakis

    2017-01-01

    Full Text Available Hermes introduces a new “describe once, run anywhere” paradigm for the execution of bioinformatics workflows in hybrid cloud environments. It combines the traditional features of parallelization-enabled workflow management systems and of distributed computing platforms in a container-based approach. It offers seamless deployment, overcoming the burden of setting up and configuring the software and network requirements. Most importantly, Hermes fosters the reproducibility of scientific workflows by supporting standardization of the software execution environment, thus leading to consistent scientific workflow results and accelerating scientific output.

  14. Hermes: Seamless delivery of containerized bioinformatics workflows in hybrid cloud (HTC) environments

    Science.gov (United States)

    Kintsakis, Athanassios M.; Psomopoulos, Fotis E.; Symeonidis, Andreas L.; Mitkas, Pericles A.

    Hermes introduces a new "describe once, run anywhere" paradigm for the execution of bioinformatics workflows in hybrid cloud environments. It combines the traditional features of parallelization-enabled workflow management systems and of distributed computing platforms in a container-based approach. It offers seamless deployment, overcoming the burden of setting up and configuring the software and network requirements. Most importantly, Hermes fosters the reproducibility of scientific workflows by supporting standardization of the software execution environment, thus leading to consistent scientific workflow results and accelerating scientific output.

  15. Distribution of trace gases and aerosols in the troposphere over West Siberia and Kara Sea

    Science.gov (United States)

    Belan, Boris D.; Arshinov, Mikhail Yu.; Paris, Jean-Daniel; Nédélec, Philippe; Ancellet, Gérard; Pelon, Jacques; Berchet, Antoine; Arzoumanian, Emmanuel; Belan, Sergey B.; Penner, Johannes E.; Balin, Yurii S.; Kokhanenko, Grigorii; Davydov, Denis K.; Ivlev, Georgii A.; Kozlov, Artem V.; Kozlov, Alexander S.; Chernov, Dmitrii G.; Fofonov, Alexader V.; Simonenkov, Denis V.; Tolmachev, Gennadii

    2015-04-01

    The Arctic is affected by climate change much stronger than other regions of the globe. Permafrost thawing can lead to additional methane release, which enhances the greenhouse effect and warming, as well as changes of Arctic tundra ecosystems. A great part of Siberian Arctic is still unexplored. Ground-based investigations are difficult to be carried out in this area due to it is an out-of-the-way place. So, in spite of the high cost, aircraft-based in-situ measurements can provide a good opportunity to fill up the gap in data on the atmospheric composition over this region. The ninth YAK-AEROSIB campaign was focused on the airborne survey of Arctic regions of West Siberia. It was performed in October 2014. During the campaign, the high-precision in-situ measurements of CO2, CH4, CO, O3, black carbon and aerososls, including aerosol lidar profiles, have been carried out in the Siberian troposphere from Novosibirsk to Kara Sea. Vertical distributions of the above atmospheric constituents will be presented. This work was supported by LIA YAK-AEROSIB, CNRS (France), the French Ministry of Foreign Affairs, CEA (France), the Branch of Geology, Geophysics and Mining Sciences of RAS (Program No. 5); State contracts of the Ministry of Education and Science of Russia No. 14.604.21.0100, (RFMTFIBBB210290) and No. 14.613.21.0013 (RFMEFI61314X0013); Interdisciplinary integration projects of the Siberian Branch of the Russian Academy of Science No. 35, No. 70 and No. 131; and Russian Foundation for Basic Research (grants No. 14-05-00526 and 14-05-00590).

  16. On the Support of Scientific Workflows over Pub/Sub Brokers

    Directory of Open Access Journals (Sweden)

    Edwin Cedeño

    2013-08-01

    Full Text Available The execution of scientific workflows is gaining importance as more computing resources are available in the form of grid environments. The Publish/Subscribe paradigm offers well-proven solutions for sustaining distributed scenarios while maintaining the high level of task decoupling required by scientific workflows. In this paper, we propose a new model for supporting scientific workflows that improves the dissemination of control events. The proposed solution is based on the mapping of workflow tasks to the underlying Pub/Sub event layer, and the definition of interfaces and procedures for execution on brokers. In this paper we also analyze the strengths and weaknesses of current solutions that are based on existing message exchange models for scientific workflows. Finally, we explain how our model improves the information dissemination, event filtering, task decoupling and the monitoring of scientific workflows.

  17. On the support of scientific workflows over Pub/Sub brokers.

    Science.gov (United States)

    Morales, Augusto; Robles, Tomas; Alcarria, Ramon; Cedeño, Edwin

    2013-08-20

    The execution of scientific workflows is gaining importance as more computing resources are available in the form of grid environments. The Publish/Subscribe paradigm offers well-proven solutions for sustaining distributed scenarios while maintaining the high level of task decoupling required by scientific workflows. In this paper, we propose a new model for supporting scientific workflows that improves the dissemination of control events. The proposed solution is based on the mapping of workflow tasks to the underlying Pub/Sub event layer, and the definition of interfaces and procedures for execution on brokers. In this paper we also analyze the strengths and weaknesses of current solutions that are based on existing message exchange models for scientific workflows. Finally, we explain how our model improves the information dissemination, event filtering, task decoupling and the monitoring of scientific workflows.

  18. Distribution of Natural Radioactivity and some Trace Elements in the Aquatic Ecosystem of Manzala Lake, Egypt

    International Nuclear Information System (INIS)

    Aly, A. I. M.; Eweida, E. A.; Hamed, M. A.

    2007-01-01

    The hydrochemical composition of Manzala lake water (average TDS = 2550 ppm) reflects the effects of several factors and recharge sources: drainage water discharge, waste water load, sea water intrusion, bathometry and evaporation rate. The concentrations of nitrate and phosphate acquire considerably high values, which indicate a high degree of pollution and eutrification. The results of the lake water analyses obtained by ICP-MS and ICP- AES, show that the concentrations of Sr, Al, Fe and P are higher than those of V, Cr, Mn, Co, Cu, Rb, Y, Zr, Mo, Ba, La, Ce, Eu, Ti, Pb, U and Zn (μ/L). In general, their concentrations increase toward the southeastern part of the lake according to the following descending order: Sr FePMn > BaV > Rb >Cu > Co > Pb > Mo. The chemical composition of the bottom sediments shows that Al, Mn, Sr, Ba and V (averages = 5.7 mg/g and 1124, 679, 290 and 121 ppm, respectively) have the highest concentrations, while U, Mo, Cs, Sb and Tl (averages = 4, 3, 0.79, 0.20 and 0.19 ppm, respectively) have the lowest concentrations. The concentrations of most determined elements increase toward the southeastern and northwestern parts of the lake. This may indicate the effect of the industrial, agricultural and domestic waste disposal through Bahr El Baqar drain at SE and Mohib drain at NW, in addition to the adsorption effect of clay-rich sediments of the lake. The average specific activities of 226 Ra ( 238 U series), 232 Th and 40 K in bottom sediments of the lake were 13.78, 12.53 and 217.74 Bq/kg, while the mean values of 40 K in surface and bottom waters were 0.96 and 0.95 Bq/L, respectively. The average specific activity of 137 Cs in bottom sediments was 4.39 Bq/kg. The obtained results show that the distribution coefficients (K d s) of most elements for bottom sediments of the lake were in the range of the recommended IAEA values, with the exception of some other values. This may be attributed to the influence of the different polluted waters

  19. New Interactions with Workflow Systems

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; van der Veer, Gerrit C.; Roos, M.; van Dijk, Elisabeth M.A.G.; Norros, L.; Koskinen, H.; Salo, L.; Savioja, P.

    2009-01-01

    This paper describes the evaluation of our early design ideas of an ad-hoc of workflow system. Using the teach-back technique, we have performed a hermeneutic analysis of the mockup implementation named NIWS to get corrective and creative feedback at the functional, dialogue and representation level

  20. Distributions of trace gases and aerosols during the dry biomass burning season in southern Africa

    Science.gov (United States)

    Sinha, Parikhit; Hobbs, Peter V.; Yokelson, Robert J.; Blake, Donald R.; Gao, Song; Kirchstetter, Thomas W.

    2003-09-01

    Vertical profiles in the lower troposphere of temperature, relative humidity, sulfur dioxide (SO2), ozone (O3), condensation nuclei (CN), and carbon monoxide (CO), and horizontal distributions of twenty gaseous and particulate species, are presented for five regions of southern Africa during the dry biomass burning season of 2000. The regions are the semiarid savannas of northeast South Africa and northern Botswana, the savanna-forest mosaic of coastal Mozambique, the humid savanna of southern Zambia, and the desert of western Namibia. The highest average concentrations of carbon dioxide (CO2), CO, methane (CH4), O3, black particulate carbon, and total particulate carbon were in the Botswana and Zambia sectors (388 and 392 ppmv, 369 and 453 ppbv, 1753 and 1758 ppbv, 79 and 88 ppbv, 2.6 and 5.5 μg m-3, and 13.2 and 14.3 μg m-3). This was due to intense biomass burning in Zambia and surrounding regions. The South Africa sector had the highest average concentrations of SO2, sulfate particles, and CN (5.1 ppbv, 8.3 μg m-3, and 6400 cm-3, respectively), which derived from biomass burning and electric generation plants and mining operations within this sector. Air quality in the Mozambique sector was similar to the neighboring South Africa sector. Over the arid Namibia sector there were polluted layers aloft, in which average SO2, O3, and CO mixing ratios (1.2 ppbv, 76 ppbv, and 310 ppbv, respectively) were similar to those measured over the other more polluted sectors. This was due to transport of biomass smoke from regions of widespread savanna burning in southern Angola. Average concentrations over all sectors of CO2 (386 ± 8 ppmv), CO (261 ± 81 ppbv), SO2 (2.5 ± 1.6 ppbv), O3 (64 ± 13 ppbv), black particulate carbon (2.3 ± 1.9 μg m-3), organic particulate carbon (6.2 ± 5.2 μg m-3), total particle mass (26.0 ± 4.7 μg m-3), and potassium particles (0.4 ± 0.1 μg m-3) were comparable to those in polluted, urban air. Since the majority of the measurements

  1. Measuring trace gas emission from multi-distributed sources using vertical radial plume mapping (VRPM) and backward Lagrangian stochastic (bLS) techniques

    Science.gov (United States)

    Two micrometeorological techniques for measuring trace gas emission rates from distributed area sources were evaluated using a variety of synthetic area sources. The accuracy of the vertical radial plume mapping (VRPM) and the backward Lagrangian (bLS) techniques with an open-path optical spectrosco...

  2. Deadline-constrained workflow scheduling algorithms for Infrastructure as a Service Clouds

    NARCIS (Netherlands)

    Abrishami, S.; Naghibzadeh, M.; Epema, D.H.J.

    2013-01-01

    The advent of Cloud computing as a new model of service provisioning in distributed systems encourages researchers to investigate its benefits and drawbacks on executing scientific applications such as workflows. One of the most challenging problems in Clouds is workflow scheduling, i.e., the

  3. A method to build and analyze scientific workflows from provenance through process mining

    NARCIS (Netherlands)

    Zeng, R.; He, X.; Li, Jiafei; Liu, Zheng; Aalst, van der W.M.P.

    2011-01-01

    Scientific workflows have recently emerged as a new paradigm for representing and managing complex distributed scientific computations and are used to accelerate the pace of scientific discovery. In many disciplines, individual workflows are large due to the large quantities of data used. As

  4. Design Tools and Workflows for Braided Structures

    DEFF Research Database (Denmark)

    Vestartas, Petras; Heinrich, Mary Katherine; Zwierzycki, Mateusz

    2017-01-01

    and merits of our method, demonstrated though four example design and analysis workflows. The workflows frame specific aspects of enquiry for the ongoing research project flora robotica. These include modelling target geometries, automatically producing instructions for fabrication, conducting structural...

  5. Collaborative e-Science Experiments and Scientific Workflows

    NARCIS (Netherlands)

    Belloum, A.; Inda, M.A.; Vasunin, D.; Korkhov, V.; Zhao, Z.; Rauwerda, H.; Breit, T.M.; Bubak, M.; Hertzberger, L.O.

    2011-01-01

    Recent advances in Internet and grid technologies have greatly enhanced scientific experiments' life cycle. In addition to compute- and data-intensive tasks, large-scale collaborations involving geographically distributed scientists and e-infrastructure are now possible. Scientific workflows, which

  6. SHIWA workflow interoperability solutions for neuroimaging data analysis

    NARCIS (Netherlands)

    Korkhov, Vladimir; Krefting, Dagmar; Montagnat, Johan; Truong Huu, Tram; Kukla, Tamas; Terstyanszky, Gabor; Manset, David; Caan, Matthan; Olabarriaga, Silvia

    2012-01-01

    Neuroimaging is a field that benefits from distributed computing infrastructures (DCIs) to perform data- and compute-intensive processing and analysis. Using grid workflow systems not only automates the processing pipelines, but also enables domain researchers to implement their expertise on how to

  7. Distribution and concentration evaluation of trace and rare earth elements in sediment samples of the Billings and Guarapiranga reservoir systems

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Larissa S.; Fávaro, Déborah I.T., E-mail: defavaro@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (LAN-CRPq/IPEN/CNEN-SP), São Paulo(Brazil). Lab. de Análise por Ativação Neutrônica; Ferreira, Francisco J. [Companhia Ambiental do Estado de São Paulo (ELAI/CETESB), Sao Paulo, SP (Brazil). Setor de Química Inorgânica

    2017-07-01

    Concentration and distribution of trace and rare earth elements in bottom sediment samples collected in the Billings System (including Rio Grande and Guarapiranga Reservoirs) were assessed by using Instrumental Neutron Activation (INAA). To evaluate the sources of anthropogenic contamination the enrichment factor (FE) and the geoacumulation index (IGeo) were calculated using NASC and Guarapiranga Park Soil as Reference Values. Results were compared to the concentration guideline values established by CCME (Canadian Council of Ministers of the Environment) environmental agency for As, Cr and Zn, and values in other published studies. Most points exceeded TEL values and, in some points, PEL values for these elements, indicating poor sediment quality in these reservoirs. In general terms, the elements As, Cr, Sb and Zn through EF and IGeo calculations present enrichment at all points analyzed, in both collection campaigns, except for the Rio Grande Reservoir points. The region where the reservoirs are located receive untreated sewage as well as pollution from urban occupation, industrial and mining activities, making it difficult to accurately identify the pollution sources. This study found higher concentrations of the elements analyzed in the Billings Reservoir, indicating a greater contamination level in relation to the other reservoirs. (author)

  8. Trace element distribution during the reproductive cycle of female and male spiny and Pacific scallops, with implications for biomonitoring

    International Nuclear Information System (INIS)

    Norum, Ulrik; Lai, Vivian W.-M.; Cullen, William R.

    2005-01-01

    Trace element concentrations and contents in gills, gonad, kidneys, mantle, muscle and remainder during the reproductive cycle of female and male spiny and Pacific scallops, from the Strait of Georgia, BC, Canada, were quantified by using ICPMS. The elements investigated were chromium, manganese, iron, cobalt, nickel, selenium, molybdenum, cadmium, tin and mercury. For all ten elements, the tissue distribution was to some extent influenced by species, sex and reproductive status. The implications of the present study in relation to the design of biomonitoring programmes are: (1) care should be taken to ensure an equal/constant sex composition when making interannual comparisons of pooled samples. Preferably the sexes should be monitored separately. (2) the practice of obtaining pooled samples in the interspawn phase is applicable only to monitoring long-term trends in contaminant levels, while the reproductive status should be heeded when studying short-term changes. (3) the present study confirms that direct temporal or spatial comparisons of absolute accumulated element concentrations are only valid intraspecifically

  9. Distribution and concentration evaluation of trace and rare earth elements in sediment samples of the Billings and Guarapiranga reservoir systems

    International Nuclear Information System (INIS)

    Silva, Larissa S.; Fávaro, Déborah I.T.; Ferreira, Francisco J.

    2017-01-01

    Concentration and distribution of trace and rare earth elements in bottom sediment samples collected in the Billings System (including Rio Grande and Guarapiranga Reservoirs) were assessed by using Instrumental Neutron Activation (INAA). To evaluate the sources of anthropogenic contamination the enrichment factor (FE) and the geoacumulation index (IGeo) were calculated using NASC and Guarapiranga Park Soil as Reference Values. Results were compared to the concentration guideline values established by CCME (Canadian Council of Ministers of the Environment) environmental agency for As, Cr and Zn, and values in other published studies. Most points exceeded TEL values and, in some points, PEL values for these elements, indicating poor sediment quality in these reservoirs. In general terms, the elements As, Cr, Sb and Zn through EF and IGeo calculations present enrichment at all points analyzed, in both collection campaigns, except for the Rio Grande Reservoir points. The region where the reservoirs are located receive untreated sewage as well as pollution from urban occupation, industrial and mining activities, making it difficult to accurately identify the pollution sources. This study found higher concentrations of the elements analyzed in the Billings Reservoir, indicating a greater contamination level in relation to the other reservoirs. (author)

  10. The equivalency between logic Petri workflow nets and workflow nets.

    Science.gov (United States)

    Wang, Jing; Yu, ShuXia; Du, YuYue

    2015-01-01

    Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented.

  11. The Equivalency between Logic Petri Workflow Nets and Workflow Nets

    Science.gov (United States)

    Wang, Jing; Yu, ShuXia; Du, YuYue

    2015-01-01

    Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented. PMID:25821845

  12. Snakemake-a scalable bioinformatics workflow engine

    NARCIS (Netherlands)

    J. Köster (Johannes); S. Rahmann (Sven)

    2012-01-01

    textabstractSnakemake is a workflow engine that provides a readable Python-based workflow definition language and a powerful execution environment that scales from single-core workstations to compute clusters without modifying the workflow. It is the first system to support the use of automatically

  13. Behavioral technique for workflow abstraction and matching

    NARCIS (Netherlands)

    Klai, K.; Ould Ahmed M'bareck, N.; Tata, S.; Dustdar, S.; Fiadeiro, J.L.; Sheth, A.

    2006-01-01

    This work is in line with the CoopFlow approach dedicated for workflow advertisement, interconnection, and cooperation in virtual organizations. In order to advertise workflows into a registry, we present in this paper a novel method to abstract behaviors of workflows into symbolic observation

  14. A performance study of grid workflow engines

    NARCIS (Netherlands)

    Stratan, C.; Iosup, A.; Epema, D.H.J.

    2008-01-01

    To benefit from grids, scientists require grid workflow engines that automatically manage the execution of inter-related jobs on the grid infrastructure. So far, the workflows community has focused on scheduling algorithms and on interface tools. Thus, while several grid workflow engines have been

  15. Concentration and subcellular distribution of trace elements in liver of small cetaceans incidentally caught along the Brazilian coast

    Energy Technology Data Exchange (ETDEWEB)

    Kunito, Takashi; Nakamura, Shinji; Ikemoto, Tokutaka; Anan, Yasumi; Kubota, Reiji; Tanabe, Shinsuke; Rosas, Fernando C.W.; Fillmann, Gilberto; Readman, James W

    2004-10-01

    Concentrations of trace elements (V, Cr, Mn, Fe, Co, Cu, Zn, Ga, As, Se, Rb, Sr, Mo, Ag, Cd, Sb, Cs, Ba, T-Hg, Org-Hg, Tl and Pb) were determined in liver samples of estuarine dolphin (Sotalia guianensis; n=20), Franciscana dolphin (Pontoporia blainvillei; n=23), Atlantic spotted dolphin (Stenella frontalis; n=2), common dolphin (Delphinus capensis; n=1) and striped dolphin (Stenella coeruleoalba; n=1) incidentally caught along the coast of Sao Paulo State and Parana State, Brazil, from 1997 to 1999. The hepatic concentrations of trace elements in the Brazilian cetaceans were comparable to the data available in literature on marine mammals from Northern Hemisphere. Concentrations of V, Se, Mo, Cd, T-Hg and Org-Hg increased with increasing age in liver of both estuarine and Franciscana dolphins. Very high concentrations of Cu (range, 262-1970 {mu}g/g dry wt.) and Zn (range, 242-369 {mu}g/g dry wt.) were observed in liver of sucklings of estuarine dolphin. Hepatic concentrations of V, Se, T-Hg, Org-Hg and Pb were significantly higher in estuarine dolphin, whereas Franciscana dolphin showed higher concentrations of Mn, Co, As and Rb. Ratio of Org-Hg to T-Hg in liver was significantly higher in Franciscana dolphin than estuarine dolphin, suggesting that demethylation ability of methyl Hg might be lower in liver of Franciscana than estuarine dolphins. High hepatic concentrations of Ag were found in some specimens of Franciscana dolphin (maximum, 20 {mu}g/g dry wt.), and 17% of Franciscana showed higher concentrations of Ag than Hg. These samples with high Ag concentration also exhibited elevated hepatic Se concentration, implying that Ag might be detoxified by Se in the liver. Higher correlation coefficient between (Hg + 0.5 Ag) and Se than between Hg and Se and the large distribution of Ag in non-soluble fraction in nuclear and mitochondrial fraction of the liver also suggests that Ag might be detoxified by Se via formation of Ag{sub 2}Se in the liver of Franciscana

  16. Trace metal distribution and mobility in drill cuttings and produced waters from Marcellus Shale gas extraction: Uranium, arsenic, barium

    International Nuclear Information System (INIS)

    Phan, Thai T.; Capo, Rosemary C.; Stewart, Brian W.; Graney, Joseph R.; Johnson, Jason D.; Sharma, Shikha; Toro, Jaime

    2015-01-01

    Highlights: • Distributions of U, As, and Ba in Marcellus Shale were determined. • As is primarily associated with sulfide minerals, Ba with exchange sites. • Most U is in the silicate minerals, but up to 20% is partitioned into carbonate. • Low [U] and [As] in produced water are consistent with reducing downhole conditions. • Proper waste management should account for potential mobilization of U and As. - Abstract: Development of unconventional shale gas wells can generate significant quantities of drilling waste, including trace metal-rich black shale from the lateral portion of the drillhole. We carried out sequential extractions on 15 samples of dry-drilled cuttings and core material from the gas-producing Middle Devonian Marcellus Shale and surrounding units to identify the host phases and evaluate the mobility of selected trace elements during cuttings disposal. Maximum whole rock concentrations of uranium (U), arsenic (As), and barium (Ba) were 47, 90, and 3333 mg kg −1 , respectively. Sequential chemical extractions suggest that although silicate minerals are the primary host for U, as much as 20% can be present in carbonate minerals. Up to 74% of the Ba in shale was extracted from exchangeable sites in the shale, while As is primarily associated with organic matter and sulfide minerals that could be mobilized by oxidation. For comparison, U and As concentrations were also measured in 43 produced water samples returned from Marcellus Shale gas wells. Low U concentrations in produced water (<0.084–3.26 μg L −1 ) are consistent with low-oxygen conditions in the wellbore, in which U would be in its reduced, immobile form. Arsenic was below detection in all produced water samples, which is also consistent with reducing conditions in the wellbore minimizing oxidation of As-bearing sulfide minerals. Geochemical modeling to determine mobility under surface storage and disposal conditions indicates that oxidation and/or dissolution of U

  17. Concentration and subcellular distribution of trace elements in liver of small cetaceans incidentally caught along the Brazilian coast

    International Nuclear Information System (INIS)

    Kunito, Takashi; Nakamura, Shinji; Ikemoto, Tokutaka; Anan, Yasumi; Kubota, Reiji; Tanabe, Shinsuke; Rosas, Fernando C.W.; Fillmann, Gilberto; Readman, James W.

    2004-01-01

    Concentrations of trace elements (V, Cr, Mn, Fe, Co, Cu, Zn, Ga, As, Se, Rb, Sr, Mo, Ag, Cd, Sb, Cs, Ba, T-Hg, Org-Hg, Tl and Pb) were determined in liver samples of estuarine dolphin (Sotalia guianensis; n=20), Franciscana dolphin (Pontoporia blainvillei; n=23), Atlantic spotted dolphin (Stenella frontalis; n=2), common dolphin (Delphinus capensis; n=1) and striped dolphin (Stenella coeruleoalba; n=1) incidentally caught along the coast of Sao Paulo State and Parana State, Brazil, from 1997 to 1999. The hepatic concentrations of trace elements in the Brazilian cetaceans were comparable to the data available in literature on marine mammals from Northern Hemisphere. Concentrations of V, Se, Mo, Cd, T-Hg and Org-Hg increased with increasing age in liver of both estuarine and Franciscana dolphins. Very high concentrations of Cu (range, 262-1970 μg/g dry wt.) and Zn (range, 242-369 μg/g dry wt.) were observed in liver of sucklings of estuarine dolphin. Hepatic concentrations of V, Se, T-Hg, Org-Hg and Pb were significantly higher in estuarine dolphin, whereas Franciscana dolphin showed higher concentrations of Mn, Co, As and Rb. Ratio of Org-Hg to T-Hg in liver was significantly higher in Franciscana dolphin than estuarine dolphin, suggesting that demethylation ability of methyl Hg might be lower in liver of Franciscana than estuarine dolphins. High hepatic concentrations of Ag were found in some specimens of Franciscana dolphin (maximum, 20 μg/g dry wt.), and 17% of Franciscana showed higher concentrations of Ag than Hg. These samples with high Ag concentration also exhibited elevated hepatic Se concentration, implying that Ag might be detoxified by Se in the liver. Higher correlation coefficient between (Hg + 0.5 Ag) and Se than between Hg and Se and the large distribution of Ag in non-soluble fraction in nuclear and mitochondrial fraction of the liver also suggests that Ag might be detoxified by Se via formation of Ag 2 Se in the liver of Franciscana dolphin

  18. Flexible Data-Aware Scheduling for Workflows over an In-Memory Object Store

    Energy Technology Data Exchange (ETDEWEB)

    Duro, Francisco Rodrigo; Garcia Blas, Javier; Isaila, Florin; Wozniak, Justin M.; Carretero, Jesus; Ross, Rob

    2016-01-01

    This paper explores novel techniques for improving the performance of many-task workflows based on the Swift scripting language. We propose novel programmer options for automated distributed data placement and task scheduling. These options trigger a data placement mechanism used for distributing intermediate workflow data over the servers of Hercules, a distributed key-value store that can be used to cache file system data. We demonstrate that these new mechanisms can significantly improve the aggregated throughput of many-task workflows with up to 86x, reduce the contention on the shared file system, exploit the data locality, and trade off locality and load balance.

  19. The Taverna workflow suite: designing and executing workflows of Web Services on the desktop, web or in the cloud

    NARCIS (Netherlands)

    Wolstencroft, K.; Haines, R.; Fellows, D.; Williams, A.; Withers, D.; Owen, S.; Soiland-Reyes, S.; Dunlop, I.; Nenadic, A.; Fisher, P.; Bhagat, J.; Belhajjame, K.; Bacall, F.; Hardisty, A.; Nieva de la Hidalga, A.; Balcazar Vargas, M.P.; Sufi, S.; Goble, C.

    2013-01-01

    The Taverna workflow tool suite (http://www.taverna.org.uk) is designed to combine distributed Web Services and/or local tools into complex analysis pipelines. These pipelines can be executed on local desktop machines or through larger infrastructure (such as supercomputers, Grids or cloud

  20. Distributed Tracing of Intruders

    National Research Council Canada - National Science Library

    Staniford-Chen, Stuart G

    1995-01-01

    .... One of the things that facilitates this malfeasance is that computer networks provide the ability for a user to log into multiple computer systems in sequence, changing identity with each step...

  1. Distribution of trace metals in surface seawater and zooplankton of the Bay of Bengal, off Rushikulya estuary, East Coast of India

    International Nuclear Information System (INIS)

    Srichandan, Suchismita; Panigrahy, R.C.; Baliarsingh, S.K.; Srinivasa, Rao B.; Pati, Premalata; Sahu, Biraja K.; Sahu, K.C.

    2016-01-01

    Concentrations of trace metals such as iron (Fe), copper (Cu), zinc (Zn), cobalt (Co), nickel (Ni), manganese (Mn), lead (Pb), cadmium (Cd), chromium (Cr), arsenic (As), vanadium (V), and selenium (Se) were determined in seawater and zooplankton from the surface waters off Rushikulya estuary, north-western Bay of Bengal. During the study period, the concentration of trace metals in seawater and zooplankton showed significant spatio-temporal variation. Cu and Co levels in seawater mostly remained non-detectable. Other elements were found at higher concentrations and exhibited marked variations. The rank order distribution of trace metals in terms of their average concentration in seawater was observed as Fe > Ni > Mn > Pb > As > Zn > Cr > V > Se > Cd while in zooplankton it was Fe > Mn > Cd > As > Pb > Ni > Cr > Zn > V > Se. The bioaccumulation factor (BAF) of Fe was highest followed by Zn and the lowest value was observed with Ni. Results of correlation analysis discerned positive affinity and good relationship among the majority of the trace metals, both in seawater and zooplankton suggesting their strong affinity and coexistence. - Highlights: • First-hand report on trace metal concentration in zooplankton and seawater covering 2 years from this eco-sensitive region. • In seawater trace metals followed the rank order of Fe > Ni > Mn > Pb > As > Zn > Cr > V > Se > Cd. • In zooplankton the rank order was Fe > Mn > Cd > As > Pb > Ni > Cr > Zn > V > Se. • The bioaccumulation factor of Fe was highest followed by Zn. • Strong affinity, coexistence, and similar source of trace metals in the study area.

  2. Climate Data Analytics Workflow Management

    Science.gov (United States)

    Zhang, J.; Lee, S.; Pan, L.; Mattmann, C. A.; Lee, T. J.

    2016-12-01

    In this project we aim to pave a novel path to create a sustainable building block toward Earth science big data analytics and knowledge sharing. Closely studying how Earth scientists conduct data analytics research in their daily work, we have developed a provenance model to record their activities, and to develop a technology to automatically generate workflows for scientists from the provenance. On top of it, we have built the prototype of a data-centric provenance repository, and establish a PDSW (People, Data, Service, Workflow) knowledge network to support workflow recommendation. To ensure the scalability and performance of the expected recommendation system, we have leveraged the Apache OODT system technology. The community-approved, metrics-based performance evaluation web-service will allow a user to select a metric from the list of several community-approved metrics and to evaluate model performance using the metric as well as the reference dataset. This service will facilitate the use of reference datasets that are generated in support of the model-data intercomparison projects such as Obs4MIPs and Ana4MIPs. The data-centric repository infrastructure will allow us to catch richer provenance to further facilitate knowledge sharing and scientific collaboration in the Earth science community. This project is part of Apache incubator CMDA project.

  3. It's All About the Data: Workflow Systems and Weather

    Science.gov (United States)

    Plale, B.

    2009-05-01

    under high-volume conditions, or the searchability and manageability of the resulting data products is disappointingly low. The provenance of a data product is a record of its lineage, or trace of the execution history that resulted in the product. The provenance of a forecast model result, e.g., captures information about the executable version of the model, configuration parameters, input data products, execution environment, and owner. Provenance enables data to be properly attributed and captures critical parameters about the model run so the quality of the result can be ascertained. Proper provenance is essential to providing reproducible scientific computing results. Workflow languages used in science discovery are complete programming languages, and in theory can support any logic expressible by a programming language. The execution environments supporting the workflow engines, on the other hand, are subject to constraints on physical resources, and hence in practice the workflow task graphs used in science utilize relatively few of the cataloged workflow patterns. It is important to note that these workflows are executed on demand, and are executed once. Into this context is introduced the need for science discovery that is responsive to real time information. If we can use simple programming models and abstractions to make scientific discovery involving real-time data accessible to specialists who share and utilize data across scientific domains, we bring science one step closer to solving the largest of human problems.

  4. Distribution and risk assessment of trace metals in sediments from Yangtze River estuary and Hangzhou Bay, China.

    Science.gov (United States)

    Li, Feipeng; Mao, Lingchen; Jia, Yubao; Gu, Zhujun; Shi, Weiling; Chen, Ling; Ye, Hua

    2018-01-01

    The Yangtze River estuary (YRE) and Hangzhou Bay (HZB) is of environmental significance because of the negative impact from industrial activities and rapid development of aquaculture on the south bank of HZB (SHZB) in recent years. This study investigated the distribution and risk assessments of trace metals (Cr, Cu, Zn, Hg, Pb, and Cd) accumulated in surface sediments by sampling in YRE, outer and south HZB. Copper and Zn concentration (avg. 35.4 and 98.7 mg kg -1 , respectively) in surface sediments were generally higher than the background suggesting a widespread of Cu and Zn in the coastal area of Yangtze River Delta. High concentrations of Cu (~ 42 mg kg -1 ), Zn (~ 111 mg kg -1 ), Cd (~ 0.27 mg kg -1 ), and Hg (~ 0.047 mg kg -1 ) were found in inner estuary of YRE and decreased offshore as a result of terrestrial input and dilution effect of total metal contents by "cleaner" sediments from the adjacent sea. In outer HZB, accumulation of terrestrial derived metal has taken place near the Zhoushan Islands. Increase in sediment metal concentration from the west (inner) to the east (outer) of SHZB gave rise to the input of fine-grained sediments contaminated with metals from outer bay. According the results from geoaccumulation index, nearly 75% of samples from YRE were moderately polluted (1.0 < I geo  < 2.0) by Cd. Cadmium and Hg contributed for 80~90% to the potential ecological risk index in the YRE and HZB, with ~ 72% sites in HZB under moderate risk (150 ≤ RI < 300) especially near Zhoushan Islands.

  5. VLAM-G: Interactive Data Driven Workflow Engine for Grid-Enabled Resources

    Directory of Open Access Journals (Sweden)

    Vladimir Korkhov

    2007-01-01

    Full Text Available Grid brings the power of many computers to scientists. However, the development of Grid-enabled applications requires knowledge about Grid infrastructure and low-level API to Grid services. In turn, workflow management systems provide a high-level environment for rapid prototyping of experimental computing systems. Coupling Grid and workflow paradigms is important for the scientific community: it makes the power of the Grid easily available to the end user. The paradigm of data driven workflow execution is one of the ways to enable distributed workflow on the Grid. The work presented in this paper is carried out in the context of the Virtual Laboratory for e-Science project. We present the VLAM-G workflow management system and its core component: the Run-Time System (RTS. The RTS is a dataflow driven workflow engine which utilizes Grid resources, hiding the complexity of the Grid from a scientist. Special attention is paid to the concept of dataflow and direct data streaming between distributed workflow components. We present the architecture and components of the RTS, describe the features of VLAM-G workflow execution, and evaluate the system by performance measurements and a real life use case.

  6. Selenium and Trace Element Distribution in Astragalus Plants: Developing a Differential Pulse Polarographic Method for Their Determination

    OpenAIRE

    SOMER, Güler; ÇALIŞKAN, A. Cengiz

    2007-01-01

    Astragalus plants have a wide range of applications in pharmaceuticals (gum tragacanth), as thickening agents in foods, and may have applications in controlling cancer cells. They are used as feed for animals and they are indicator plants for selenium. Because of their use in health-related areas it is very important to determine their selenium and trace element content with high accuracy. A new differential pulse polarographic method was established for trace element determination (...

  7. Workflows for Full Waveform Inversions

    Science.gov (United States)

    Boehm, Christian; Krischer, Lion; Afanasiev, Michael; van Driel, Martin; May, Dave A.; Rietmann, Max; Fichtner, Andreas

    2017-04-01

    Despite many theoretical advances and the increasing availability of high-performance computing clusters, full seismic waveform inversions still face considerable challenges regarding data and workflow management. While the community has access to solvers which can harness modern heterogeneous computing architectures, the computational bottleneck has fallen to these often manpower-bounded issues that need to be overcome to facilitate further progress. Modern inversions involve huge amounts of data and require a tight integration between numerical PDE solvers, data acquisition and processing systems, nonlinear optimization libraries, and job orchestration frameworks. To this end we created a set of libraries and applications revolving around Salvus (http://salvus.io), a novel software package designed to solve large-scale full waveform inverse problems. This presentation focuses on solving passive source seismic full waveform inversions from local to global scales with Salvus. We discuss (i) design choices for the aforementioned components required for full waveform modeling and inversion, (ii) their implementation in the Salvus framework, and (iii) how it is all tied together by a usable workflow system. We combine state-of-the-art algorithms ranging from high-order finite-element solutions of the wave equation to quasi-Newton optimization algorithms using trust-region methods that can handle inexact derivatives. All is steered by an automated interactive graph-based workflow framework capable of orchestrating all necessary pieces. This naturally facilitates the creation of new Earth models and hopefully sparks new scientific insights. Additionally, and even more importantly, it enhances reproducibility and reliability of the final results.

  8. Automation of Flexible Migration Workflows

    Directory of Open Access Journals (Sweden)

    Dirk von Suchodoletz

    2011-03-01

    Full Text Available Many digital preservation scenarios are based on the migration strategy, which itself is heavily tool-dependent. For popular, well-defined and often open file formats – e.g., digital images, such as PNG, GIF, JPEG – a wide range of tools exist. Migration workflows become more difficult with proprietary formats, as used by the several text processing applications becoming available in the last two decades. If a certain file format can not be rendered with actual software, emulation of the original environment remains a valid option. For instance, with the original Lotus AmiPro or Word Perfect, it is not a problem to save an object of this type in ASCII text or Rich Text Format. In specific environments, it is even possible to send the file to a virtual printer, thereby producing a PDF as a migration output. Such manual migration tasks typically involve human interaction, which may be feasible for a small number of objects, but not for larger batches of files.We propose a novel approach using a software-operated VNC abstraction layer in order to replace humans with machine interaction. Emulators or virtualization tools equipped with a VNC interface are very well suited for this approach. But screen, keyboard and mouse interaction is just part of the setup. Furthermore, digital objects need to be transferred into the original environment in order to be extracted after processing. Nevertheless, the complexity of the new generation of migration services is quickly rising; a preservation workflow is now comprised not only of the migration tool itself, but of a complete software and virtual hardware stack with recorded workflows linked to every supported migration scenario. Thus the requirements of OAIS management must include proper software archiving, emulator selection, system image and recording handling. The concept of view-paths could help either to automatically determine the proper pre-configured virtual environment or to set up system

  9. Distribution of trace metals in surface seawater and zooplankton of the Bay of Bengal, off Rushikulya estuary, East Coast of India.

    Science.gov (United States)

    Srichandan, Suchismita; Panigrahy, R C; Baliarsingh, S K; Rao B, Srinivasa; Pati, Premalata; Sahu, Biraja K; Sahu, K C

    2016-10-15

    Concentrations of trace metals such as iron (Fe), copper (Cu), zinc (Zn), cobalt (Co), nickel (Ni), manganese (Mn), lead (Pb), cadmium (Cd), chromium (Cr), arsenic (As), vanadium (V), and selenium (Se) were determined in seawater and zooplankton from the surface waters off Rushikulya estuary, north-western Bay of Bengal. During the study period, the concentration of trace metals in seawater and zooplankton showed significant spatio-temporal variation. Cu and Co levels in seawater mostly remained non-detectable. Other elements were found at higher concentrations and exhibited marked variations. The rank order distribution of trace metals in terms of their average concentration in seawater was observed as Fe>Ni>Mn>Pb>As>Zn>Cr>V>Se>Cd while in zooplankton it was Fe>Mn>Cd>As>Pb>Ni>Cr>Zn>V>Se. The bioaccumulation factor (BAF) of Fe was highest followed by Zn and the lowest value was observed with Ni. Results of correlation analysis discerned positive affinity and good relationship among the majority of the trace metals, both in seawater and zooplankton suggesting their strong affinity and coexistence. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Vegetation and Cold Trapping Modulating Elevation-dependent Distribution of Trace Metals in Soils of a High Mountain in Eastern Tibetan Plateau.

    Science.gov (United States)

    Bing, Haijian; Wu, Yanhong; Zhou, Jun; Li, Rui; Luo, Ji; Yu, Dong

    2016-04-07

    Trace metals adsorbed onto fine particles can be transported long distances and ultimately deposited in Polar Regions via the cold condensation effect. This study indicated the possible sources of silver (Ag), cadmium (Cd), copper (Cu), lead (Pb), antimony (Sb) and zinc (Zn) in soils on the eastern slope of Mt. Gongga, eastern Tibetan Plateau, and deciphered the effects of vegetation and mountain cold condensation on their distributions with elevation. The metal concentrations in the soils were comparable to other mountains worldwide except the remarkably high concentrations of Cd. Trace metals with high enrichment in the soils were influenced from anthropogenic contributions. Spatially, the concentrations of Cu and Zn in the surface horizons decreased from 2000 to 3700 m a.s.l., and then increased with elevation, whereas other metals were notably enriched in the mid-elevation area (approximately 3000 m a.s.l.). After normalization for soil organic carbon, high concentrations of Cd, Pb, Sb and Zn were observed above the timberline. Our results indicated the importance of vegetation in trace metal accumulation in an alpine ecosystem and highlighted the mountain cold trapping effect on trace metal deposition sourced from long-range atmospheric transport.

  11. Distributions, sources and pollution status of 17 trace metal/metalloids in the street dust of a heavily industrialized city of central China

    International Nuclear Information System (INIS)

    Li, Zhonggen; Feng, Xinbin; Li, Guanghui; Bi, Xiangyang; Zhu, Jianming; Qin, Haibo; Dai, Zhihui; Liu, Jinling; Li, Qiuhua; Sun, Guangyi

    2013-01-01

    A series of representative street dust samples were collected from a heavily industrialized city, Zhuzhou, in central China, with the aim to investigate the spatial distribution and pollution status of 17 trace metal/metalloid elements. Concentrations of twelve elements (Pb, Zn, Cu, Cd, Hg, As, Sb, In, Bi, Tl, Ag and Ga) were distinctly amplified by atmospheric deposition resulting from a large scale Pb/Zn smelter located in the northwest fringe of the city, and followed a declining trend towards the city center. Three metals (W, Mo and Co) were enriched in samples very close to a hard alloy manufacturing plant, while Ni and Cr appeared to derive predominantly from natural sources. Other industries and traffic had neglectable effects on the accumulation of observed elements. Cd, In, Zn, Ag and Pb were the five metal/metalloids with highest pollution levels and the northwestern part of city is especially affected by heavy metal pollution. -- Highlights: •Large-scale Pb/Zn smelters contributed to elevated trace elements in the street dust. •The hard alloy processing caused the enrichment of a few elements. •Cd, In, Zn, Ag and Pb were the most polluted elements. •Northwestern Zhuzhou suffered severe contamination for a range of trace elements. -- Pb/Zn smelting and hard alloy processing operations have caused seriously contamination of trace metal/metalloids in the street dust

  12. Pro WF Windows Workflow in NET 40

    CERN Document Server

    Bukovics, Bruce

    2010-01-01

    Windows Workflow Foundation (WF) is a revolutionary part of the .NET 4 Framework that allows you to orchestrate human and system interactions as a series of workflows that can be easily mapped, analyzed, adjusted, and implemented. As business problems become more complex, the need for workflow-based solutions has never been more evident. WF provides a simple and consistent way to model and implement complex problems. As a developer, you focus on developing the business logic for individual workflow tasks. The runtime handles the execution of those tasks after they have been composed into a wor

  13. Multidetector-row CT: economics and workflow

    International Nuclear Information System (INIS)

    Pottala, K.M.; Kalra, M.K.; Saini, S.; Ouellette, K.; Sahani, D.; Thrall, J.H.

    2005-01-01

    With rapid evolution of multidetector-row CT (MDCT) technology and applications, several factors such ad technology upgrade and turf battles for sharing cost and profitability affect MDCT workflow and economics. MDCT workflow optimization can enhance productivity and reduce unit costs as well as increase profitability, in spite of decrease in reimbursement rates. Strategies for workflow management include standardization, automation, and constant assessment of various steps involved in MDCT operations. In this review article, we describe issues related to MDCT economics and workflow. (orig.)

  14. Styx Grid Services: Lightweight Middleware for Efficient Scientific Workflows

    Directory of Open Access Journals (Sweden)

    J.D. Blower

    2006-01-01

    Full Text Available The service-oriented approach to performing distributed scientific research is potentially very powerful but is not yet widely used in many scientific fields. This is partly due to the technical difficulties involved in creating services and workflows and the inefficiency of many workflow systems with regard to handling large datasets. We present the Styx Grid Service, a simple system that wraps command-line programs and allows them to be run over the Internet exactly as if they were local programs. Styx Grid Services are very easy to create and use and can be composed into powerful workflows with simple shell scripts or more sophisticated graphical tools. An important feature of the system is that data can be streamed directly from service to service, significantly increasing the efficiency of workflows that use large data volumes. The status and progress of Styx Grid Services can be monitored asynchronously using a mechanism that places very few demands on firewalls. We show how Styx Grid Services can interoperate with with Web Services and WS-Resources using suitable adapters.

  15. Towards an Intelligent Workflow Designer based on the Reuse of Workflow Patterns

    NARCIS (Netherlands)

    Iochpe, Cirano; Chiao, Carolina; Hess, Guillermo; Nascimento, Gleison; Thom, Lucinéia; Reichert, Manfred

    2007-01-01

    In order to perform process-aware information systems we need sophisticated methods and concepts for designing and modeling processes. Recently, research on workflow patterns has emerged in order to increase the reuse of recurring workflow structures. However, current workflow modeling tools do not

  16. The distribution and speciation of trace metals in surface sediments from the Pearl River Estuary and the Daya Bay, Southern China

    International Nuclear Information System (INIS)

    Yu Xiujuan; Yan Yan; Wang Wenxiong

    2010-01-01

    Surface sediments collected from the Pearl River Estuary (PRE) and the Daya Bay (DYB) were analyzed for total metal concentrations and chemical phase partitioning. The total concentrations of Cr, Cu, Ni, Pb, and Zn in the PRE were obviously higher than those in DYB. The maximum concentrations of trace metals in DYB occurred in the four sub-basins, especially in Dapeng Cove, while the concentrations of these metals in the western side of the PRE were higher than those in the east side. Such distribution pattern was primarily due to the different hydraulic conditions and inputs of anthropogenic trace metals. The chemical partitioning of metals analyzed by the BCR sequential extraction method showed that Cr, Ni, and Zn of both areas were present dominantly in the residual fraction, while Pb was found mostly in the non-residual fractions. The partitioning of Cu showed a significant difference between the two areas.

  17. A STRUCTURAL MODEL OF AN EXCAVATOR WORKFLOW CONTROL SYSTEM

    Directory of Open Access Journals (Sweden)

    A. Gurko

    2016-12-01

    Full Text Available Earthwork improving is connected with excavators automation. In this paper, on the basis of the analysis of problems that a hydraulic excavator control system have to solve, the hierarchical structure of a control system have been proposed. The decomposition of the control process had been executed that allowed to develop the structural model which reflects the characteristics of a multilevel space-distributed control system of an excavator workflow.

  18. Workflow patterns the definitive guide

    CERN Document Server

    Russell, Nick; ter Hofstede, Arthur H M

    2016-01-01

    The study of business processes has emerged as a highly effective approach to coordinating an organization's complex service- and knowledge-based activities. The growing field of business process management (BPM) focuses on methods and tools for designing, enacting, and analyzing business processes. This volume offers a definitive guide to the use of patterns, which synthesize the wide range of approaches to modeling business processes. It provides a unique and comprehensive introduction to the well-known workflow patterns collection -- recurrent, generic constructs describing common business process modeling and execution scenarios, presented in the form of problem-solution dialectics. The underlying principles of the patterns approach ensure that they are independent of any specific enabling technology, representational formalism, or modeling approach, and thus broadly applicable across the business process modeling and business process technology domains. The authors, drawing on extensive research done by...

  19. Complexity Metrics for Workflow Nets

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M.P.

    2009-01-01

    analysts have difficulties grasping the dynamics implied by a process model. Recent empirical studies show that people make numerous errors when modeling complex business processes, e.g., about 20 percent of the EPCs in the SAP reference model have design flaws resulting in potential deadlocks, livelocks......, etc. It seems obvious that the complexity of the model contributes to design errors and a lack of understanding. It is not easy to measure complexity, however. This paper presents three complexity metrics that have been implemented in the process analysis tool ProM. The metrics are defined...... for a subclass of Petri nets named Workflow nets, but the results can easily be applied to other languages. To demonstrate the applicability of these metrics, we have applied our approach and tool to 262 relatively complex Protos models made in the context of various student projects. This allows us to validate...

  20. Contracts for Cross-Organizational Workflow Management

    NARCIS (Netherlands)

    Koetsier, M.J.; Grefen, P.W.P.J.; Vonk, J.

    1999-01-01

    Nowadays, many organizations form dynamic partnerships to deal effectively with market requirements. As companies use automated workflow systems to control their processes, a way of linking workflow processes in different organizations is useful in turning the co-operating companies into a seamless

  1. Verifying generalized soundness for workflow nets

    NARCIS (Netherlands)

    Hee, van K.M.; Oanea, O.I.; Sidorova, N.; Voorhoeve, M.; Virbitskaite, I.; Voronkov, A.

    2007-01-01

    We improve the decision procedure from [10] for the problem of generalized soundness of workflow nets. A workflow net is generalized sound iff every marking reachable from an initial marking with k tokens on the initial place terminates properly, i.e. it can reach a marking with k tokens on the

  2. Workflow Fault Tree Generation Through Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2014-01-01

    We present a framework for the automated generation of fault trees from models of realworld process workflows, expressed in a formalised subset of the popular Business Process Modelling and Notation (BPMN) language. To capture uncertainty and unreliability in workflows, we extend this formalism...

  3. Workflow Patterns for Business Process Modeling

    NARCIS (Netherlands)

    Thom, Lucineia Heloisa; Lochpe, Cirano; Reichert, M.U.

    For its reuse advantages, workflow patterns (e.g., control flow patterns, data patterns, resource patterns) are increasingly attracting the interest of both researchers and vendors. Frequently, business process or workflow models can be assembeled out of a set of recurrent process fragments (or

  4. Distribution of some trace elements in biosubstrates of workers occupied in the production of mineral nitrogenous phosphate fertilizers

    International Nuclear Information System (INIS)

    Gorbunov, A.V.; Lyapunov, S.M.; Okina, O.I.; Frontasyeva, M.V.; Pavlov, S.S.

    2005-01-01

    The data on the content of some trace elements typical for the production of nitrogeneous phosphate fertilizers (F, Sr, rare-earth elements), as well as heavy and toxic metals in industrial products, occupational air, drinking water and bio substrates (urine, hair) of the factory workers are presented. The correlations between the content of fluorine in urine and hair of workers and between the content of fluorine, length of service and age, have been shown. The correlation dependence between the content of F in bio substrates and a number of trace elements typical for the given type of production has been evaluated. The comparison of the morbidity and character of diseases of the factory workers and of the local residents unoccupied in the production has been made

  5. Distribution and leaching characteristics of trace elements in ashes as a function of different waste fuels and incineration technologies.

    Science.gov (United States)

    Saqib, Naeem; Bäckström, Mattias

    2015-10-01

    Impact of waste fuels (virgin/waste wood, mixed biofuel (peat, bark, wood chips) industrial, household, mixed waste fuel) and incineration technologies on partitioning and leaching behavior of trace elements has been investigated. Study included 4 grate fired and 9 fluidized boilers. Results showed that mixed waste incineration mostly caused increased transfer of trace elements to fly ash; particularly Pb/Zn. Waste wood incineration showed higher transfer of Cr, As and Zn to fly ash as compared to virgin wood. The possible reasons could be high input of trace element in waste fuel/change in volatilization behavior due to addition of certain waste fractions. The concentration of Cd and Zn increased in fly ash with incineration temperature. Total concentration in ashes decreased in order of Zn>Cu>Pb>Cr>Sb>As>Mo. The concentration levels of trace elements were mostly higher in fluidized boilers fly ashes as compared to grate boilers (especially for biofuel incineration). It might be attributed to high combustion efficiency due to pre-treatment of waste in fluidized boilers. Leaching results indicated that water soluble forms of elements in ashes were low with few exceptions. Concentration levels in ash and ash matrix properties (association of elements on ash particles) are crucial parameters affecting leaching. Leached amounts of Pb, Zn and Cr in >50% of fly ashes exceeded regulatory limit for disposal. 87% of chlorine in fly ashes washed out with water at the liquid to solid ratio 10 indicating excessive presence of alkali metal chlorides/alkaline earths. Copyright © 2015. Published by Elsevier B.V.

  6. Vertical distribution of trace-element concentrations and occurrence of metallurgical slag particles in accumulated bed sediments of Lake Roosevelt, Washington, September 2002

    Science.gov (United States)

    Cox, S.E.; Bell, P.R.; Lowther, J.S.; Van Metre, P.C.

    2005-01-01

    Sediment cores were collected from six locations in Lake Roosevelt to determine the vertical distributions of trace-element concentrations in the accumulated sediments of Lake Roosevelt. Elevated concentrations of arsenic, cadmium, copper, lead, mercury, and zinc occurred throughout much of the accumulated sediments. Concentrations varied greatly within the sediment core profiles, often covering a range of 5 to 10 fold. Trace-element concentrations typically were largest below the surficial sediments in the lower one-half of each profile, with generally decreasing concentrations from the 1964 horizon to the surface of the core. The trace-element profiles reflect changes in historical discharges of trace elements to the Columbia River by an upstream smelter. All samples analyzed exceeded clean-up guidelines adopted by the Confederated Tribes of the Colville Reservation for cadmium, lead, and zinc and more than 70 percent of the samples exceeded cleanup guidelines for mercury, arsenic, and copper. Although 100 percent of the samples exceeded sediment guidelines for cadmium, lead, and zinc, surficial concentrations of arsenic, copper, and mercury in some cores were less than the sediment-quality guidelines. With the exception of copper, the trace-element profiles of the five cores collected along the pre-reservoir Columbia River channel typically showed trends of decreasing concentrations in sediments deposited after the 1964 time horizon. The decreasing concentrations of trace elements in the upper half of cores from along the pre-reservoir Columbia River showed a pattern of decreasing concentrations similar to reductions in trace-element loading in liquid effluent from an upstream smelter. Except for arsenic, trace-element concentrations typically were smaller at downstream reservoir locations along the pre-reservoir Columbia River. Trace-element concentration in sediments from the Spokane Arm of the reservoir showed distinct differences compared to the similarities

  7. A Formal Framework for Workflow Analysis

    Science.gov (United States)

    Cravo, Glória

    2010-09-01

    In this paper we provide a new formal framework to model and analyse workflows. A workflow is the formal definition of a business process that consists in the execution of tasks in order to achieve a certain objective. In our work we describe a workflow as a graph whose vertices represent tasks and the arcs are associated to workflow transitions. Each task has associated an input/output logic operator. This logic operator can be the logical AND (•), the OR (⊗), or the XOR -exclusive-or—(⊕). Moreover, we introduce algebraic concepts in order to completely describe completely the structure of workflows. We also introduce the concept of logical termination. Finally, we provide a necessary and sufficient condition for this property to hold.

  8. Distributions of Polycyclic Aromatic Hydrocarbons, Aromatic Ketones, Carboxylic Acids, and Trace Metals in Arctic Aerosols: Long-Range Atmospheric Transport, Photochemical Degradation/Production at Polar Sunrise.

    Science.gov (United States)

    Singh, Dharmendra Kumar; Kawamura, Kimitaka; Yanase, Ayako; Barrie, Leonard A

    2017-08-15

    The distributions, correlations, and source apportionment of aromatic acids, aromatic ketones, polycyclic aromatic hydrocarbons (PAHs), and trace metals were studied in Canadian high Arctic aerosols. Nineteen PAHs including minor sulfur-containing heterocyclic PAH (dibenzothiophene) and major 6 carcinogenic PAHs were detected with a high proportion of fluoranthene followed by benzo[k]fluoranthene, pyrene, and chrysene. However, in the sunlit period of spring, their concentrations significantly declined likely due to photochemical decomposition. During the polar sunrise from mid-March to mid-April, benzo[a]pyrene to benzo[e]pyrene ratios significantly dropped, and the ratios diminished further from late April to May onward. These results suggest that PAHs transported over the Arctic are subjected to strong photochemical degradation at polar sunrise. Although aromatic ketones decreased in spring, concentrations of some aromatic acids such as benzoic and phthalic acids increased during the course of polar sunrise, suggesting that aromatic hydrocarbons are oxidized to result in aromatic acids. However, PAHs do not act as the major source for low molecular weight (LMW) diacids such as oxalic acid that are largely formed at polar sunrise in the arctic atmosphere because PAHs are 1 to 2 orders of magnitude less abundant than LMW diacids. Correlations of trace metals with organics, their sources, and the possible role of trace transition metals are explained.

  9. The distribution of mercury and other trace elements in the bones of two human individuals from medieval Denmark – the chemical life history hypothesis

    DEFF Research Database (Denmark)

    Rasmussen, Kaare Lund; Skytte, Lilian; Pilekær, Christian

    2013-01-01

    performed on a single sample from a tooth or a long bone. In this paper we investigate how a suite of elements (Mg, Al, Ca, Mn, Fe, Zn, As, Sr, Ba, Hg and Pb) are distributed in two medieval skeletons excavated at the laymen cemetery at the Franciscan Friary in Svendborg, Denmark.The analyses have been...... individuals can be clearly distinguished by Principal Component Analysis of all the measured trace elements.Our data support a previously published hypothesis that the elemental ratios Sr/Ca, Ba/Ca and Mg/Ca are indicative of provenance. Aluminium, Fe and Mn can be attributed to various forms of diagenesis...

  10. A validation of a ray-tracing tool used to generate bi-directional scattering distribution functions for complex fenestration systems

    DEFF Research Database (Denmark)

    McNeil, A.; Jonsson, C.J.; Appelfeld, David

    2013-01-01

    , or daylighting systems. However, such tools require users to provide bi-directional scattering distribution function (BSDF) data that describe the solar-optical performance of the CFS. A free, open-source Radiance tool genBSDF enables users to generate BSDF data for arbitrary CFS. Prior to genBSDF, BSDF data.......We explain the basis and use of the genBSDF tool and validate the tool by comparing results for four different cases to BSDF data produced via alternate methods. This validation demonstrates that BSDFs created with genBSDF are comparable to BSDFs generated analytically using TracePro and by measurement...

  11. The distribution of soluble radionuclide-relevant trace elements between salt minerals and saline solutions; Die Verteilung loeslicher Radionuklid-relevanter Spurenelemente zwischen Salzmineralen und salinaren Loesungen

    Energy Technology Data Exchange (ETDEWEB)

    Voss, Ina

    2015-07-16

    The research platform ENTRIA (Disposal options for radioactive residues Interdisciplinary analyses and development of evaluation principles) includes the sub-project ''Final disposal in deep geological formations without any arrangements for retrieval''. This approach considers rock salt (beside clay and granite) as host rock formation for disposal of heat-producing long-live waste. Most rock salt formations contain Mg-rich brines derived from highly evolved sea water evaporation processes now included in the rock salt mass. If such brines get access to metal-canister corrosion will allow release of soluble nuclides to the brine. In this scenario, it cannot be excluded that contaminated brines leave the deep seated disposal area and move along geological or technical migration pathways towards the rock salt/cap rock contact. The temperature of the brine will drop from near 80 C to 25 or 30 C. The deceasing temperature of the brine causes precipitation of magnesian chloride and sulfate phase in equilibrium with the brine. In order to understand the salt precipitation and the retention mechanism of dissolved trace elements experiments have been set up which allow formation of sylvite, carnallite, kainite, and hydrous Mg-sulphates under controlled conditions. The retention capacity of crystallizing salt minerals based occurring in magnesian brine solutions at decreasing temperature within a salt dome is best measured as the distribution coefficient D. This concept assumes incorporation of trace elements into the lattice of salt minerals. The distribution coefficients of the trace elements, Rb, Cs, Co, Ni, Zn, Li and B between sylvite, carnallite, kainite, and MgSO{sub 4} phases have been determined at experimental temperatures of 25, 35, 55 and 83 C. The results clearly indicate the following range of distribution coefficients (D): Sylvite D > 1 Rb and Br, D < 1 Co, Ni, Zn, Li and B, Carnallite D > 1 Rb and Cs, D < 1 Co, Ni, Zn, Li and B, Kainite D

  12. Pegasus Workflow Management System: Helping Applications From Earth and Space

    Science.gov (United States)

    Mehta, G.; Deelman, E.; Vahi, K.; Silva, F.

    2010-12-01

    Pegasus WMS is a Workflow Management System that can manage large-scale scientific workflows across Grid, local and Cloud resources simultaneously. Pegasus WMS provides a means for representing the workflow of an application in an abstract XML form, agnostic of the resources available to run it and the location of data and executables. It then compiles these workflows into concrete plans by querying catalogs and farming computations across local and distributed computing resources, as well as emerging commercial and community cloud environments in an easy and reliable manner. Pegasus WMS optimizes the execution as well as data movement by leveraging existing Grid and cloud technologies via a flexible pluggable interface and provides advanced features like reusing existing data, automatic cleanup of generated data, and recursive workflows with deferred planning. It also captures all the provenance of the workflow from the planning stage to the execution of the generated data, helping scientists to accurately measure performance metrics of their workflow as well as data reproducibility issues. Pegasus WMS was initially developed as part of the GriPhyN project to support large-scale high-energy physics and astrophysics experiments. Direct funding from the NSF enabled support for a wide variety of applications from diverse domains including earthquake simulation, bacterial RNA studies, helioseismology and ocean modeling. Earthquake Simulation: Pegasus WMS was recently used in a large scale production run in 2009 by the Southern California Earthquake Centre to run 192 million loosely coupled tasks and about 2000 tightly coupled MPI style tasks on National Cyber infrastructure for generating a probabilistic seismic hazard map of the Southern California region. SCEC ran 223 workflows over a period of eight weeks, using on average 4,420 cores, with a peak of 14,540 cores. A total of 192 million files were produced totaling about 165TB out of which 11TB of data was saved

  13. ADVANCED APPROACH TO PRODUCTION WORKFLOW COMPOSITION ON ENGINEERING KNOWLEDGE PORTALS

    OpenAIRE

    Novogrudska, Rina; Kot, Tatyana; Globa, Larisa; Schill, Alexander

    2016-01-01

    Background. In the environment of engineering knowledge portals great amount of partial workflows is concentrated. Such workflows are composed into general workflow aiming to perform real complex production task. Characteristics of partial workflows and general workflow structure are not studied enough, that affects the impossibility of general production workflowdynamic composition.Objective. Creating an approach to the general production workflow dynamic composition based on the partial wor...

  14. Trace element distribution and 235U/238U ratios in Euphrates waters and in soils and tree barks of Dhi Qar province (southern Iraq)

    International Nuclear Information System (INIS)

    Riccobono, Francesco; Perra, Guido; Pisani, Anastasia; Protano, Giuseppe

    2011-01-01

    To assess the quality of the environment in southern Iraq after the Gulf War II, a geochemical survey was carried out. The survey provided data on the chemistry of Euphrates waters, as well as the trace element contents, U and Pb isotopic composition, and PAH levels in soil and tree bark samples. The trace element concentrations and the 235 U/ 238 U ratio values in the Euphrates waters were within the usual natural range, except for the high contents of Sr due to a widespread presence of gypsum in soils of this area. The trace element contents in soils agreed with the common geochemistry of soils from floodplain sediments. Some exceptions were the high contents of Co, Cr and Ni, which had a natural origin related to ophiolitic outcrops in the upper sector of the Euphrates basin. The high concentrations of S and Sr were linked to the abundance of gypsum in soils. A marked geochemical homogeneity of soil samples was suggested by the similar distribution pattern of rare earth elements, while the 235 U/ 238 U ratio was also fairly homogeneous and within the natural range. The chemistry of the tree bark samples closely reflected that of the soils, with some notable exceptions. Unlike the soils, some tree bark samples had anomalous values of the 235 U/ 238 U ratio due to mixing of depleted uranium (DU) with the natural uranium pool. Moreover, the distribution of some trace elements (such as REEs, Th and Zr) and the isotopic composition of Pb in barks clearly differed from those of the nearby soils. The overall results suggested that significant external inputs occurred implying that once formed the DU-enriched particles could travel over long distances. The polycyclic aromatic hydrocarbon concentrations in tree bark samples showed that phenanthrene, fluoranthene and pyrene were the most abundant components, indicating an important role of automotive traffic. - Highlights: → This is a contribution to the knowledge of the Iraqi environment after Gulf War II. → In

  15. Acute Effects of Moderate and Strenuous Running on Trace Element Distribution in the Brain, Liver, and Spleen of Trained Rats

    Directory of Open Access Journals (Sweden)

    Kıvanç Ergen

    2013-03-01

    Full Text Available Objective: Trace elements such as manganese (Mn, cobalt (Co and chromium (Cr play key roles in metabolic reactions and are important in many physiological enzymatic processes. In this study, we aimed to investigate the acute effects of moderate and strenuous running (treadmill exercise on the levels of Mn, Co and Cr in the brain, liver, and spleen of trained rats. Study Design: Animal experiment. Material and Methods: Twenty-one Wistar-Albino adult male rats were used in the study. Rats were grouped as control group (no mandated exercise; n=8, moderate exercise group (30 min exercise duration; n=7, and strenuous exercise group (60 min exercise duration; n=6. The levels of Mn, Co, and Cr in the frontal lobe, temporal lobe, brain stem, liver, and spleen were determined by atomic absorption spectrophotometer. Results: Cr levels in liver of rats increased in parallel to the time course of running supporting the exercise training effect on the action of insulin. Compared to the control group, the level of Co significantly decreased in the brain stem of rats in the moderate exercise group (p=0.009 and in the frontal lobe of rats in the strenuous exercise group (p=0.004. In the strenuous exercise group, an examination of the brain stem revealed that the level of Mn significantly decreased (p=0.001, and levels of Co and Cr were apparently depleted to the extent that these elements were no longer detectable. Conclusion: A notable finding is that during or after single bout strenuous exercise, levels of Co decreased in the spleen and particularly decreased in the brain stem of regularly trained rats. From this study, it can be inferred that sportsmen should aware trace element disturbances among the body parts or depletion of some trace elements after single bout of chronic strenuous running exercise.

  16. High performance workflow implementation for protein surface characterization using grid technology

    Directory of Open Access Journals (Sweden)

    Clematis Andrea

    2005-12-01

    Full Text Available Abstract Background This study concerns the development of a high performance workflow that, using grid technology, correlates different kinds of Bioinformatics data, starting from the base pairs of the nucleotide sequence to the exposed residues of the protein surface. The implementation of this workflow is based on the Italian Grid.it project infrastructure, that is a network of several computational resources and storage facilities distributed at different grid sites. Methods Workflows are very common in Bioinformatics because they allow to process large quantities of data by delegating the management of resources to the information streaming. Grid technology optimizes the computational load during the different workflow steps, dividing the more expensive tasks into a set of small jobs. Results Grid technology allows efficient database management, a crucial problem for obtaining good results in Bioinformatics applications. The proposed workflow is implemented to integrate huge amounts of data and the results themselves must be stored into a relational database, which results as the added value to the global knowledge. Conclusion A web interface has been developed to make this technology accessible to grid users. Once the workflow has started, by means of the simplified interface, it is possible to follow all the different steps throughout the data processing. Eventually, when the workflow has been terminated, the different features of the protein, like the amino acids exposed on the protein surface, can be compared with the data present in the output database.

  17. Experiences and lessons learned from creating a generalized workflow for data publication of field campaign datasets

    Science.gov (United States)

    Santhana Vannan, S. K.; Ramachandran, R.; Deb, D.; Beaty, T.; Wright, D.

    2017-12-01

    This paper summarizes the workflow challenges of curating and publishing data produced from disparate data sources and provides a generalized workflow solution to efficiently archive data generated by researchers. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) for biogeochemical dynamics and the Global Hydrology Resource Center (GHRC) DAAC have been collaborating on the development of a generalized workflow solution to efficiently manage the data publication process. The generalized workflow presented here are built on lessons learned from implementations of the workflow system. Data publication consists of the following steps: Accepting the data package from the data providers, ensuring the full integrity of the data files. Identifying and addressing data quality issues Assembling standardized, detailed metadata and documentation, including file level details, processing methodology, and characteristics of data files Setting up data access mechanisms Setup of the data in data tools and services for improved data dissemination and user experience Registering the dataset in online search and discovery catalogues Preserving the data location through Digital Object Identifiers (DOI) We will describe the steps taken to automate, and realize efficiencies to the above process. The goals of the workflow system are to reduce the time taken to publish a dataset, to increase the quality of documentation and metadata, and to track individual datasets through the data curation process. Utilities developed to achieve these goal will be described. We will also share metrics driven value of the workflow system and discuss the future steps towards creation of a common software framework.

  18. Measuring Semantic and Structural Information for Data Oriented Workflow Retrieval with Cost Constraints

    Directory of Open Access Journals (Sweden)

    Yinglong Ma

    2014-01-01

    Full Text Available The reuse of data oriented workflows (DOWs can reduce the cost of workflow system development and control the risk of project failure and therefore is crucial for accelerating the automation of business processes. Reusing workflows can be achieved by measuring the similarity among candidate workflows and selecting the workflow satisfying requirements of users from them. However, due to DOWs being often developed based on an open, distributed, and heterogeneous environment, different users often can impose diverse cost constraints on data oriented workflows. This makes the reuse of DOWs challenging. There is no clear solution for retrieving DOWs with cost constraints. In this paper, we present a novel graph based model of DOWs with cost constraints, called constrained data oriented workflow (CDW, which can express cost constraints that users are often concerned about. An approach is proposed for retrieving CDWs, which seamlessly combines semantic and structural information of CDWs. A distance measure based on matrix theory is adopted to seamlessly combine semantic and structural similarities of CDWs for selecting and reusing them. Finally, the related experiments are made to show the effectiveness and efficiency of our approach.

  19. Distribution of major, trace and rare-earth elements in surface sediments of the Wharton Basin, Indian Ocean

    Digital Repository Service at National Institute of Oceanography (India)

    Pattan, J.N.; Rao, Ch.M.; Higgs, N.C.; Colley, S.; Parthiban, G.

    in Table 3 for comparison. 4.1. Major and trace elements In deep-sea sedihaent, silica is derived mainly from lithogenous and biogenic sources. The sili- ceous oozes have higher SiO2 content (60.3- 72.5%) than red clays (53.9-65.8%). These con... a very low Ca content (Table 1 ). Average Sr con- tent is low in siliceous ooze (85 ppm) and red clay ( 110 ppm) and highest in calcareous ooze (1017 ppm). Sr shows strong positive correla- tion with Ca (Table 2 ), reflecting its well...

  20. Les éléments traces métalliques dans la lagune Ebrié : distribution ...

    African Journals Online (AJOL)

    Cette étude porte sur l'évaluation de la contamination en PHE des sédiments de la lagune Ebrié. Des prélèvements saisonniers de sédiments superficiels ont été effectués dans les baies estuariennes. La fraction fine (< 63 µm) a été considérée pour l'extraction et le dosage des éléments traces métalliques (Ni, Cu, Cd, et.

  1. Automated quality control in a file-based broadcasting workflow

    Science.gov (United States)

    Zhang, Lina

    2014-04-01

    Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.

  2. Examination of the regional distribution of minor and trace elements in normal human brain by PIXE and chemometric techniques

    International Nuclear Information System (INIS)

    Maenhaut, W.; Hebbrecht, G.; Reuck, J. de

    1993-01-01

    Particle-induced X-ray emission (PIXE) was used to measure two minor and six trace elements, i.e. K, Ca, Mn, Fe, Cu, Zn, Se, and Rb, in up to 50 different structures (regions) of brains from Belgian individuals without neurological disorders. The data matrix with the mean dry-weight elemental concentrations and mean wet-to-dry weight ratio (means over 18 brains) for the various structures was subjected to two chemometric techniques, i.e., VARIMAX rotated absolute principal component analysis (APCA) and hierarchical cluster analysis. Three components were identified by APCA: Components 1 and 3 represented aqueous fractions of the brain (respectively the intracellular and extracellular fluid), whereas component 2 apparently represented the solid brain fraction. The elements K, Cu, Zn, Se, and Rb were predominantly attributed to component 1, Ca to component 3, and Fe to component 2. In the hierarchical cluster analysis seven different agglomerative cluster strategies were compared. The dendrograms obtained from the furthest neighbor and Ward's error sum strategy were virtually identical, and they consisted of two large clusters with 30 and 16 structures, respectively. The first cluster included all gray matter structures, while the second comprised all white matter. Furthermore, structures involved in the same physiological function or morphologically similar regions often conglomerated in one subcluster. This strongly suggests that there is some relationship between the trace element profile of a brain structure and its function. (orig.)

  3. Workflow Management in CLARIN-DK

    DEFF Research Database (Denmark)

    Jongejan, Bart

    2013-01-01

    The CLARIN-DK infrastructure is not only a repository of resources, but also a place where users can analyse, annotate, reformat and potentially even translate resources, using tools that are integrated in the infrastructure as web services. In many cases a single tool does not produce the desired...... with the features that describe her goal, because the workflow manager not only executes chains of tools in a workflow, but also takes care of autonomously devising workflows that serve the user’s intention, given the tools that currently are integrated in the infrastructure as web services. To do this...

  4. The Diabetic Retinopathy Screening Workflow

    Science.gov (United States)

    Bolster, Nigel M.; Giardini, Mario E.; Bastawrous, Andrew

    2015-01-01

    Complications of diabetes mellitus, namely diabetic retinopathy and diabetic maculopathy, are the leading cause of blindness in working aged people. Sufferers can avoid blindness if identified early via retinal imaging. Systematic screening of the diabetic population has been shown to greatly reduce the prevalence and incidence of blindness within the population. Many national screening programs have digital fundus photography as their basis. In the past 5 years several techniques and adapters have been developed that allow digital fundus photography to be performed using smartphones. We review recent progress in smartphone-based fundus imaging and discuss its potential for integration into national systematic diabetic retinopathy screening programs. Some systems have produced promising initial results with respect to their agreement with reference standards. However further multisite trialling of such systems’ use within implementable screening workflows is required if an evidence base strong enough to affect policy change is to be established. If this were to occur national diabetic retinopathy screening would, for the first time, become possible in low- and middle-income settings where cost and availability of trained eye care personnel are currently key barriers to implementation. As diabetes prevalence and incidence is increasing sharply in these settings, the impact on global blindness could be profound. PMID:26596630

  5. Workflow Optimization in Vertebrobasilar Occlusion

    International Nuclear Information System (INIS)

    Kamper, Lars; Meyn, Hannes; Rybacki, Konrad; Nordmeyer, Simone; Kempkes, Udo; Piroth, Werner; Isenmann, Stefan; Haage, Patrick

    2012-01-01

    Objective: In vertebrobasilar occlusion, rapid recanalization is the only substantial means to improve the prognosis. We introduced a standard operating procedure (SOP) for interventional therapy to analyze the effects on interdisciplinary time management. Methods: Intrahospital time periods between hospital admission and neuroradiological intervention were retrospectively analyzed, together with the patients’ outcome, before (n = 18) and after (n = 20) implementation of the SOP. Results: After implementation of the SOP, we observed statistically significant improvement of postinterventional patient neurological status (p = 0.017). In addition, we found a decrease of 5:33 h for the mean time period from hospital admission until neuroradiological intervention. The recanalization rate increased from 72.2% to 80% after implementation of the SOP. Conclusion: Our results underscore the relevance of SOP implementation and analysis of time management for clinical workflow optimization. Both may trigger awareness for the need of efficient interdisciplinary time management. This could be an explanation for the decreased time periods and improved postinterventional patient status after SOP implementation.

  6. Security aspects in teleradiology workflow

    Science.gov (United States)

    Soegner, Peter I.; Helweg, Gernot; Holzer, Heimo; zur Nedden, Dieter

    2000-05-01

    The medicolegal necessity of privacy, security and confidentiality was the aim of the attempt to develop a secure teleradiology workflow between the telepartners -- radiologist and the referring physician. To avoid the lack of dataprotection and datasecurity we introduced biometric fingerprint scanners in combination with smart cards to identify the teleradiology partners and communicated over an encrypted TCP/IP satellite link between Innsbruck and Reutte. We used an asymmetric kryptography method to guarantee authentification, integrity of the data-packages and confidentiality of the medical data. It was necessary to use a biometric feature to avoid a case of mistaken identity of persons, who wanted access to the system. Only an invariable electronical identification allowed a legal liability to the final report and only a secure dataconnection allowed the exchange of sensible medical data between different partners of Health Care Networks. In our study we selected the user friendly combination of a smart card and a biometric fingerprint technique, called SkymedTM Double Guard Secure Keyboard (Agfa-Gevaert) to confirm identities and log into the imaging workstations and the electronic patient record. We examined the interoperability of the used software with the existing platforms. Only the WIN-XX operating systems could be protected at the time of our study.

  7. Integrative Workflows for Metagenomic Analysis

    Directory of Open Access Journals (Sweden)

    Efthymios eLadoukakis

    2014-11-01

    Full Text Available The rapid evolution of all sequencing technologies, described by the term Next Generation Sequencing (NGS, have revolutionized metagenomic analysis. They constitute a combination of high-throughput analytical protocols, coupled to delicate measuring techniques, in order to potentially discover, properly assemble and map allelic sequences to the correct genomes, achieving particularly high yields for only a fraction of the cost of traditional processes (i.e. Sanger. From a bioinformatic perspective, this boils down to many gigabytes of data being generated from each single sequencing experiment, rendering the management or even the storage, critical bottlenecks with respect to the overall analytical endeavor. The enormous complexity is even more aggravated by the versatility of the processing steps available, represented by the numerous bioinformatic tools that are essential, for each analytical task, in order to fully unveil the genetic content of a metagenomic dataset. These disparate tasks range from simple, nonetheless non-trivial, quality control of raw data to exceptionally complex protein annotation procedures, requesting a high level of expertise for their proper application or the neat implementation of the whole workflow. Furthermore, a bioinformatic analysis of such scale, requires grand computational resources, imposing as the sole realistic solution, the utilization of cloud computing infrastructures. In this review article we discuss different, integrative, bioinformatic solutions available, which address the aforementioned issues, by performing a critical assessment of the available automated pipelines for data management, quality control and annotation of metagenomic data, embracing various, major sequencing technologies and applications.

  8. Retrieval of Vertical Aerosol and Trace Gas Distributions from Polarization Sensitive Multi-Axis Differential Optical Absorption Spectroscopy (MAX-DOAS)

    Science.gov (United States)

    Tirpitz, Jan-Lukas; Friess, Udo; Platt, Ulrich

    2017-04-01

    An accurate knowledge of the vertical distribution of trace gases and aerosols is crucial for our understanding of the chemical and dynamical processes in the lower troposphere. Their accurate determination is typically only possible by means of laborious and expensive airborne in-situ measurements but in the recent decades, numerous promising ground-based remote sensing approaches have been developed. One of them is to infer vertical distributions from "Differential Optical Absorption Spectroscopy" (DOAS) measurements. DOAS is a technique to analyze UV- and visible radiation spectra of direct or scattered sunlight, which delivers information on different atmospheric parameters, integrated over the light path from space to the instrument. An appropriate set of DOAS measurements, recorded under different viewing directions (Multi-Axis DOAS) and thus different light path geometries, provides information on the atmospheric state. The vertical profiles of aerosol properties and trace gas concentrations can be retrieved from such a set by numerical inversion techniques, incorporating radiative transfer models. The information content of measured data is rarely sufficient for a well-constrained retrieval, particularly for atmospheric layers above 1 km. We showed in first simulations that, apart from spectral properties, the polarization state of skylight is likely to provide a significant amount of additional information on the atmospheric state and thus to enhance retrieval quality. We present first simulations, expectations and ideas on how to implement and characterize a polarization sensitive Multi-Axis DOAS instrument and a corresponding profile retrieval algorithm.

  9. Probing the distribution and contamination levels of 10 trace metal/metalloids in soils near a Pb/Zn smelter in Middle China.

    Science.gov (United States)

    Li, Zhonggen; Feng, Xinbin; Bi, Xiangyang; Li, Guanghui; Lin, Yan; Sun, Guangyi

    2014-03-01

    The horizontal and vertical distribution patterns and contamination status of ten trace metal/metalloids (Ag, Bi, Co, Cr, Ge, In, Ni, Sb, Sn, Tl) in soils around one of the largest Chinese Pb-Zn smelter in Zhuzhou City, Central China, were revealed. Different soil samples were collected from 11 areas, including ten agricultural areas and one city park area, with a total of 83 surface soil samples and six soil cores obtained. Trace metal/metalloids were determined by inductively coupled plasma-mass spectrometry after digestion by an acid mixture of HF and HNO3. The results showed that Ag, Bi, In, Sb, Sn, and Tl contents decreased both with the distance to the Pb-Zn smelter as well as the soil depth, hinting that these elements were mainly originated from the Pb-Zn smelting operations and were introduced into soils through atmospheric deposition. Soil Ge was influenced by the smelter at a less extent, while the distributions of Co, Cr, and Ni were roughly even among most sampling sites and soil depths, suggesting that they were primarily derived from natural sources. The contamination status, as revealed by the geo-accumulation index (I geo), indicated that In and Ag were the most enriched elements, followed by Sb, Bi, and Sn. In general, Cr, Tl, Co, Ni, and Ge were of an uncontaminated status.

  10. Trace metal distribution in the Arosa estuary (N.W. Spain): The application of a recently developed sequential extraction procedure for metal partitioning

    International Nuclear Information System (INIS)

    Santamaria-Fernandez, Rebeca; Cave, Mark R.; Hill, Steve J.

    2006-01-01

    A study of the trace metal distribution in sediment samples from the Galician coast (Spain) has been performed. A multielement extraction method optimised via experimental design has been employed. The method uses centrifugation to pass the extractant solution at varying pH, through the sediment sample. The sequential leaches were collected and analysed by ICP-AES. Chemometric approaches were utilised to identify the composition of the physico-chemical components in order to characterise the sample. The samples collected at different sites could be classified according to their differences in metal bio-availability and important information regarding element distribution within the physico-chemical components is given. The method has proved to be a quick and reliable way to evaluate sediment samples for environmental geochemistry analysis. In addition, this approach has potential as fast screening method for the bio-availability of metals in the environment

  11. The distribution of radionuclides and some trace metals in the water columns of the Japan and Bonin trenches; Repartition des nucleides radioactifs et de quelques metaux-traces dans les fosses du Japon et des iles Bonin

    Energy Technology Data Exchange (ETDEWEB)

    Nozari, Y.; Yamada, M. [Tokyo Univ. (Japan). Ocean Research Inst; Nakanishi, T. [Kanazawa Univ. (Japan). Dept. of Chemistry; Nagaya, Y.; Nakamura, K.; Yamada, M. [National Inst. of Radiological Sciences, Hitachinaka, Ibaraki (Japan); Shitashima, K.; Tsubota, H. [Hiroshima Univ. (Japan). Faculty of Integrated Arts and Sciences

    1998-05-01

    Presented here is the first geochemical data on the U/Th series Th, Pa, Ac, and Pb isotopes and artificial fallout radionuclides ({sup 90}Sr, {sup 137}Cs, and Pu isotopes), and some trace elements (V, Zn, Cd, Cu, Mn, and Ni) in two water columns of the Japan and Bonin trenches down to the bottom depths of 7585 m and 9750 m, respectively. Hydrographic properties such as temperature, salinity dissolved oxygen, and nutrient content within the trench valley remain constant at the same levels as those in the bottom water of the Northwest Pacific basin (typically {approx}6000 m in depth). The radionuclide activities and most trace metal concentrations are also not very different from those in the overlying water at depths of around 5000-6000 m. This means that any chemical alteration which sea water undergoes during its residence within the trench was not obviously detected by the techniques used here. The suggestion follows that the trench water is rather freely communicating y isopycnal mixing with the bottom water overlying the Northwest Pacific abyssal plain. The trench waters contain high {sup 239,240}Pu activities throughout, indicating that Pu is actively regenerating from rapidly sinking, large particles at the bottom interface, probably due to a change in the oxidation state. On the other hand, the vertical profiles of {sup 210}Pb and {sup 231}Pa show lower activities within the trench than those in the overlying deep waters, suggesting that the effect of boundary and bottom scavenging is significant in controlling their oceanic distributions. However, none of the trace metals studied here obviously follows the behaviour of the above nuclides. The {sup 228}Th data show scattering within the Bonin Trench that is largely ascribable to analytical errors. If, however we accept that the scatter of {sup 228}Th data is real and the variation is caused solely by decay of its parent {sup 228}Ra, we can set an upper limit of {approx}5 years for the renewal time of the

  12. The Redox Dynamics of Iron in a Seasonally Waterlogged Forest Soil (Chaux Forest, Eastern France) Traced with Rare Earth Element Distribution Patterns

    Science.gov (United States)

    Steinmann, M.; Floch, A. L.; Lucot, E.; Badot, P. M.

    2014-12-01

    The oxyhydroxides of iron are common soil minerals and known to control the availability of various major and trace elements essential for biogeochemical processes. We present a study from acidic natural forest soils, where reducing redox conditions due to seasonal waterlogging lead to the dissolution of Fe-oxyhydroxides, and to the release of Fe to soil water. In order to study in detail the mechanism of redox cycling of Fe, we used Rare Earth Element (REE) distribution patterns, because an earlier study has shown that they are a suitable tool to identify trace metal sources during soil reduction in wetland soils (Davranche et al., 2011). The REE patterns of soil leachates obtained with the modified 3-step BCR extraction scheme of Rauret et al., (1999) were compared with those of natural soil water. The adsorbed fractions (F1 leach), the reducible fraction of the deepest soil horizon H4 (F2 leach, 50-120 cm), and the oxidizable fractions of horizons H2 to H4 (F3 leachs, 24-120 cm) yielded REE patterns almost identical to soil water (see figure), showing that the REE and trace metal content of soil water was mainly derived from the F1 pool, and from the F2 and F3 pools of the clay mineral-rich deep soil horizons. In contrast, the F2 leach mobilized mainly Fe-oxyhydroxides associated with organic matter of the surface soil and yielded REE patterns significantly different from those of soil water. These results suggest that the trace metal content of soil water in hydromorphic soils is primarily controlled by the clay fraction of the deeper soil horizons and not by organic matter and related Fe-oxyhydroxides of the surface soil. Additional analyses are in progress in order to verify whether the REE and trace metals of the deeper soil horizons were directly derived from clay minerals or from associated Fe-oxyhydroxide coatings. Refs cited: Davranche et al. (2011), Chem. Geol. 284; Rauret et al. (1999), J. Environ. Monit. 1.

  13. RABIX: AN OPEN-SOURCE WORKFLOW EXECUTOR SUPPORTING RECOMPUTABILITY AND INTEROPERABILITY OF WORKFLOW DESCRIPTIONS.

    Science.gov (United States)

    Kaushik, Gaurav; Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz

    2017-01-01

    As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optim1izations to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor, an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions.

  14. TRACE ELEMENTS LATERAL DISTRIBUTION AND LIMITATIONS FOR REVEGETATION IN LEAD MINE SOILS: CASE OF LAKHOUAT MINE, TUNISIA

    Directory of Open Access Journals (Sweden)

    H. Sahraoui

    2016-01-01

    Full Text Available Anthropogenic activities such as mining have increased the prevalence and occurrence of trace elements soil contamination. Abandoned mine tailings cause the contamination of adjacent agricultural soils. In Lakhouat mining area (West-Northern Tunisia, the dispersion of particles containing Pb, Zn and Cd results in the contamination of the surrounding agricultural soils. These soils presented high concentrations of Pb (1272 mg kg-1, Zn (5543 mg kg-1 and Cd (25 mg kg-1. Furthermore, the tailing sample and soil sample close the dam tailing presented higher concentrations of Pb, Zn and Cd and conferred more limitation factors for revegetation than adjacent soils of mining area. The main limiting factors of mine soils are their low effective depth, low organic matter content and low phosphorus content and an imbalance between potassium and manganese exchangeable cations. These mine soils are strongly affected by high Pb, Zn and Cd levels which hinder revegetation.

  15. Trace element distribution in geochemical environment of the island Krk and its influence on the local population

    International Nuclear Information System (INIS)

    Kutle, A.; Orescanin, V.; Obhodas, J.; Valkovic, V.

    2004-01-01

    Samples of soil, plant material and water collected on the Croatian island Krk in northern Adriatic sea were analyzed for a number of elements (Ca, Ti, Mn, Fe, Ni, Cu, Zn, As, Br, Rb, Sr, Zr, Pb) by using energy dispersive X-ray fluorescence (EDXRF) as an analytical tool. Some of these data have been previously used to produce geochemical map of the island. In addition, trace element contents of hair from children, attending elementary and secondary schools, has been investigated using the same analytical method for Mg, S, Ca, Cr, Mn, Fe, Ni, Cu, Zn, As, Hg and Pb elements. Concentrations of the twelve elements and nine variables from the questionnaire, i.e., age, sex, living place, transportation, medication, hair colour, hair type, type of shampoo used and hair treatment, were considered in the statistical analysis. The observed differences for the seven island's communes have also been discussed. (author)

  16. Urban soil geochemistry in Athens, Greece: The importance of local geology in controlling the distribution of potentially harmful trace elements.

    Science.gov (United States)

    Argyraki, Ariadne; Kelepertzis, Efstratios

    2014-06-01

    Understanding urban soil geochemistry is a challenging task because of the complicated layering of the urban landscape and the profound impact of large cities on the chemical dispersion of harmful trace elements. A systematic geochemical soil survey was performed across Greater Athens and Piraeus, Greece. Surface soil samples (0-10cm) were collected from 238 sampling sites on a regular 1×1km grid and were digested by a HNO3-HCl-HClO4-HF mixture. A combination of multivariate statistics and Geographical Information System approaches was applied for discriminating natural from anthropogenic sources using 4 major elements, 9 trace metals, and 2 metalloids. Based on these analyses the lack of heavy industry in Athens was demonstrated by the influence of geology on the local soil chemistry with this accounting for 49% of the variability in the major elements, as well as Cr, Ni, Co, and possibly As (median values of 102, 141, 16 and 24mg kg(-1) respectively). The contribution to soil chemistry of classical urban contaminants including Pb, Cu, Zn, Sn, Sb, and Cd (medians of 45, 39, 98, 3.6, 1.7 and 0.3mg kg(-1) respectively) was also observed; significant correlations were identified between concentrations and urbanization indicators, including vehicular traffic, urban land use, population density, and timing of urbanization. Analysis of soil heterogeneity and spatial variability of soil composition in the Greater Athens and Piraeus area provided a representation of the extent of anthropogenic modifications on natural element loadings. The concentrations of Ni, Cr, and As were relatively high compared to those in other cities around the world, and further investigation should characterize and evaluate their geochemical reactivity. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Using mosses as biomonitors to study trace element emissions and their distribution in six different volcanic areas

    Science.gov (United States)

    Arndt, Julia; Calabrese, Sergio; D'Alessandro, Walter; Planer-Friedrich, Britta

    2017-09-01

    Volcanoes emit SO2, CO2, and H2S, but also trace elements gases and particles such as As, Cd, Cr, Cu, Hg, Ni, Pb, and Sb. Active moss bag biomonitoring, an easy to apply and low budget method, was used to determine trace element release from volcanic areas of different geological context and climates. Exposure height variations (0.7-1.6 m above ground) due to different availability of natural tie points did not affect the results. Accumulation was linear for exposure durations from three days to nine weeks, so values were comparable by normalization to moss exposure time. Uncovered moss bags showed higher accumulation than co-exposed covered ones because of additional dust and wet deposition while washout by rain was negligible. The selection of a specific moss significantly affected element accumulation with moss of lower shoot compactness accumulating more. For all volcanic areas, highest accumulation was found for S (1-1000 μmol·(g·d)- 1), followed by Fe and Mg (0.1-10 μmol·(g·d)- 1), Sr, Ba, Pb, Cr, Li (10- 4-10- 1 μmol·(g·d)- 1), then Co, Mo and the volatile elements As, Sb, Se, Tl, Bi (10- 6-10- 2 μmol·(g·d)- 1). For most elements, open conduit volcanoes (Etna, Stromboli, Nyiragongo) showed higher moss accumulation rates than more quiescent hydrothermal areas (Vulcano > Nisyros > Yellowstone National Park) and a correlation of S, Fe, and Pb from eruptive ash and lava emissions. For some volatile elements (S, As, Se), higher accumulation was observed within fumarolic fields compared to crater rims of open conduit volcanoes which is a relevant information for risk assessment of tourist exposure to volcanic gases.

  18. Workflow Based Software Development Environment, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed research is to investigate and develop a workflow based tool, the Software Developers Assistant, to facilitate the collaboration between...

  19. COSMOS: Python library for massively parallel workflows.

    Science.gov (United States)

    Gafni, Erik; Luquette, Lovelace J; Lancaster, Alex K; Hawkins, Jared B; Jung, Jae-Yoon; Souilmi, Yassine; Wall, Dennis P; Tonellato, Peter J

    2014-10-15

    Efficient workflows to shepherd clinically generated genomic data through the multiple stages of a next-generation sequencing pipeline are of critical importance in translational biomedical science. Here we present COSMOS, a Python library for workflow management that allows formal description of pipelines and partitioning of jobs. In addition, it includes a user interface for tracking the progress of jobs, abstraction of the queuing system and fine-grained control over the workflow. Workflows can be created on traditional computing clusters as well as cloud-based services. Source code is available for academic non-commercial research purposes. Links to code and documentation are provided at http://lpm.hms.harvard.edu and http://wall-lab.stanford.edu. dpwall@stanford.edu or peter_tonellato@hms.harvard.edu. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  20. Implementing Workflow Reconfiguration in WS-BPEL

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Dragoni, Nicola; Zhou, Mu

    2012-01-01

    This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would not natu......This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would...... not naturally support dynamic change, is used as a target for implementation. The WS-BPEL recovery framework is here exploited to implement the reconfiguration using principles derived from previous research in process algebra and two mappings from BPMN to WS-BPEL are presented, one automatic and only mostly...

  1. Logical provenance in data-oriented workflows?

    KAUST Repository

    Ikeda, R.; Das Sarma, Akash; Widom, J.

    2013-01-01

    for general transformations, introducing the notions of correctness, precision, and minimality. We then determine when properties such as correctness and minimality carry over from the individual transformations' provenance to the workflow provenance. We

  2. Workflow Based Software Development Environment, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed research is to investigate and develop a workflow based tool, the Software Developers Assistant, to facilitate the collaboration between...

  3. The Symbiotic Relationship between Scientific Workflow and Provenance (Invited)

    Science.gov (United States)

    Stephan, E.

    2010-12-01

    The purpose of this presentation is to describe the symbiotic nature of scientific workflows and provenance. We will also discuss the current trends and real world challenges facing these two distinct research areas. Although motivated differently, the needs of the international science communities are the glue that binds this relationship together. Understanding and articulating the science drivers to these communities is paramount as these technologies evolve and mature. Originally conceived for managing business processes, workflows are now becoming invaluable assets in both computational and experimental sciences. These reconfigurable, automated systems provide essential technology to perform complex analyses by coupling together geographically distributed disparate data sources and applications. As a result, workflows are capable of higher throughput in a shorter amount of time than performing the steps manually. Today many different workflow products exist; these could include Kepler and Taverna or similar products like MeDICI, developed at PNNL, that are standardized on the Business Process Execution Language (BPEL). Provenance, originating from the French term Provenir “to come from”, is used to describe the curation process of artwork as art is passed from owner to owner. The concept of provenance was adopted by digital libraries as a means to track the lineage of documents while standards such as the DublinCore began to emerge. In recent years the systems science community has increasingly expressed the need to expand the concept of provenance to formally articulate the history of scientific data. Communities such as the International Provenance and Annotation Workshop (IPAW) have formalized a provenance data model. The Open Provenance Model, and the W3C is hosting a provenance incubator group featuring the Proof Markup Language. Although both workflows and provenance have risen from different communities and operate independently, their mutual

  4. A workflow learning model to improve geovisual analytics utility.

    Science.gov (United States)

    Roth, Robert E; Maceachren, Alan M; McCabe, Craig A

    2009-01-01

    INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on

  5. Resident Workflow and Psychiatric Emergency Consultation: Identifying Factors for Quality Improvement in a Training Environment.

    Science.gov (United States)

    Blair, Thomas; Wiener, Zev; Seroussi, Ariel; Tang, Lingqi; O'Hora, Jennifer; Cheung, Erick

    2017-06-01

    Quality improvement to optimize workflow has the potential to mitigate resident burnout and enhance patient care. This study applied mixed methods to identify factors that enhance or impede workflow for residents performing emergency psychiatric consultations. The study population consisted of all psychiatry program residents (55 eligible, 42 participating) at the Semel Institute for Neuroscience and Human Behavior, University of California, Los Angeles. The authors developed a survey through iterative piloting, surveyed all residents, and then conducted a focus group. The survey included elements hypothesized to enhance or impede workflow, and measures pertaining to self-rated efficiency and stress. Distributional and bivariate analyses were performed. Survey findings were clarified in focus group discussion. This study identified several factors subjectively associated with enhanced or impeded workflow, including difficulty with documentation, the value of personal organization systems, and struggles to communicate with patients' families. Implications for resident education are discussed.

  6. Spatial distribution of trace elements in the surface sediments of a major European estuary (Loire Estuary, France): Source identification and evaluation of anthropogenic contribution

    Science.gov (United States)

    Coynel, Alexandra; Gorse, Laureline; Curti, Cécile; Schafer, Jörg; Grosbois, Cécile; Morelli, Guia; Ducassou, Emmanuelle; Blanc, Gérard; Maillet, Grégoire M.; Mojtahid, Meryem

    2016-12-01

    Assessing the extent of metal contamination in estuarine surface sediments is hampered by the high heterogeneity of sediment characteristics, the spatial variability of trace element sources, sedimentary dynamics and geochemical processes in addition to the need of accurate reference values for deciphering natural to anthropogenic contribution. Based on 285 surface sediment samples from the Loire Estuary, the first high-resolution spatial distributions are presented for grain-size, particulate organic carbon (POC) and the eight metals/metalloids identified as priority contaminants (Cd, Zn, Pb, Cu, As, Cr, Ni, Hg) plus Ag (an urban tracer). Grain-size and/or POC are major factors controlling the spatial distribution of trace element concentrations. The V-normalized trace metal concentrations divided by the V-normalized concentrations in the basin geochemical background showed the highest Enrichment Factors for Ag and Hg (EF; up to 34 and 140, respectively). These results suggest a severe contamination in the Loire Estuary for both elements. Intra-estuarine Ag and Hg anomalies were identified by comparison between respective normalized concentrations in the Loire Estuary surface sediments and those measured in the surface sediments at the outlet of the Loire River System (watershed-derived). Anthropogenic intra-estuarine Ag and Hg stocks in the uppermost centimetre of the sediment compared with rough annual fluvial flux estimates suggest that the overall strong Enrichment Factors for Ag (EFAg) and and Hg (EFHg) in the Loire Estuary sediments are mainly due to watershed-derived inputs, highlighting the need of high temporal hydro-geochemical monitoring to establish reliable incoming fluxes. Significant correlations obtained between EFCd and EFAg, EFCu and POC and EFHg and POC revealed common behavior and/or sources. Comparison of trace element concentrations with ecotoxicological indices (Sediment Quality Guidelines) provides first standardized information on the

  7. Study on the distribution of radioactive trace elements in vitamin D-overloaded rats using the multitracer technique

    International Nuclear Information System (INIS)

    Hirunuma, Rieko; Enomoto, Shuichi; Ambe, Fumitoshi; Endo, Kazutoyo; Ambe, Shizuko

    1999-01-01

    The uptake and distribution of radioisotopes of beryllium, calcium, scandium, vanadium, chromium, manganese, iron, cobalt, nickel, zinc, gallium, arsenic, strontium and barium in vitamin D (VD)-overloaded rats were investigated and compared with those in control rats, using the multitracer technique. Each element revealed its characteristic distribution among various organs in control and VD-overloaded rats. For some elements, such as cobalt and chromium, the distribution patterns in them were significantly different. These results are discussed in terms of the metabolism of the elements in rats

  8. Integrated workflows for spiking neuronal network simulations

    Directory of Open Access Journals (Sweden)

    Ján eAntolík

    2013-12-01

    Full Text Available The increasing availability of computational resources is enabling more detailed, realistic modelling in computational neuroscience, resulting in a shift towards more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeller's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modellers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity.To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualisation into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organised configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualisation stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modelling studies by relieving the user from manual handling of the flow of metadata between the individual

  9. An Adaptable Seismic Data Format for Modern Scientific Workflows

    Science.gov (United States)

    Smith, J. A.; Bozdag, E.; Krischer, L.; Lefebvre, M.; Lei, W.; Podhorszki, N.; Tromp, J.

    2013-12-01

    Data storage, exchange, and access play a critical role in modern seismology. Current seismic data formats, such as SEED, SAC, and SEG-Y, were designed with specific applications in mind and are frequently a major bottleneck in implementing efficient workflows. We propose a new modern parallel format that can be adapted for a variety of seismic workflows. The Adaptable Seismic Data Format (ASDF) features high-performance parallel read and write support and the ability to store an arbitrary number of traces of varying sizes. Provenance information is stored inside the file so that users know the origin of the data as well as the precise operations that have been applied to the waveforms. The design of the new format is based on several real-world use cases, including earthquake seismology and seismic interferometry. The metadata is based on the proven XML schemas StationXML and QuakeML. Existing time-series analysis tool-kits are easily interfaced with this new format so that seismologists can use robust, previously developed software packages, such as ObsPy and the SAC library. ADIOS, netCDF4, and HDF5 can be used as the underlying container format. At Princeton University, we have chosen to use ADIOS as the container format because it has shown superior scalability for certain applications, such as dealing with big data on HPC systems. In the context of high-performance computing, we have implemented ASDF into the global adjoint tomography workflow on Oak Ridge National Laboratory's supercomputer Titan.

  10. Thermal Remote Sensing with Uav-Based Workflows

    Science.gov (United States)

    Boesch, R.

    2017-08-01

    Climate change will have a significant influence on vegetation health and growth. Predictions of higher mean summer temperatures and prolonged summer draughts may pose a threat to agriculture areas and forest canopies. Rising canopy temperatures can be an indicator of plant stress because of the closure of stomata and a decrease in the transpiration rate. Thermal cameras are available for decades, but still often used for single image analysis, only in oblique view manner or with visual evaluations of video sequences. Therefore remote sensing using a thermal camera can be an important data source to understand transpiration processes. Photogrammetric workflows allow to process thermal images similar to RGB data. But low spatial resolution of thermal cameras, significant optical distortion and typically low contrast require an adapted workflow. Temperature distribution in forest canopies is typically completely unknown and less distinct than for urban or industrial areas, where metal constructions and surfaces yield high contrast and sharp edge information. The aim of this paper is to investigate the influence of interior camera orientation, tie point matching and ground control points on the resulting accuracy of bundle adjustment and dense cloud generation with a typically used photogrammetric workflow for UAVbased thermal imagery in natural environments.

  11. DEWEY: the DICOM-enabled workflow engine system.

    Science.gov (United States)

    Erickson, Bradley J; Langer, Steve G; Blezek, Daniel J; Ryan, William J; French, Todd L

    2014-06-01

    Workflow is a widely used term to describe the sequence of steps to accomplish a task. The use of workflow technology in medicine and medical imaging in particular is limited. In this article, we describe the application of a workflow engine to improve workflow in a radiology department. We implemented a DICOM-enabled workflow engine system in our department. We designed it in a way to allow for scalability, reliability, and flexibility. We implemented several workflows, including one that replaced an existing manual workflow and measured the number of examinations prepared in time without and with the workflow system. The system significantly increased the number of examinations prepared in time for clinical review compared to human effort. It also met the design goals defined at its outset. Workflow engines appear to have value as ways to efficiently assure that complex workflows are completed in a timely fashion.

  12. Distribution and health risk assessment of trace metals in freshwater tilapia from three different aquaculture sites in Jelebu Region (Malaysia).

    Science.gov (United States)

    Low, Kah Hin; Zain, Sharifuddin Md; Abas, Mhd Radzi; Md Salleh, Kaharudin; Teo, Yin Yin

    2015-06-15

    The trace metal concentrations in edible muscle of red tilapia (Oreochromis spp.) sampled from a former tin mining pool, concrete tank and earthen pond in Jelebu were analysed with microwave assisted digestion-inductively coupled plasma-mass spectrometry. Results were compared with established legal limits and the daily ingestion exposures simulated using the Monte Carlo algorithm for potential health risks. Among the metals investigated, arsenic was found to be the key contaminant, which may have arisen from the use of formulated feeding pellets. Although the risks of toxicity associated with consumption of red tilapia from the sites investigated were found to be within the tolerable range, the preliminary probabilistic estimation of As cancer risk shows that the 95th percentile risk level surpassed the benchmark level of 10(-5). In general, the probabilistic health risks associated with ingestion of red tilapia can be ranked as follows: former tin mining pool > concrete tank > earthen pond. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Assessment of the trace element distribution in soils in the parks of the city of Zagreb (Croatia).

    Science.gov (United States)

    Roje, Vibor; Orešković, Marko; Rončević, Juraj; Bakšić, Darko; Pernar, Nikola; Perković, Ivan

    2018-02-07

    This paper presents the results of the preliminary testing of the selected trace elements in the soils of several parks in the city of Zagreb, Republic of Croatia. In each park, the samples were taken from several points-at various distances from the roads. The samples were taken at two different depths: 0-5 and 30-45 cm. Composite samples were done for each sampling point. Microwave-assisted wet digestion of the soil samples was performed and the determination by ICP-AES technique was done. Results obtained for Al, As, Ba, Mn, Ti, V, and K are in a good agreement with the results published in the scientific literature so far. The mass fraction values of Cd, Cr, Cu, Ni, Pb, and Zn are somewhat higher than the maximum values given in the Croatian Directive on agricultural land protection against pollution. Be, Mo, Sb, Se, and Tl in the samples were present in the concentrations that are lower than their method detection limit values.

  14. Spatial distributions, fractionation characteristics, and ecological risk assessment of trace elements in sediments of Chaohu Lake, a large eutrophic freshwater lake in eastern China.

    Science.gov (United States)

    Wu, Lei; Liu, Guijian; Zhou, Chuncai; Liu, Rongqiong; Xi, Shanshan; Da, Chunnian; Liu, Fei

    2018-01-01

    The concentrations, spatial distribution, fractionation characteristics, and potential ecological risks of trace elements (Cu, Pb, Zn, Cr, Ni, and Co) in the surface sediment samples collected from 32 sites in Chaohu Lake were investigated. The improved BCR sequential extraction procedure was applied to analyze the chemical forms of trace elements in sediments. The enrichment factor (EF), sediment quality guidelines (SQGs), potential ecological risk index (PERI), and risk assessment code (RAC) were employed to evaluate the pollution levels and the potential ecological risks. The results found that the concentrations of Cu, Pb, Zn, Cr, Ni, and Co in the surface sediments were 78.59, 36.91, 161.84, 98.87, 38.92, and 10.09 mg kg -1 , respectively. The lower concentrations of Cu, Pb, Zn, Cr, and Ni were almost found in the middle part of the lake, while Co increased from the western toward the eastern parts of the lake. Cr, Ni, Co, and Zn predominantly existed in the residual fractions, with the average values of 76.35, 59.22, 45.60, and 44.30%, respectively. Cu and Pb were mainly combined with Fe/Mn oxides in reducible fraction, with the average values of 66.4 and 69.1%, respectively. The pollution levels were different among the selected elements. Cu had the highest potential ecological risk, while Cr had the lowest potential ecological risk.

  15. Spatial and Temporal Distribution of Trace Metals (Cd, Cu, Ni, Pb, and Zn in Coastal Waters off the West Coast of Taiwan

    Directory of Open Access Journals (Sweden)

    Kuo-Tung Jiann

    2014-01-01

    Full Text Available Surface water samples were collected along the west coast of Taiwan during two expedition cruises which represent periods of different regional climatic patterns. Information on hydrochemical parameters such as salinity, nutrients, suspended particulate matter (SPM, and Chlorophyll a concentrations were obtained, and dissolved and particulate trace metal (Cd, Cu, Ni, Pb, and Zn concentrations were determined. Spatial variations were observed and the differences were attributed to (1 influence of varying extents of terrestrial inputs from the mountainous rivers of Taiwan to the coast, and (2 urbanization and industrialization in different parts of the island. Geochemical processes such as desorption (Cd and adsorption to sinking particles (Pb also contributed to the variability of trace metal distributions in coastal waters. Results showed temporal variations in chemical characteristics in coastal waters as a consequence of prevailing monsoons. During the wet season when river discharges were higher, the transport of particulate metals was elevated due to increased sediment loads. During the dry season, lower river discharges resulted in a lesser extent of estuarine dilution effect for chemicals of anthropogenic sources, indicated by higher dissolved concentrations present in coastal waters associated with slightly higher salinity.

  16. Distribution of trace elements in organs of six species of cetaceans from the Ligurian Sea (Mediterranean), and the relationship with stable carbon and nitrogen ratios

    Energy Technology Data Exchange (ETDEWEB)

    Capelli, R. [Dipartimento di Chimica e Tecnologie Farmaceutiche ed Alimentari - Universita degli Studi di Genova - Via Brigata Salerno, 13 I-16147 Genova (Italy); Das, K. [MARE center, Laboratory for Oceanology, University of Liege, B6 Sart-Tilman, B-4000 Liege (Belgium); Pellegrini, R. De; Drava, G. [Dipartimento di Chimica e Tecnologie Farmaceutiche ed Alimentari - Universita degli Studi di Genova - Via Brigata Salerno, 13 I-16147 Genova (Italy); Lepoint, G. [MARE center, Laboratory for Oceanology, University of Liege, B6 Sart-Tilman, B-4000 Liege (Belgium); Miglio, C. [Dipartimento di Chimica e Tecnologie Farmaceutiche ed Alimentari - Universita degli Studi di Genova - Via Brigata Salerno, 13 I-16147 Genova (Italy); Minganti, V. [Dipartimento di Chimica e Tecnologie Farmaceutiche ed Alimentari - Universita degli Studi di Genova - Via Brigata Salerno, 13 I-16147 Genova (Italy)], E-mail: minganti@dictfa.unige.it; Poggi, R. [Museo Civico di Storia Naturale ' Giacomo Doria' - Via Brigata Liguria, 9 I-16121 Genova (Italy)

    2008-02-15

    Mercury (total and organic), cadmium, lead, copper, iron, manganese, selenium and zinc concentrations were measured in different organs of 6 different cetacean species stranded in an area of extraordinary ecological interest (Cetaceans' Sanctuary of the Mediterranean Sea) along the coast of the Ligurian Sea (North-West Mediterranean). Stable-isotopes ratios of carbon ({sup 13}C/{sup 12}C) and nitrogen ({sup 15}N/{sup 14}N) were also measured in the muscle. A significant relationship exists between {sup 15}N/{sup 14}N, mercury concentration and the trophic level. The distribution of essential and non-essential trace elements was studied on several organs, and a significant relationship between selenium and mercury, with a molar ratio close to 1, was found in the cetaceans' kidney, liver and spleen, regardless of their species. High selenium concentrations are generally associated with a low organic to total mercury ratio. While narrow ranges of concentrations were observed for essential elements in most organs, mercury and selenium concentrations are characterised by a wide range of variation. Bio-accumulation and bio-amplification processes in cetaceans can be better understood by comparing trace element concentrations with the stable-isotopes data.

  17. Influence of tidal regime on the distribution of trace metals in a contaminated tidal freshwater marsh soil colonized with common reed (Phragmites australis)

    International Nuclear Information System (INIS)

    Teuchies, J.; Deckere, E. de; Bervoets, L.; Meynendonckx, J.; Regenmortel, S. van; Blust, R.; Meire, P.

    2008-01-01

    A historical input of trace metals into tidal marshes fringing the river Scheldt may be a cause for concern. Nevertheless, the specific physicochemical form, rather than the total concentration, determines the ecotoxicological risk of metals in the soil. In this study the effect of tidal regime on the distribution of trace metals in different compartments of the soil was investigated. As, Cd, Cu and Zn concentrations in sediment, pore water and in roots were determined along a depth profile. Total sediment metal concentrations were similar at different sites, reflecting pollution history. Pore water metal concentrations were generally higher under less flooded conditions (mean is (2.32 ± 0.08) x 10 -3 mg Cd L -1 and (1.53 ± 0.03) x 10 -3 mg Cd L -1 ). Metal concentrations associated with roots (mean is 202.47 ± 2.83 mg Cd kg -1 and 69.39 ± 0.99 mg Cd kg -1 ) were up to 10 times higher than sediment (mean is 20.48 ± 0.19 mg Cd kg -1 and 20.42 ± 0.21 mg Cd kg -1 ) metal concentrations and higher under dryer conditions. Despite high metal concentrations associated with roots, the major part of the metals in the marsh soil is still associated with the sediment as the overall biomass of roots is small compared to the sediment. - Pore water metal concentrations and metal root plaque concentration are influenced by the tidal regime

  18. Major, trace, and rare earth elements in the sediments of the Central Indian Ocean Basin: Their source and distribution

    Digital Repository Service at National Institute of Oceanography (India)

    Pattan, J.N.; Jauhari, P.

    The distribution maps of elements show that highest concentrations of Mn, Cu, Ni, Zn, Co, and biogenic opal in the surface sediment occurs between 10 degrees S and 16 degrees S latitude, where diagenetic ferromanganese nodules rich in Mn, Cu, Ni...

  19. Distribution and behavior of major and trace elements in Tokyo Bay, Mutsu Bay and Funka Bay marine sediments

    International Nuclear Information System (INIS)

    Honda, Teruyuki; Kimura, Ken-ichiro

    2003-01-01

    Fourteen major and trace elements in marine sediment core samples collected from the coasts along eastern Japan, i.e. Tokyo Bay (II) (the recess), Tokyo Bay (IV) (the mouth), Mutsu Bay and Funka Bay and the Northwest Pacific basin as a comparative subject were determined by the instrumental neutron activation analysis (INAA). The sedimentation rates and sedimentary ages were calculated for the coastal sediment cores by the 210 Pb method. The results obtained in this study are summarized as follows: (1) Lanthanoid abundance patterns suggested that the major origin of the sediments was terrigenous material. La*/Lu* and Ce*/La* ratios revealed that the sediments from Tokyo Bay (II) and Mutsu Bay more directly reflected the contribution from river than those of other regions. In addition, the Th/Sc ratio indicated that the coastal sediments mainly originated in the materials from the volcanic island-arcs, Japanese islands, whereas those from the Northwest Pacific mainly from the continent. (2) The correlation between the Ce/U and Th/U ratios with high correlation coefficients of 0.920 to 0.991 indicated that all the sediments from Tokyo Bay (II) and Funka Bay were in reducing conditions while at least the upper sediments from Tokyo Bay (IV) and Mutsu Bay were in oxidizing conditions. (3) It became quite obvious that the sedimentation mechanism and the sedimentation environment at Tokyo Bay (II) was different from those at Tokyo Bay (IV), since the sedimentation rate at Tokyo Bay (II) was approximately twice as large as that at Tokyo Bay (IV). The sedimentary age of the 5th layer (8∼10 cm in depth) from Funka Bay was calculated at approximately 1940∼50, which agreed with the time, 1943∼45 when Showa-shinzan was formed by the eruption of the Usu volcano. (author)

  20. Brain regional distributions of the minor and trace elements, Na, Mg, Cl, K, Mn, Zn, Rb and Br, in young and aged mice

    International Nuclear Information System (INIS)

    Amano, R.; Oishi, S.; Ishie, M.; Kimura, M.

    2001-01-01

    Brain regional cerebral concentrations of minor and trace elements, Na, Mg, Cl, K, Mn, Zn, Rb and Br were determined in young and aged mice, by instrumental neutron activation analysis for small amounts of regional (corpus striatum, cerebellum, cerebral cortex, hippocampus, midbrain, pons and medulla olfactory bulb) samples. Significant age-related differences were found for Mn concentration in all brain regions: The Mn concentration of the young brain was higher than those of aged brain, in addition, Zn was distributed heterogeneously, and highly concentrated in cerebral cortex and hippocampus regions in both young and aged mice. These results suggest that, in the aged brain, Mn is required less than in the young brain, on the other hand, Zn is required equally in both young and aged brains. (author)

  1. SMITH: a LIMS for handling next-generation sequencing workflows.

    Science.gov (United States)

    Venco, Francesco; Vaskin, Yuriy; Ceol, Arnaud; Muller, Heiko

    2014-01-01

    Life-science laboratories make increasing use of Next Generation Sequencing (NGS) for studying bio-macromolecules and their interactions. Array-based methods for measuring gene expression or protein-DNA interactions are being replaced by RNA-Seq and ChIP-Seq. Sequencing is generally performed by specialized facilities that have to keep track of sequencing requests, trace samples, ensure quality and make data available according to predefined privileges. An integrated tool helps to troubleshoot problems, to maintain a high quality standard, to reduce time and costs. Commercial and non-commercial tools called LIMS (Laboratory Information Management Systems) are available for this purpose. However, they often come at prohibitive cost and/or lack the flexibility and scalability needed to adjust seamlessly to the frequently changing protocols employed. In order to manage the flow of sequencing data produced at the Genomic Unit of the Italian Institute of Technology (IIT), we developed SMITH (Sequencing Machine Information Tracking and Handling). SMITH is a web application with a MySQL server at the backend. Wet-lab scientists of the Centre for Genomic Science and database experts from the Politecnico of Milan in the context of a Genomic Data Model Project developed SMITH. The data base schema stores all the information of an NGS experiment, including the descriptions of all protocols and algorithms used in the process. Notably, an attribute-value table allows associating an unconstrained textual description to each sample and all the data produced afterwards. This method permits the creation of metadata that can be used to search the database for specific files as well as for statistical analyses. SMITH runs automatically and limits direct human interaction mainly to administrative tasks. SMITH data-delivery procedures were standardized making it easier for biologists and analysts to navigate the data. Automation also helps saving time. The workflows are available

  2. SMITH: a LIMS for handling next-generation sequencing workflows

    Science.gov (United States)

    2014-01-01

    Background Life-science laboratories make increasing use of Next Generation Sequencing (NGS) for studying bio-macromolecules and their interactions. Array-based methods for measuring gene expression or protein-DNA interactions are being replaced by RNA-Seq and ChIP-Seq. Sequencing is generally performed by specialized facilities that have to keep track of sequencing requests, trace samples, ensure quality and make data available according to predefined privileges. An integrated tool helps to troubleshoot problems, to maintain a high quality standard, to reduce time and costs. Commercial and non-commercial tools called LIMS (Laboratory Information Management Systems) are available for this purpose. However, they often come at prohibitive cost and/or lack the flexibility and scalability needed to adjust seamlessly to the frequently changing protocols employed. In order to manage the flow of sequencing data produced at the Genomic Unit of the Italian Institute of Technology (IIT), we developed SMITH (Sequencing Machine Information Tracking and Handling). Methods SMITH is a web application with a MySQL server at the backend. Wet-lab scientists of the Centre for Genomic Science and database experts from the Politecnico of Milan in the context of a Genomic Data Model Project developed SMITH. The data base schema stores all the information of an NGS experiment, including the descriptions of all protocols and algorithms used in the process. Notably, an attribute-value table allows associating an unconstrained textual description to each sample and all the data produced afterwards. This method permits the creation of metadata that can be used to search the database for specific files as well as for statistical analyses. Results SMITH runs automatically and limits direct human interaction mainly to administrative tasks. SMITH data-delivery procedures were standardized making it easier for biologists and analysts to navigate the data. Automation also helps saving time. The

  3. Progress in digital color workflow understanding in the International Color Consortium (ICC) Workflow WG

    Science.gov (United States)

    McCarthy, Ann

    2006-01-01

    The ICC Workflow WG serves as the bridge between ICC color management technologies and use of those technologies in real world color production applications. ICC color management is applicable to and is used in a wide range of color systems, from highly specialized digital cinema color special effects to high volume publications printing to home photography. The ICC Workflow WG works to align ICC technologies so that the color management needs of these diverse use case systems are addressed in an open, platform independent manner. This report provides a high level summary of the ICC Workflow WG objectives and work to date, focusing on the ways in which workflow can impact image quality and color systems performance. The 'ICC Workflow Primitives' and 'ICC Workflow Patterns and Dimensions' workflow models are covered in some detail. Consider the questions, "How much of dissatisfaction with color management today is the result of 'the wrong color transformation at the wrong time' and 'I can't get to the right conversion at the right point in my work process'?" Put another way, consider how image quality through a workflow can be negatively affected when the coordination and control level of the color management system is not sufficient.

  4. Improved Screening Mammogram Workflow by Maximizing PACS Streamlining Capabilities in an Academic Breast Center.

    Science.gov (United States)

    Pham, Ramya; Forsberg, Daniel; Plecha, Donna

    2017-04-01

    The aim of this study was to perform an operational improvement project targeted at the breast imaging reading workflow of mammography examinations at an academic medical center with its associated breast centers and satellite sites. Through careful analysis of the current workflow, two major issues were identified: stockpiling of paperwork and multiple worklists. Both issues were considered to cause significant delays to the start of interpreting screening mammograms. Four workflow changes were suggested (scanning of paperwork, worklist consolidation, use of chat functionality, and tracking of case distribution among trainees) and implemented in July 2015. Timestamp data was collected 2 months before (May-Jun) and after (Aug-Sep) the implemented changes. Generalized linear models were used to analyze the data. The results showed significant improvements for the interpretation of screening mammograms. The average time elapsed for time to open a case reduced from 70 to 28 min (60 % decrease, p workflow for diagnostic mammograms at large unaltered even with increased volume of mammography examinations (31 % increase of 4344 examinations for May-Jun to 5678 examinations for Aug-Sep). In conclusion, targeted efforts to improve the breast imaging reading workflow for screening mammograms in a teaching environment provided significant performance improvements without affecting the workflow of diagnostic mammograms.

  5. Spatial distribution of mercury and other trace elements in recent lake sediments from central Alberta, Canada: An assessment of the regional impact of coal-fired power plants

    International Nuclear Information System (INIS)

    Sanei, H.; Goodarzi, F.; Outridge, P.M.

    2010-01-01

    These have been growing concerns over the environmental impacts of the coal-fired power plants in the western Canadian province of Alberta, which collectively comprise one of the largest point sources of Hg and other trace elements nationally. The overall cumulative impact of the power plants since the beginning of their activities several decades ago has been a critical question for industry, government agencies, and the research community. This paper aims to delineate the cumulative geographic extent of impact by investigating the spatial distribution of mercury and other trace elements of environmental concern in nine freshwater lakes, which cover the large area surrounding the coal-fired power plants in central Alberta, Canada. 210-Lead dating was used in conjunction with physical evidence of deposited fly ash to determine the sediments' age and hence the depths corresponding to the onset of coal-fired power generation in 1956. Total mean concentrations and fluxes of elements of environmental concern with integrated values since 1956 were then determined. The concentration values do not reflect the catastrophic oil spill at Lake Wabamun in 2005. The post-1956 flux rates of As, Cd, Co, Cr, Cu, Hg, Mo, Ni, Pb, Sb, V, W, and Zn were generally highest in sediment cores obtained from two lakes adjacent to power plants. However, the variable prevailing wind directions played an important role in determining the aerial distribution of Hg and other trace elements to the southeast and to the west of the power plants. Post-1956 fluxes of most elements declined downwind (westward), consistent with strong easterly winds transporting metal pollution further to the west of the power plants. However, spatial interpolation of the data suggested a major southern extension to the area of maximum metal deposition, which has not been sampled by this or previous studies in the region. An atmospheric model estimate of total Hg flux in 2007 near the Genesee power plant was

  6. The MPO system for automatic workflow documentation

    Energy Technology Data Exchange (ETDEWEB)

    Abla, G.; Coviello, E.N.; Flanagan, S.M. [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Greenwald, M. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Lee, X. [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Romosan, A. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Schissel, D.P., E-mail: schissel@fusion.gat.com [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Shoshani, A. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Stillerman, J.; Wright, J. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Wu, K.J. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

    2016-11-15

    Highlights: • Data model, infrastructure, and tools for data tracking, cataloging, and integration. • Automatically document workflow and data provenance in the widest sense. • Fusion Science as test bed but the system’s framework and data model is quite general. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. This paper presents the Metadata, Provenance, and Ontology (MPO) System, the software that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.

  7. Integrating configuration workflows with project management system

    International Nuclear Information System (INIS)

    Nilsen, Dimitri; Weber, Pavel

    2014-01-01

    The complexity of the heterogeneous computing resources, services and recurring infrastructure changes at the GridKa WLCG Tier-1 computing center require a structured approach to configuration management and optimization of interplay between functional components of the whole system. A set of tools deployed at GridKa, including Puppet, Redmine, Foreman, SVN and Icinga, provides the administrative environment giving the possibility to define and develop configuration workflows, reduce the administrative effort and improve sustainable operation of the whole computing center. In this presentation we discuss the developed configuration scenarios implemented at GridKa, which we use for host installation, service deployment, change management procedures, service retirement etc. The integration of Puppet with a project management tool like Redmine provides us with the opportunity to track problem issues, organize tasks and automate these workflows. The interaction between Puppet and Redmine results in automatic updates of the issues related to the executed workflow performed by different system components. The extensive configuration workflows require collaboration and interaction between different departments like network, security, production etc. at GridKa. Redmine plugins developed at GridKa and integrated in its administrative environment provide an effective way of collaboration within the GridKa team. We present the structural overview of the software components, their connections, communication protocols and show a few working examples of the workflows and their automation.

  8. The MPO system for automatic workflow documentation

    International Nuclear Information System (INIS)

    Abla, G.; Coviello, E.N.; Flanagan, S.M.; Greenwald, M.; Lee, X.; Romosan, A.; Schissel, D.P.; Shoshani, A.; Stillerman, J.; Wright, J.; Wu, K.J.

    2016-01-01

    Highlights: • Data model, infrastructure, and tools for data tracking, cataloging, and integration. • Automatically document workflow and data provenance in the widest sense. • Fusion Science as test bed but the system’s framework and data model is quite general. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. This paper presents the Metadata, Provenance, and Ontology (MPO) System, the software that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.

  9. Nuclear traces in glass

    International Nuclear Information System (INIS)

    Segovia A, M. de N.

    1978-01-01

    The charged particles produce, in dielectric materials, physical and chemical effects which make evident the damaged zone along the trajectory of the particle. This damaged zone is known as the latent trace. The latent traces can be enlarged by an etching of the detector material. This treatment attacks preferently the zones of the material where the charged particles have penetrated, producing concavities which can be observed through a low magnification optical microscope. These concavities are known as developed traces. In this work we describe the glass characteristics as a detector of the fission fragments traces. In the first chapter we present a summary of the existing basic theories to explain the formation of traces in solids. In the second chapter we describe the etching method used for the traces development. In the following chapters we determine some chatacteristics of the traces formed on the glass, such as: the development optimum time; the diameter variation of the traces and their density according to the temperature variation of the detector; the glass response to a radiation more penetrating than that of the fission fragments; the distribution of the developed traces and the existing relation between this ditribution and the fission fragments of 252 Cf energies. The method which has been used is simple and cheap and can be utilized in laboratories whose resources are limited. The commercial glass which has been employed allows the registration of the fission fragments and subsequently the realization of experiments which involve the counting of the traces as well as the identification of particles. (author)

  10. Study on the contents of trace rare earth elements and their distribution in wheat and rice samples by RNAA

    International Nuclear Information System (INIS)

    Sun Jingxin; Zhao Hang; Wang Yuqi

    1994-01-01

    The concentrations of 8 REE (La, Ce, Nd, Sm, Eu, Tb, Yb and Lu) in wheat and rice samples have been determined by RNAA. The contents and distributions of REE in each part of the plants (i.e. root, leaf, stem, husk and seed) and their host soils were studied, which included samples applied with rare earth elements in farming and control samples. The effects of applying rare earth on the uptake of REE by the plants and the REE accumulation in the grains of human health were also discussed. (author) 9 refs.; 4 figs.; 4 tabs

  11. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2016-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value

  12. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00396985; The ATLAS collaboration; Keeble, Oliver; Quadt, Arnulf; Kawamura, Gen

    2017-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the Cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value.

  13. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2018-01-01

    From 2025 onwards, the ATLAS collaboration at the Large Hadron Collider (LHC) at CERN will experience a massive increase in data quantity as well as complexity. Including mitigating factors, the prevalent computing power by that time will only fulfil one tenth of the requirement. This contribution will focus on Cloud computing as an approach to help overcome this challenge by providing flexible hardware that can be configured to the specific needs of a workflow. Experience with Cloud computing exists, but there is a large uncertainty if and to which degree it can be able to reduce the burden by 2025. In order to understand and quantify the benefits of Cloud computing, the "Workflow and Infrastructure Model" was created. It estimates the viability of Cloud computing by combining different inputs from the workflow side with infrastructure specifications. The model delivers metrics that enable the comparison of different Cloud configurations as well as different Cloud offerings with each other. A wide range of r...

  14. Impact of CGNS on CFD Workflow

    Science.gov (United States)

    Poinot, M.; Rumsey, C. L.; Mani, M.

    2004-01-01

    CFD tools are an integral part of industrial and research processes, for which the amount of data is increasing at a high rate. These data are used in a multi-disciplinary fluid dynamics environment, including structural, thermal, chemical or even electrical topics. We show that the data specification is an important challenge that must be tackled to achieve an efficient workflow for use in this environment. We compare the process with other software techniques, such as network or database type, where past experiences showed how difficult it was to bridge the gap between completely general specifications and dedicated specific applications. We show two aspects of the use of CFD General Notation System (CGNS) that impact CFD workflow: as a data specification framework and as a data storage means. Then, we give examples of projects involving CFD workflows where the use of the CGNS standard leads to a useful method either for data specification, exchange, or storage.

  15. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...... for more complex annotations and ultimately to automatically synthesise workflows by composing predefined sub-processes, in order to achieve a configuration that is optimal for parameters of interest....

  16. Distribution of persistent organic pollutants and trace metals in surface waters in the Seversky Donets River basin (Eastern Ukraine)

    Science.gov (United States)

    Diadin, Dmytro; Celle-Jeanton, Hélène; Steinmann, Marc; Loup, Christophe; Crini, Nadia; Vystavna, Yuliya; Vergeles, Yuri; Huneau, Frédéric

    2017-04-01

    The paper is focused on surface water of the Seversky Donets River Basin in Eastern Ukraine which undergoes significant anthropogenic pressure due to municipal and industrial wastewater discharge, polluted runoff from both urban and agricultural areas, leakages at oil-gas extraction sites located in the region. In these conditions the Seversky Donets River is used for drinking water supply of the city of Kharkiv with 1.5 million inhabitants as well as other smaller settlements in the basin. The diversity of water pollution sources makes it reasonable to use complex indicators and assessment approaches such as combination of organic and inorganic pollutants. We have studied the distribution of major ions, metals and persistent organic compounds (PAHs and PCBs) in water of the Seversky Donets River and its tributaries. In total 20 sites have been sampled on the river catchment area as of 4.5 thousands km2. PAHs and PCBs were measured in surface water for the first time in the region. The most distinctive transformations of water composition have been found downstream wastewater treatment plants in the city of Kharkiv where treated mixture of municipal and industrial wastewater is discharged to the river. Such metals as Ni, Zn, Cr in combination with phosphates and nitrates has shown significant positive correlation indicating the common source of their input. Ten of sixteen total PAHs were detected in measurable concentrations in at least one sample of river water. Sum of PAHs ranged from 15.3 to 117.2 ng/L with mean of 43.8 ng/L. The ratios of PAHs have indicated rather pyrogenic than petrogenic inputs on all the studied sites. Elevated concentrations of low molecular weight PAHs in water were found close to a coal-burning power station and a coke chemical plant also confirming their origin from coal combustion and subsequent atmospheric deposition. PCBs distribution has appeared to be relatively uniform on the territory despite the vast area of the basin researched

  17. A virtual radiation therapy workflow training simulation

    International Nuclear Information System (INIS)

    Bridge, P.; Crowe, S.B.; Gibson, G.; Ellemor, N.J.; Hargrave, C.; Carmichael, M.

    2016-01-01

    Aim: Simulation forms an increasingly vital component of clinical skills development in a wide range of professional disciplines. Simulation of clinical techniques and equipment is designed to better prepare students for placement by providing an opportunity to learn technical skills in a “safe” academic environment. In radiotherapy training over the last decade or so this has predominantly comprised treatment planning software and small ancillary equipment such as mould room apparatus. Recent virtual reality developments have dramatically changed this approach. Innovative new simulation applications and file processing and interrogation software have helped to fill in the gaps to provide a streamlined virtual workflow solution. This paper outlines the innovations that have enabled this, along with an evaluation of the impact on students and educators. Method: Virtual reality software and workflow applications have been developed to enable the following steps of radiation therapy to be simulated in an academic environment: CT scanning using a 3D virtual CT scanner simulation; batch CT duplication; treatment planning; 3D plan evaluation using a virtual linear accelerator; quantitative plan assessment, patient setup with lasers; and image guided radiotherapy software. Results: Evaluation of the impact of the virtual reality workflow system highlighted substantial time saving for academic staff as well as positive feedback from students relating to preparation for clinical placements. Students valued practice in the “safe” environment and the opportunity to understand the clinical workflow ahead of clinical department experience. Conclusion: Simulation of most of the radiation therapy workflow and tasks is feasible using a raft of virtual reality simulation applications and supporting software. Benefits of this approach include time-saving, embedding of a case-study based approach, increased student confidence, and optimal use of the clinical environment

  18. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...

  19. A Prudent Approach to Fair Use Workflow

    Directory of Open Access Journals (Sweden)

    Karey Patterson

    2018-02-01

    Full Text Available This poster will outline a new highly efficient workflow for the management of copyright materials that is prudent and accommodates generally and legally accepted Fair Use limits. The workflow allows library or copyright staff an easy means to keep on top of their copyright obligations, manage licenses and review and adjust schedules but is still a highly efficient means to cope with large numbers of requests to use materials. The poster details speed and efficiency gains for professors and library staff while reducing legal exposure.

  20. Applying of factor analyses for determination of trace elements distribution in water from Vardar and its tributaries, Macedonia/Greece.

    Science.gov (United States)

    Popov, Stanko Ilić; Stafilov, Trajče; Sajn, Robert; Tănăselia, Claudiu; Bačeva, Katerina

    2014-01-01

    A systematic study was carried out to investigate the distribution of fifty-six elements in the water samples from river Vardar (Republic of Macedonia and Greece) and its major tributaries. The samples were collected from 27 sampling sites. Analyses were performed by mass spectrometry with inductively coupled plasma (ICP-MS) and atomic emission spectrometry with inductively coupled plasma (ICP-AES). Cluster and R mode factor analysis (FA) was used to identify and characterise element associations and four associations of elements were determined by the method of multivariate statistics. Three factors represent the associations of elements that occur in the river water naturally while Factor 3 represents an anthropogenic association of the elements (Cd, Ga, In, Pb, Re, Tl, Cu, and Zn) introduced in the river waters from the waste waters from the mining and metallurgical activities in the country.

  1. Long-term Geochemical Evolution of Lithogenic Versus Anthropogenic Distribution of Macro and Trace Elements in Household Attic Dust.

    Science.gov (United States)

    Balabanova, Biljana; Stafilov, Trajče; Šajn, Robert; Tănăselia, Claudiu

    2017-01-01

    Attic dusts were examined as historical archives of anthropogenic emissions, with the goal of elucidating the enrichment pathways associated with hydrothermal exploitation of Cu, Pb, and Zn minerals in the Bregalnica River basin in the eastern part of the Republic of Macedonia. Dust samples were collected from 84 settlements. Atomic emission spectrometry and mass spectrometry with inductively coupled plasma were applied as analytical techniques for the determination of 69 element contents. Multivariate analysis was applied for the extraction of dominant geochemical markers. The lithogenic distribution was simplified to six dominant geochemical markers: F1: Ga-Nb-Ta-Y-(La-Gd)-(Eu-Lu); F2: Be-Cr-Li-Mg-Ni; F3: Ag-Bi-Cd-Cu-In-Mn-Pb-Sb-Te-W-Zn; F4: Ba-Cs-Hf-Pd-Rb-Sr-Tl-Zr; F5: As-Co-Ge-V; and F6: К-Na-Sc-Ti. The anthropogenic effects on the air pollution were marked by a dominance of F3 and secondary dominance of F5. The fifth factor also was determined as a lithogenic marker for the occurrence of the very old Rifeous shales. The first factor also presents a very unique association that despite the heterogeneity relays on natural phenomena of tracking the deposition in areas of Proterosoic gneisses; related to the distribution of fine particles was associated with carbonate-silicate volcanic rocks. Intensive poly-metallic dust depositions were recorded only in the surroundings of localities where the hydrothermal extractions are implemented. Long-term deposition can be considered as pollution indexes for these hot spots. This mainly affects the Cd, Pb, and Zn deposition that is as high as 25, 3900, and 3200 mg/kg, respectively.

  2. Perti Net-Based Workflow Access Control Model

    Institute of Scientific and Technical Information of China (English)

    陈卓; 骆婷; 石磊; 洪帆

    2004-01-01

    Access control is an important protection mechanism for information systems. This paper shows how to make access control in workflow system. We give a workflow access control model (WACM) based on several current access control models. The model supports roles assignment and dynamic authorization. The paper defines the workflow using Petri net. It firstly gives the definition and description of the workflow, and then analyzes the architecture of the workflow access control model (WACM). Finally, an example of an e-commerce workflow access control model is discussed in detail.

  3. Contribution to investigations on trace elements transport in the Channel: spatial distribution of industrial tracers in mytilus edulis and fucus serratus

    International Nuclear Information System (INIS)

    Germain, P.; Masson, M.; Baron, Y.

    1990-01-01

    The distribution of artificial tracers - gamma emitters - has been studied in biological indicator species, mussels and fucus, along the french and english Channel shores in order to gain a better knowledge of trace elements transports in the Channel coastal areas. The main conclusions are supplied by 106 Ru-Rh and 60 Co. Extension of species labelling is larger eastwards than westwards, and the differences recorded between french and english shores show weak exchanges between south and north Channel; in the norman-breton gulf and in the Seine river bay, the distribution of radioactive tracers demonstrates complex current processes. The results are compared to the hydrodynamical studies carried out through models and follow-up of radioactive tracers in sea-water. Particular processes have been observed, corresponding to areas where the decay gradient from the source term is not respected (western Cotentin shore, western Seine Bay, Caux aerea). They are discussed in relation with fresh - sea water mixing, current and physico-chemical problems [fr

  4. The influence of precipitation kinetics on trace element partitioning between solid and liquid solutions: A coupled fluid dynamics/thermodynamics framework to predict distribution coefficients

    Science.gov (United States)

    Kavner, A.

    2017-12-01

    In a multicomponent multiphase geochemical system undergoing a chemical reaction such as precipitation and/or dissolution, the partitioning of species between phases is determined by a combination of thermodynamic properties and transport processes. The interpretation of the observed distribution of trace elements requires models integrating coupled chemistry and mechanical transport. Here, a framework is presented that predicts the kinetic effects on the distribution of species between two reacting phases. Based on a perturbation theory combining Navier-Stokes fluid flow and chemical reactivity, the framework predicts rate-dependent partition coefficients in a variety of different systems. We present the theoretical framework, with applications to two systems: 1. species- and isotope-dependent Soret diffusion of species in a multicomponent silicate melt subjected to a temperature gradient, and 2. Elemental partitioning and isotope fractionation during precipitation of a multicomponent solid from a multicomponent liquid phase. Predictions will be compared with results from experimental studies. The approach has applications for understanding chemical exchange in at boundary layers such as the Earth's surface magmatic systems and at the core/mantle boundary.

  5. Regional distribution and losses of end-of-life steel throughout multiple product life cycles-Insights from the global multiregional MaTrace model.

    Science.gov (United States)

    Pauliuk, Stefan; Kondo, Yasushi; Nakamura, Shinichiro; Nakajima, Kenichi

    2017-01-01

    Substantial amounts of post-consumer scrap are exported to other regions or lost during recovery and remelting, and both export and losses pose a constraint to desires for having regionally closed material cycles. To quantify the challenges and trade-offs associated with closed-loop metal recycling, we looked at the material cycles from the perspective of a single material unit and trace a unit of material through several product life cycles. Focusing on steel, we used current process parameters, loss rates, and trade patterns of the steel cycle to study how steel that was originally contained in high quality applications such as machinery or vehicles with stringent purity requirements gets subsequently distributed across different regions and product groups such as building and construction with less stringent purity requirements. We applied MaTrace Global, a supply-driven multiregional model of steel flows coupled to a dynamic stock model of steel use. We found that, depending on region and product group, up to 95% of the steel consumed today will leave the use phase of that region until 2100, and that up to 50% can get lost in obsolete stocks, landfills, or slag piles until 2100. The high losses resulting from business-as-usual scrap recovery and recycling can be reduced, both by diverting postconsumer scrap into long-lived applications such as buildings and by improving the recovery rates in the waste management and remelting industries. Because the lifetimes of high-quality (cold-rolled) steel applications are shorter and remelting occurs more often than for buildings and infrastructure, we found and quantified a tradeoff between low losses and high-quality applications in the steel cycle. Furthermore, we found that with current trade patterns, reduced overall losses will lead to higher fractions of secondary steel being exported to other regions. Current loss rates, product lifetimes, and trade patterns impede the closure of the steel cycle.

  6. Trace metal distributions in the sediments from river-reservoir systems: case of the Congo River and Lake Ma Vallée, Kinshasa (Democratic Republic of Congo).

    Science.gov (United States)

    Mwanamoki, Paola M; Devarajan, Naresh; Niane, Birane; Ngelinkoto, Patience; Thevenon, Florian; Nlandu, José W; Mpiana, Pius T; Prabakar, Kandasamy; Mubedi, Josué I; Kabele, Christophe G; Wildi, Walter; Poté, John

    2015-01-01

    The contamination of drinking water resources by toxic metals is a major problem in many parts of the world, particularly in dense populated areas of developing countries that lack wastewater treatment facilities. The present study characterizes the recent evolution with time of some contaminants deposited in the Congo River and Lake Ma Vallée, both located in the vicinity of the large city of Kinshasa, capital of Democratic Republic of Congo (DRC). Physicochemical parameters including grain size distribution, organic matter and trace element concentrations were measured in sediment cores sampled from Congo River (n = 3) and Lake Ma Vallée (n = 2). The maximum concentration of trace elements in sediment profiles was found in the samples from the sites of Pool Malebo, with the values of 107.2, 111.7, 88.6, 39.3, 15.4, 6.1 and 4.7 mg kg(-1) for Cr, Ni, Zn, Cu, Pb, As and Hg, respectively. This site, which is characterized by intense human activities, is especially well known for the construction of numerous boats that are used for regular navigation on Congo River. Concerning Lake Ma Vallée, the concentration of all metals are generally low, with maximum values of 26.3, 53.6, 16.1, 15.3, 6.5 and 1.8 mg kg(-1) for Cr, Ni, Zn, Cu, Pb and As, respectively. However, the comparison of the metal profiles retrieved from the different sampled cores also reveals specific variations. The results of this study point out the sediment pollution by toxic metals in the Congo River Basin. This research presents useful tools for the evaluation of sediment contamination of river-reservoir systems.

  7. Distribution and risk assessment of trace metals in Leptodius exarata, surface water and sediments from Douglas Creek in the Qua Iboe Estuary

    Directory of Open Access Journals (Sweden)

    Nsikak U. Benson

    2017-05-01

    Full Text Available Five trace metals in Leptodius exarata, epipellic sediments and surface water from an intertidal ecosystem in the Niger Delta (Nigeria were studied to evaluate their spatial distributions, degrees of contamination, and associated ecological and health risks. The results show that the Cd (cadmium, Cr (chromium, Ni (nickel, Pb (lead and Zn (zinc concentrations in sediment range from 0.550–1.142, 9.57–15.95, 9.15–13.96, 2.00–8.90 and 91.5–121.6 mg kg−1 dw, respectively, while the L. exarata tissue metal content varies from 0.162–0.931, 3.81–8.62, 4.45–17.15, 1.90–7.35, and 125.55–269.75 mg kg−1 dw, respectively. The bioconcentration factor ranking for trace metals was found to follow the Zn > Ni > Pb > Cr > Cd sequence. The high biota to sediment accumulation factor (BSAF found for L. exarata reveals a sentinel metal bioindicator. Sediments from most sites were found to be uncontaminated to moderately contaminated (geoaccumulation, Igeo > 0, with Cd and Zn associated with anthropogenic intrusions. Low mean-ERM (effect range-median and mean-PEL (probable effect level quotients of sediments were found, indicating low–moderate degrees of contamination with 30% and 21% probabilities of toxicity. The multi-metal potential ecological risk index (RI for the intertidal ecosystem denotes low–moderate risk. Health risks associated with crab (L. exarata consumption are more significant for children than for adults.

  8. Study on rice absorption and distribution of Cd in applying Zn fertilizer with 65Zn, 115Cdm tracing technique

    International Nuclear Information System (INIS)

    Tang Nianxin; Shen Jinxiong

    1994-01-01

    Results of study by using 65 Zn and 115 Cd m tracers show that, along with the increase of the amount of Cd in applying Zn fertilizer to soil, rice has the phenomena of growth retard and tiller delay in the earlier growing stage. The inhibiting phenomenon is lightened along with the progress of rice growth. Very small quantity of Cd might be helpful to the growth of rice. It would cause serious inhibition to rice growth when the amount of Cd reaches to a definite limitation (64 x 10 -6 ). The distribution of Cd in a rice plant follows the following order in content: root>stem and leaves>brown rice>ear stalk>rice shell. Cd is mainly accumulated in rice root, taking 90% of the total amount of Cd contained in whole rice plant. The amount of Cd absorbed by rice increases with the amount of Cd applied to soil, though the total absorption extremely low, for example, only about 0.1% of the applied amount could be absorbed by two crops of rice, most of the applied Cd still retains in soil. Less and less Zn could be absorbed and utilized by rice along with the increase of the amount of the applied Cd. Application of Mn fertilizer affects negatively the absorption of Cd by rice, especially in brown rice

  9. Trace analysis

    International Nuclear Information System (INIS)

    Warner, M.

    1987-01-01

    What is the current state of quantitative trace analytical chemistry? What are today's research efforts? And what challenges does the future hold? These are some of the questions addressed at a recent four-day symposium sponsored by the National Bureau of Standards (NBS) entitled Accuracy in Trace Analysis - Accomplishments, Goals, Challenges. The two plenary sessions held on the first day of the symposium reviewed the history of quantitative trace analysis, discussed the present situation from academic and industrial perspectives, and summarized future needs. The remaining three days of the symposium consisted of parallel sessions dealing with the measurement process; quantitation in materials; environmental, clinical, and nutrient analysis; and advances in analytical techniques

  10. Tracing source, distribution and health risk of potentially harmful elements (PHEs) in street dust of Durgapur, India.

    Science.gov (United States)

    Gope, Manash; Masto, Reginald Ebhin; George, Joshy; Balachandran, Srinivasan

    2018-06-15

    Street dust samples from Durgapur, the steel city of eastern India, were collected from five different land use patterns, i.e., national highways, urban residential area, sensitive area, industrial area and busy traffic zone during summer, monsoon, and winter to analyze the pollution characteristics, chemical fractionation, source apportionment and health risk of heavy metals (HMs). The samples were fractionated into ≤ 53 µm and analyzed for potentially harmful elements (PHEs) viz. Cd, Cr, Cu, Fe, Mn, Ni, Pb, and Zn. Summer season indicated higher concentrations of PHEs when compared to the other two seasons. Mean enrichment factor (EF), geo-accumulation index (Igeo), and contamination factor (CF) were high for Cd followed by Pb during all the three season in Durgapur. Chemical fractionation was executed in order to obtain distribution patterns of PHEs and to evaluate their bioavailable fractions in street dust samples. Mn was found to be highly bioavailable and bioavailability of the PHEs were in the order of Mn > Zn > Pb > Ni > Cd > Cu > Fe > Cr. Principal Component Analysis (PCA), cluster analysis, correlation analysis indicated the main sources of PHEs could be industrial, especially coal powered thermal plant, iron and steel industries and cement industries and vehicular. Multivariate analysis of variance (MANOVA) indicated that sites, seasons and their interaction were significantly affected by different PHEs as a whole. The health risk was calculated with total metal as well as mobile fraction of PHEs, which indicated that the actual non-carcinogenic risk due to bioavailable PHEs was less (HI < 1) when compared to total concentrations of PHEs. Carcinogenic risk was observed for total Cr in street dust (Child: 4.6E-06; Adult: 3.6E-06). Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Soundness of Timed-Arc Workflow Nets

    DEFF Research Database (Denmark)

    Mateo, Jose Antonio; Srba, Jiri; Sørensen, Mathias Grund

    2014-01-01

    , we demonstrate the usability of our theory on the case studies of a Brake System Control Unit used in aircraft certification, the MPEG2 encoding algorithm, and a blood transfusion workflow. The implementation of the algorithms is freely available as a part of the model checker TAPAAL....

  12. Using workflow for projects in higher education

    NARCIS (Netherlands)

    van der Veen, Johan (CTIT); Jones, Valerie M.; Collis, Betty

    2000-01-01

    The WWW is increasingly used as a medium to support education and training. A course at the University of Twente in which groups of students collaborate in the design and production of multimedia instructional materials has now been supported by a website since 1995. Workflow was integrated with

  13. Workflow Automation: A Collective Case Study

    Science.gov (United States)

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  14. Text mining for the biocuration workflow.

    Science.gov (United States)

    Hirschman, Lynette; Burns, Gully A P C; Krallinger, Martin; Arighi, Cecilia; Cohen, K Bretonnel; Valencia, Alfonso; Wu, Cathy H; Chatr-Aryamontri, Andrew; Dowell, Karen G; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G

    2012-01-01

    Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on 'Text Mining for the BioCuration Workflow' at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community.

  15. Conventions and workflows for using Situs

    International Nuclear Information System (INIS)

    Wriggers, Willy

    2012-01-01

    Recent developments of the Situs software suite for multi-scale modeling are reviewed. Typical workflows and conventions encountered during processing of biophysical data from electron microscopy, tomography or small-angle X-ray scattering are described. Situs is a modular program package for the multi-scale modeling of atomic resolution structures and low-resolution biophysical data from electron microscopy, tomography or small-angle X-ray scattering. This article provides an overview of recent developments in the Situs package, with an emphasis on workflows and conventions that are important for practical applications. The modular design of the programs facilitates scripting in the bash shell that allows specific programs to be combined in creative ways that go beyond the original intent of the developers. Several scripting-enabled functionalities, such as flexible transformations of data type, the use of symmetry constraints or the creation of two-dimensional projection images, are described. The processing of low-resolution biophysical maps in such workflows follows not only first principles but often relies on implicit conventions. Situs conventions related to map formats, resolution, correlation functions and feature detection are reviewed and summarized. The compatibility of the Situs workflow with CCP4 conventions and programs is discussed

  16. Adaptive workflow simulation of emergency response

    NARCIS (Netherlands)

    Bruinsma, Guido Wybe Jan

    2010-01-01

    Recent incidents and major training exercises in and outside the Netherlands have persistently shown that not having or not sharing information during emergency response are major sources of emergency response inefficiency and error, and affect incident mitigation outcomes through workflow planning

  17. Physical and chemical properties of the regional mixed layer of Mexico's Megapolis Part II: evaluation of measured and modeled trace gases and particle size distributions

    Directory of Open Access Journals (Sweden)

    C. Ochoa

    2012-11-01

    Full Text Available This study extends the work of Baumgardner et al. (2009 in which measurements of trace gases and particles, at a remote, high altitude mountain site, 60 km from Mexico City were analyzed with respect to the origin of the air masses. In the current evaluation, the temperature, water vapor mixing ratio (WMR, ozone (O3, carbon monoxide (CO, sulfur dioxide (SO2 and acyl peroxy nitrate (APN are simulated with the WRF-Chem chemical transport model and compared with the measurements at the mountain site. Comparisons between the model and measurements are also evaluated for particle size distributions (PSDs of the mass concentrations of sulfate, nitrate, ammonium and organic mass (OM. The model predictions of the diurnal trends in temperature, WMR and trace gases were generally well correlated; 13 of the 18 correlations were significant at a confidence level of <0.01. Less satisfactory were the average hourly differences between model and measurements that showed predicted values within expected, natural variation for only 10 of the 18 comparisons. The model performed best when comparing with the measurements during periods when the air originated from the east. In that case all six of the parameters being compared had average differences between the model and measurements less than the expected standard deviation. For the cases when the air masses are from the southwest or west northwest, only two of the comparisons from each case showed differences less than the expected standard deviation. The differences appear to be a result of an overly rapid growth of the boundary layer predicted by the model and too much dilution. There also is more O3 being produced, most likely by photochemical production, downwind of the emission sources than is predicted by the model.

    The measured and modeled PSD compare very well with respect to their general shape and the diameter of the peak concentrations. The spectra are log

  18. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronnie S:; van der Aalst, Wil M.P.; Bakker, Piet J.M.

    2007-01-01

    care process of the Academic Medical Center (AMC) hospital is used as reference process. The process consists of hundreds of activities. These have been modeled and analyzed using an EUC and a CWN. Moreover, based on the CWN, the process has been implemented using four different workflow systems......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where...... an Executable Use Case (EUC) and Colored Workflow Net (CWN) are used to close the gap between the given requirements specification and the realization of these requirements with the help of a workflow system. This paper describes a large case study where the diagnostic tra jectory of the gynaecological oncology...

  19. Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator

    Directory of Open Access Journals (Sweden)

    Thoraval Samuel

    2005-04-01

    Full Text Available Abstract Background Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. Results We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results can be reproduced or shared among users. Availability: http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive, ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download. Conclusion From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous

  20. BIFI: a Taverna plugin for a simplified and user-friendly workflow platform.

    Science.gov (United States)

    Yildiz, Ahmet; Dilaveroglu, Erkan; Visne, Ilhami; Günay, Bilal; Sefer, Emrah; Weinhausel, Andreas; Rattay, Frank; Goble, Carole A; Pandey, Ram Vinay; Kriegner, Albert

    2014-10-20

    Heterogeneity in the features, input-output behaviour and user interface for available bioinformatics tools and services is still a bottleneck for both expert and non-expert users. Advancement in providing common interfaces over such tools and services are gaining interest among researchers. However, the lack of (meta-) information about input-output data and parameter prevents to provide automated and standardized solutions, which can assist users in setting the appropriate parameters. These limitations must be resolved especially in the workflow-based solution in order to ease the integration of software. We report a Taverna Workbench plugin: the XworX BIFI (Beautiful Interfaces for Inputs) implemented as a solution for the aforementioned issues. BIFI provides a Graphical User Interface (GUI) definition language used to layout the user interface and to define parameter options for Taverna workflows. BIFI is also able to submit GUI Definition Files (GDF) directly or discover appropriate instances from a configured repository. In the absence of a GDF, BIFI generates a default interface. The Taverna Workbench is an open source software providing the ability to combine various services within a workflow. Nevertheless, users can supply input data to the workflow via a simple user interface providing only a text area to enter the input in text form. The workflow may contain meta-information in human readable form such as description text for the port and an example value. However, not all workflow ports are documented so well or have all the required information.BIFI uses custom user interface components for ports which give users feedback on the parameter data type or structure to be used for service execution and enables client-side data validations. Moreover, BIFI offers user interfaces that allow users to interactively construct workflow views and share them with the community, thus significantly increasing usability of heterogeneous, distributed service

  1. CSP for Executable Scientific Workflows

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard

    and can usually benefit performance-wise from both multiprocessing, cluster and grid environments. PyCSP is an implementation of Communicating Sequential Processes (CSP) for the Python programming language and takes advantage of CSP's formal and verifiable approach to controlling concurrency...... on multi-processing and cluster computing using PyCSP. Additionally, McStas is demonstrated to utilise grid computing resources using PyCSP. Finally, this thesis presents a new dynamic channel model, which has not yet been implemented for PyCSP. The dynamic channel is able to change the internal...... synchronisation mechanisms on-the-fly, depending on the location and number of channel-ends connected. Thus it may start out as a simple local pipe and evolve into a distributed channel spanning multiple nodes. This channel is a necessary next step for PyCSP to allow for complete freedom in executing CSP...

  2. Performing Workflows in Pervasive Environments Based on Context Specifications

    OpenAIRE

    Xiping Liu; Jianxin Chen

    2010-01-01

    The workflow performance consists of the performance of activities and transitions between activities. Along with the fast development of varied computing devices, activities in workflows and transitions between activities could be performed in pervasive ways, whichcauses that the workflow performance need to migrate from traditional computing environments to pervasive environments. To perform workflows in pervasive environments needs to take account of the context information which affects b...

  3. MBAT: A scalable informatics system for unifying digital atlasing workflows

    Directory of Open Access Journals (Sweden)

    Sane Nikhil

    2010-12-01

    Full Text Available Abstract Background Digital atlases provide a common semantic and spatial coordinate system that can be leveraged to compare, contrast, and correlate data from disparate sources. As the quality and amount of biological data continues to advance and grow, searching, referencing, and comparing this data with a researcher's own data is essential. However, the integration process is cumbersome and time-consuming due to misaligned data, implicitly defined associations, and incompatible data sources. This work addressing these challenges by providing a unified and adaptable environment to accelerate the workflow to gather, align, and analyze the data. Results The MouseBIRN Atlasing Toolkit (MBAT project was developed as a cross-platform, free open-source application that unifies and accelerates the digital atlas workflow. A tiered, plug-in architecture was designed for the neuroinformatics and genomics goals of the project to provide a modular and extensible design. MBAT provides the ability to use a single query to search and retrieve data from multiple data sources, align image data using the user's preferred registration method, composite data from multiple sources in a common space, and link relevant informatics information to the current view of the data or atlas. The workspaces leverage tool plug-ins to extend and allow future extensions of the basic workspace functionality. A wide variety of tool plug-ins were developed that integrate pre-existing as well as newly created technology into each workspace. Novel atlasing features were also developed, such as supporting multiple label sets, dynamic selection and grouping of labels, and synchronized, context-driven display of ontological data. Conclusions MBAT empowers researchers to discover correlations among disparate data by providing a unified environment for bringing together distributed reference resources, a user's image data, and biological atlases into the same spatial or semantic context

  4. A workflow for sub-/seismic structure and deformation quantification of 3-D reflection seismic data sets across different scales

    Energy Technology Data Exchange (ETDEWEB)

    Krawczyk, C.M.; Lohr, T.; Oncken, O. [GFZ Potsdam (Germany); Tanner, D.C. [Goettingen Univ. (Germany). GZG; Endres, H. [RWTH Aachen (Germany)]|[TEEC, Isernhagen (Germany); Trappe, H.; Kukla, P. [TEEC, Isernhagen (Germany)

    2007-09-13

    The evolution of a sedimentary basin is mostly affected by deformation. Large-scale, subsurface deformation is typically identified by seismic data, sub-seismic small-scale fractures by well data. Between these two methods, we lack a deeper understanding of how deformation scales. We analysed a 3-D reflection seismic data set in the North German Basin, in order to determine the magnitude and distribution of deformation and its accumulation in space and time. A five-step approach is introduced for quantitative deformation and fracture prediction. An increased resolution of subtle tectonic lineaments is achieved by coherency processing, allowing to unravel the kinematics in the North German Basin from structural interpretation. Extensional events during basin initiation and later inversion are evident. 3-D retrodeformation shows major-strain magnitudes between 0-20% up to 1.3 km away from a fault trace, and variable deviations of associated extensional fractures. Good correlation of FMI data, strain distribution from retro-deformation and from geostatistic tools (see also Trappe et al., this volume) allows the validation of the results and makes the prediction of small-scale faults/fractures possible. The temporal component will be gained in the future by analogue models. The suggested workflow is applicable to reflection seismic surveys and yields in great detail both the tectonic history of a region as well as predictions for hydrocarbon plays or deep groundwater or geothermal reservoirs. (orig.)

  5. Provenance for Runtime Workflow Steering and Validation in Computational Seismology

    Science.gov (United States)

    Spinuso, A.; Krischer, L.; Krause, A.; Filgueira, R.; Magnoni, F.; Muraleedharan, V.; David, M.

    2014-12-01

    Provenance systems may be offered by modern workflow engines to collect metadata about the data transformations at runtime. If combined with effective visualisation and monitoring interfaces, these provenance recordings can speed up the validation process of an experiment, suggesting interactive or automated interventions with immediate effects on the lifecycle of a workflow run. For instance, in the field of computational seismology, if we consider research applications performing long lasting cross correlation analysis and high resolution simulations, the immediate notification of logical errors and the rapid access to intermediate results, can produce reactions which foster a more efficient progress of the research. These applications are often executed in secured and sophisticated HPC and HTC infrastructures, highlighting the need for a comprehensive framework that facilitates the extraction of fine grained provenance and the development of provenance aware components, leveraging the scalability characteristics of the adopted workflow engines, whose enactment can be mapped to different technologies (MPI, Storm clusters, etc). This work looks at the adoption of W3C-PROV concepts and data model within a user driven processing and validation framework for seismic data, supporting also computational and data management steering. Validation needs to balance automation with user intervention, considering the scientist as part of the archiving process. Therefore, the provenance data is enriched with community-specific metadata vocabularies and control messages, making an experiment reproducible and its description consistent with the community understandings. Moreover, it can contain user defined terms and annotations. The current implementation of the system is supported by the EU-Funded VERCE (http://verce.eu). It provides, as well as the provenance generation mechanisms, a prototypal browser-based user interface and a web API built on top of a NoSQL storage

  6. On Lifecycle Constraints of Artifact-Centric Workflows

    Science.gov (United States)

    Kucukoguz, Esra; Su, Jianwen

    Data plays a fundamental role in modeling and management of business processes and workflows. Among the recent "data-aware" workflow models, artifact-centric models are particularly interesting. (Business) artifacts are the key data entities that are used in workflows and can reflect both the business logic and the execution states of a running workflow. The notion of artifacts succinctly captures the fluidity aspect of data during workflow executions. However, much of the technical dimension concerning artifacts in workflows is not well understood. In this paper, we study a key concept of an artifact "lifecycle". In particular, we allow declarative specifications/constraints of artifact lifecycle in the spirit of DecSerFlow, and formulate the notion of lifecycle as the set of all possible paths an artifact can navigate through. We investigate two technical problems: (Compliance) does a given workflow (schema) contain only lifecycle allowed by a constraint? And (automated construction) from a given lifecycle specification (constraint), is it possible to construct a "compliant" workflow? The study is based on a new formal variant of artifact-centric workflow model called "ArtiNets" and two classes of lifecycle constraints named "regular" and "counting" constraints. We present a range of technical results concerning compliance and automated construction, including: (1) compliance is decidable when workflow is atomic or constraints are regular, (2) for each constraint, we can always construct a workflow that satisfies the constraint, and (3) sufficient conditions where atomic workflows can be constructed.

  7. Optimal resource assignment in workflows for maximizing cooperation

    NARCIS (Netherlands)

    Kumar, Akhil; Dijkman, R.M.; Song, Minseok; Daniel, Fl.; Wang, J.; Weber, B.

    2013-01-01

    A workflow is a team process since many actors work on various tasks to complete an instance. Resource management in such workflows deals with assignment of tasks to workers or actors. In team formation, it is necessary to ensure that members of a team are compatible with each other. When a workflow

  8. Database Support for Workflow Management: The WIDE Project

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Pernici, B; Sánchez, G.; Unknown, [Unknown

    1999-01-01

    Database Support for Workflow Management: The WIDE Project presents the results of the ESPRIT WIDE project on advanced database support for workflow management. The book discusses the state of the art in combining database management and workflow management technology, especially in the areas of

  9. A semi-automated workflow for biodiversity data retrieval, cleaning, and quality control

    Directory of Open Access Journals (Sweden)

    Cherian Mathew

    2014-12-01

    Full Text Available The compilation and cleaning of data needed for analyses and prediction of species distributions is a time consuming process requiring a solid understanding of data formats and service APIs provided by biodiversity informatics infrastructures. We designed and implemented a Taverna-based Data Refinement Workflow which integrates taxonomic data retrieval, data cleaning, and data selection into a consistent, standards-based, and effective system hiding the complexity of underlying service infrastructures. The workflow can be freely used both locally and through a web-portal which does not require additional software installations by users.

  10. Measuring Trace Gas Emission from Multi-Distributed Sources Using Vertical Radial Plume Mapping (VRPM and Backward Lagrangian Stochastic (bLS Techniques

    Directory of Open Access Journals (Sweden)

    Thomas K. Flesch

    2011-09-01

    Full Text Available Two micrometeorological techniques for measuring trace gas emission rates from distributed area sources were evaluated using a variety of synthetic area sources. The vertical radial plume mapping (VRPM and the backward Lagrangian stochastic (bLS techniques with an open-path optical spectroscopic sensor were evaluated for relative accuracy for multiple emission-source and sensor configurations. The relative accuracy was calculated by dividing the measured emission rate by the actual emission rate; thus, a relative accuracy of 1.0 represents a perfect measure. For a single area emission source, the VRPM technique yielded a somewhat high relative accuracy of 1.38 ± 0.28. The bLS technique resulted in a relative accuracy close to unity, 0.98 ± 0.24. Relative accuracies for dual source emissions for the VRPM and bLS techniques were somewhat similar to single source emissions, 1.23 ± 0.17 and 0.94 ± 0.24, respectively. When the bLS technique was used with vertical point concentrations, the relative accuracy was unacceptably low,

  11. The distribution pattern of trace elements in Pedra do Fogo formation, permian of Maranhao Basin and it application as an environment indicator of sedimentation

    International Nuclear Information System (INIS)

    Oliveira, C.M. de.

    1982-01-01

    The present investigation consisted of a geochemical study of the medium and upper layers of the Pedra do Fogo Formation aiming at correlating its elemental distribution pattern with the sedimentary environment where that formation was deposited. Pelitic material with carbonate content below 30% was sampled in the three different outcrops of the Pedra do Fogo Formation for mineralogical and chemical analyses. Illite and dolomite, with subordinate amounts of smectite, calcite, quartz and K-feldspar were determined by X-ray diffraction techniques as the constituents of the mineral assemblages. Trace element (B, Ba, Co, Cr, Cu, Ga, Li, Mn, Ni, P, V, Pb, Sr, Rb and Zn), CO sub(2) and K concentrations were determined by emission spectrography, atomic absorption spectrometry and colorimetry. The interpretation of the chemical data based on B-V, Ga-B, B-K sub(2) O and B-Ga-Rb diagrams and on B content indicates a wite range of variation for the medium salinity during the deposition of the Pedra do Fogo rocks. Accordingly, the observed sequence is composed of intercalations of marine and fresh water sediments, the latter being dominant. The marine intercalations are more frequent in the bottom of the sampled stratigraphic section and become progressively rarer towards the top layers which were deposited in a typically fresh water environment. (author)

  12. Distributions, sources and pollution status of 17 trace metal/metalloids in the street dust of a heavily industrialized city of central China.

    Science.gov (United States)

    Li, Zhonggen; Feng, Xinbin; Li, Guanghui; Bi, Xiangyang; Zhu, Jianming; Qin, Haibo; Dai, Zhihui; Liu, Jinling; Li, Qiuhua; Sun, Guangyi

    2013-11-01

    A series of representative street dust samples were collected from a heavily industrialized city, Zhuzhou, in central China, with the aim to investigate the spatial distribution and pollution status of 17 trace metal/metalloid elements. Concentrations of twelve elements (Pb, Zn, Cu, Cd, Hg, As, Sb, In, Bi, Tl, Ag and Ga) were distinctly amplified by atmospheric deposition resulting from a large scale Pb/Zn smelter located in the northwest fringe of the city, and followed a declining trend towards the city center. Three metals (W, Mo and Co) were enriched in samples very close to a hard alloy manufacturing plant, while Ni and Cr appeared to derive predominantly from natural sources. Other industries and traffic had neglectable effects on the accumulation of observed elements. Cd, In, Zn, Ag and Pb were the five metal/metalloids with highest pollution levels and the northwestern part of city is especially affected by heavy metal pollution. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. The spatial distribution of fossil fuel CO2 traced by Δ(14)C in the leaves of gingko (Ginkgo biloba L.) in Beijing City, China.

    Science.gov (United States)

    Niu, Zhenchuan; Zhou, Weijian; Zhang, Xiaoshan; Wang, Sen; Zhang, Dongxia; Lu, Xuefeng; Cheng, Peng; Wu, Shugang; Xiong, Xiaohu; Du, Hua; Fu, Yunchong

    2016-01-01

    Atmospheric fossil fuel CO2 (CO2ff ) information is an important reference for local government to formulate energy-saving and emission reduction in China. The CO2ff spatial distribution in Beijing City was traced by Δ(14)C in the leaves of gingko (Ginkgo biloba L.) from late March to September in 2009. The Δ(14)C values were in the range of -35.2 ± 2.8∼15.5 ± 3.2 ‰ (average 3.4 ± 11.8 ‰), with high values found at suburban sites (average 12.8 ± 3.1 ‰) and low values at road sites (average -8.4 ± 18.1 ‰). The CO2ff concentrations varied from 11.6 ± 3.7 to 32.5 ± 9.0 ppm, with an average of 16.4 ± 4.9 ppm. The CO2ff distribution in Beijing City showed spatial heterogeneity. CO2ff hotspots were found at road sites resulted from the emission from vehicles, while low CO2ff concentrations were found at suburban sites because of the less usage of fossil fuels. Additionally, CO2ff concentrations in the northwest area were generally higher than those in the southeast area due to the disadvantageous topography.

  14. Extension of specification language for soundness and completeness of service workflow

    Science.gov (United States)

    Viriyasitavat, Wattana; Xu, Li Da; Bi, Zhuming; Sapsomboon, Assadaporn

    2018-05-01

    A Service Workflow is an aggregation of distributed services to fulfill specific functionalities. With ever increasing available services, the methodologies for the selections of the services against the given requirements become main research subjects in multiple disciplines. A few of researchers have contributed to the formal specification languages and the methods for model checking; however, existing methods have the difficulties to tackle with the complexity of workflow compositions. In this paper, we propose to formalize the specification language to reduce the complexity of the workflow composition. To this end, we extend a specification language with the consideration of formal logic, so that some effective theorems can be derived for the verification of syntax, semantics, and inference rules in the workflow composition. The logic-based approach automates compliance checking effectively. The Service Workflow Specification (SWSpec) has been extended and formulated, and the soundness, completeness, and consistency of SWSpec applications have been verified; note that a logic-based SWSpec is mandatory for the development of model checking. The application of the proposed SWSpec has been demonstrated by the examples with the addressed soundness, completeness, and consistency.

  15. Eleven quick tips for architecting biomedical informatics workflows with cloud computing.

    Science.gov (United States)

    Cole, Brian S; Moore, Jason H

    2018-03-01

    Cloud computing has revolutionized the development and operations of hardware and software across diverse technological arenas, yet academic biomedical research has lagged behind despite the numerous and weighty advantages that cloud computing offers. Biomedical researchers who embrace cloud computing can reap rewards in cost reduction, decreased development and maintenance workload, increased reproducibility, ease of sharing data and software, enhanced security, horizontal and vertical scalability, high availability, a thriving technology partner ecosystem, and much more. Despite these advantages that cloud-based workflows offer, the majority of scientific software developed in academia does not utilize cloud computing and must be migrated to the cloud by the user. In this article, we present 11 quick tips for architecting biomedical informatics workflows on compute clouds, distilling knowledge gained from experience developing, operating, maintaining, and distributing software and virtualized appliances on the world's largest cloud. Researchers who follow these tips stand to benefit immediately by migrating their workflows to cloud computing and embracing the paradigm of abstraction.

  16. A Semi-Automated Workflow Solution for Data Set Publication

    Directory of Open Access Journals (Sweden)

    Suresh Vannan

    2016-03-01

    Full Text Available To address the need for published data, considerable effort has gone into formalizing the process of data publication. From funding agencies to publishers, data publication has rapidly become a requirement. Digital Object Identifiers (DOI and data citations have enhanced the integration and availability of data. The challenge facing data publishers now is to deal with the increased number of publishable data products and most importantly the difficulties of publishing diverse data products into an online archive. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC, a NASA-funded data center, faces these challenges as it deals with data products created by individual investigators. This paper summarizes the challenges of curating data and provides a summary of a workflow solution that ORNL DAAC researcher and technical staffs have created to deal with publication of the diverse data products. The workflow solution presented here is generic and can be applied to data from any scientific domain and data located at any data center.

  17. Grid workflow job execution service 'Pilot'

    Science.gov (United States)

    Shamardin, Lev; Kryukov, Alexander; Demichev, Andrey; Ilyin, Vyacheslav

    2011-12-01

    'Pilot' is a grid job execution service for workflow jobs. The main goal for the service is to automate computations with multiple stages since they can be expressed as simple workflows. Each job is a directed acyclic graph of tasks and each task is an execution of something on a grid resource (or 'computing element'). Tasks may be submitted to any WS-GRAM (Globus Toolkit 4) service. The target resources for the tasks execution are selected by the Pilot service from the set of available resources which match the specific requirements from the task and/or job definition. Some simple conditional execution logic is also provided. The 'Pilot' service is built on the REST concepts and provides a simple API through authenticated HTTPS. This service is deployed and used in production in a Russian national grid project GridNNN.

  18. Grid workflow job execution service 'Pilot'

    International Nuclear Information System (INIS)

    Shamardin, Lev; Kryukov, Alexander; Demichev, Andrey; Ilyin, Vyacheslav

    2011-01-01

    'Pilot' is a grid job execution service for workflow jobs. The main goal for the service is to automate computations with multiple stages since they can be expressed as simple workflows. Each job is a directed acyclic graph of tasks and each task is an execution of something on a grid resource (or 'computing element'). Tasks may be submitted to any WS-GRAM (Globus Toolkit 4) service. The target resources for the tasks execution are selected by the Pilot service from the set of available resources which match the specific requirements from the task and/or job definition. Some simple conditional execution logic is also provided. The 'Pilot' service is built on the REST concepts and provides a simple API through authenticated HTTPS. This service is deployed and used in production in a Russian national grid project GridNNN.

  19. Workflow optimization beyond RIS and PACS

    International Nuclear Information System (INIS)

    Treitl, M.; Wirth, S.; Lucke, A.; Nissen-Meyer, S.; Trumm, C.; Rieger, J.; Pfeifer, K.-J.; Reiser, M.; Villain, S.

    2005-01-01

    Technological progress and the rising cost pressure on the healthcare system have led to a drastic change in the work environment of radiologists today. The pervasive demand for workflow optimization and increased efficiency of its activities raises the question of whether by employment of electronic systems, such as RIS and PACS, the potentials of digital technology are sufficiently used to fulfil this demand. This report describes the tasks and structures in radiology departments, which so far are only insufficiently supported by commercially available electronic systems but are nevertheless substantial. We developed and employed a web-based, integrated workplace system, which simplifies many daily tasks of departmental organization and administration apart from well-established tasks of documentation. Furthermore, we analyzed the effects exerted on departmental workflow by employment of this system for 3 years. (orig.) [de

  20. Designing Flexible E-Business Workflow Systems

    OpenAIRE

    Cătălin Silvestru; Codrin Nisioiu; Marinela Mircea; Bogdan Ghilic-Micu; Marian Stoica

    2010-01-01

    In today’s business environment organizations must cope with complex interactions between actors adapt fast to frequent market changes and be innovative. In this context, integrating knowledge with processes and Business Intelligenceis a major step towards improving organization agility. Therefore, traditional environments for workflow design have been adapted to answer the new business models and current requirements in the field of collaborative processes. This paper approaches the design o...

  1. Planning bioinformatics workflows using an expert system

    Science.gov (United States)

    Chen, Xiaoling; Chang, Jeffrey T.

    2017-01-01

    Abstract Motivation: Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. Results: To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. Availability and Implementation: https://github.com/jefftc/changlab Contact: jeffrey.t.chang@uth.tmc.edu PMID:28052928

  2. SPECT/CT workflow and imaging protocols

    Energy Technology Data Exchange (ETDEWEB)

    Beckers, Catherine [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Hustinx, Roland [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Domaine Universitaire du Sart Tilman, Service de Medecine Nucleaire et Imagerie Oncologique, CHU de Liege, Liege (Belgium)

    2014-05-15

    Introducing a hybrid imaging method such as single photon emission computed tomography (SPECT)/CT greatly alters the routine in the nuclear medicine department. It requires designing new workflow processes and the revision of original scheduling process and imaging protocols. In addition, the imaging protocol should be adapted for each individual patient, so that performing CT is fully justified and the CT procedure is fully tailored to address the clinical issue. Such refinements often occur before the procedure is started but may be required at some intermediate stage of the procedure. Furthermore, SPECT/CT leads in many instances to a new partnership with the radiology department. This article presents practical advice and highlights the key clinical elements which need to be considered to help understand the workflow process of SPECT/CT and optimise imaging protocols. The workflow process using SPECT/CT is complex in particular because of its bimodal character, the large spectrum of stakeholders, the multiplicity of their activities at various time points and the need for real-time decision-making. With help from analytical tools developed for quality assessment, the workflow process using SPECT/CT may be separated into related, but independent steps, each with its specific human and material resources to use as inputs or outputs. This helps identify factors that could contribute to failure in routine clinical practice. At each step of the process, practical aspects to optimise imaging procedure and protocols are developed. A decision-making algorithm for justifying each CT indication as well as the appropriateness of each CT protocol is the cornerstone of routine clinical practice using SPECT/CT. In conclusion, implementing hybrid SPECT/CT imaging requires new ways of working. It is highly rewarding from a clinical perspective, but it also proves to be a daily challenge in terms of management. (orig.)

  3. IDD Archival Hardware Architecture and Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Mendonsa, D; Nekoogar, F; Martz, H

    2008-10-09

    This document describes the functionality of every component in the DHS/IDD archival and storage hardware system shown in Fig. 1. The document describes steps by step process of image data being received at LLNL then being processed and made available to authorized personnel and collaborators. Throughout this document references will be made to one of two figures, Fig. 1 describing the elements of the architecture and the Fig. 2 describing the workflow and how the project utilizes the available hardware.

  4. SPECT/CT workflow and imaging protocols

    International Nuclear Information System (INIS)

    Beckers, Catherine; Hustinx, Roland

    2014-01-01

    Introducing a hybrid imaging method such as single photon emission computed tomography (SPECT)/CT greatly alters the routine in the nuclear medicine department. It requires designing new workflow processes and the revision of original scheduling process and imaging protocols. In addition, the imaging protocol should be adapted for each individual patient, so that performing CT is fully justified and the CT procedure is fully tailored to address the clinical issue. Such refinements often occur before the procedure is started but may be required at some intermediate stage of the procedure. Furthermore, SPECT/CT leads in many instances to a new partnership with the radiology department. This article presents practical advice and highlights the key clinical elements which need to be considered to help understand the workflow process of SPECT/CT and optimise imaging protocols. The workflow process using SPECT/CT is complex in particular because of its bimodal character, the large spectrum of stakeholders, the multiplicity of their activities at various time points and the need for real-time decision-making. With help from analytical tools developed for quality assessment, the workflow process using SPECT/CT may be separated into related, but independent steps, each with its specific human and material resources to use as inputs or outputs. This helps identify factors that could contribute to failure in routine clinical practice. At each step of the process, practical aspects to optimise imaging procedure and protocols are developed. A decision-making algorithm for justifying each CT indication as well as the appropriateness of each CT protocol is the cornerstone of routine clinical practice using SPECT/CT. In conclusion, implementing hybrid SPECT/CT imaging requires new ways of working. It is highly rewarding from a clinical perspective, but it also proves to be a daily challenge in terms of management. (orig.)

  5. Schedule-Aware Workflow Management Systems

    Science.gov (United States)

    Mans, Ronny S.; Russell, Nick C.; van der Aalst, Wil M. P.; Moleman, Arnold J.; Bakker, Piet J. M.

    Contemporary workflow management systems offer work-items to users through specific work-lists. Users select the work-items they will perform without having a specific schedule in mind. However, in many environments work needs to be scheduled and performed at particular times. For example, in hospitals many work-items are linked to appointments, e.g., a doctor cannot perform surgery without reserving an operating theater and making sure that the patient is present. One of the problems when applying workflow technology in such domains is the lack of calendar-based scheduling support. In this paper, we present an approach that supports the seamless integration of unscheduled (flow) and scheduled (schedule) tasks. Using CPN Tools we have developed a specification and simulation model for schedule-aware workflow management systems. Based on this a system has been realized that uses YAWL, Microsoft Exchange Server 2007, Outlook, and a dedicated scheduling service. The approach is illustrated using a real-life case study at the AMC hospital in the Netherlands. In addition, we elaborate on the experiences obtained when developing and implementing a system of this scale using formal techniques.

  6. Routine digital pathology workflow: The Catania experience

    Directory of Open Access Journals (Sweden)

    Filippo Fraggetta

    2017-01-01

    Full Text Available Introduction: Successful implementation of whole slide imaging (WSI for routine clinical practice has been accomplished in only a few pathology laboratories worldwide. We report the transition to an effective and complete digital surgical pathology workflow in the pathology laboratory at Cannizzaro Hospital in Catania, Italy. Methods: All (100% permanent histopathology glass slides were digitized at ×20 using Aperio AT2 scanners. Compatible stain and scanning slide racks were employed to streamline operations. eSlide Manager software was bidirectionally interfaced with the anatomic pathology laboratory information system. Virtual slide trays connected to the two-dimensional (2D barcode tracking system allowed pathologists to confirm that they were correctly assigned slides and that all tissues on these glass slides were scanned. Results: Over 115,000 glass slides were digitized with a scan fail rate of around 1%. Drying glass slides before scanning minimized them sticking to scanner racks. Implementation required introduction of a 2D barcode tracking system and modification of histology workflow processes. Conclusion: Our experience indicates that effective adoption of WSI for primary diagnostic use was more dependent on optimizing preimaging variables and integration with the laboratory information system than on information technology infrastructure and ensuring pathologist buy-in. Implementation of digital pathology for routine practice not only leveraged the benefits of digital imaging but also creates an opportunity for establishing standardization of workflow processes in the pathology laboratory.

  7. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  8. Scientific Workflows and the Sensor Web for Virtual Environmental Observatories

    Science.gov (United States)

    Simonis, I.; Vahed, A.

    2008-12-01

    interfaces. All data sets and sensor communication follow well-defined abstract models and corresponding encodings, mostly developed by the OGC Sensor Web Enablement initiative. Scientific progress is currently accelerated by an emerging new concept called scientific workflows, which organize and manage complex distributed computations. A scientific workflow represents and records the highly complex processes that a domain scientist typically would follow in exploration, discovery and ultimately, transformation of raw data to publishable results. The challenge is now to integrate the benefits of scientific workflows with those provided by the Sensor Web in order to leverage all resources for scientific exploration, problem solving, and knowledge generation. Scientific workflows for the Sensor Web represent the next evolutionary step towards efficient, powerful, and flexible earth observation frameworks and platforms. Those platforms support the entire process from capturing data, sharing and integrating, to requesting additional observations. Multiple sites and organizations will participate on single platforms and scientists from different countries and organizations interact and contribute to large-scale research projects. Simultaneously, the data- and information overload becomes manageable, as multiple layers of abstraction will free scientists to deal with underlying data-, processing or storage peculiarities. The vision are automated investigation and discovery mechanisms that allow scientists to pose queries to the system, which in turn would identify potentially related resources, schedules processing tasks and assembles all parts in workflows that may satisfy the query.

  9. A review of the distribution coefficients of trace elements in soils: influence of sorption system, element characteristics, and soil colloidal properties.

    Science.gov (United States)

    Shaheen, Sabry M; Tsadilas, Christos D; Rinklebe, Jörg

    2013-12-01

    Knowledge about the behavior and reactions of separate soil components with trace elements (TEs) and their distribution coefficients (Kds) in soils is a key issue in assessing the mobility and retention of TEs. Thus, the fate of TEs and the toxic risk they pose depend crucially on their Kd in soil. This article reviews the Kd of TEs in soils as affected by the sorption system, element characteristics, and soil colloidal properties. The sorption mechanism, determining factors, favorable conditions, and competitive ions on the sorption and Kd of TEs are also discussed here. This review demonstrates that the Kd value of TEs does not only depend on inorganic and organic soil constituents, but also on the nature and characteristics of the elements involved as well as on their competition for sorption sites. The Kd value of TEs is mainly affected by individual or competitive sorption systems. Generally, the sorption in competitive systems is lower than in mono-metal sorption systems. More strongly sorbed elements, such as Pb and Cu, are less affected by competition than mobile elements, such as Cd, Ni, and Zn. The sorption preference exhibited by soils for elements over others may be due to: (i) the hydrolysis constant, (ii) the atomic weight, (iii) the ionic radius, and subsequently the hydrated radius, and (iv) its Misono softness value. Moreover, element concentrations in the test solution mainly affect the Kd values. Mostly, values of Kd decrease as the concentration of the included cation increases in the test solution. Additionally, the Kd of TEs is controlled by the sorption characteristics of soils, such as pH, clay minerals, soil organic matter, Fe and Mn oxides, and calcium carbonate. However, more research is required to verify the practical utilization of studying Kd of TEs in soils as a reliable indicator for assessing the remediation process of toxic metals in soils and waters. © 2013 Elsevier B.V. All rights reserved.

  10. Trace elements distribution and post-mortem intake in human bones from Middle Age by total reflection X-ray fluorescence

    International Nuclear Information System (INIS)

    Carvalho, M.L.; Marques, A.F.; Lima, M.T.; Reus, U.

    2004-01-01

    The purpose of the present work is to investigate the suitability of TXRF technique to study the distribution of trace elements along human bones of the 13th century, to conclude about environmental conditions and dietary habits of old populations and to study the uptake of some elements from the surrounding soil. In this work, we used TXRF to quantify and to make profiles of the elements through long bones. Two femur bones, one from a man and another from a woman, buried in the same grave were cross-sectioned in four different points at a distance of 1 cm. Microsamples of each section were taken at a distance of 1 mm from each other. Quantitative analysis was performed for Ca, Mn, Fe, Cu, Zn, Sr, Ba and Pb. Very high concentrations of Mn and Fe were obtained in the whole analysed samples, reaching values higher than 2% in some samples of trabecular tissue, very much alike to the concentrations in the burial soil. A sharp decrease for both elements was observed in cortical tissue. Zn and Sr present steady concentration levels in both kinds of bone tissues. Pb and Cu show very low concentrations in the inner tissue of cortical bone. However, these concentrations increase in the regions in contact to trabecular tissue and external surface in contact with the soil, where high levels of both elements were found. We suggest that contamination from the surrounding soil exists for Mn and Fe in the whole bone tissue. Pb can be both from post-mortem and ante-mortem origin. Inner compact tissue might represent in vivo accumulation and trabecular one corresponds to uptake during burial. The steady levels of Sr and Zn together with soil concentration lower levels for these elements may allow us to conclude that they are originated from in vivo incorporation in the hydroxyapatite bone matrix

  11. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronny S.; van der Aalst, Willibrordus Martinus Pancratius; Molemann, A.J.

    2007-01-01

    Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where an Execu......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where...... an Executable Use Case (EUC) and Colored Care organizations, such as hospitals, need to support complex and dynamic workflows. Moreover, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes...

  12. Distribution and origin of major and trace elements (particularly REE, U and Th) into labile and residual phases in an acid soil profile (Vosges Mountains, France)

    Science.gov (United States)

    Aubert, D.; Probst, A.; Stille, P.

    2003-04-01

    Physical and chemical weathering of rocks and minerals lead to soil formation and allow the removal of chemical elements from these systems to ground- or surface waters. But most of the time the determination of element concentrations in soils is not sufficient to estimate whether they are being accumulated or what is their ability to be released in the environment. Thus, the distribution and chemical binding for a given element is very important because it determines its mobility and potential bioavailability throughout a soil profile. Heavy metals and REE (Rare Earth Elements) are particularly of environmental concern because of their potential toxicity. For most of them, their chemical form strongly depends on the evolution of physico-chemical parameters like pH or redox conditions that will induce adsorption-desorption, complexation or co-precipitation phenomena in the material. The purpose of this study is to determine the distribution of several major and trace elements (especially REE, Th and U) in an acidic forested podzolic soil profile from the Vosges Mountains (France). To achieve this goal we use a 7 step sequential extraction procedure that allows determining precisely the origin and the behaviour of particular elements in the environment (Leleyter et al., 1999). In addition we performed leaching experiments using very dilute acetic and hydrochloric acid in order to establish the origin of REE in this soil. The results of the sequential extraction indicate that most of the metals, Th and U are mainly bound to Fe oxides. Organic matter appears also to be a great carrier of P, Ca, Fe and REE even if its content is very low in the deep horizons of the soil. Moreover, we show that in each soil horizon, middle REE (MREE) to heavy REE (HREE) are more labile than light REE (LREE). Leaching experiments using dilute acid solution further suggest that in the shallowest horizons REE largely derive from atmospheric deposition whereas at greater depth, weathering

  13. Tavaxy: integrating Taverna and Galaxy workflows with cloud computing support.

    Science.gov (United States)

    Abouelhoda, Mohamed; Issa, Shadi Alaa; Ghanem, Moustafa

    2012-05-04

    Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis.The system can be accessed either through a

  14. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    Directory of Open Access Journals (Sweden)

    Abouelhoda Mohamed

    2012-05-01

    Full Text Available Abstract Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub- workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and

  15. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    Science.gov (United States)

    2012-01-01

    Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis. The system

  16. SegMine workflows for semantic microarray data analysis in Orange4WS

    Directory of Open Access Journals (Sweden)

    Kulovesi Kimmo

    2011-10-01

    Full Text Available Abstract Background In experimental data analysis, bioinformatics researchers increasingly rely on tools that enable the composition and reuse of scientific workflows. The utility of current bioinformatics workflow environments can be significantly increased by offering advanced data mining services as workflow components. Such services can support, for instance, knowledge discovery from diverse distributed data and knowledge sources (such as GO, KEGG, PubMed, and experimental databases. Specifically, cutting-edge data analysis approaches, such as semantic data mining, link discovery, and visualization, have not yet been made available to researchers investigating complex biological datasets. Results We present a new methodology, SegMine, for semantic analysis of microarray data by exploiting general biological knowledge, and a new workflow environment, Orange4WS, with integrated support for web services in which the SegMine methodology is implemented. The SegMine methodology consists of two main steps. First, the semantic subgroup discovery algorithm is used to construct elaborate rules that identify enriched gene sets. Then, a link discovery service is used for the creation and visualization of new biological hypotheses. The utility of SegMine, implemented as a set of workflows in Orange4WS, is demonstrated in two microarray data analysis applications. In the analysis of senescence in human stem cells, the use of SegMine resulted in three novel research hypotheses that could improve understanding of the underlying mechanisms of senescence and identification of candidate marker genes. Conclusions Compared to the available data analysis systems, SegMine offers improved hypothesis generation and data interpretation for bioinformatics in an easy-to-use integrated workflow environment.

  17. Managing Evolving Business Workflows through the Capture of Descriptive Information

    CERN Document Server

    Gaspard, S; Dindeleux, R; McClatchey, R; Gaspard, Sebastien; Estrella, Florida

    2003-01-01

    Business systems these days need to be agile to address the needs of a changing world. In particular the discipline of Enterprise Application Integration requires business process management to be highly reconfigurable with the ability to support dynamic workflows, inter-application integration and process reconfiguration. Basing EAI systems on model-resident or on a so-called description-driven approach enables aspects of flexibility, distribution, system evolution and integration to be addressed in a domain-independent manner. Such a system called CRISTAL is described in this paper with particular emphasis on its application to EAI problem domains. A practical example of the CRISTAL technology in the domain of manufacturing systems, called Agilium, is described to demonstrate the principles of model-driven system evolution and integration. The approach is compared to other model-driven development approaches such as the Model-Driven Architecture of the OMG and so-called Adaptive Object Models.

  18. A Component Based Approach to Scientific Workflow Management

    CERN Document Server

    Le Goff, Jean-Marie; Baker, Nigel; Brooks, Peter; McClatchey, Richard

    2001-01-01

    CRISTAL is a distributed scientific workflow system used in the manufacturing and production phases of HEP experiment construction at CERN. The CRISTAL project has studied the use of a description driven approach, using meta- modelling techniques, to manage the evolving needs of a large physics community. Interest from such diverse communities as bio-informatics and manufacturing has motivated the CRISTAL team to re-engineer the system to customize functionality according to end user requirements but maximize software reuse in the process. The next generation CRISTAL vision is to build a generic component architecture from which a complete software product line can be generated according to the particular needs of the target enterprise. This paper discusses the issues of adopting a component product line based approach and our experiences of software reuse.

  19. A component based approach to scientific workflow management

    International Nuclear Information System (INIS)

    Baker, N.; Brooks, P.; McClatchey, R.; Kovacs, Z.; LeGoff, J.-M.

    2001-01-01

    CRISTAL is a distributed scientific workflow system used in the manufacturing and production phases of HEP experiment construction at CERN. The CRISTAL project has studied the use of a description driven approach, using meta-modelling techniques, to manage the evolving needs of a large physics community. Interest from such diverse communities as bio-informatics and manufacturing has motivated the CRISTAL team to re-engineer the system to customize functionality according to end user requirements but maximize software reuse in the process. The next generation CRISTAL vision is to build a generic component architecture from which a complete software product line can be generated according to the particular needs of the target enterprise. This paper discusses the issues of adopting a component product line based approach and our experiences of software reuse

  20. Coupling of a continuum ice sheet model and a discrete element calving model using a scientific workflow system

    Science.gov (United States)

    Memon, Shahbaz; Vallot, Dorothée; Zwinger, Thomas; Neukirchen, Helmut

    2017-04-01

    Scientific communities generate complex simulations through orchestration of semi-structured analysis pipelines which involves execution of large workflows on multiple, distributed and heterogeneous computing and data resources. Modeling ice dynamics of glaciers requires workflows consisting of many non-trivial, computationally expensive processing tasks which are coupled to each other. From this domain, we present an e-Science use case, a workflow, which requires the execution of a continuum ice flow model and a discrete element based calving model in an iterative manner. Apart from the execution, this workflow also contains data format conversion tasks that support the execution of ice flow and calving by means of transition through sequential, nested and iterative steps. Thus, the management and monitoring of all the processing tasks including data management and transfer of the workflow model becomes more complex. From the implementation perspective, this workflow model was initially developed on a set of scripts using static data input and output references. In the course of application usage when more scripts or modifications introduced as per user requirements, the debugging and validation of results were more cumbersome to achieve. To address these problems, we identified a need to have a high-level scientific workflow tool through which all the above mentioned processes can be achieved in an efficient and usable manner. We decided to make use of the e-Science middleware UNICORE (Uniform Interface to Computing Resources) that allows seamless and automated access to different heterogenous and distributed resources which is supported by a scientific workflow engine. Based on this, we developed a high-level scientific workflow model for coupling of massively parallel High-Performance Computing (HPC) jobs: a continuum ice sheet model (Elmer/Ice) and a discrete element calving and crevassing model (HiDEM). In our talk we present how the use of a high

  1. Workflow-Oriented Cyberinfrastructure for Sensor Data Analytics

    Science.gov (United States)

    Orcutt, J. A.; Rajasekar, A.; Moore, R. W.; Vernon, F.

    2015-12-01

    Sensor streams comprise an increasingly large part of Earth Science data. Analytics based on sensor data require an easy way to perform operations such as acquisition, conversion to physical units, metadata linking, sensor fusion, analysis and visualization on distributed sensor streams. Furthermore, embedding real-time sensor data into scientific workflows is of growing interest. We have implemented a scalable networked architecture that can be used to dynamically access packets of data in a stream from multiple sensors, and perform synthesis and analysis across a distributed network. Our system is based on the integrated Rule Oriented Data System (irods.org), which accesses sensor data from the Antelope Real Time Data System (brtt.com), and provides virtualized access to collections of data streams. We integrate real-time data streaming from different sources, collected for different purposes, on different time and spatial scales, and sensed by different methods. iRODS, noted for its policy-oriented data management, brings to sensor processing features and facilities such as single sign-on, third party access control lists ( ACLs), location transparency, logical resource naming, and server-side modeling capabilities while reducing the burden on sensor network operators. Rich integrated metadata support also makes it straightforward to discover data streams of interest and maintain data provenance. The workflow support in iRODS readily integrates sensor processing into any analytical pipeline. The system is developed as part of the NSF-funded Datanet Federation Consortium (datafed.org). APIs for selecting, opening, reaping and closing sensor streams are provided, along with other helper functions to associate metadata and convert sensor packets into NetCDF and JSON formats. Near real-time sensor data including seismic sensors, environmental sensors, LIDAR and video streams are available through this interface. A system for archiving sensor data and metadata in Net

  2. Text mining for the biocuration workflow

    Science.gov (United States)

    Hirschman, Lynette; Burns, Gully A. P. C; Krallinger, Martin; Arighi, Cecilia; Cohen, K. Bretonnel; Valencia, Alfonso; Wu, Cathy H.; Chatr-Aryamontri, Andrew; Dowell, Karen G.; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G.

    2012-01-01

    Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on ‘Text Mining for the BioCuration Workflow’ at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community. PMID:22513129

  3. Electronic resource management systems a workflow approach

    CERN Document Server

    Anderson, Elsa K

    2014-01-01

    To get to the bottom of a successful approach to Electronic Resource Management (ERM), Anderson interviewed staff at 11 institutions about their ERM implementations. Among her conclusions, presented in this issue of Library Technology Reports, is that grasping the intricacies of your workflow-analyzing each step to reveal the gaps and problems-at the beginning is crucial to selecting and implementing an ERM. Whether the system will be used to fill a gap, aggregate critical data, or replace a tedious manual process, the best solution for your library depends on factors such as your current soft

  4. Evolutionary optimization of production materials workflow processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    We present an evolutionary optimisation technique for stochastic production processes, which is able to find improved production materials workflow processes with respect to arbitrary combinations of numerical quantities associated with the production process. Working from a core fragment...... of the BPMN language, we employ an evolutionary algorithm where stochastic model checking is used as a fitness function to determine the degree of improvement of candidate processes derived from the original process through mutation and cross-over operations. We illustrate this technique using a case study...

  5. Reenginering of the i4 workflow engine

    OpenAIRE

    Likar, Tilen

    2013-01-01

    I4 is an enterprise resource planning system which allows you to manage business processes. Due to increasing demands for managing complex processes and adjusting those processes to global standards, a renewal of a part of the system was required. In this thesis we faced the reengineering of the workflow engine, and corresponding data model. We designed a business process diagram in Bizagi Porcess Modeler. The import to i4 and the export from i4 was developed on XPDL file exported from the mo...

  6. Accumulation and distribution of trace metals and radionuclides in marine organisms (particularly Tapes decussatus L.) in the Izmir bay area, Turkey

    International Nuclear Information System (INIS)

    Geldiay, R.; Uysal, H.

    1976-01-01

    The shell fish Tapes decussatus has economic importance as a product from Izmir bay. The concentrations of trace metals (Cu, Mn, Zn, Fe, Pb, Co, Cr, Hg, Cd) in this organism have been determined in relation to different localities with polluted and non-polluted waters. Measurement of levels and trends of these trace elements are important in the context of public health. Seasonal as well as spatial variation of 65 Zn and 115 Cd in the organs and tissues of T. decussatus were determined. A comparison of the concentration of trace elements in natural conditions with that in laboratory conditions was made using radioisotopes. The concentrations of trace elements in T. decussatus varied according to the tissues and organs of the body, the size of the animal, the locality of sampling and the season of the year. Bio-concentrations of radioactive 65 Zn and 115 Cd were also observed to vary according to the tissues and organs of the animal. Pathways of the trace elements were also studied using radioisotopes ( 65 Zn and 115 Cd). The effects of toxicity of the stable elements on their uptake and loss were also determined. The toxic effects of different concentrations on the uptake and loss of 65 Zn and 115 Cd were studied. (author)

  7. Echoes of Semiotically-Based Design in the Development and Testing of a Workflow System

    Directory of Open Access Journals (Sweden)

    Clarisse Sieckenius de Souza

    2001-05-01

    Full Text Available Workflow systems are information-intensive task-oriented computer applications that typically involve a considerable number of users playing a wide variety of roles. Since communication, coordination and decision-making processes are essential for such systems, representing, interpreting and negotiating collective meanings are a crucial issue for software design and development processes. In this paper, we report and discuss our experience in implementing Qualitas, a web-based workflow system. Semiotic theory was extensively used to support design decisions and negotiations with users about technological signs. Taking scenarios as a type-sign exchanged throughout the whole process, we could trace the theoretic underpinnings of our experience and draw some revealing conclusions about the product and the process of technologically reified discourse. Although it is present in all information technology applications, this kind of discourse is seldom analyzed by software designers and developers. Our conjecture is that outside semiotic theory, professionals involved with human-computer interaction and software engineering practices have difficulty to coalesce concepts derived from such different disciplines as psychology, anthropology, linguistics and sociology, to name a few. Semiotics, however, can by itself provide a unifying ontological basis for interdisciplinary nowledge, raising issues and proposing alternatives, that may help professionals gain insights at lower learning costs. eywords: semiotic engineering, workflow systems, information-intensive task-oriented systems, scenario based design and development of computer systems, human-computer interaction

  8. Building and documenting workflows with python-based snakemake

    OpenAIRE

    Köster, Johannes; Rahmann, Sven

    2012-01-01

    textabstractSnakemake is a novel workflow engine with a simple Python-derived workflow definition language and an optimizing execution environment. It is the first system that supports multiple named wildcards (or variables) in input and output filenames of each rule definition. It also allows to write human-readable workflows that document themselves. We have found Snakemake especially useful for building high-throughput sequencing data analysis pipelines and present examples from this area....

  9. Automated data reduction workflows for astronomy. The ESO Reflex environment

    Science.gov (United States)

    Freudling, W.; Romaniello, M.; Bramich, D. M.; Ballester, P.; Forchi, V.; García-Dabló, C. E.; Moehler, S.; Neeser, M. J.

    2013-11-01

    Context. Data from complex modern astronomical instruments often consist of a large number of different science and calibration files, and their reduction requires a variety of software tools. The execution chain of the tools represents a complex workflow that needs to be tuned and supervised, often by individual researchers that are not necessarily experts for any specific instrument. Aims: The efficiency of data reduction can be improved by using automatic workflows to organise data and execute a sequence of data reduction steps. To realize such efficiency gains, we designed a system that allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection and interaction with the data. Methods: The European Southern Observatory (ESO) has developed Reflex, an environment to automate data reduction workflows. Reflex is implemented as a package of customized components for the Kepler workflow engine. Kepler provides the graphical user interface to create an executable flowchart-like representation of the data reduction process. Key features of Reflex are a rule-based data organiser, infrastructure to re-use results, thorough book-keeping, data progeny tracking, interactive user interfaces, and a novel concept to exploit information created during data organisation for the workflow execution. Results: Automated workflows can greatly increase the efficiency of astronomical data reduction. In Reflex, workflows can be run non-interactively as a first step. Subsequent optimization can then be carried out while transparently re-using all unchanged intermediate products. We found that such workflows enable the reduction of complex data by non-expert users and minimizes mistakes due to book-keeping errors. Conclusions: Reflex includes novel concepts to increase the efficiency of astronomical data processing. While Reflex is a specific implementation of astronomical scientific workflows within the Kepler workflow

  10. A Model of Workflow Composition for Emergency Management

    Science.gov (United States)

    Xin, Chen; Bin-ge, Cui; Feng, Zhang; Xue-hui, Xu; Shan-shan, Fu

    The common-used workflow technology is not flexible enough in dealing with concurrent emergency situations. The paper proposes a novel model for defining emergency plans, in which workflow segments appear as a constituent part. A formal abstraction, which contains four operations, is defined to compose workflow segments under constraint rule. The software system of the business process resources construction and composition is implemented and integrated into Emergency Plan Management Application System.

  11. Business and scientific workflows a web service-oriented approach

    CERN Document Server

    Tan, Wei

    2013-01-01

    Focuses on how to use web service computing and service-based workflow technologies to develop timely, effective workflows for both business and scientific fields Utilizing web computing and Service-Oriented Architecture (SOA), Business and Scientific Workflows: A Web Service-Oriented Approach focuses on how to design, analyze, and deploy web service-based workflows for both business and scientific applications in many areas of healthcare and biomedicine. It also discusses and presents the recent research and development results. This informative reference features app

  12. Comparison of Resource Platform Selection Approaches for Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Ramakrishnan, Lavanya

    2010-03-05

    Cloud computing is increasingly considered as an additional computational resource platform for scientific workflows. The cloud offers opportunity to scale-out applications from desktops and local cluster resources. At the same time, it can eliminate the challenges of restricted software environments and queue delays in shared high performance computing environments. Choosing from these diverse resource platforms for a workflow execution poses a challenge for many scientists. Scientists are often faced with deciding resource platform selection trade-offs with limited information on the actual workflows. While many workflow planning methods have explored task scheduling onto different resources, these methods often require fine-scale characterization of the workflow that is onerous for a scientist. In this position paper, we describe our early exploratory work into using blackbox characteristics to do a cost-benefit analysis across of using cloud platforms. We use only very limited high-level information on the workflow length, width, and data sizes. The length and width are indicative of the workflow duration and parallelism. The data size characterizes the IO requirements. We compare the effectiveness of this approach to other resource selection models using two exemplar scientific workflows scheduled on desktops, local clusters, HPC centers, and clouds. Early results suggest that the blackbox model often makes the same resource selections as a more fine-grained whitebox model. We believe the simplicity of the blackbox model can help inform a scientist on the applicability of cloud computing resources even before porting an existing workflow.

  13. Design, Modelling and Analysis of a Workflow Reconfiguration

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Abouzaid, Faisal; Dragoni, Nicola

    2011-01-01

    This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design and to ve......This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design...

  14. Modeling Complex Workflow in Molecular Diagnostics

    Science.gov (United States)

    Gomah, Mohamed E.; Turley, James P.; Lu, Huimin; Jones, Dan

    2010-01-01

    One of the hurdles to achieving personalized medicine has been implementing the laboratory processes for performing and reporting complex molecular tests. The rapidly changing test rosters and complex analysis platforms in molecular diagnostics have meant that many clinical laboratories still use labor-intensive manual processing and testing without the level of automation seen in high-volume chemistry and hematology testing. We provide here a discussion of design requirements and the results of implementation of a suite of lab management tools that incorporate the many elements required for use of molecular diagnostics in personalized medicine, particularly in cancer. These applications provide the functionality required for sample accessioning and tracking, material generation, and testing that are particular to the evolving needs of individualized molecular diagnostics. On implementation, the applications described here resulted in improvements in the turn-around time for reporting of more complex molecular test sets, and significant changes in the workflow. Therefore, careful mapping of workflow can permit design of software applications that simplify even the complex demands of specialized molecular testing. By incorporating design features for order review, software tools can permit a more personalized approach to sample handling and test selection without compromising efficiency. PMID:20007844

  15. Deriving DICOM surgical extensions from surgical workflows

    Science.gov (United States)

    Burgert, O.; Neumuth, T.; Gessat, M.; Jacobs, S.; Lemke, H. U.

    2007-03-01

    The generation, storage, transfer, and representation of image data in radiology are standardized by DICOM. To cover the needs of image guided surgery or computer assisted surgery in general one needs to handle patient information besides image data. A large number of objects must be defined in DICOM to address the needs of surgery. We propose an analysis process based on Surgical Workflows that helps to identify these objects together with use cases and requirements motivating for their specification. As the first result we confirmed the need for the specification of representation and transfer of geometric models. The analysis of Surgical Workflows has shown that geometric models are widely used to represent planned procedure steps, surgical tools, anatomical structures, or prosthesis in the context of surgical planning, image guided surgery, augmented reality, and simulation. By now, the models are stored and transferred in several file formats bare of contextual information. The standardization of data types including contextual information and specifications for handling of geometric models allows a broader usage of such models. This paper explains the specification process leading to Geometry Mesh Service Object Pair classes. This process can be a template for the definition of further DICOM classes.

  16. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  17. Workflow management for a cosmology collaboratory

    International Nuclear Information System (INIS)

    Loken, Stewart C.; McParland, Charles

    2001-01-01

    The Nearby Supernova Factory Project will provide a unique opportunity to bring together simulation and observation to address crucial problems in particle and nuclear physics. Its goal is to significantly enhance our understanding of the nuclear processes in supernovae and to improve our ability to use both Type Ia and Type II supernovae as reference light sources (standard candles) in precision measurements of cosmological parameters. Over the past several years, astronomers and astrophysicists have been conducting in-depth sky searches with the goal of identifying supernovae in their earliest evolutionary stages and, during the 4 to 8 weeks of their most ''explosive'' activity, measure their changing magnitude and spectra. The search program currently under development at LBNL is an earth-based observation program utilizing observational instruments at Haleakala and Mauna Kea, Hawaii and Mt. Palomar, California. This new program provides a demanding testbed for the integration of computational, data management and collaboratory technologies. A critical element of this effort is the use of emerging workflow management tools to permit collaborating scientists to manage data processing and storage and to integrate advanced supernova simulation into the real-time control of the experiments. This paper describes the workflow management framework for the project, discusses security and resource allocation requirements and reviews emerging tools to support this important aspect of collaborative work

  18. The Prosthetic Workflow in the Digital Era

    Directory of Open Access Journals (Sweden)

    Lidia Tordiglione

    2016-01-01

    Full Text Available The purpose of this retrospective study was to clinically evaluate the benefits of adopting a full digital workflow for the implementation of fixed prosthetic restorations on natural teeth. To evaluate the effectiveness of these protocols, treatment plans were drawn up for 15 patients requiring rehabilitation of one or more natural teeth. All the dental impressions were taken using a Planmeca PlanScan® (Planmeca OY, Helsinki, Finland intraoral scanner, which provided digital casts on which the restorations were digitally designed using Exocad® (Exocad GmbH, Germany, 2010 software and fabricated by CAM processing on 5-axis milling machines. A total of 28 single crowns were made from monolithic zirconia, 12 vestibular veneers from lithium disilicate, and 4 three-quarter vestibular veneers with palatal extension. While the restorations were applied, the authors could clinically appreciate the excellent match between the digitally produced prosthetic design and the cemented prostheses, which never required any occlusal or proximal adjustment. Out of all the restorations applied, only one exhibited premature failure and was replaced with no other complications or need for further scanning. From the clinical experience gained using a full digital workflow, the authors can confirm that these work processes enable the fabrication of clinically reliable restorations, with all the benefits that digital methods bring to the dentist, the dental laboratory, and the patient.

  19. Multi-level meta-workflows: new concept for regularly occurring tasks in quantum chemistry.

    Science.gov (United States)

    Arshad, Junaid; Hoffmann, Alexander; Gesing, Sandra; Grunzke, Richard; Krüger, Jens; Kiss, Tamas; Herres-Pawlis, Sonja; Terstyanszky, Gabor

    2016-01-01

    In Quantum Chemistry, many tasks are reoccurring frequently, e.g. geometry optimizations, benchmarking series etc. Here, workflows can help to reduce the time of manual job definition and output extraction. These workflows are executed on computing infrastructures and may require large computing and data resources. Scientific workflows hide these infrastructures and the resources needed to run them. It requires significant efforts and specific expertise to design, implement and test these workflows. Many of these workflows are complex and monolithic entities that can be used for particular scientific experiments. Hence, their modification is not straightforward and it makes almost impossible to share them. To address these issues we propose developing atomic workflows and embedding them in meta-workflows. Atomic workflows deliver a well-defined research domain specific function. Publishing workflows in repositories enables workflow sharing inside and/or among scientific communities. We formally specify atomic and meta-workflows in order to define data structures to be used in repositories for uploading and sharing them. Additionally, we present a formal description focused at orchestration of atomic workflows into meta-workflows. We investigated the operations that represent basic functionalities in Quantum Chemistry, developed the relevant atomic workflows and combined them into meta-workflows. Having these workflows we defined the structure of the Quantum Chemistry workflow library and uploaded these workflows in the SHIWA Workflow Repository.Graphical AbstractMeta-workflows and embedded workflows in the template representation.

  20. Linac particle tracing simulations

    International Nuclear Information System (INIS)

    Lysenko, W.P.

    1979-01-01

    A particle tracing code was developed to study space--charge effects in proton or heavy-ion linear accelerators. The purpose is to study space--charge phenomena as directly as possible without the complications of many accelerator details. Thus, the accelerator is represented simply by harmonic oscillator or impulse restoring forces. Variable parameters as well as mismatched phase--space distributions were studied. This study represents the initial search for those features of the accelerator or of the phase--space distribution that lead to emittance growth

  1. Tracing Clues

    DEFF Research Database (Denmark)

    Feldt, Liv Egholm

    The past is all messiness and blurred relations. However, we tend to sort the messiness out through rigorous analytical studies leaving the messiness behind. Carlo Ginzburgs´ article Clues. Roots of an Evidential Paradigm from 1986 invigorates methodological elements of (historical) research, which...... central methodological elements will be further elaborated and discussed through a historical case study that traces how networks of philanthropic concepts and practices influenced the Danish welfare state in the period from the Danish constitution of 1849 until today. The overall aim of this paper...

  2. Toward Exascale Seismic Imaging: Taming Workflow and I/O Issues

    Science.gov (United States)

    Lefebvre, M. P.; Bozdag, E.; Lei, W.; Rusmanugroho, H.; Smith, J. A.; Tromp, J.; Yuan, Y.

    2013-12-01

    Providing a better understanding of the physics and chemistry of Earth's interior through numerical simulations has always required tremendous computational resources. Post-petascale supercomputers are now available to solve complex scientific problems that were thought unreachable a few decades ago. They also bring a cohort of concerns on how to obtain optimum performance. Several issues are currently being investigated by the HPC community. To name a few, we can list energy consumption, fault resilience, scalability of the current parallel paradigms, large workflow management, I/O performance and feature extraction with large datasets. For this presentation, we focus on the last three issues. In the context of seismic imaging, in particular for simulations based on adjoint methods, workflows are well defined. They consist of a few collective steps (e.g., mesh generation or model updates) and of a large number of independent steps (e.g., forward and adjoint simulations of each seismic event, pre- and postprocessing of seismic traces). The greater goal is to reduce the time to solution, that is, obtaining a more precise representation of the subsurface as fast as possible. This brings us to consider both the workflow in its entirety and the parts composing it. The usual approach is to speedup the purely computational parts by code tuning in order to reach higher FLOPS and better memory usage. This still remains an important concern, but larger scale experiments show that the imaging workflow suffers from a severe I/O bottleneck. This limitation occurs both for purely computational data and seismic time series. The latter are dealt with by the introduction of a new Adaptable Seismic Data Format (ASDF). In both cases, a parallel I/O library, ORNL's ADIOS, is used to drastically lessen the weight of disk access. Moreover, parallel visualization tools, such as VisIt, are able to take advantage of the metadata included in our ADIOS outputs to extract features and

  3. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology.

    Science.gov (United States)

    Cock, Peter J A; Grüning, Björn A; Paszkiewicz, Konrad; Pritchard, Leighton

    2013-01-01

    The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of "effector" proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen's predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu).

  4. Optimization of business processes in banks through flexible workflow

    Science.gov (United States)

    Postolache, V.

    2017-08-01

    This article describes an integrated business model of a commercial bank. There are examples of components that go into its composition: wooden models and business processes, strategic goals, organizational structure, system architecture, operational and marketing risk models, etc. The practice has shown that the development and implementation of the integrated business model of the bank significantly increase operating efficiency and its management, ensures organizational and technology stable development. Considering the evolution of business processes in the banking sector, should be analysed their common characteristics. From the author’s point of view, a business process is a set of various activities of a commercial bank in which “Input” is one or more financial and material resources, as a result of this activity and “output” is created by banking product, which is some value to consumer. Using workflow technology, management business process efficiency issue is a matter of managing the integration of resources and sequence of actions aimed at achieving this goal. In turn, it implies management of jobs or functions’ interaction, synchronizing of the assignments periods, reducing delays in the transmission of the results etc. Workflow technology is very important for managers at all levels, as they can use it to easily strengthen the control over what is happening in a particular unit, and in the bank as a whole. The manager is able to plan, to implement rules, to interact within the framework of the company’s procedures and tasks entrusted to the system of the distribution function and execution control, alert on the implementation and issuance of the statistical data on the effectiveness of operating procedures. Development and active use of the integrated bank business model is one of the key success factors that contribute to long-term and stable development of the bank, increase employee efficiency and business processes, implement the

  5. Clay mineralogy, grain size distribution and their correlations with trace metals in the salt marsh sediments of the Skallingen barrier spit, Danish Wadden Sea

    DEFF Research Database (Denmark)

    He, Changling; Bartholdy, Jesper; Christiansen, Christian

    2012-01-01

    metals. The clay assembly of the sediment consists of illite, kaolinite and much less chlorite and smectite. The major clay minerals of illite, kaolinite as well as chlorite correlate very poorly with all the trace metals investigated, due probably to the weak competing strength of these clays compared...

  6. Quantification and micron-scale imaging of spatial distribution of trace beryllium in shrapnel fragments and metallurgic samples with correlative fluorescence detection method and secondary ion mass spectrometry (SIMS)

    Science.gov (United States)

    Abraham, Jerrold L.; Chandra, Subhash; Agrawal, Anoop

    2014-01-01

    Recently, a report raised the possibility of shrapnel-induced chronic beryllium disease (CBD) from long-term exposure to the surface of retained aluminum shrapnel fragments in the body. Since the shrapnel fragments contained trace beryllium, methodological developments were needed for beryllium quantification and to study its spatial distribution in relation to other matrix elements, such as aluminum and iron, in metallurgic samples. In this work, we developed methodology for quantification of trace beryllium in samples of shrapnel fragments and other metallurgic sample-types with main matrix of aluminum (aluminum cans from soda, beer, carbonated water, and aluminum foil). Sample preparation procedures were developed for dissolving beryllium for its quantification with the fluorescence detection method for homogenized measurements. The spatial distribution of trace beryllium on the sample surface and in 3D was imaged with a dynamic secondary ion mass spectrometry (SIMS) instrument, CAMECA IMS 3f SIMS ion microscope. The beryllium content of shrapnel (~100 ppb) was the same as the trace quantities of beryllium found in aluminum cans. The beryllium content of aluminum foil (~25 ppb) was significantly lower than cans. SIMS imaging analysis revealed beryllium to be distributed in the form of low micron-sized particles and clusters distributed randomly in X-Y-and Z dimensions, and often in association with iron, in the main aluminum matrix of cans. These observations indicate a plausible formation of Be-Fe or Al-Be alloy in the matrix of cans. Further observations were made on fluids (carbonated water) for understanding if trace beryllium in cans leached out and contaminated the food product. A direct comparison of carbonated water in aluminum cans and plastic bottles revealed that beryllium was below the detection limits of the fluorescence detection method (~0.01 ppb). These observations indicate that beryllium present in aluminum matrix was either present in an

  7. Quantification and micron-scale imaging of spatial distribution of trace beryllium in shrapnel fragments and metallurgic samples with correlative fluorescence detection method and secondary ion mass spectrometry (SIMS).

    Science.gov (United States)

    Abraham, J L; Chandra, S; Agrawal, A

    2014-11-01

    Recently, a report raised the possibility of shrapnel-induced chronic beryllium disease from long-term exposure to the surface of retained aluminum shrapnel fragments in the body. Since the shrapnel fragments contained trace beryllium, methodological developments were needed for beryllium quantification and to study its spatial distribution in relation to other matrix elements, such as aluminum and iron, in metallurgic samples. In this work, we developed methodology for quantification of trace beryllium in samples of shrapnel fragments and other metallurgic sample-types with main matrix of aluminum (aluminum cans from soda, beer, carbonated water and aluminum foil). Sample preparation procedures were developed for dissolving beryllium for its quantification with the fluorescence detection method for homogenized measurements. The spatial distribution of trace beryllium on the sample surface and in 3D was imaged with a dynamic secondary ion mass spectrometry instrument, CAMECA IMS 3f secondary ion mass spectrometry ion microscope. The beryllium content of shrapnel (∼100 ppb) was the same as the trace quantities of beryllium found in aluminum cans. The beryllium content of aluminum foil (∼25 ppb) was significantly lower than cans. SIMS imaging analysis revealed beryllium to be distributed in the form of low micron-sized particles and clusters distributed randomly in X-Y- and Z dimensions, and often in association with iron, in the main aluminum matrix of cans. These observations indicate a plausible formation of Be-Fe or Al-Be alloy in the matrix of cans. Further observations were made on fluids (carbonated water) for understanding if trace beryllium in cans leached out and contaminated the food product. A direct comparison of carbonated water in aluminum cans and plastic bottles revealed that beryllium was below the detection limits of the fluorescence detection method (∼0.01 ppb). These observations indicate that beryllium present in aluminum matrix was either

  8. Towards seamless workflows in agile data science

    Science.gov (United States)

    Klump, J. F.; Robertson, J.

    2017-12-01

    Agile workflows are a response to projects with requirements that may change over time. They prioritise rapid and flexible responses to change, preferring to adapt to changes in requirements rather than predict them before a project starts. This suits the needs of research very well because research is inherently agile in its methodology. The adoption of agile methods has made collaborative data analysis much easier in a research environment fragmented across institutional data stores, HPC, personal and lab computers and more recently cloud environments. Agile workflows use tools that share a common worldview: in an agile environment, there may be more that one valid version of data, code or environment in play at any given time. All of these versions need references and identifiers. For example, a team of developers following the git-flow conventions (github.com/nvie/gitflow) may have several active branches, one for each strand of development. These workflows allow rapid and parallel iteration while maintaining identifiers pointing to individual snapshots of data and code and allowing rapid switching between strands. In contrast, the current focus of versioning in research data management is geared towards managing data for reproducibility and long-term preservation of the record of science. While both are important goals in the persistent curation domain of the institutional research data infrastructure, current tools emphasise planning over adaptation and can introduce unwanted rigidity by insisting on a single valid version or point of truth. In the collaborative curation domain of a research project, things are more fluid. However, there is no equivalent to the "versioning iso-surface" of the git protocol for the management and versioning of research data. At CSIRO we are developing concepts and tools for the agile management of software code and research data for virtual research environments, based on our experiences of actual data analytics projects in the

  9. Workflow efficiency of two 1.5 T MR scanners with and without an automated user interface for head examinations.

    Science.gov (United States)

    Moenninghoff, Christoph; Umutlu, Lale; Kloeters, Christian; Ringelstein, Adrian; Ladd, Mark E; Sombetzki, Antje; Lauenstein, Thomas C; Forsting, Michael; Schlamann, Marc

    2013-06-01

    Workflow efficiency and workload of radiological technologists (RTs) were compared in head examinations performed with two 1.5 T magnetic resonance (MR) scanners equipped with or without an automated user interface called "day optimizing throughput" (Dot) workflow engine. Thirty-four patients with known intracranial pathology were examined with a 1.5 T MR scanner with Dot workflow engine (Siemens MAGNETOM Aera) and with a 1.5 T MR scanner with conventional user interface (Siemens MAGNETOM Avanto) using four standardized examination protocols. The elapsed time for all necessary work steps, which were performed by 11 RTs within the total examination time, was compared for each examination at both MR scanners. The RTs evaluated the user-friendliness of both scanners by a questionnaire. Normality of distribution was checked for all continuous variables by use of the Shapiro-Wilk test. Normally distributed variables were analyzed by Student's paired t-test, otherwise Wilcoxon signed-rank test was used to compare means. Total examination time of MR examinations performed with Dot engine was reduced from 24:53 to 20:01 minutes (P user interface (P = .001). According to this preliminary study, the Dot workflow engine is a time-saving user assistance software, which decreases the RTs' effort significantly and may help to automate neuroradiological examinations for a higher workflow efficiency. Copyright © 2013 AUR. Published by Elsevier Inc. All rights reserved.

  10. Managing Library IT Workflow with Bugzilla

    Directory of Open Access Journals (Sweden)

    Nina McHale

    2010-09-01

    Full Text Available Prior to September 2008, all technology issues at the University of Colorado Denver's Auraria Library were reported to a dedicated departmental phone line. A variety of staff changes necessitated a more formal means of tracking, delegating, and resolving reported issues, and the department turned to Bugzilla, an open source bug tracking application designed by Mozilla.org developers. While designed with software development bug tracking in mind, Bugzilla can be easily customized and modified to serve as an IT ticketing system. Twenty-three months and over 2300 trouble tickets later, Auraria's IT department workflow is much smoother and more efficient. This article includes two Perl Template Toolkit code samples for customized Bugzilla screens for its use in a library environment; readers will be able to easily replicate the project in their own environments.

  11. Swabs to genomes: a comprehensive workflow

    Directory of Open Access Journals (Sweden)

    Madison I. Dunitz

    2015-05-01

    Full Text Available The sequencing, assembly, and basic analysis of microbial genomes, once a painstaking and expensive undertaking, has become much easier for research labs with access to standard molecular biology and computational tools. However, there are a confusing variety of options available for DNA library preparation and sequencing, and inexperience with bioinformatics can pose a significant barrier to entry for many who may be interested in microbial genomics. The objective of the present study was to design, test, troubleshoot, and publish a simple, comprehensive workflow from the collection of an environmental sample (a swab to a published microbial genome; empowering even a lab or classroom with limited resources and bioinformatics experience to perform it.

  12. The P2P approach to interorganizational workflows

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Weske, M.H.; Dittrich, K.R.; Geppert, A.; Norrie, M.C.

    2001-01-01

    This paper describes in an informal way the Public-To-Private (P2P) approach to interorganizational workflows, which is based on a notion of inheritance. The approach consists of three steps: (1) create a common understanding of the interorganizational workflow by specifying a shared public

  13. Reasoning about repairability of workflows at design time

    NARCIS (Netherlands)

    Tagni, Gaston; Ten Teije, Annette; Van Harmelen, Frank

    2009-01-01

    This paper describes an approach for reasoning about the repairability of workflows at design time. We propose a heuristic-based analysis of a workflow that aims at evaluating its definition, considering different design aspects and characteristics that affect its repairability (called repairability

  14. Design decisions in workflow management and quality of work.

    NARCIS (Netherlands)

    Waal, B.M.E. de; Batenburg, R.

    2009-01-01

    In this paper, the design and implementation of a workflow management (WFM) system in a large Dutch social insurance organisation is described. The effect of workflow design decisions on the quality of work is explored theoretically and empirically, using the model of Zur Mühlen as a frame of

  15. Conceptual framework and architecture for service mediating workflow management

    NARCIS (Netherlands)

    Hu, Jinmin; Grefen, P.W.P.J.

    2003-01-01

    This paper proposes a three-layer workflow concept framework to realize workflow enactment flexibility by dynamically binding activities to their implementations at run time. A service mediating layer is added to bridge business process definition and its implementation. Based on this framework, we

  16. Building and documenting workflows with python-based snakemake

    NARCIS (Netherlands)

    J. Köster (Johannes); S. Rahmann (Sven)

    2012-01-01

    textabstractSnakemake is a novel workflow engine with a simple Python-derived workflow definition language and an optimizing execution environment. It is the first system that supports multiple named wildcards (or variables) in input and output filenames of each rule definition. It also allows to

  17. Analyzing the Gap between Workflows and their Natural Language Descriptions

    NARCIS (Netherlands)

    Groth, P.T.; Gil, Y

    2009-01-01

    Scientists increasingly use workflows to represent and share their computational experiments. Because of their declarative nature, focus on pre-existing component composition and the availability of visual editors, workflows provide a valuable start for creating user-friendly environments for end

  18. Modelling and analysis of workflow for lean supply chains

    Science.gov (United States)

    Ma, Jinping; Wang, Kanliang; Xu, Lida

    2011-11-01

    Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.

  19. Two-Layer Transaction Management for Workflow Management Applications

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Vonk, J.; Boertjes, E.M.; Apers, Peter M.G.

    Workflow management applications require advanced transaction management that is not offered by traditional database systems. For this reason, a number of extended transaction models has been proposed in the past. None of these models seems completely adequate, though, because workflow management

  20. Parametric Room Acoustic workflows with real-time acoustic simulation

    DEFF Research Database (Denmark)

    Parigi, Dario

    2017-01-01

    The paper investigates and assesses the opportunities that real-time acoustic simulation offer to engage in parametric acoustics workflow and to influence architectural designs from early design stages......The paper investigates and assesses the opportunities that real-time acoustic simulation offer to engage in parametric acoustics workflow and to influence architectural designs from early design stages...

  1. Open source workflow : a viable direction for BPM?

    NARCIS (Netherlands)

    Wohed, P.; Russell, N.C.; Hofstede, ter A.H.M.; Andersson, B.; Aalst, van der W.M.P.; Bellahsène, Z.; Léonard, M.

    2008-01-01

    With the growing interest in open source software in general and business process management and workflow systems in particular, it is worthwhile investigating the state of open source workflow management. The plethora of these offerings (recent surveys such as [4,6], each contain more than 30 such

  2. A Collaborative Workflow for the Digitization of Unique Materials

    Science.gov (United States)

    Gueguen, Gretchen; Hanlon, Ann M.

    2009-01-01

    This paper examines the experience of one institution, the University of Maryland Libraries, as it made organizational efforts to harness existing workflows and to capture digitization done in the course of responding to patron requests. By examining the way this organization adjusted its existing workflows to put in place more systematic methods…

  3. "Intelligent" tools for workflow process redesign : a research agenda

    NARCIS (Netherlands)

    Netjes, M.; Vanderfeesten, I.T.P.; Reijers, H.A.; Bussler, C.; Haller, A.

    2006-01-01

    Although much attention is being paid to business processes during the past decades, the design of business processes and particularly workflow processes is still more art than science. In this workshop paper, we present our view on modeling methods for workflow processes and introduce our research

  4. Workflow automation based on OSI job transfer and manipulation

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Joosten, Stef M.M.; Guareis de farias, Cléver

    1999-01-01

    This paper shows that Workflow Management Systems (WFMS) and a data communication standard called Job Transfer and Manipulation (JTM) are built on the same concepts, even though different words are used. The paper analyses the correspondence of workflow concepts and JTM concepts. Besides, the

  5. From Paper Based Clinical Practice Guidelines to Declarative Workflow Management

    DEFF Research Database (Denmark)

    Lyng, Karen Marie; Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2009-01-01

    a sub workflow can be described in a declarative workflow management system: the Resultmaker Online Consultant (ROC). The example demonstrates that declarative primitives allow to naturally extend the paper based flowchart to an executable model without introducing a complex cyclic control flow graph....

  6. Trace spaces

    DEFF Research Database (Denmark)

    Fajstrup, Lisbeth; Goubault, Eric; Haucourt, Emmanuel

    2012-01-01

    in the interleaving semantics of a concurrent program, but rather some equivalence classes. The purpose of this paper is to describe a new algorithm to compute such equivalence classes, and a representative per class, which is based on ideas originating in algebraic topology. We introduce a geometric semantics...... of concurrent languages, where programs are interpreted as directed topological spaces, and study its properties in order to devise an algorithm for computing dihomotopy classes of execution paths. In particular, our algorithm is able to compute a control-flow graph for concurrent programs, possibly containing...... loops, which is “as reduced as possible” in the sense that it generates traces modulo equivalence. A preliminary implementation was achieved, showing promising results towards efficient methods to analyze concurrent programs, with very promising results compared to partial-order reduction techniques....

  7. A practical workflow for making anatomical atlases for biological research.

    Science.gov (United States)

    Wan, Yong; Lewis, A Kelsey; Colasanto, Mary; van Langeveld, Mark; Kardon, Gabrielle; Hansen, Charles

    2012-01-01

    The anatomical atlas has been at the intersection of science and art for centuries. These atlases are essential to biological research, but high-quality atlases are often scarce. Recent advances in imaging technology have made high-quality 3D atlases possible. However, until now there has been a lack of practical workflows using standard tools to generate atlases from images of biological samples. With certain adaptations, CG artists' workflow and tools, traditionally used in the film industry, are practical for building high-quality biological atlases. Researchers have developed a workflow for generating a 3D anatomical atlas using accessible artists' tools. They used this workflow to build a mouse limb atlas for studying the musculoskeletal system's development. This research aims to raise the awareness of using artists' tools in scientific research and promote interdisciplinary collaborations between artists and scientists. This video (http://youtu.be/g61C-nia9ms) demonstrates a workflow for creating an anatomical atlas.

  8. Federated Database Services for Wind Tunnel Experiment Workflows

    Directory of Open Access Journals (Sweden)

    A. Paventhan

    2006-01-01

    Full Text Available Enabling the full life cycle of scientific and engineering workflows requires robust middleware and services that support effective data management, near-realtime data movement and custom data processing. Many existing solutions exploit the database as a passive metadata catalog. In this paper, we present an approach that makes use of federation of databases to host data-centric wind tunnel application workflows. The user is able to compose customized application workflows based on database services. We provide a reference implementation that leverages typical business tools and technologies: Microsoft SQL Server for database services and Windows Workflow Foundation for workflow services. The application data and user's code are both hosted in federated databases. With the growing interest in XML Web Services in scientific Grids, and with databases beginning to support native XML types and XML Web services, we can expect the role of databases in scientific computation to grow in importance.

  9. geoKepler Workflow Module for Computationally Scalable and Reproducible Geoprocessing and Modeling

    Science.gov (United States)

    Cowart, C.; Block, J.; Crawl, D.; Graham, J.; Gupta, A.; Nguyen, M.; de Callafon, R.; Smarr, L.; Altintas, I.

    2015-12-01

    The NSF-funded WIFIRE project has developed an open-source, online geospatial workflow platform for unifying geoprocessing tools and models for for fire and other geospatially dependent modeling applications. It is a product of WIFIRE's objective to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. geoKepler includes a set of reusable GIS components, or actors, for the Kepler Scientific Workflow System (https://kepler-project.org). Actors exist for reading and writing GIS data in formats such as Shapefile, GeoJSON, KML, and using OGC web services such as WFS. The actors also allow for calling geoprocessing tools in other packages such as GDAL and GRASS. Kepler integrates functions from multiple platforms and file formats into one framework, thus enabling optimal GIS interoperability, model coupling, and scalability. Products of the GIS actors can be fed directly to models such as FARSITE and WRF. Kepler's ability to schedule and scale processes using Hadoop and Spark also makes geoprocessing ultimately extensible and computationally scalable. The reusable workflows in geoKepler can be made to run automatically when alerted by real-time environmental conditions. Here, we show breakthroughs in the speed of creating complex data for hazard assessments with this platform. We also demonstrate geoKepler workflows that use Data Assimilation to ingest real-time weather data into wildfire simulations, and for data mining techniques to gain insight into environmental conditions affecting fire behavior. Existing machine learning tools and libraries such as R and MLlib are being leveraged for this purpose in Kepler, as well as Kepler's Distributed Data Parallel (DDP) capability to provide a framework for scalable processing. geoKepler workflows can be executed via an iPython notebook as a part of a Jupyter hub at UC San Diego for sharing and reporting of the scientific analysis and results from

  10. Integrate Data into Scientific Workflows for Terrestrial Biosphere Model Evaluation through Brokers

    Science.gov (United States)

    Wei, Y.; Cook, R. B.; Du, F.; Dasgupta, A.; Poco, J.; Huntzinger, D. N.; Schwalm, C. R.; Boldrini, E.; Santoro, M.; Pearlman, J.; Pearlman, F.; Nativi, S.; Khalsa, S.

    2013-12-01

    Terrestrial biosphere models (TBMs) have become integral tools for extrapolating local observations and process-level understanding of land-atmosphere carbon exchange to larger regions. Model-model and model-observation intercomparisons are critical to understand the uncertainties within model outputs, to improve model skill, and to improve our understanding of land-atmosphere carbon exchange. The DataONE Exploration, Visualization, and Analysis (EVA) working group is evaluating TBMs using scientific workflows in UV-CDAT/VisTrails. This workflow-based approach promotes collaboration and improved tracking of evaluation provenance. But challenges still remain. The multi-scale and multi-discipline nature of TBMs makes it necessary to include diverse and distributed data resources in model evaluation. These include, among others, remote sensing data from NASA, flux tower observations from various organizations including DOE, and inventory data from US Forest Service. A key challenge is to make heterogeneous data from different organizations and disciplines discoverable and readily integrated for use in scientific workflows. This presentation introduces the brokering approach taken by the DataONE EVA to fill the gap between TBMs' evaluation scientific workflows and cross-organization and cross-discipline data resources. The DataONE EVA started the development of an Integrated Model Intercomparison Framework (IMIF) that leverages standards-based discovery and access brokers to dynamically discover, access, and transform (e.g. subset and resampling) diverse data products from DataONE, Earth System Grid (ESG), and other data repositories into a format that can be readily used by scientific workflows in UV-CDAT/VisTrails. The discovery and access brokers serve as an independent middleware that bridge existing data repositories and TBMs evaluation scientific workflows but introduce little overhead to either component. In the initial work, an OpenSearch-based discovery broker

  11. Instrumental neutron activation analysis of site-dependent uptake and distribution of trace elements in the saltmarsh plant Aster tripolium from marsh fields in the Schelde estuary, Netherlands

    International Nuclear Information System (INIS)

    Rossbach, M.

    1986-07-01

    As part of an environmental chemical investigation the uptake of heavy metals by a saltmarsh plant Aster tripolium from two differently polluted salt marsh sites of the North Sea between 20 to 30 trace elements were determined in soil and plant organs. A sensitive gamma ray counting system was installed and tested for instrumental activation analyses (INAA). Installations to improve sensitivity as well as conditions necessary for reliable trace element analysis with the aid of Anticompton spectrometers (ACS) are described. The accuracy and reproducibility of the method was determined by the analysis of reference- and control materials of the german environmental specimen bank. In order to characterise the state of pollution of the salt marsh soils pollution-factors for single elements as well as interelemental correlations were evaluated. In addition, uptake and translocation factors of the biological samples were calculated. The many highly significant correlations between elements within the plant organs indicated that uptake appears to be physiologicaly controlled and not dependent on soil concentration. In order to detect further consequences of differing pollution influences within these plants biochemical separation techniques were applied and trace element levels in selected extracts were determined. For the specification of heavy metals gelpermeation chromatography of ethanolic extracts proved to be the most promising method. Furthermore, propositions for the use of trace elements as a fingerprint for pollution status and characterisation of species for referenz- and specimenbank materials have been developed. Aster tripolium as a cadmium accumulating plant can probably be used as an indicator in the monitoring of cadmium polluted salt marsh areas. (orig.) [de

  12. Abundance, distribution and bioavailability of major and trace elements in surface sediments from the Cai River estuary and Nha Trang Bay (South China Sea, Vietnam)

    Science.gov (United States)

    Koukina, S. E.; Lobus, N. V.; Peresypkin, V. I.; Dara, O. M.; Smurov, A. V.

    2017-11-01

    Major (Si, Al, Fe, Ti, Mg, Ca, Na, K, S, P), minor (Mn) and trace (Li, V, Cr, Co, Ni, Cu, Zn, As, Sr, Zr, Mo, Cd, Ag, Sn, Sb, Cs, Ba, Hg, Pb, Bi and U) elements, their chemical forms and the mineral composition, organic matter (TOC) and carbonates (TIC) in surface sediments from the Cai River estuary and Nha Trang Bay were first determined along the salinity gradient. The abundance and ratio of major and trace elements in surface sediments are discussed in relation to the mineralogy, grain size, depositional conditions, reference background and SQG values. Most trace-element contents are at natural levels and are derived from the composition of rocks and soils in the watershed. A severe enrichment of Ag is most likely derived from metal-rich detrital heavy minerals such as Ag-sulfosalts. Along the salinity gradient, several zones of metal enrichment occur in surface sediments because of the geochemical fractionation of the riverine material. The parts of actually and potentially bioavailable forms (isolated by four single chemical reagent extractions) are most elevated for Mn and Pb (up to 36% and 32% of total content, respectively). The possible anthropogenic input of Pb in the region requires further study. Overall, the most bioavailable parts of trace elements are associated with easily soluble amorphous Fe and Mn oxyhydroxides. The sediments are primarily enriched with bioavailable metal forms in the riverine part of the estuary. Natural (such as turbidities) and human-generated (such as urban and industrial activities) pressures are shown to influence the abundance and speciation of potential contaminants and therefore change their bioavailability in this estuarine system.

  13. Radiology information system: a workflow-based approach

    International Nuclear Information System (INIS)

    Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; Aalst, W.M.P. van der

    2009-01-01

    Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare. (orig.)

  14. Assessment of trace element concentration distribution in human placenta by wavelength dispersive X-ray fluorescence: Effect of neonate weight and maternal age

    International Nuclear Information System (INIS)

    Ozdemir, Yueksel; Boerekci, Buenyamin; Levet, Aytac; Kurudirek, Murat

    2009-01-01

    Trace element status in human placenta is dependent on maternal-neonatal characteristics. This work was undertaken to investigate the correlation between essential trace element concentrations in the placenta and maternal-neonatal characteristics. Placenta samples were collected from total 61 healthy mothers at gestation between 37 and 41 weeks. These samples were investigated with the restriction that the mother's age was 20-40 years old and the neonate's weight was 1-4 kg. Percent concentrations of trace elements were determined using wavelength dispersive X-ray fluorescence (WDXRF). The placenta samples were prepared and analyzed without exposure to any chemical treatment. Concentrations of Fe, Cu and Zn in placenta tissues were found statistically to vary corresponding to the age of the mother and weight of the neonate. In the subjects, the concentration of Fe and Cu were increased in heavier neonates (p<0.05) and the concentration of Zn was increased with increasing mother age (p<0.05). Consequently, the Fe, Cu and Zn elements appear to have interactive connections in human placenta.

  15. Accelerating Science Impact through Big Data Workflow Management and Supercomputing

    Directory of Open Access Journals (Sweden)

    De K.

    2016-01-01

    Full Text Available The Large Hadron Collider (LHC, operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. ATLAS, one of the largest collaborations ever assembled in the the history of science, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. To manage the workflow for all data processing on hundreds of data centers the PanDA (Production and Distributed AnalysisWorkload Management System is used. An ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF, is realizing within BigPanDA and megaPanDA projects. These projects are now exploring how PanDA might be used for managing computing jobs that run on supercomputers including OLCF’s Titan and NRC-KI HPC2. The main idea is to reuse, as much as possible, existing components of the PanDA system that are already deployed on the LHC Grid for analysis of physics data. The next generation of PanDA will allow many data-intensive sciences employing a variety of computing platforms to benefit from ATLAS experience and proven tools in highly scalable processing.

  16. DOMstudio: an integrated workflow for Digital Outcrop Model reconstruction and interpretation

    Science.gov (United States)

    Bistacchi, Andrea

    2015-04-01

    ). This dataset can be used as-is (PC-DOM), or a 3D triangulated surface can be interpolated from the point cloud, and images can be used to associate a texture to this surface (TS-DOM). In the DOMstudio workflow we use both PC-DOMs and TS-DOMs. Particularly, the latter are obtained projecting the original images onto the triangulated surface, without any downsampling, thus retaining the original resolution and quality of images collected in the field. In the DOMstudio interpretation step, PC-DOM is considered the best option for fracture analysis in outcrops where facets corresponding to fractures are present. This allows obtaining orientation statistics (e.g. stereoplots, Fisher statistics, etc.) directly from a point cloud where, for each point, the unit vector normal to the outcrop surface has been calculated. A recent development in this kind of processing is represented by the possibility to automatically select (segment) subset point clouds representing single fracture surfaces, which can be used for studies on fracture length, spacing, etc., allowing to obtain parameters like the frequency-length distribution, P21, etc. PC-DOM interpretation can be combined or complemented, depending on the outcrop morphology, with an interpretation carried out on a TS-DOM in terms of traces, which are the linear intersection of "geological" surfaces (fractures, faults, bedding, etc.) with the outcrop surface. This kind of interpretation is very well suited for outcrops with smooth surfaces, and can be performed either by manual picking, or by applying image analysis techniques on the images associated with the DOM. In this case, a huge mass of data, with very high resolution, can be collected very effectively. If we consider applications like lithological or mineral map-ping, TS-DOM datasets are the only suitable support. Finally, the DOMstudio workflow produces output in formats that are compatible with all common geomodelling packages (e.g. Gocad/Skua, Petrel, Move), allowing

  17. Exploring Dental Providers' Workflow in an Electronic Dental Record Environment.

    Science.gov (United States)

    Schwei, Kelsey M; Cooper, Ryan; Mahnke, Andrea N; Ye, Zhan; Acharya, Amit

    2016-01-01

    A workflow is defined as a predefined set of work steps and partial ordering of these steps in any environment to achieve the expected outcome. Few studies have investigated the workflow of providers in a dental office. It is important to understand the interaction of dental providers with the existing technologies at point of care to assess breakdown in the workflow which could contribute to better technology designs. The study objective was to assess electronic dental record (EDR) workflows using time and motion methodology in order to identify breakdowns and opportunities for process improvement. A time and motion methodology was used to study the human-computer interaction and workflow of dental providers with an EDR in four dental centers at a large healthcare organization. A data collection tool was developed to capture the workflow of dental providers and staff while they interacted with an EDR during initial, planned, and emergency patient visits, and at the front desk. Qualitative and quantitative analysis was conducted on the observational data. Breakdowns in workflow were identified while posting charges, viewing radiographs, e-prescribing, and interacting with patient scheduler. EDR interaction time was significantly different between dentists and dental assistants (6:20 min vs. 10:57 min, p = 0.013) and between dentists and dental hygienists (6:20 min vs. 9:36 min, p = 0.003). On average, a dentist spent far less time than dental assistants and dental hygienists in data recording within the EDR.

  18. Workflow Lexicons in Healthcare: Validation of the SWIM Lexicon.

    Science.gov (United States)

    Meenan, Chris; Erickson, Bradley; Knight, Nancy; Fossett, Jewel; Olsen, Elizabeth; Mohod, Prerna; Chen, Joseph; Langer, Steve G

    2017-06-01

    For clinical departments seeking to successfully navigate the challenges of modern health reform, obtaining access to operational and clinical data to establish and sustain goals for improving quality is essential. More broadly, health delivery organizations are also seeking to understand performance across multiple facilities and often across multiple electronic medical record (EMR) systems. Interpreting operational data across multiple vendor systems can be challenging, as various manufacturers may describe different departmental workflow steps in different ways and sometimes even within a single vendor's installed customer base. In 2012, The Society for Imaging Informatics in Medicine (SIIM) recognized the need for better quality and performance data standards and formed SIIM's Workflow Initiative for Medicine (SWIM), an initiative designed to consistently describe workflow steps in radiology departments as well as defining operational quality metrics. The SWIM lexicon was published as a working model to describe operational workflow steps and quality measures. We measured the prevalence of the SWIM lexicon workflow steps in both academic and community radiology environments using real-world patient observations and correlated that information with automatically captured workflow steps from our clinical information systems. Our goal was to measure frequency of occurrence of workflow steps identified by the SWIM lexicon in a real-world clinical setting, as well as to correlate how accurately departmental information systems captured patient flow through our health facility.

  19. Examining daily activity routines of older adults using workflow.

    Science.gov (United States)

    Chung, Jane; Ozkaynak, Mustafa; Demiris, George

    2017-07-01

    We evaluated the value of workflow analysis supported by a novel visualization technique to better understand the daily routines of older adults and highlight their patterns of daily activities and normal variability in physical functions. We used a self-reported activity diary to obtain data from six community-dwelling older adults for 14 consecutive days. Workflow for daily routine was analyzed using the EventFlow tool, which aggregates workflow information to highlight patterns and variabilities. A total of 1453 events were included in the data analysis. To demonstrate the patterns and variability of each individual's daily activities, participant activity workflows were visualized and compared. The workflow analysis revealed great variability in activity types, regularity, frequency, duration, and timing of performing certain activities across individuals. Also, when workflow approach was applied to spatial information of activities, the analysis revealed the ability to provide meaningful data on individuals' mobility in different levels of life spaces from home to community. Results suggest that using workflows to characterize the daily activities of older adults will be helpful for clinicians and researchers in understanding their daily routines and preparing education and prevention strategies tailored to each individual's activity level. This tool also has the potential to be integrated into consumer informatics technologies, such as patient portals or personal health records, so that consumers may be encouraged to become actively involved in monitoring and managing their health. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    Science.gov (United States)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  1. An open source approach to enable the reproducibility of scientific workflows in the ocean sciences

    Science.gov (United States)

    Di Stefano, M.; Fox, P. A.; West, P.; Hare, J. A.; Maffei, A. R.

    2013-12-01

    Every scientist should be able to rerun data analyses conducted by his or her team and regenerate the figures in a paper. However, all too often the correct version of a script goes missing, or the original raw data is filtered by hand and the filtering process is undocumented, or there is lack of collaboration and communication among scientists working in a team. Here we present 3 different use cases in ocean sciences in which end-to-end workflows are tracked. The main tool that is deployed to address these use cases is based on a web application (IPython Notebook) that provides the ability to work on very diverse and heterogeneous data and information sources, providing an effective way to share the and track changes to source code used to generate data products and associated metadata, as well as to track the overall workflow provenance to allow versioned reproducibility of a data product. Use cases selected for this work are: 1) A partial reproduction of the Ecosystem Status Report (ESR) for the Northeast U.S. Continental Shelf Large Marine Ecosystem. Our goal with this use case is to enable not just the traceability but also the reproducibility of this biannual report, keeping track of all the processes behind the generation and validation of time-series and spatial data and information products. An end-to-end workflow with code versioning is developed so that indicators in the report may be traced back to the source datasets. 2) Realtime generation of web pages to be able to visualize one of the environmental indicators from the Ecosystem Advisory for the Northeast Shelf Large Marine Ecosystem web site. 3) Data and visualization integration for ocean climate forecasting. In this use case, we focus on a workflow to describe how to provide access to online data sources in the NetCDF format and other model data, and make use of multicore processing to generate video animation from time series of gridded data. For each use case we show how complete workflows

  2. Tracers and tracing methods

    International Nuclear Information System (INIS)

    Leclerc, J.P.

    2001-01-01

    The first international congress on 'Tracers and tracing methods' took place in Nancy in May 2001. The objective of this second congress was to present the current status and trends on tracing methods and their applications. It has given the opportunity to people from different fields to exchange scientific information and knowledge about tracer methodologies and applications. The target participants were the researchers, engineers and technologists of various industrial and research sectors: chemical engineering, environment, food engineering, bio-engineering, geology, hydrology, civil engineering, iron and steel production... Two sessions have been planned to cover both fundamental and industrial aspects: 1)fundamental development (tomography, tracer camera visualization and particles tracking; validation of computational fluid dynamics simulations by tracer experiments and numerical residence time distribution; new tracers and detectors or improvement and development of existing tracing methods; data treatments and modeling; reactive tracer experiments and interpretation) 2)industrial applications (geology, hydrogeology and oil field applications; civil engineering, mineral engineering and metallurgy applications; chemical engineering; environment; food engineering and bio-engineering). The program included 5 plenary lectures, 23 oral communications and around 50 posters. Only 9 presentations are interested for the INIS database

  3. Design and evaluation of a multimedia electronic patient record "oncoflow" with clinical workflow assistance for head and neck tumor therapy.

    Science.gov (United States)

    Meier, Jens; Boehm, Andreas; Kielhorn, Anne; Dietz, Andreas; Bohn, Stefan; Neumuth, Thomas

    2014-11-01

    The management of patient-specific information is a challenging task for surgeons and physicians because existing clinical information systems are insufficiently integrated into daily clinical routine and contained information entities are distributed across different proprietary databases. Thus, existing information is hardly usable for further electronic processing, workflow support or clinical studies. A Web-based clinical information system has been developed that automatically imports patient-specific information from different information systems. The system is tailored to the existing workflow for the treatment of patients with head and neck cancer. In this paper, the clinical assistance functions and a quantitative as well as a qualitative system evaluation are presented. The information system has been deployed at a clinical site and is in use in daily clinical routine. Two evaluation studies show that the information integration, the structured information presentation in the Web browser and the assistance functions improve the physician's workflow. The studies also show that the usage of the new information system does not impair the time physicians need for a process step compared with the usage of the existing information system. Information integration is crucial for efficient workflow support in the clinic. The central access to information within a modern and structured user interface saves valuable time for the physician. The comprehensive database allows an instant usage of the existing information clinical workflow support or the conduction of trial studies.

  4. Trace Element Accumulation and Tissue Distribution in the Purpleback Flying Squid Sthenoteuthis oualaniensis from the Central and Southern South China Sea.

    Science.gov (United States)

    Wu, Yan Yan; Shen, Yu; Huang, Hui; Yang, Xian Qing; Zhao, Yong Qiang; Cen, Jian Wei; Qi, Bo

    2017-01-01

    Sthenoteuthis oualaniensis is a species of cephalopod that is becoming economically important in the South China Sea. As, Cd, Cr, Cu, Hg, Pb, and Zn concentrations were determined in the mantle, arms, and digestive gland of S. oualaniensis from 31 oceanographic survey stations in the central and southern South China Sea. Intraspecific and interspecific comparisons with previous studies were made. Mean concentrations of trace elements analyzed in arms and mantle were in the following orders: Zn > Cu > Cd > Cr > As > Hg. In digestive gland, the concentrations of Cd and Cu exceed that of Zn. All the Pb concentrations were under the detected limit.

  5. Statistical behavior and geological significance of the geochemical distribution of trace elements in the Cretaceous volcanics Cordoba and San Luis, Argentina

    International Nuclear Information System (INIS)

    Daziano, C.

    2010-01-01

    Statistical analysis of trace elements in volcanics research s, allowed to distinguish two independent populations with the same geochemical environment. For each component they have variable index of homogeneity resulting in dissimilar average values that reveal geochemical intra telluric phenomena. On the other hand the inhomogeneities observed in these rocks - as reflected in its petrochemical characters - could be exacerbated especially at so remote and dispersed location of their pitches, their relations with the enclosing rocks for the ranges of compositional variation, due differences relative ages

  6. Deploying and sharing U-Compare workflows as web services.

    Science.gov (United States)

    Kontonatsios, Georgios; Korkontzelos, Ioannis; Kolluru, Balakrishna; Thompson, Paul; Ananiadou, Sophia

    2013-02-18

    U-Compare is a text mining platform that allows the construction, evaluation and comparison of text mining workflows. U-Compare contains a large library of components that are tuned to the biomedical domain. Users can rapidly develop biomedical text mining workflows by mixing and matching U-Compare's components. Workflows developed using U-Compare can be exported and sent to other users who, in turn, can import and re-use them. However, the resulting workflows are standalone applications, i.e., software tools that run and are accessible only via a local machine, and that can only be run with the U-Compare platform. We address the above issues by extending U-Compare to convert standalone workflows into web services automatically, via a two-click process. The resulting web services can be registered on a central server and made publicly available. Alternatively, users can make web services available on their own servers, after installing the web application framework, which is part of the extension to U-Compare. We have performed a user-oriented evaluation of the proposed extension, by asking users who have tested the enhanced functionality of U-Compare to complete questionnaires that assess its functionality, reliability, usability, efficiency and maintainability. The results obtained reveal that the new functionality is well received by users. The web services produced by U-Compare are built on top of open standards, i.e., REST and SOAP protocols, and therefore, they are decoupled from the underlying platform. Exported workflows can be integrated with any application that supports these open standards. We demonstrate how the newly extended U-Compare enhances the cross-platform interoperability of workflows, by seamlessly importing a number of text mining workflow web services exported from U-Compare into Taverna, i.e., a generic scientific workflow construction platform.

  7. Enhanced reproducibility of SADI web service workflows with Galaxy and Docker.

    Science.gov (United States)

    Aranguren, Mikel Egaña; Wilkinson, Mark D

    2015-01-01

    Semantic Web technologies have been widely applied in the life sciences, for example by data providers such as OpenLifeData and through web services frameworks such as SADI. The recently reported OpenLifeData2SADI project offers access to the vast OpenLifeData data store through SADI services. This article describes how to merge data retrieved from OpenLifeData2SADI with other SADI services using the Galaxy bioinformatics analysis platform, thus making this semantic data more amenable to complex analyses. This is demonstrated using a working example, which is made distributable and reproducible through a Docker image that includes SADI tools, along with the data and workflows that constitute the demonstration. The combination of Galaxy and Docker offers a solution for faithfully reproducing and sharing complex data retrieval and analysis workflows based on the SADI Semantic web service design patterns.

  8. From Data to Knowledge to Discoveries: Artificial Intelligence and Scientific Workflows

    Directory of Open Access Journals (Sweden)

    Yolanda Gil

    2009-01-01

    Full Text Available Scientific computing has entered a new era of scale and sharing with the arrival of cyberinfrastructure facilities for computational experimentation. A key emerging concept is scientific workflows, which provide a declarative representation of complex scientific applications that can be automatically managed and executed in distributed shared resources. In the coming decades, computational experimentation will push the boundaries of current cyberinfrastructure in terms of inter-disciplinary scope and integrative models of scientific phenomena under study. This paper argues that knowledge-rich workflow environments will provide necessary capabilities for that vision by assisting scientists to validate and vet complex analysis processes and by automating important aspects of scientific exploration and discovery.

  9. CMS Alignement and Calibration workflows: lesson learned and future plans

    CERN Document Server

    AUTHOR|(CDS)2069172

    2014-01-01

    We review the online and offline workflows designed to align and calibrate the CMS detector. Starting from the gained experience during the first LHC run, we discuss the expected developments for Run II. In particular, we describe the envisioned different stages, from the alignment using cosmic rays data to the detector alignment and calibration using the first proton-proton collisions data ( O(100 pb-1) ) and a larger dataset ( O(1 fb-1) ) to reach the target precision. The automatisation of the workflow and the integration in the online and offline activity (dedicated triggers and datasets, data skims, workflows to compute the calibration and alignment constants) are discussed.

  10. What is needed for effective open access workflows?

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Institutions and funders are pushing forward open access with ever new guidelines and policies. Since institutional repositories are important maintainers of green open access, they should support easy and fast workflows for researchers and libraries to release publications. Based on the requirements specification of researchers, libraries and publishers, possible supporting software extensions are discussed. How does a typical workflow look like? What has to be considered by the researchers and by the editors in the library before releasing a green open access publication? Where and how can software support and improve existing workflows?

  11. EDMS based workflow for Printing Industry

    Directory of Open Access Journals (Sweden)

    Prathap Nayak

    2013-04-01

    Full Text Available Information is indispensable factor of any enterprise. It can be a record or a document generated for every transaction that is made, which is either a paper based or in electronic format for future reference. A Printing Industry is one such industry in which managing information of various formats, with latest workflows and technologies, could be a nightmare and a challenge for any operator or an user when each process from the least bit of information to a printed product are always dependendent on each other. Hence the information has to be harmonized artistically in order to avoid production downtime or employees pointing fingers at each other. This paper analyses how the implementation of Electronic Document Management System (EDMS could contribute to the Printing Industry for immediate access to stored documents within and across departments irrespective of geographical boundaries. The paper outlines initially with a brief history, contemporary EDMS system and some illustrated examples with a study done by choosing Library as a pilot area for evaluating EDMS. The paper ends with an imitative proposal that maps several document management based activities for implementation of EDMS for a Printing Industry.

  12. Resilient workflows for computational mechanics platforms

    International Nuclear Information System (INIS)

    Nguyen, Toan; Trifan, Laurentiu; Desideri, Jean-Antoine

    2010-01-01

    Workflow management systems have recently been the focus of much interest and many research and deployment for scientific applications worldwide. Their ability to abstract the applications by wrapping application codes have also stressed the usefulness of such systems for multidiscipline applications. When complex applications need to provide seamless interfaces hiding the technicalities of the computing infrastructures, their high-level modeling, monitoring and execution functionalities help giving production teams seamless and effective facilities. Software integration infrastructures based on programming paradigms such as Python, Mathlab and Scilab have also provided evidence of the usefulness of such approaches for the tight coupling of multidisciplne application codes. Also high-performance computing based on multi-core multi-cluster infrastructures open new opportunities for more accurate, more extensive and effective robust multi-discipline simulations for the decades to come. This supports the goal of full flight dynamics simulation for 3D aircraft models within the next decade, opening the way to virtual flight-tests and certification of aircraft in the future.

  13. Resilient workflows for computational mechanics platforms

    Science.gov (United States)

    Nguyên, Toàn; Trifan, Laurentiu; Désidéri, Jean-Antoine

    2010-06-01

    Workflow management systems have recently been the focus of much interest and many research and deployment for scientific applications worldwide [26, 27]. Their ability to abstract the applications by wrapping application codes have also stressed the usefulness of such systems for multidiscipline applications [23, 24]. When complex applications need to provide seamless interfaces hiding the technicalities of the computing infrastructures, their high-level modeling, monitoring and execution functionalities help giving production teams seamless and effective facilities [25, 31, 33]. Software integration infrastructures based on programming paradigms such as Python, Mathlab and Scilab have also provided evidence of the usefulness of such approaches for the tight coupling of multidisciplne application codes [22, 24]. Also high-performance computing based on multi-core multi-cluster infrastructures open new opportunities for more accurate, more extensive and effective robust multi-discipline simulations for the decades to come [28]. This supports the goal of full flight dynamics simulation for 3D aircraft models within the next decade, opening the way to virtual flight-tests and certification of aircraft in the future [23, 24, 29].

  14. Reproducible Bioconductor workflows using browser-based interactive notebooks and containers.

    Science.gov (United States)

    Almugbel, Reem; Hung, Ling-Hong; Hu, Jiaming; Almutairy, Abeer; Ortogero, Nicole; Tamta, Yashaswi; Yeung, Ka Yee

    2018-01-01

    Bioinformatics publications typically include complex software workflows that are difficult to describe in a manuscript. We describe and demonstrate the use of interactive software notebooks to document and distribute bioinformatics research. We provide a user-friendly tool, BiocImageBuilder, that allows users to easily distribute their bioinformatics protocols through interactive notebooks uploaded to either a GitHub repository or a private server. We present four different interactive Jupyter notebooks using R and Bioconductor workflows to infer differential gene expression, analyze cross-platform datasets, process RNA-seq data and KinomeScan data. These interactive notebooks are available on GitHub. The analytical results can be viewed in a browser. Most importantly, the software contents can be executed and modified. This is accomplished using Binder, which runs the notebook inside software containers, thus avoiding the need to install any software and ensuring reproducibility. All the notebooks were produced using custom files generated by BiocImageBuilder. BiocImageBuilder facilitates the publication of workflows with a point-and-click user interface. We demonstrate that interactive notebooks can be used to disseminate a wide range of bioinformatics analyses. The use of software containers to mirror the original software environment ensures reproducibility of results. Parameters and code can be dynamically modified, allowing for robust verification of published results and encouraging rapid adoption of new methods. Given the increasing complexity of bioinformatics workflows, we anticipate that these interactive software notebooks will become as necessary for documenting software methods as traditional laboratory notebooks have been for documenting bench protocols, and as ubiquitous. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  15. A standard-enabled workflow for synthetic biology

    KAUST Repository

    Myers, Chris J.

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications.

  16. A standard-enabled workflow for synthetic biology

    KAUST Repository

    Myers, Chris J.; Beal, Jacob; Gorochowski, Thomas E.; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Gö ksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach

    2017-01-01

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select

  17. Text mining meets workflow: linking U-Compare with Taverna

    Science.gov (United States)

    Kano, Yoshinobu; Dobson, Paul; Nakanishi, Mio; Tsujii, Jun'ichi; Ananiadou, Sophia

    2010-01-01

    Summary: Text mining from the biomedical literature is of increasing importance, yet it is not easy for the bioinformatics community to create and run text mining workflows due to the lack of accessibility and interoperability of the text mining resources. The U-Compare system provides a wide range of bio text mining resources in a highly interoperable workflow environment where workflows can very easily be created, executed, evaluated and visualized without coding. We have linked U-Compare to Taverna, a generic workflow system, to expose text mining functionality to the bioinformatics community. Availability: http://u-compare.org/taverna.html, http://u-compare.org Contact: kano@is.s.u-tokyo.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20709690

  18. Job life cycle management libraries for CMS workflow management projects

    International Nuclear Information System (INIS)

    Lingen, Frank van; Wilkinson, Rick; Evans, Dave; Foulkes, Stephen; Afaq, Anzar; Vaandering, Eric; Ryu, Seangchan

    2010-01-01

    Scientific analysis and simulation requires the processing and generation of millions of data samples. These tasks are often comprised of multiple smaller tasks divided over multiple (computing) sites. This paper discusses the Compact Muon Solenoid (CMS) workflow infrastructure, and specifically the Python based workflow library which is used for so called task lifecycle management. The CMS workflow infrastructure consists of three layers: high level specification of the various tasks based on input/output data sets, life cycle management of task instances derived from the high level specification and execution management. The workflow library is the result of a convergence of three CMS sub projects that respectively deal with scientific analysis, simulation and real time data aggregation from the experiment. This will reduce duplication and hence development and maintenance costs.

  19. A Community-Driven Workflow Recommendation and Reuse Infrastructure

    Data.gov (United States)

    National Aeronautics and Space Administration — Promote and encourage process and workflow reuse  within NASA Earth eXchange (NEX) by developing a proactive recommendation technology based on collective NEX user...

  20. Scientific workflows as productivity tools for drug discovery.

    Science.gov (United States)

    Shon, John; Ohkawa, Hitomi; Hammer, Juergen

    2008-05-01

    Large pharmaceutical companies annually invest tens to hundreds of millions of US dollars in research informatics to support their early drug discovery processes. Traditionally, most of these investments are designed to increase the efficiency of drug discovery. The introduction of do-it-yourself scientific workflow platforms has enabled research informatics organizations to shift their efforts toward scientific innovation, ultimately resulting in a possible increase in return on their investments. Unlike the handling of most scientific data and application integration approaches, researchers apply scientific workflows to in silico experimentation and exploration, leading to scientific discoveries that lie beyond automation and integration. This review highlights some key requirements for scientific workflow environments in the pharmaceutical industry that are necessary for increasing research productivity. Examples of the application of scientific workflows in research and a summary of recent platform advances are also provided.