WorldWideScience

Sample records for open workflow environment

  1. CDK-Taverna: an open workflow environment for cheminformatics

    Directory of Open Access Journals (Sweden)

    Zielesny Achim

    2010-03-01

    Full Text Available Abstract Background Small molecules are of increasing interest for bioinformatics in areas such as metabolomics and drug discovery. The recent release of large open access chemistry databases generates a demand for flexible tools to process them and discover new knowledge. To freely support open science based on these data resources, it is desirable for the processing tools to be open source and available for everyone. Results Here we describe a novel combination of the workflow engine Taverna and the cheminformatics library Chemistry Development Kit (CDK resulting in a open source workflow solution for cheminformatics. We have implemented more than 160 different workers to handle specific cheminformatics tasks. We describe the applications of CDK-Taverna in various usage scenarios. Conclusions The combination of the workflow engine Taverna and the Chemistry Development Kit provides the first open source cheminformatics workflow solution for the biosciences. With the Taverna-community working towards a more powerful workflow engine and a more user-friendly user interface, CDK-Taverna has the potential to become a free alternative to existing proprietary workflow tools.

  2. New developments on the cheminformatics open workflow environment CDK-Taverna.

    Science.gov (United States)

    Truszkowski, Andreas; Jayaseelan, Kalai Vanii; Neumann, Stefan; Willighagen, Egon L; Zielesny, Achim; Steinbeck, Christoph

    2011-12-13

    The computational processing and analysis of small molecules is at heart of cheminformatics and structural bioinformatics and their application in e.g. metabolomics or drug discovery. Pipelining or workflow tools allow for the Lego™-like, graphical assembly of I/O modules and algorithms into a complex workflow which can be easily deployed, modified and tested without the hassle of implementing it into a monolithic application. The CDK-Taverna project aims at building a free open-source cheminformatics pipelining solution through combination of different open-source projects such as Taverna, the Chemistry Development Kit (CDK) or the Waikato Environment for Knowledge Analysis (WEKA). A first integrated version 1.0 of CDK-Taverna was recently released to the public. The CDK-Taverna project was migrated to the most up-to-date versions of its foundational software libraries with a complete re-engineering of its worker's architecture (version 2.0). 64-bit computing and multi-core usage by paralleled threads are now supported to allow for fast in-memory processing and analysis of large sets of molecules. Earlier deficiencies like workarounds for iterative data reading are removed. The combinatorial chemistry related reaction enumeration features are considerably enhanced. Additional functionality for calculating a natural product likeness score for small molecules is implemented to identify possible drug candidates. Finally the data analysis capabilities are extended with new workers that provide access to the open-source WEKA library for clustering and machine learning as well as training and test set partitioning. The new features are outlined with usage scenarios. CDK-Taverna 2.0 as an open-source cheminformatics workflow solution matured to become a freely available and increasingly powerful tool for the biosciences. The combination of the new CDK-Taverna worker family with the already available workflows developed by a lively Taverna community and published on

  3. New developments on the cheminformatics open workflow environment CDK-Taverna

    Directory of Open Access Journals (Sweden)

    Truszkowski Andreas

    2011-12-01

    Full Text Available Abstract Background The computational processing and analysis of small molecules is at heart of cheminformatics and structural bioinformatics and their application in e.g. metabolomics or drug discovery. Pipelining or workflow tools allow for the Lego™-like, graphical assembly of I/O modules and algorithms into a complex workflow which can be easily deployed, modified and tested without the hassle of implementing it into a monolithic application. The CDK-Taverna project aims at building a free open-source cheminformatics pipelining solution through combination of different open-source projects such as Taverna, the Chemistry Development Kit (CDK or the Waikato Environment for Knowledge Analysis (WEKA. A first integrated version 1.0 of CDK-Taverna was recently released to the public. Results The CDK-Taverna project was migrated to the most up-to-date versions of its foundational software libraries with a complete re-engineering of its worker's architecture (version 2.0. 64-bit computing and multi-core usage by paralleled threads are now supported to allow for fast in-memory processing and analysis of large sets of molecules. Earlier deficiencies like workarounds for iterative data reading are removed. The combinatorial chemistry related reaction enumeration features are considerably enhanced. Additional functionality for calculating a natural product likeness score for small molecules is implemented to identify possible drug candidates. Finally the data analysis capabilities are extended with new workers that provide access to the open-source WEKA library for clustering and machine learning as well as training and test set partitioning. The new features are outlined with usage scenarios. Conclusions CDK-Taverna 2.0 as an open-source cheminformatics workflow solution matured to become a freely available and increasingly powerful tool for the biosciences. The combination of the new CDK-Taverna worker family with the already available workflows

  4. Akuna: An Open Source User Environment for Managing Subsurface Simulation Workflows

    Science.gov (United States)

    Freedman, V. L.; Agarwal, D.; Bensema, K.; Finsterle, S.; Gable, C. W.; Keating, E. H.; Krishnan, H.; Lansing, C.; Moeglein, W.; Pau, G. S. H.; Porter, E.; Scheibe, T. D.

    2014-12-01

    The U.S. Department of Energy (DOE) is investing in development of a numerical modeling toolset called ASCEM (Advanced Simulation Capability for Environmental Management) to support modeling analyses at legacy waste sites. ASCEM is an open source and modular computing framework that incorporates new advances and tools for predicting contaminant fate and transport in natural and engineered systems. The ASCEM toolset includes both a Platform with Integrated Toolsets (called Akuna) and a High-Performance Computing multi-process simulator (called Amanzi). The focus of this presentation is on Akuna, an open-source user environment that manages subsurface simulation workflows and associated data and metadata. In this presentation, key elements of Akuna are demonstrated, which includes toolsets for model setup, database management, sensitivity analysis, parameter estimation, uncertainty quantification, and visualization of both model setup and simulation results. A key component of the workflow is in the automated job launching and monitoring capabilities, which allow a user to submit and monitor simulation runs on high-performance, parallel computers. Visualization of large outputs can also be performed without moving data back to local resources. These capabilities make high-performance computing accessible to the users who might not be familiar with batch queue systems and usage protocols on different supercomputers and clusters.

  5. Enabling Structured Exploration of Workflow Performance Variability in Extreme-Scale Environments

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin; Stephan, Eric G.; Raju, Bibi; Altintas, Ilkay; Elsethagen, Todd O.; Krishnamoorthy, Sriram

    2015-11-15

    Workflows are taking an Workflows are taking an increasingly important role in orchestrating complex scientific processes in extreme scale and highly heterogeneous environments. However, to date we cannot reliably predict, understand, and optimize workflow performance. Sources of performance variability and in particular the interdependencies of workflow design, execution environment and system architecture are not well understood. While there is a rich portfolio of tools for performance analysis, modeling and prediction for single applications in homogenous computing environments, these are not applicable to workflows, due to the number and heterogeneity of the involved workflow and system components and their strong interdependencies. In this paper, we investigate workflow performance goals and identify factors that could have a relevant impact. Based on our analysis, we propose a new workflow performance provenance ontology, the Open Provenance Model-based WorkFlow Performance Provenance, or OPM-WFPP, that will enable the empirical study of workflow performance characteristics and variability including complex source attribution.

  6. What is needed for effective open access workflows?

    CERN Document Server

    CERN. Geneva

    2017-01-01

    Institutions and funders are pushing forward open access with ever new guidelines and policies. Since institutional repositories are important maintainers of green open access, they should support easy and fast workflows for researchers and libraries to release publications. Based on the requirements specification of researchers, libraries and publishers, possible supporting software extensions are discussed. How does a typical workflow look like? What has to be considered by the researchers and by the editors in the library before releasing a green open access publication? Where and how can software support and improve existing workflows?

  7. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert [UConn Health, Department of Molecular Biology and Biophysics (United States); Martyn, Timothy O. [Rensselaer at Hartford, Department of Engineering and Science (United States); Ellis, Heidi J. C. [Western New England College, Department of Computer Science and Information Technology (United States); Gryk, Michael R., E-mail: gryk@uchc.edu [UConn Health, Department of Molecular Biology and Biophysics (United States)

    2015-07-15

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  8. Beyond Scientific Workflows: Networked Open Processes

    NARCIS (Netherlands)

    Cushing, R.; Bubak, M.; Belloum, A.; de Laat, C.

    2013-01-01

    The multitude of scientific services and processes being developed brings about challenges for future in silico distributed experiments. Choosing the correct service from an expanding body of processes means that the the task of manually building workflows is becoming untenable. In this paper we pro

  9. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  10. Workflow Based Software Development Environment Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed research is to investigate and develop a workflow based tool, the Software Developers Assistant, to facilitate the collaboration between...

  11. FOSS geospatial libraries in scientific workflow environments: experiences and directions

    CSIR Research Space (South Africa)

    McFerren, G

    2011-07-01

    Full Text Available and simplification), data export and spatial overlay operations commonly are required. We find a relative lack of support for geospatial data, services and these functions within several Free and Open Source Software (FOSS) scientific workflow packages. Furthermore...

  12. Science Gateways, Scientific Workflows and Open Community Software

    Science.gov (United States)

    Pierce, M. E.; Marru, S.

    2014-12-01

    Science gateways and scientific workflows occupy different ends of the spectrum of user-focused cyberinfrastructure. Gateways, sometimes called science portals, provide a way for enabling large numbers of users to take advantage of advanced computing resources (supercomputers, advanced storage systems, science clouds) by providing Web and desktop interfaces and supporting services. Scientific workflows, at the other end of the spectrum, support advanced usage of cyberinfrastructure that enable "power users" to undertake computational experiments that are not easily done through the usual mechanisms (managing simulations across multiple sites, for example). Despite these different target communities, gateways and workflows share many similarities and can potentially be accommodated by the same software system. For example, pipelines to process InSAR imagery sets or to datamine GPS time series data are workflows. The results and the ability to make downstream products may be made available through a gateway, and power users may want to provide their own custom pipelines. In this abstract, we discuss our efforts to build an open source software system, Apache Airavata, that can accommodate both gateway and workflow use cases. Our approach is general, and we have applied the software to problems in a number of scientific domains. In this talk, we discuss our applications to usage scenarios specific to earth science, focusing on earthquake physics examples drawn from the QuakSim.org and GeoGateway.org efforts. We also examine the role of the Apache Software Foundation's open community model as a way to build up common commmunity codes that do not depend upon a single "owner" to sustain. Pushing beyond open source software, we also see the need to provide gateways and workflow systems as cloud services. These services centralize operations, provide well-defined programming interfaces, scale elastically, and have global-scale fault tolerance. We discuss our work providing

  13. WooW-II: Workshop on open workflows

    Directory of Open Access Journals (Sweden)

    Daniel Arribas-Bel

    2015-07-01

    Full Text Available This resource describes WooW-II, a two-day workshop on open workflows for quantitative social scientists. The workshop is broken down in five main parts, where each of them typically consists of an introductionary tutorial and a hands-on assignment. The specific tools discussed in this workshop are Markdown, Pandoc, Git, Github, R, and Rstudio, but the theoretical approach applies to a wider range of tools (e.g., LATEX and Python. By the end of the workshop, participants should be able to reproduce a paper of their own and make it available in an open form applying the concepts and tools introduced.

  14. Workflows for microarray data processing in the Kepler environment

    Directory of Open Access Journals (Sweden)

    Stropp Thomas

    2012-05-01

    Full Text Available Abstract Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data and therefore are close to

  15. Reproducible Large-Scale Neuroimaging Studies with the OpenMOLE Workflow Management System.

    Science.gov (United States)

    Passerat-Palmbach, Jonathan; Reuillon, Romain; Leclaire, Mathieu; Makropoulos, Antonios; Robinson, Emma C; Parisot, Sarah; Rueckert, Daniel

    2017-01-01

    OpenMOLE is a scientific workflow engine with a strong emphasis on workload distribution. Workflows are designed using a high level Domain Specific Language (DSL) built on top of Scala. It exposes natural parallelism constructs to easily delegate the workload resulting from a workflow to a wide range of distributed computing environments. OpenMOLE hides the complexity of designing complex experiments thanks to its DSL. Users can embed their own applications and scale their pipelines from a small prototype running on their desktop computer to a large-scale study harnessing distributed computing infrastructures, simply by changing a single line in the pipeline definition. The construction of the pipeline itself is decoupled from the execution context. The high-level DSL abstracts the underlying execution environment, contrary to classic shell-script based pipelines. These two aspects allow pipelines to be shared and studies to be replicated across different computing environments. Workflows can be run as traditional batch pipelines or coupled with OpenMOLE's advanced exploration methods in order to study the behavior of an application, or perform automatic parameter tuning. In this work, we briefly present the strong assets of OpenMOLE and detail recent improvements targeting re-executability of workflows across various Linux platforms. We have tightly coupled OpenMOLE with CARE, a standalone containerization solution that allows re-executing on a Linux host any application that has been packaged on another Linux host previously. The solution is evaluated against a Python-based pipeline involving packages such as scikit-learn as well as binary dependencies. All were packaged and re-executed successfully on various HPC environments, with identical numerical results (here prediction scores) obtained on each environment. Our results show that the pair formed by OpenMOLE and CARE is a reliable solution to generate reproducible results and re-executable pipelines. A

  16. Actor-driven workflow execution in distributed environments

    NARCIS (Netherlands)

    F. Berretz; S. Skorupa; V. Sander; A. Belloum; M. Bubak

    2010-01-01

    Currently, most workflow management systems (WfMS) in Grid environments provide push-oriented task distribution strategies, where tasks are directly bound to suitable resources. In those scenarios the dedicated resources execute the submitted tasks according to the request of a WfMS or sometimes by

  17. Actor-driven workflow execution in distributed environments

    NARCIS (Netherlands)

    Berretz, F.; Skorupa, S.; Sander, V.; Belloum, A.; Bubak, M.

    2011-01-01

    Currently, most workflow management systems (WfMS) in Grid environments provide push-oriented task distribution strategies, where tasks are directly bound to suitable resources. In those scenarios the dedicated resources execute the submitted tasks according to the request of a WfMS or sometimes by

  18. A High Throughput Workflow Environment for Cosmological Simulations

    CERN Document Server

    Erickson, Brandon M S; Evrard, August E; Becker, Matthew R; Busha, Michael T; Kravtsov, Andrey V; Marru, Suresh; Pierce, Marlon; Wechsler, Risa H

    2012-01-01

    The next generation of wide-area sky surveys offer the power to place extremely precise constraints on cosmological parameters and to test the source of cosmic acceleration. These observational programs will employ multiple techniques based on a variety of statistical signatures of galaxies and large-scale structure. These techniques have sources of systematic error that need to be understood at the percent-level in order to fully leverage the power of next-generation catalogs. Simulations of large-scale structure provide the means to characterize these uncertainties. We are using XSEDE resources to produce multiple synthetic sky surveys of galaxies and large-scale structure in support of science analysis for the Dark Energy Survey. In order to scale up our production to the level of fifty 10^10-particle simulations, we are working to embed production control within the Apache Airavata workflow environment. We explain our methods and report how the workflow has reduced production time by 40% compared to manua...

  19. Building Scientific Workflows for the Geosciences with Open Community Software

    Science.gov (United States)

    Pierce, M. E.; Marru, S.; Weerawarana, S. M.

    2012-12-01

    We describe the design and development of the Apache Airavata scientific workflow software and its application to problems in geosciences. Airavata is based on Service Oriented Architecture principles and is developed as general purpose software for managing large-scale science applications on supercomputing resources such as the NSF's XSEDE. Based on the NSF-funded EarthCube Workflow Working Group activities, we discuss the application of this software relative to specific requirements (such as data stream data processing, event triggering, dealing with large data sets, and advanced distributed execution patterns involved in data mining). We also consider the role of governance in EarthCube software development and present the development of Airavata software through the Apache Software Foundation's community development model. We discuss the potential impacts on software accountability and sustainability using this model.

  20. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    Science.gov (United States)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  1. The standard-based open workflow system in GeoBrain (Invited)

    Science.gov (United States)

    Di, L.; Yu, G.; Zhao, P.; Deng, M.

    2013-12-01

    GeoBrain is an Earth science Web-service system developed and operated by the Center for Spatial Information Science and Systems, George Mason University. In GeoBrain, a standard-based open workflow system has been implemented to accommodate the automated processing of geospatial data through a set of complex geo-processing functions for advanced production generation. The GeoBrain models the complex geoprocessing at two levels, the conceptual and concrete. At the conceptual level, the workflows exist in the form of data and service types defined by ontologies. The workflows at conceptual level are called geo-processing models and cataloged in GeoBrain as virtual product types. A conceptual workflow is instantiated into a concrete, executable workflow when a user requests a product that matches a virtual product type. Both conceptual and concrete workflows are encoded in Business Process Execution Language (BPEL). A BPEL workflow engine, called BPELPower, has been implemented to execute the workflow for the product generation. A provenance capturing service has been implemented to generate the ISO 19115-compliant complete product provenance metadata before and after the workflow execution. The generation of provenance metadata before the workflow execution allows users to examine the usability of the final product before the lengthy and expensive execution takes place. The three modes of workflow executions defined in the ISO 19119, transparent, translucent, and opaque, are available in GeoBrain. A geoprocessing modeling portal has been developed to allow domain experts to develop geoprocessing models at the type level with the support of both data and service/processing ontologies. The geoprocessing models capture the knowledge of the domain experts and are become the operational offering of the products after a proper peer review of models is conducted. An automated workflow composition has been experimented successfully based on ontologies and artificial

  2. Earth observation scientific workflows in a distributed computing environment

    CSIR Research Space (South Africa)

    Van Zyl, TL

    2011-09-01

    Full Text Available Geospatially Enabled Scientific Workflows offer a promising paradigm to facilitate researchers, in the earth observation domain, with many aspects of the scientific process. One such aspect is that of access to distributed earth observation data...

  3. EPUB as publication format in Open Access journals: Tools and workflow

    Directory of Open Access Journals (Sweden)

    Trude Eikebrokk

    2014-04-01

    Full Text Available In this article, we present a case study of how the main publishing format of an Open Access journal was changed from PDF to EPUB by designing a new workflow using JATS as the basic XML source format. We state the reasons and discuss advantages for doing this, how we did it, and the costs of changing an established Microsoft Word workflow. As an example, we use one typical sociology article with tables, illustrations and references. We then follow the article from JATS markup through different transformations resulting in XHTML, EPUB and MOBI versions. In the end, we put everything together in an automated XProc pipeline. The process has been developed on free and open source tools, and we describe and evaluate these tools in the article. The workflow is suitable for non-professional publishers, and all code is attached and free for reuse by others.

  4. An open source workflow for 3D printouts of scientific data volumes

    Science.gov (United States)

    Loewe, P.; Klump, J. F.; Wickert, J.; Ludwig, M.; Frigeri, A.

    2013-12-01

    As the amount of scientific data continues to grow, researchers need new tools to help them visualize complex data. Immersive data-visualisations are helpful, yet fail to provide tactile feedback and sensory feedback on spatial orientation, as provided from tangible objects. The gap in sensory feedback from virtual objects leads to the development of tangible representations of geospatial information to solve real world problems. Examples are animated globes [1], interactive environments like tangible GIS [2], and on demand 3D prints. The production of a tangible representation of a scientific data set is one step in a line of scientific thinking, leading from the physical world into scientific reasoning and back: The process starts with a physical observation, or from a data stream generated by an environmental sensor. This data stream is turned into a geo-referenced data set. This data is turned into a volume representation which is converted into command sequences for the printing device, leading to the creation of a 3D printout. As a last, but crucial step, this new object has to be documented and linked to the associated metadata, and curated in long term repositories to preserve its scientific meaning and context. The workflow to produce tangible 3D data-prints from science data at the German Research Centre for Geosciences (GFZ) was implemented as a software based on the Free and Open Source Geoinformatics tools GRASS GIS and Paraview. The workflow was successfully validated in various application scenarios at GFZ using a RapMan printer to create 3D specimens of elevation models, geological underground models, ice penetrating radar soundings for planetology, and space time stacks for Tsunami model quality assessment. While these first pilot applications have demonstrated the feasibility of the overall approach [3], current research focuses on the provision of the workflow as Software as a Service (SAAS), thematic generalisation of information content and

  5. The Rutgers Workflow Management System: Migrating a Digital Object Management Utility to Open Source

    Directory of Open Access Journals (Sweden)

    Grace Agnew

    2007-12-01

    Full Text Available This article examines the development, architecture, and future plans for the Workflow Management System, software developed by Rutgers University Libraries (RUL to create and catalog digital objects for repository ingest and access. The Workflow Management System (WMS was created as a front-end utility for the Fedora open source repository platform and a vehicle for a flexible, extensible metadata architecture, to serve the information needs of a large university and its collaborators. The next phase of development for the WMS shifted to a re-engineering of the WMS as an open source application. This paper discusses the design and architecture of the WMS, its re-engineering for open source release, remaining issues to be addressed before application release, and future development plans for the WMS.

  6. Using SensorML to describe scientific workflows in distributed web service environments

    CSIR Research Space (South Africa)

    Van Zyl, TL

    2009-07-01

    Full Text Available for increased collaboration through workflow sharing. The Sensor Web is an open complex adaptive system the pervades the internet and provides access to sensor resources. One mechanism for describing sensor resources is through the use of SensorML. It is shown...

  7. Collaborations in Open Learning Environments

    NARCIS (Netherlands)

    Spoelstra, Howard

    2015-01-01

    This thesis researches automated services for professionals aiming at starting collaborative learning projects in open learning environments, such as MOOCs. It investigates the theoretical backgrounds of team formation for collaborative learning. Based on the outcomes, a model is developed describin

  8. Collaborations in Open Learning Environments

    NARCIS (Netherlands)

    Spoelstra, Howard

    2015-01-01

    This thesis researches automated services for professionals aiming at starting collaborative learning projects in open learning environments, such as MOOCs. It investigates the theoretical backgrounds of team formation for collaborative learning. Based on the outcomes, a model is developed

  9. 3D Printing of CT Dataset: Validation of an Open Source and Consumer-Available Workflow.

    Science.gov (United States)

    Bortolotto, Chandra; Eshja, Esmeralda; Peroni, Caterina; Orlandi, Matteo A; Bizzotto, Nicola; Poggi, Paolo

    2016-02-01

    The broad availability of cheap three-dimensional (3D) printing equipment has raised the need for a thorough analysis on its effects on clinical accuracy. Our aim is to determine whether the accuracy of 3D printing process is affected by the use of a low-budget workflow based on open source software and consumer's commercially available 3D printers. A group of test objects was scanned with a 64-slice computed tomography (CT) in order to build their 3D copies. CT datasets were elaborated using a software chain based on three free and open source software. Objects were printed out with a commercially available 3D printer. Both the 3D copies and the test objects were measured using a digital professional caliper. Overall, the objects' mean absolute difference between test objects and 3D copies is 0.23 mm and the mean relative difference amounts to 0.55 %. Our results demonstrate that the accuracy of 3D printing process remains high despite the use of a low-budget workflow.

  10. Design and Evaluation of Data Annotation Workflows for CAVE-like Virtual Environments.

    Science.gov (United States)

    Pick, Sebastian; Weyers, Benjamin; Hentschel, Bernd; Kuhlen, Torsten W

    2016-04-01

    Data annotation finds increasing use in Virtual Reality applications with the goal to support the data analysis process, such as architectural reviews. In this context, a variety of different annotation systems for application to immersive virtual environments have been presented. While many interesting interaction designs for the data annotation workflow have emerged from them, important details and evaluations are often omitted. In particular, we observe that the process of handling metadata to interactively create and manage complex annotations is often not covered in detail. In this paper, we strive to improve this situation by focusing on the design of data annotation workflows and their evaluation. We propose a workflow design that facilitates the most important annotation operations, i.e., annotation creation, review, and modification. Our workflow design is easily extensible in terms of supported annotation and metadata types as well as interaction techniques, which makes it suitable for a variety of application scenarios. To evaluate it, we have conducted a user study in a CAVE-like virtual environment in which we compared our design to two alternatives in terms of a realistic annotation creation task. Our design obtained good results in terms of task performance and user experience.

  11. The Open Physiology workflow: modeling processes over physiology circuitboards of interoperable tissue units

    Science.gov (United States)

    de Bono, Bernard; Safaei, Soroush; Grenon, Pierre; Nickerson, David P.; Alexander, Samuel; Helvensteijn, Michiel; Kok, Joost N.; Kokash, Natallia; Wu, Alan; Yu, Tommy; Hunter, Peter; Baldock, Richard A.

    2015-01-01

    A key challenge for the physiology modeling community is to enable the searching, objective comparison and, ultimately, re-use of models and associated data that are interoperable in terms of their physiological meaning. In this work, we outline the development of a workflow to modularize the simulation of tissue-level processes in physiology. In particular, we show how, via this approach, we can systematically extract, parcellate and annotate tissue histology data to represent component units of tissue function. These functional units are semantically interoperable, in terms of their physiological meaning. In particular, they are interoperable with respect to [i] each other and with respect to [ii] a circuitboard representation of long-range advective routes of fluid flow over which to model long-range molecular exchange between these units. We exemplify this approach through the combination of models for physiology-based pharmacokinetics and pharmacodynamics to quantitatively depict biological mechanisms across multiple scales. Links to the data, models and software components that constitute this workflow are found at http://open-physiology.org/. PMID:25759670

  12. A virtual data language and system for scientific workflow management in data grid environments

    Science.gov (United States)

    Zhao, Yong

    With advances in scientific instrumentation and simulation, scientific data is growing fast in both size and analysis complexity. So-called Data Grids aim to provide high performance, distributed data analysis infrastructure for data- intensive sciences, where scientists distributed worldwide need to extract information from large collections of data, and to share both data products and the resources needed to produce and store them. However, the description, composition, and execution of even logically simple scientific workflows are often complicated by the need to deal with "messy" issues like heterogeneous storage formats and ad-hoc file system structures. We show how these difficulties can be overcome via a typed workflow notation called virtual data language, within which issues of physical representation are cleanly separated from logical typing, and by the implementation of this notation within the context of a powerful virtual data system that supports distributed execution. The resulting language and system are capable of expressing complex workflows in a simple compact form, enacting those workflows in distributed environments, monitoring and recording the execution processes, and tracing the derivation history of data products. We describe the motivation, design, implementation, and evaluation of the virtual data language and system, and the application of the virtual data paradigm in various science disciplines, including astronomy, cognitive neuroscience.

  13. First field demonstration of cloud datacenter workflow automation employing dynamic optical transport network resources under OpenStack and OpenFlow orchestration.

    Science.gov (United States)

    Szyrkowiec, Thomas; Autenrieth, Achim; Gunning, Paul; Wright, Paul; Lord, Andrew; Elbers, Jörg-Peter; Lumb, Alan

    2014-02-10

    For the first time, we demonstrate the orchestration of elastic datacenter and inter-datacenter transport network resources using a combination of OpenStack and OpenFlow. Programmatic control allows a datacenter operator to dynamically request optical lightpaths from a transport network operator to accommodate rapid changes of inter-datacenter workflows.

  14. Achieving E-learning with IMS Learning Design - Workflow Implications at the Open University of the Netherlands

    NARCIS (Netherlands)

    Westera, Wim; Brouns, Francis; Pannekeet, Kees; Janssen, José; Manderveld, Jocelyn

    2005-01-01

    Please refer to the original article in: Westera, W., Brouns, F., Pannekeet, K., Janssen, J., & Manderveld, J. (2005). Achieving E-learning with IMS Learning Design - Workflow Implications at the Open University of the Netherlands. Educational Technology & Society, 8 (3), 216-225. (URL: http://www.i

  15. Open-Source Enterprise Content Management using Workflows: An Implementation Case-Study for Higher Education Institutions

    Directory of Open Access Journals (Sweden)

    Maican C.

    2014-12-01

    Full Text Available Organizations are continuously challenged by the management of increasing amounts and varieties of digital information types and formats. In the form of a case-study, this paper presents a small part of the implementation process of an open-source Enterprise Content Management software and a workflow for routing documents within a university. The first part of the paper makes and introduction to the concepts surrounding the Enterprise Content Management software, the second part of the paper briefly presents the Business Process Management (BPM and the third part focuses on the workflow process as part of the BPM within an organization.

  16. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing

    Science.gov (United States)

    Brown, David K.; Penkler, David L.; Musyoka, Thommas M.; Bishop, Özlem Tastan

    2015-01-01

    Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS. PMID:26280450

  17. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing.

    Science.gov (United States)

    Brown, David K; Penkler, David L; Musyoka, Thommas M; Bishop, Özlem Tastan

    2015-01-01

    Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS.

  18. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing.

    Directory of Open Access Journals (Sweden)

    David K Brown

    Full Text Available Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS, a workflow management system and web interface for high performance computing (HPC. JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS.

  19. A framework for integration of scientific applications into the OpenTopography workflow

    Science.gov (United States)

    Nandigam, V.; Crosby, C.; Baru, C.

    2012-12-01

    The NSF-funded OpenTopography facility provides online access to Earth science-oriented high-resolution LIDAR topography data, online processing tools, and derivative products. The underlying cyberinfrastructure employs a multi-tier service oriented architecture that is comprised of an infrastructure tier, a processing services tier, and an application tier. The infrastructure tier consists of storage, compute resources as well as supporting databases. The services tier consists of the set of processing routines each deployed as a Web service. The applications tier provides client interfaces to the system. (e.g. Portal). We propose a "pluggable" infrastructure design that will allow new scientific algorithms and processing routines developed and maintained by the community to be integrated into the OpenTopography system so that the wider earth science community can benefit from its availability. All core components in OpenTopography are available as Web services using a customized open-source Opal toolkit. The Opal toolkit provides mechanisms to manage and track job submissions, with the help of a back-end database. It allows monitoring of job and system status by providing charting tools. All core components in OpenTopography have been developed, maintained and wrapped as Web services using Opal by OpenTopography developers. However, as the scientific community develops new processing and analysis approaches this integration approach is not scalable efficiently. Most of the new scientific applications will have their own active development teams performing regular updates, maintenance and other improvements. It would be optimal to have the application co-located where its developers can continue to actively work on it while still making it accessible within the OpenTopography workflow for processing capabilities. We will utilize a software framework for remote integration of these scientific applications into the OpenTopography system. This will be accomplished by

  20. DIaaS: Data-Intensive workflows as a service - Enabling easy composition and deployment of data-intensive workflows on Virtual Research Environments

    Science.gov (United States)

    Filgueira, R.; Ferreira da Silva, R.; Deelman, E.; Atkinson, M.

    2016-12-01

    We present the Data-Intensive workflows as a Service (DIaaS) model for enabling easy data-intensive workflow composition and deployment on clouds using containers. DIaaS model backbone is Asterism, an integrated solution for running data-intensive stream-based applications on heterogeneous systems, which combines the benefits of dispel4py with Pegasus workflow systems. The stream-based executions of an Asterism workflow are managed by dispel4py, while the data movement between different e-Infrastructures, and the coordination of the application execution are automatically managed by Pegasus. DIaaS combines Asterism framework with Docker containers to provide an integrated, complete, easy-to-use, portable approach to run data-intensive workflows on distributed platforms. Three containers integrate the DIaaS model: a Pegasus node, and an MPI and an Apache Storm clusters. Container images are described as Dockerfiles (available online at http://github.com/dispel4py/pegasus_dispel4py), linked to Docker Hub for providing continuous integration (automated image builds), and image storing and sharing. In this model, all required software (workflow systems and execution engines) for running scientific applications are packed into the containers, which significantly reduces the effort (and possible human errors) required by scientists or VRE administrators to build such systems. The most common use of DIaaS will be to act as a backend of VREs or Scientific Gateways to run data-intensive applications, deploying cloud resources upon request. We have demonstrated the feasibility of DIaaS using the data-intensive seismic ambient noise cross-correlation application (Figure 1). The application preprocesses (Phase1) and cross-correlates (Phase2) traces from several seismic stations. The application is submitted via Pegasus (Container1), and Phase1 and Phase2 are executed in the MPI (Container2) and Storm (Container3) clusters respectively. Although both phases could be executed

  1. The TimeStudio Project: An open source scientific workflow system for the behavioral and brain sciences.

    Science.gov (United States)

    Nyström, Pär; Falck-Ytter, Terje; Gredebäck, Gustaf

    2016-06-01

    This article describes a new open source scientific workflow system, the TimeStudio Project, dedicated to the behavioral and brain sciences. The program is written in MATLAB and features a graphical user interface for the dynamic pipelining of computer algorithms developed as TimeStudio plugins. TimeStudio includes both a set of general plugins (for reading data files, modifying data structures, visualizing data structures, etc.) and a set of plugins specifically developed for the analysis of event-related eyetracking data as a proof of concept. It is possible to create custom plugins to integrate new or existing MATLAB code anywhere in a workflow, making TimeStudio a flexible workbench for organizing and performing a wide range of analyses. The system also features an integrated sharing and archiving tool for TimeStudio workflows, which can be used to share workflows both during the data analysis phase and after scientific publication. TimeStudio thus facilitates the reproduction and replication of scientific studies, increases the transparency of analyses, and reduces individual researchers' analysis workload. The project website ( http://timestudioproject.com ) contains the latest releases of TimeStudio, together with documentation and user forums.

  2. Proving Value in Radiology: Experience Developing and Implementing a Shareable Open Source Registry Platform Driven by Radiology Workflow.

    Science.gov (United States)

    Gichoya, Judy Wawira; Kohli, Marc D; Haste, Paul; Abigail, Elizabeth Mills; Johnson, Matthew S

    2017-06-16

    Numerous initiatives are in place to support value based care in radiology including decision support using appropriateness criteria, quality metrics like radiation dose monitoring, and efforts to improve the quality of the radiology report for consumption by referring providers. These initiatives are largely data driven. Organizations can choose to purchase proprietary registry systems, pay for software as a service solution, or deploy/build their own registry systems. Traditionally, registries are created for a single purpose like radiation dosage or specific disease tracking like diabetes registry. This results in a fragmented view of the patient, and increases overhead to maintain such single purpose registry system by requiring an alternative data entry workflow and additional infrastructure to host and maintain multiple registries for different clinical needs. This complexity is magnified in the health care enterprise whereby radiology systems usually are run parallel to other clinical systems due to the different clinical workflow for radiologists. In the new era of value based care where data needs are increasing with demand for a shorter turnaround time to provide data that can be used for information and decision making, there is a critical gap to develop registries that are more adapt to the radiology workflow with minimal overhead on resources for maintenance and setup. We share our experience of developing and implementing an open source registry system for quality improvement and research in our academic institution that is driven by our radiology workflow.

  3. Nationwide Buildings Energy Research enabled through an integrated Data Intensive Scientific Workflow and Advanced Analysis Environment

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lansing, Carina S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elsethagen, Todd O. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hathaway, John E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Guillen, Zoe C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dirks, James A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Skorski, Daniel C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stephan, Eric G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gorrissen, Willy J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gorton, Ian [Carnegie Mellon Univ., Pittsburgh, PA (United States); Liu, Yan [Concordia Univ., Montreal, QC (Canada)

    2014-01-28

    Modern workflow systems enable scientists to run ensemble simulations at unprecedented scales and levels of complexity, allowing them to study system sizes previously impossible to achieve, due to the inherent resource requirements needed for the modeling work. However as a result of these new capabilities the science teams suddenly also face unprecedented data volumes that they are unable to analyze with their existing tools and methodologies in a timely fashion. In this paper we will describe the ongoing development work to create an integrated data intensive scientific workflow and analysis environment that offers researchers the ability to easily create and execute complex simulation studies and provides them with different scalable methods to analyze the resulting data volumes. The integration of simulation and analysis environments is hereby not only a question of ease of use, but supports fundamental functions in the correlated analysis of simulation input, execution details and derived results for multi-variant, complex studies. To this end the team extended and integrated the existing capabilities of the Velo data management and analysis infrastructure, the MeDICi data intensive workflow system and RHIPE the R for Hadoop version of the well-known statistics package, as well as developing a new visual analytics interface for the result exploitation by multi-domain users. The capabilities of the new environment are demonstrated on a use case that focusses on the Pacific Northwest National Laboratory (PNNL) building energy team, showing how they were able to take their previously local scale simulations to a nationwide level by utilizing data intensive computing techniques not only for their modeling work, but also for the subsequent analysis of their modeling results. As part of the PNNL research initiative PRIMA (Platform for Regional Integrated Modeling and Analysis) the team performed an initial 3 year study of building energy demands for the US Eastern

  4. Designing open learning environments for professional development

    NARCIS (Netherlands)

    Sloep, Peter

    2011-01-01

    Sloep, P. B. (2011). Designing open learning environments for professional development. Presentation at the FP7 Handover Project Meeting. April, 9, 2011, Amsterdam, The Netherlands: Open University in the Netherlands.

  5. Designing open learning environments for professional development

    NARCIS (Netherlands)

    Sloep, Peter

    2011-01-01

    Sloep, P. B. (2011). Designing open learning environments for professional development. Presentation at the FP7 Handover Project Meeting. April, 9, 2011, Amsterdam, The Netherlands: Open University in the Netherlands.

  6. BEAM: A computational workflow system for managing and modeling material characterization data in HPC environments

    Energy Technology Data Exchange (ETDEWEB)

    Lingerfelt, Eric J [ORNL; Endeve, Eirik [ORNL; Ovchinnikov, Oleg S [ORNL; Borreguero Calvo, Jose M [ORNL; Park, Byung H [ORNL; Archibald, Richard K [ORNL; Symons, Christopher T [ORNL; Kalinin, Sergei V [ORNL; Messer, Bronson [ORNL; Shankar, Mallikarjun [ORNL; Jesse, Stephen [ORNL

    2016-01-01

    Improvements in scientific instrumentation allow imaging at mesoscopic to atomic length scales, many spectroscopic modes, and now with the rise of multimodal acquisition systems and the associated processing capability the era of multidimensional, informationally dense data sets has arrived. Technical issues in these combinatorial scientific fields are exacerbated by computational challenges best summarized as a necessity for drastic improvement in the capability to transfer, store, and analyze large volumes of data. The Bellerophon Environment for Analysis of Materials (BEAM) platform provides material scientists the capability to directly leverage the integrated computational and analytical power of High Performance Computing (HPC) to perform scalable data analysis and simulation via an intuitive, cross-platform client user interface. This framework delivers authenticated, push-button execution of complex user workflows that deploy data analysis algorithms and computational simulations utilizing the converged compute-and-data infrastructure at Oak Ridge National Laboratory s (ORNL) Compute and Data Environment for Science (CADES) and HPC environments like Titan at the Oak Ridge Leadership Computing Facility (OLCF). In this work we address the underlying HPC needs for characterization in the material science community, elaborate how BEAM s design and infrastructure tackle those needs, and present a small sub-set of user cases where scientists utilized BEAM across a broad range of analytical techniques and analysis modes.

  7. US and Dutch nurse experiences with fall prevention technology within nursing home environment and workflow: a qualitative study

    NARCIS (Netherlands)

    Vandenberg, Ann E.; van Beijnum, Bernhard J.F.; Overdevest, Vera G.P.; Capezuti, Elizabeth; Johnson II, Theodore M.

    2017-01-01

    Falls remain a major geriatric problem, and the search for new solutions continues. We investigated how existing fall prevention technology was experienced within nursing home nurses' environment and workflow. Our NIH-funded study in an American nursing home was followed by a cultural learning

  8. Interoperability Using Lightweight Metadata Standards: Service & Data Casting, OpenSearch, OPM Provenance, and Shared SciFlo Workflows

    Science.gov (United States)

    Wilson, B. D.; Manipon, G.; Hua, H.; Fetzer, E.

    2011-12-01

    Under several NASA grants, we are generating multi-sensor merged atmospheric datasets to enable the detection of instrument biases and studies of climate trends over decades of data. For example, under a NASA MEASURES grant we are producing a water vapor climatology from the A-Train instruments, stratified by the Cloudsat cloud classification for each geophysical scene. The generation and proper use of such multi-sensor climate data records (CDR's) requires a high level of openness, transparency, and traceability. To make the datasets self-documenting and provide access to full metadata and traceability, we have implemented a set of capabilities and services using known, interoperable protocols. These protocols include OpenSearch, OPeNDAP, Open Provenance Model, service & data casting technologies using Atom feeds, and REST-callable analysis workflows implemented as SciFlo (XML) documents. We advocate that our approach can serve as a blueprint for how to openly "document and serve" complex, multi-sensor CDR's with full traceability. The capabilities and services provided include: - Discovery of the collections by keyword search, exposed using OpenSearch protocol; - Space/time query across the CDR's granules and all of the input datasets via OpenSearch; - User-level configuration of the production workflows so that scientists can select additional physical variables from the A-Train to add to the next iteration of the merged datasets; - Efficient data merging using on-the-fly OPeNDAP variable slicing & spatial subsetting of data out of input netCDF and HDF files (without moving the entire files); - Self-documenting CDR's published in a highly usable netCDF4 format with groups used to organize the variables, CF-style attributes for each variable, numeric array compression, & links to OPM provenance; - Recording of processing provenance and data lineage into a query-able provenance trail in Open Provenance Model (OPM) format, auto-captured by the workflow engine

  9. An open source approach to enable the reproducibility of scientific workflows in the ocean sciences

    Science.gov (United States)

    Di Stefano, M.; Fox, P. A.; West, P.; Hare, J. A.; Maffei, A. R.

    2013-12-01

    Every scientist should be able to rerun data analyses conducted by his or her team and regenerate the figures in a paper. However, all too often the correct version of a script goes missing, or the original raw data is filtered by hand and the filtering process is undocumented, or there is lack of collaboration and communication among scientists working in a team. Here we present 3 different use cases in ocean sciences in which end-to-end workflows are tracked. The main tool that is deployed to address these use cases is based on a web application (IPython Notebook) that provides the ability to work on very diverse and heterogeneous data and information sources, providing an effective way to share the and track changes to source code used to generate data products and associated metadata, as well as to track the overall workflow provenance to allow versioned reproducibility of a data product. Use cases selected for this work are: 1) A partial reproduction of the Ecosystem Status Report (ESR) for the Northeast U.S. Continental Shelf Large Marine Ecosystem. Our goal with this use case is to enable not just the traceability but also the reproducibility of this biannual report, keeping track of all the processes behind the generation and validation of time-series and spatial data and information products. An end-to-end workflow with code versioning is developed so that indicators in the report may be traced back to the source datasets. 2) Realtime generation of web pages to be able to visualize one of the environmental indicators from the Ecosystem Advisory for the Northeast Shelf Large Marine Ecosystem web site. 3) Data and visualization integration for ocean climate forecasting. In this use case, we focus on a workflow to describe how to provide access to online data sources in the NetCDF format and other model data, and make use of multicore processing to generate video animation from time series of gridded data. For each use case we show how complete workflows

  10. Using SensorML to describe scientific workflows in distributed web service environments

    CSIR Research Space (South Africa)

    Van Zyl, TL

    2009-07-01

    Full Text Available Scientific Workflows provides a technology that facilitates researchers by allowing them to capture in a machine processable manner the method relating to some research. This increases both provenance and repeatability of the research and allows...

  11. Multi-objective approach for energy-aware workflow scheduling in cloud computing environments.

    Science.gov (United States)

    Yassa, Sonia; Chelouah, Rachid; Kadima, Hubert; Granado, Bertrand

    2013-01-01

    We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach.

  12. Use of workflow technology to assist remote caretakers in a smart kitchen environment designed for elderly people suffering from dementia

    OpenAIRE

    Sarni, T. (Tomi)

    2013-01-01

    Purpose of this study was to determine feasibility of an information system that enables remote assistance between caretakers and elderly people suffering from dementia in a smart kitchen environment. Such system could alleviate stress experienced by caretakers by enabling provisioning of care giving between any combination of informal and formal caretakers, and by increasing mobility of caretakers. Second research problem was to evaluate benefits and drawbacks of using workflow technology to...

  13. Open Calculus: A Free Online Learning Environment

    Science.gov (United States)

    Korey, Jane; Rheinlander, Kim; Wallace, Dorothy

    2007-01-01

    Dartmouth College mathematicians have developed a free online calculus course called "Open Calculus." Open Calculus is an exportable distance-learning/self-study environment for learning calculus including written text, nearly 4000 online homework problems and instructional videos. The paper recounts the evaluation of course elements since 2000 in…

  14. Context-aware Workflow Model for Supporting Composite Workflows

    Institute of Scientific and Technical Information of China (English)

    Jong-sun CHOI; Jae-young CHOI; Yong-yun CHO

    2010-01-01

    -In recent years,several researchers have applied workflow technologies for service automation on ubiquitous computing environments.However,most context-aware oprkflows do not offer a method to compose several workflows in order to get more large-scale or complicated workflow.They only provide a simple workflow model,not a composite workflow model.In this paper,the autorhs propose a context-aware worrkflow model to support composite workflows by expanding the patterns of the existing context-aware workflows,which support the basic workflow patterns.The suggested worklow modei offers composite workflow patterns for a context-aware workflow,which consists of various flow patterns,such as simple,split,parallel flows,and subflow.With the suggested model,the model can easily reuse few of existing workflows to make a new workflow.As a result,it can save the development efforts and time of cantext-aware workflows and increase the workflow reusability.Therefore,the suggested model is expected to make it easy to develop applications related to context-aware workflow services on ubiquitous computing environments.

  15. Development of an open source laboratory information management system for 2-D gel electrophoresis-based proteomics workflow

    Directory of Open Access Journals (Sweden)

    Toda Tosifusa

    2006-10-01

    Full Text Available Abstract Background In the post-genome era, most research scientists working in the field of proteomics are confronted with difficulties in management of large volumes of data, which they are required to keep in formats suitable for subsequent data mining. Therefore, a well-developed open source laboratory information management system (LIMS should be available for their proteomics research studies. Results We developed an open source LIMS appropriately customized for 2-D gel electrophoresis-based proteomics workflow. The main features of its design are compactness, flexibility and connectivity to public databases. It supports the handling of data imported from mass spectrometry software and 2-D gel image analysis software. The LIMS is equipped with the same input interface for 2-D gel information as a clickable map on public 2DPAGE databases. The LIMS allows researchers to follow their own experimental procedures by reviewing the illustrations of 2-D gel maps and well layouts on the digestion plates and MS sample plates. Conclusion Our new open source LIMS is now available as a basic model for proteome informatics, and is accessible for further improvement. We hope that many research scientists working in the field of proteomics will evaluate our LIMS and suggest ways in which it can be improved.

  16. Multi-Objective Approach for Energy-Aware Workflow Scheduling in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Sonia Yassa

    2013-01-01

    Full Text Available We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach.

  17. The Integration of Personal Learning Environments & Open Network Learning Environments

    Science.gov (United States)

    Tu, Chih-Hsiung; Sujo-Montes, Laura; Yen, Cherng-Jyh; Chan, Junn-Yih; Blocher, Michael

    2012-01-01

    Learning management systems traditionally provide structures to guide online learners to achieve their learning goals. Web 2.0 technology empowers learners to create, share, and organize their personal learning environments in open network environments; and allows learners to engage in social networking and collaborating activities. Advanced…

  18. The Integration of Personal Learning Environments & Open Network Learning Environments

    Science.gov (United States)

    Tu, Chih-Hsiung; Sujo-Montes, Laura; Yen, Cherng-Jyh; Chan, Junn-Yih; Blocher, Michael

    2012-01-01

    Learning management systems traditionally provide structures to guide online learners to achieve their learning goals. Web 2.0 technology empowers learners to create, share, and organize their personal learning environments in open network environments; and allows learners to engage in social networking and collaborating activities. Advanced…

  19. Using CyberShake Workflows to Manage Big Seismic Hazard Data on Large-Scale Open-Science HPC Resources

    Science.gov (United States)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2015-12-01

    The CyberShake computational platform, developed by the Southern California Earthquake Center (SCEC), is an integrated collection of scientific software and middleware that performs 3D physics-based probabilistic seismic hazard analysis (PSHA) for Southern California. CyberShake integrates large-scale and high-throughput research codes to produce probabilistic seismic hazard curves for individual locations of interest and hazard maps for an entire region. A recent CyberShake calculation produced about 500,000 two-component seismograms for each of 336 locations, resulting in over 300 million synthetic seismograms in a Los Angeles-area probabilistic seismic hazard model. CyberShake calculations require a series of scientific software programs. Early computational stages produce data used as inputs by later stages, so we describe CyberShake calculations using a workflow definition language. Scientific workflow tools automate and manage the input and output data and enable remote job execution on large-scale HPC systems. To satisfy the requests of broad impact users of CyberShake data, such as seismologists, utility companies, and building code engineers, we successfully completed CyberShake Study 15.4 in April and May 2015, calculating a 1 Hz urban seismic hazard map for Los Angeles. We distributed the calculation between the NSF Track 1 system NCSA Blue Waters, the DOE Leadership-class system OLCF Titan, and USC's Center for High Performance Computing. This study ran for over 5 weeks, burning about 1.1 million node-hours and producing over half a petabyte of data. The CyberShake Study 15.4 results doubled the maximum simulated seismic frequency from 0.5 Hz to 1.0 Hz as compared to previous studies, representing a factor of 16 increase in computational complexity. We will describe how our workflow tools supported splitting the calculation across multiple systems. We will explain how we modified CyberShake software components, including GPU implementations and

  20. Parallel Programming Environment for OpenMP

    Directory of Open Access Journals (Sweden)

    Insung Park

    2001-01-01

    Full Text Available We present our effort to provide a comprehensive parallel programming environment for the OpenMP parallel directive language. This environment includes a parallel programming methodology for the OpenMP programming model and a set of tools (Ursa Minor and InterPol that support this methodology. Our toolset provides automated and interactive assistance to parallel programmers in time-consuming tasks of the proposed methodology. The features provided by our tools include performance and program structure visualization, interactive optimization, support for performance modeling, and performance advising for finding and correcting performance problems. The presented evaluation demonstrates that our environment offers significant support in general parallel tuning efforts and that the toolset facilitates many common tasks in OpenMP parallel programming in an efficient manner.

  1. Open access in the critical care environment.

    Science.gov (United States)

    South, Tabitha; Adair, Brigette

    2014-12-01

    Open access has become an important topic in critical care over the last 3 years. In the past, critical care had restricted access and set visitation guidelines to protect patients. This article provides a review of the literature related to open access in the critical care environment, including the impact on patients, families, and health care providers. The ultimate goal is to provide care centered on patients and families and to create a healing environment to ensure safe passage of patients through their hospital stays. This outcome could lead to increased patient/family satisfaction.

  2. Web Server Security on Open Source Environments

    Science.gov (United States)

    Gkoutzelis, Dimitrios X.; Sardis, Manolis S.

    Administering critical resources has never been more difficult that it is today. In a changing world of software innovation where major changes occur on a daily basis, it is crucial for the webmasters and server administrators to shield their data against an unknown arsenal of attacks in the hands of their attackers. Up until now this kind of defense was a privilege of the few, out-budgeted and low cost solutions let the defender vulnerable to the uprising of innovating attacking methods. Luckily, the digital revolution of the past decade left its mark, changing the way we face security forever: open source infrastructure today covers all the prerequisites for a secure web environment in a way we could never imagine fifteen years ago. Online security of large corporations, military and government bodies is more and more handled by open source application thus driving the technological trend of the 21st century in adopting open solutions to E-Commerce and privacy issues. This paper describes substantial security precautions in facing privacy and authentication issues in a totally open source web environment. Our goal is to state and face the most known problems in data handling and consequently propose the most appealing techniques to face these challenges through an open solution.

  3. Optimizing Workflow Data Footprint

    Directory of Open Access Journals (Sweden)

    Gurmeet Singh

    2007-01-01

    Full Text Available In this paper we examine the issue of optimizing disk usage and scheduling large-scale scientific workflows onto distributed resources where the workflows are data-intensive, requiring large amounts of data storage, and the resources have limited storage resources. Our approach is two-fold: we minimize the amount of space a workflow requires during execution by removing data files at runtime when they are no longer needed and we demonstrate that workflows may have to be restructured to reduce the overall data footprint of the workflow. We show the results of our data management and workflow restructuring solutions using a Laser Interferometer Gravitational-Wave Observatory (LIGO application and an astronomy application, Montage, running on a large-scale production grid-the Open Science Grid. We show that although reducing the data footprint of Montage by 48% can be achieved with dynamic data cleanup techniques, LIGO Scientific Collaboration workflows require additional restructuring to achieve a 56% reduction in data space usage. We also examine the cost of the workflow restructuring in terms of the application's runtime.

  4. Semantic Document Library: A Virtual Research Environment for Documents, Data and Workflows Sharing

    Science.gov (United States)

    Kotwani, K.; Liu, Y.; Myers, J.; Futrelle, J.

    2008-12-01

    The Semantic Document Library (SDL) was driven by use cases from the environmental observatory communities and is designed to provide conventional document repository features of uploading, downloading, editing and versioning of documents as well as value adding features of tagging, querying, sharing, annotating, ranking, provenance, social networking and geo-spatial mapping services. It allows users to organize a catalogue of watershed observation data, model output, workflows, as well publications and documents related to the same watershed study through the tagging capability. Users can tag all relevant materials using the same watershed name and find all of them easily later using this tag. The underpinning semantic content repository can store materials from other cyberenvironments such as workflow or simulation tools and SDL provides an effective interface to query and organize materials from various sources. Advanced features of the SDL allow users to visualize the provenance of the materials such as the source and how the output data is derived. Other novel features include visualizing all geo-referenced materials on a geospatial map. SDL as a component of a cyberenvironment portal (the NCSA Cybercollaboratory) has goal of efficient management of information and relationships between published artifacts (Validated models, vetted data, workflows, annotations, best practices, reviews and papers) produced from raw research artifacts (data, notes, plans etc.) through agents (people, sensors etc.). Tremendous scientific potential of artifacts is achieved through mechanisms of sharing, reuse and collaboration - empowering scientists to spread their knowledge and protocols and to benefit from the knowledge of others. SDL successfully implements web 2.0 technologies and design patterns along with semantic content management approach that enables use of multiple ontologies and dynamic evolution (e.g. folksonomies) of terminology. Scientific documents involved with

  5. Combining Cloud-based Workflow Management System with SOA and CEP to Create Agility in Collaborative Environment

    Directory of Open Access Journals (Sweden)

    Marian STOICA

    2017-01-01

    Full Text Available In current economy, technological solutions like cloud computing, service-oriented architecture (SOA and complex event processing (CEP are recognized as modern approaches used for increasing the business agility and achieving innovation. The complexity of collaborative business environment raises more and more the need for performant workflow management systems (WfMS that meet current requirements. Each approach has advantages, but also faces challenges. In this paper we propose a solution for integration of cloud computing with WfMS, SOA and CEP that allows these technologies to complete each other and bank on their benefits to increase agility and reduce the challenges/problems. The paper presents a short introduction in the subject, followed by an analysis of the combination between cloud computing and WfMS and the benefits of cloud based workflow management system. The paper ends with a solution for combining cloud WfMS with SOA and CEP in order to gain business agility and real time collaboration, followed by conclusions and research directions.

  6. Geo-processing workflow driven wildfire hot pixel detection under sensor web environment

    Science.gov (United States)

    Chen, Nengcheng; Di, Liping; Yu, Genong; Gong, Jianya

    2010-03-01

    Integrating Sensor Web Enablement (SWE) services with Geo-Processing Workflows (GPW) has become a bottleneck for Sensor Web-based applications, especially remote-sensing observations. This paper presents a common GPW framework for Sensor Web data service as part of the NASA Sensor Web project. This abstract framework includes abstract GPW model construction, GPW chains from service combination, and data retrieval components. The concrete framework consists of a data service node, a data processing node, a data presentation node, a Catalogue Service node, and a BPEL engine. An abstract model designer is used to design the top level GPW model, a model instantiation service is used to generate the concrete Business Process Execution Language (BPEL), and the BPEL execution engine is adopted. This framework is used to generate several kinds of data: raw data from live sensors, coverage or feature data, geospatial products, or sensor maps. A prototype, including a model designer, model instantiation service, and GPW engine-BPELPower is presented. A scenario for an EO-1 Sensor Web data service for wildfire hot pixel detection is used to test the feasibility of the proposed framework. The execution time and influences of the EO-1 live Hyperion data wildfire classification service framework are evaluated. The benefits and high performance of the proposed framework are discussed. The experiments of EO-1 live Hyperion data wildfire classification service show that this framework can improve the quality of services for sensor data retrieval and processing.

  7. An Open Environment for Cooperative Equational Solving

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    We describe a system called CFLP which aims at the integration ofthe best featu res of functional logic programming (FLP), cooperative constraint solving (CCS), and distributed computing. FLP provides support for defining one's own abstract ions over a constraint domain in an easy and comfortable way, whereas CCS is emp loyed to solve systems of mixed constraints by iterating specialized constraint solving methods in accordance with a well defined strategy. The system is a di s tributed implementation of a cooperative constraint functional logic programming scheme that combines higher-order lazy narrowing with cooperative constraint s o lving. The model takes advantage of the existence of several constraint solving resources located in a distributed environment (e.g., a network of computers), w hich communicate asynchronously via message passing. To increase the openness of the system, we are redesigning CFLP based on CORBA. We discuss some design and implementation issues of the system.

  8. Agile parallel bioinformatics workflow management using Pwrake

    Directory of Open Access Journals (Sweden)

    Tanaka Masahiro

    2011-09-01

    Full Text Available Abstract Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error. Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. Findings We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Conclusions Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows

  9. US and Dutch nurse experiences with fall prevention technology within nursing home environment and workflow: A qualitative study.

    Science.gov (United States)

    Vandenberg, Ann E; van Beijnum, Bert-Jan; Overdevest, Vera G P; Capezuti, Elizabeth; Johnson, Theodore M

    Falls remain a major geriatric problem, and the search for new solutions continues. We investigated how existing fall prevention technology was experienced within nursing home nurses' environment and workflow. Our NIH-funded study in an American nursing home was followed by a cultural learning exchange with a Dutch nursing home. We constructed two case reports from interview and observational data and compared the magnitude of falls, safety cultures, and technology characteristics and effectiveness. Falls were a high-magnitude problem at the US site, with a collectively vigilant safety culture attending to non-directional audible alarms; falls were a low-magnitude problem at the NL site which employed customizable, infrared sensors that directed text alerts to assigned staff members' mobile devices in patient-centered care culture. Across cases, 1) a coordinated communication system was essential in facilitating effective fall prevention alert response, and 2) nursing home safety culture is tightly associated with the chosen technological system. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Open Book Testing in Online Learning Environments

    Science.gov (United States)

    Rakes, Glenda C.

    2008-01-01

    One continuing concern associated with online courses is assessment of student performance. One option for online assessment is the use of open book tests. This study investigated the impact of training in open book test-taking strategies on student test performance in online, timed, unproctored, open book tests. When the tutorial was required…

  11. Security Technologies for Open Networking Environments (STONE)

    Energy Technology Data Exchange (ETDEWEB)

    Muftic, Sead

    2005-03-31

    Under this project SETECS performed research, created the design, and the initial prototype of three groups of security technologies: (a) middleware security platform, (b) Web services security, and (c) group security system. The results of the project indicate that the three types of security technologies can be used either individually or in combination, which enables effective and rapid deployment of a number of secure applications in open networking environments. The middleware security platform represents a set of object-oriented security components providing various functions to handle basic cryptography, X.509 certificates, S/MIME and PKCS No.7 encapsulation formats, secure communication protocols, and smart cards. The platform has been designed in the form of security engines, including a Registration Engine, Certification Engine, an Authorization Engine, and a Secure Group Applications Engine. By creating a middleware security platform consisting of multiple independent components the following advantages have been achieved - Object-oriented, Modularity, Simplified Development, and testing, Portability, and Simplified extensions. The middleware security platform has been fully designed and a preliminary Java-based prototype has been created for the Microsoft Windows operating system. The Web services security system, designed in the project, consists of technologies and applications that provide authentication (i.e., single sign), authorization, and federation of identities in an open networking environment. The system is based on OASIS SAML and XACML standards for secure Web services. Its topology comprises three major components: Domain Security Server (DSS) is the main building block of the system Secure Application Server (SAS) Secure Client In addition to the SAML and XACML engines, the authorization system consists of two sets of components An Authorization Administration System An Authorization Enforcement System Federation of identities in multi

  12. KNIME-CDK: Workflow-driven cheminformatics.

    Science.gov (United States)

    Beisken, Stephan; Meinl, Thorsten; Wiswedel, Bernd; de Figueiredo, Luis F; Berthold, Michael; Steinbeck, Christoph

    2013-08-22

    Cheminformaticians have to routinely process and analyse libraries of small molecules. Among other things, that includes the standardization of molecules, calculation of various descriptors, visualisation of molecular structures, and downstream analysis. For this purpose, scientific workflow platforms such as the Konstanz Information Miner can be used if provided with the right plug-in. A workflow-based cheminformatics tool provides the advantage of ease-of-use and interoperability between complementary cheminformatics packages within the same framework, hence facilitating the analysis process. KNIME-CDK comprises functions for molecule conversion to/from common formats, generation of signatures, fingerprints, and molecular properties. It is based on the Chemistry Development Toolkit and uses the Chemical Markup Language for persistence. A comparison with the cheminformatics plug-in RDKit shows that KNIME-CDK supports a similar range of chemical classes and adds new functionality to the framework. We describe the design and integration of the plug-in, and demonstrate the usage of the nodes on ChEBI, a library of small molecules of biological interest. KNIME-CDK is an open-source plug-in for the Konstanz Information Miner, a free workflow platform. KNIME-CDK is build on top of the open-source Chemistry Development Toolkit and allows for efficient cross-vendor structural cheminformatics. Its ease-of-use and modularity enables researchers to automate routine tasks and data analysis, bringing complimentary cheminformatics functionality to the workflow environment.

  13. Automation of Flexible Migration Workflows

    Directory of Open Access Journals (Sweden)

    Dirk von Suchodoletz

    2011-03-01

    Full Text Available Many digital preservation scenarios are based on the migration strategy, which itself is heavily tool-dependent. For popular, well-defined and often open file formats – e.g., digital images, such as PNG, GIF, JPEG – a wide range of tools exist. Migration workflows become more difficult with proprietary formats, as used by the several text processing applications becoming available in the last two decades. If a certain file format can not be rendered with actual software, emulation of the original environment remains a valid option. For instance, with the original Lotus AmiPro or Word Perfect, it is not a problem to save an object of this type in ASCII text or Rich Text Format. In specific environments, it is even possible to send the file to a virtual printer, thereby producing a PDF as a migration output. Such manual migration tasks typically involve human interaction, which may be feasible for a small number of objects, but not for larger batches of files.We propose a novel approach using a software-operated VNC abstraction layer in order to replace humans with machine interaction. Emulators or virtualization tools equipped with a VNC interface are very well suited for this approach. But screen, keyboard and mouse interaction is just part of the setup. Furthermore, digital objects need to be transferred into the original environment in order to be extracted after processing. Nevertheless, the complexity of the new generation of migration services is quickly rising; a preservation workflow is now comprised not only of the migration tool itself, but of a complete software and virtual hardware stack with recorded workflows linked to every supported migration scenario. Thus the requirements of OAIS management must include proper software archiving, emulator selection, system image and recording handling. The concept of view-paths could help either to automatically determine the proper pre-configured virtual environment or to set up system

  14. Negotiation and Monitoring in Open Environments

    NARCIS (Netherlands)

    Clark, K.P.

    2014-01-01

    Large scale, distributed, digital environments offer vast potential. Within these environments, software systems will provide unprecedented support for daily life. Offering access to vast amounts of knowledge and resources, these systems will enable wider participation of society, at large. An examp

  15. Multimedia Services in Open Distributed Telecommunications Environments

    NARCIS (Netherlands)

    Leydekkers, Peter

    1997-01-01

    In the majority of European countries a twofold change is taking place in the telecommunications marketplace. Firstly, the traditional monopolistic state owned telecommunications provider is being privatised and secondly, the market in which this newly privatised provider operates is being opened to

  16. Constructing Workflows from Script Applications

    Directory of Open Access Journals (Sweden)

    Mikołaj Baranowski

    2012-01-01

    Full Text Available For programming and executing complex applications on grid infrastructures, scientific workflows have been proposed as convenient high-level alternative to solutions based on general-purpose programming languages, APIs and scripts. GridSpace is a collaborative programming and execution environment, which is based on a scripting approach and it extends Ruby language with a high-level API for invoking operations on remote resources. In this paper we describe a tool which enables to convert the GridSpace application source code into a workflow representation which, in turn, may be used for scheduling, provenance, or visualization. We describe how we addressed the issues of analyzing Ruby source code, resolving variable and method dependencies, as well as building workflow representation. The solutions to these problems have been developed and they were evaluated by testing them on complex grid application workflows such as CyberShake, Epigenomics and Montage. Evaluation is enriched by representing typical workflow control flow patterns.

  17. Bandwidth-Aware Scheduling of Workflow Application on Multiple Grid Sites

    Directory of Open Access Journals (Sweden)

    Harshadkumar B. Prajapati

    2014-01-01

    Full Text Available Bandwidth-aware workflow scheduling is required to improve the performance of a workflow application in a multisite Grid environment, as the data movement cost between two low-bandwidth sites can adversely affect the makespan of the application. Pegasus WMS, an open-source and freely available WMS, cannot fully utilize its workflow mapping capability due to unavailability of integration of any bandwidth monitoring infrastructure in it. This paper develops the integration of Network Weather Service (NWS in Pegasus WMS to enable the bandwidth-aware mapping of scientific workflows. Our work demonstrates the applicability of the integration of NWS by making existing Heft site-selector of Pegasus WMS bandwidth aware. Furthermore, this paper proposes and implements a new workflow scheduling algorithm—Level based Highest Input and Processing Weight First. The results of the performed experiments indicate that the bandwidth-aware workflow scheduling algorithms perform better than bandwidth-unaware algorithms: Random and Heft of Pegasus WMS. Moreover, our proposed workflow scheduling algorithm performs better than the bandwidth-aware Heft algorithms. Thus, the proposed bandwidth-aware workflow scheduling enhances capability of Pegasus WMS and can increase performance of workflow applications.

  18. Open source engineering and sustainability tools for the built environment

    NARCIS (Netherlands)

    Coenders, J.L.

    2013-01-01

    This paper presents two novel open source software developments for design and engineering in the built environment. The first development, called “sustainability-open” [1], aims on providing open source design, analysis and assessment software source code for (environmental) performance of building

  19. Open source engineering and sustainability tools for the built environment

    NARCIS (Netherlands)

    Coenders, J.L.

    2013-01-01

    This paper presents two novel open source software developments for design and engineering in the built environment. The first development, called “sustainability-open” [1], aims on providing open source design, analysis and assessment software source code for (environmental) performance of

  20. Institutional and pedagogical criteria for productive open source learning environments

    DEFF Research Database (Denmark)

    Svendsen, Brian Møller; Ryberg, Thomas; Semey, Ian Peter;

    2004-01-01

    In this article we present some institutional and pedagogical criteria for making an informed decision in relation to identifying and choosing a productive open source learning environment. We argue that three concepts (implementation, maintainability and further development) are important when...... considering the sustainability and cost efficiency of an open source system, and we outline a set of key points for evaluating an open source software in terms of cost of system adoption. Furthermore we identify a range of pedagogical concepts and criteria to emphasize the importance of considering...... the relation between the local pedagogical practice and the pedagogical design of the open source learning environment. This we illustrate through an analysis of an open source system and our own pedagogical practice at Aalborg University, Denmark (POPP)....

  1. Institutional and pedagogical criteria for productive open source learning environments

    DEFF Research Database (Denmark)

    Svendsen, Brian Møller; Ryberg, Thomas; Semey, Ian Peter

    2004-01-01

    In this article we present some institutional and pedagogical criteria for making an informed decision in relation to identifying and choosing a productive open source learning environment. We argue that three concepts (implementation, maintainability and further development) are important when...... considering the sustainability and cost efficiency of an open source system, and we outline a set of key points for evaluating an open source software in terms of cost of system adoption. Furthermore we identify a range of pedagogical concepts and criteria to emphasize the importance of considering...... the relation between the local pedagogical practice and the pedagogical design of the open source learning environment. This we illustrate through an analysis of an open source system and our own pedagogical practice at Aalborg University, Denmark (POPP)....

  2. Service-oriented workflow to efficiently and automatically fulfill products in a highly individualized web and mobile environment

    Science.gov (United States)

    Qiao, Mu

    2015-03-01

    Service Oriented Architecture1 (SOA) is widely used in building flexible and scalable web sites and services. In most of the web or mobile photo book and gifting business space, the products ordered are highly variable without a standard template that one can substitute texts or images from similar to that of commercial variable data printing. In this paper, the author describes a SOA workflow in a multi-sites, multi-product lines fulfillment system where three major challenges are addressed: utilization of hardware and equipment, highly automation with fault recovery, and highly scalable and flexible with order volume fluctuation.

  3. GeNeDA: An Open-Source Workflow for Design Automation of Gene Regulatory Networks Inspired from Microelectronics.

    Science.gov (United States)

    Madec, Morgan; Pecheux, François; Gendrault, Yves; Rosati, Elise; Lallement, Christophe; Haiech, Jacques

    2016-10-01

    The topic of this article is the development of an open-source automated design framework for synthetic biology, specifically for the design of artificial gene regulatory networks based on a digital approach. In opposition to other tools, GeNeDA is an open-source online software based on existing tools used in microelectronics that have proven their efficiency over the last 30 years. The complete framework is composed of a computation core directly adapted from an Electronic Design Automation tool, input and output interfaces, a library of elementary parts that can be achieved with gene regulatory networks, and an interface with an electrical circuit simulator. Each of these modules is an extension of microelectronics tools and concepts: ODIN II, ABC, the Verilog language, SPICE simulator, and SystemC-AMS. GeNeDA is first validated on a benchmark of several combinatorial circuits. The results highlight the importance of the part library. Then, this framework is used for the design of a sequential circuit including a biological state machine.

  4. 3D workflow for HDR image capture of projection systems and objects for CAVE virtual environments authoring with wireless touch-sensitive devices

    Science.gov (United States)

    Prusten, Mark J.; McIntyre, Michelle; Landis, Marvin

    2006-02-01

    A 3D workflow pipeline is presented for High Dynamic Range (HDR) image capture of projected scenes or objects for presentation in CAVE virtual environments. The methods of HDR digital photography of environments vs. objects are reviewed. Samples of both types of virtual authoring being the actual CAVE environment and a sculpture are shown. A series of software tools are incorporated into a pipeline called CAVEPIPE, allowing for high-resolution objects and scenes to be composited together in natural illumination environments [1] and presented in our CAVE virtual reality environment. We also present a way to enhance the user interface for CAVE environments. The traditional methods of controlling the navigation through virtual environments include: glove, HUD's and 3D mouse devices. By integrating a wireless network that includes both WiFi (IEEE 802.11b/g) and Bluetooth (IEEE 802.15.1) protocols the non-graphical input control device can be eliminated. Therefore wireless devices can be added that would include: PDA's, Smart Phones, TabletPC's, Portable Gaming consoles, and PocketPC's.

  5. Process Planning and Scheduling Integration in an Open Manufacturing Environment

    Institute of Scientific and Technical Information of China (English)

    LOU Ping; LIU Quan; ZHOU Zu-de; QUAN Shu-hai; FANG Ban-hang

    2009-01-01

    New open manufacturing environments have been proposed aiming at realizing more flexible distributed manufacturing paradigms, which can deal with not only dynamic changes in volume and variety of products, but also changes of machining equipments, dispersals of processing locations, and also with unscheduled disruptions. This research is to develop an integrated process planning and scheduling system, which is suited to this open, dynamic,distributed manufacturing environment. Multi-agent system (MAS) approaches are used for integration of manufacturing processing planning and scheduling in an open distributed manufacturing environment, in which process planning can be adjusted dynamically and manufacturing resources can increase/decrease according to the requirements. One kind of multi-level dynamic negotiated approaches to process planning and scheduling is presented for the integration of manufacturing process planning and scheduling.

  6. Emergency doctors' strategies to manage competing workload demands in an interruptive environment: An observational workflow time study.

    Science.gov (United States)

    Walter, Scott R; Raban, Magdalena Z; Dunsmuir, William T M; Douglas, Heather E; Westbrook, Johanna I

    2017-01-01

    An observational workflow time study was conducted involving doctors in the emergency department (ED) of a large Australian hospital. During 121.7 h across 58 sessions, we observed interruptive events, conceptualised as prompts, and doctors' strategies to handle those prompts (task-switching, multitasking, acknowledgement, deferral and deflection) to assess the role of multiple work system factors influencing doctors' work in the ED. Prompt rates varied vastly between work scenarios, being highest during non-verbal solo tasks. The propensity to use certain strategies also differed with task type, prompt type and location within the department, although task-switching was by far the most frequent. Communicative prompts were important in patient treatment and workload management. Clinicians appear to adjust their communication strategies in response to contextual factors in order to deliver patient care. Risk due to the interruptive nature of ED communication is potentially outweighed by the positive effects of timely information transfer and advice provision.

  7. Validating Procedural Knowledge in the Open Virtual Collaboration Environment

    OpenAIRE

    Wickler, Gerhard

    2013-01-01

    This paper describes the OpenVCE system, which is an open-source environment that integrates Web 2.0 technology and a 3D virtual world space to support collaborative work, specifically in large-scale emergency response scenarios, where the system has been evaluated. The support is achieved through procedural knowledge that is available to the system. OpenVCE supports the distributed knowledge engineering of procedural knowledge in a semi-formal framework based on a wiki. For the formal aspect...

  8. Canopy openness, understory light environments, and oak regeneration

    Science.gov (United States)

    Brian C. McCarthy; Scott A. Robison

    2003-01-01

    Understory light environments were evaluated in four mixed-oak forests in southern Ohio using hemispherical photography. Within each forest, plots were divided into nine treatment combinations based on three pretreatment fire categories and three Integrated Moisture Index (IMI) categories. For each of 108 photographs we determined the percentage of open sky, direct...

  9. Open 3D Environments for Competitive and Collaborative Educational Games

    NARCIS (Netherlands)

    Klemke, Roland; Kravcik, Milos

    2012-01-01

    Klemke, R., & Kravčík, M. (2012, 18 September). Open 3D Environments for Competitive and Collaborative Educational Games. Presentation at S. Bocconi, R. Klamma, & Y. Bachvarova, Proceedings of the 1st International Workshop on Pedagogically-driven Serious Games (PDSG 2012). In conjunction with the

  10. Open 3D Environments for Competitive and Collaborative Educational Games

    NARCIS (Netherlands)

    Klemke, Roland; Kravcik, Milos

    2012-01-01

    Klemke, R., & Kravčík, M. (2012). Open 3D Environments for Competitive and Collaborative Educational Games. In S. Bocconi, R. Klamma, & Y. Bachvarova (Eds.), Proceedings of the 1st International Workshop on Pedagogically-driven Serious Games (PDSG 2012). In conjunction with the Seventh European

  11. Model of Trust Management in Open Network Environment

    Institute of Scientific and Technical Information of China (English)

    曹元大; 宁宇鹏

    2003-01-01

    To keep open network more efficacious and secure, it is necessary that a nice trust model and method of trust management must be developed. The reason why traditional trust models are incomplete in their function to manage trust is explained, and a general model based on hybrid trust model and introducer protocol is provided. The hybrid model is more flexible and efficacious to manage trust compared with hierarchy model and Web model. The introducer protocol is a better solution to build, maintain and refresh the trust relationship in open network environment.

  12. Measuring Semantic and Structural Information for Data Oriented Workflow Retrieval with Cost Constraints

    Directory of Open Access Journals (Sweden)

    Yinglong Ma

    2014-01-01

    Full Text Available The reuse of data oriented workflows (DOWs can reduce the cost of workflow system development and control the risk of project failure and therefore is crucial for accelerating the automation of business processes. Reusing workflows can be achieved by measuring the similarity among candidate workflows and selecting the workflow satisfying requirements of users from them. However, due to DOWs being often developed based on an open, distributed, and heterogeneous environment, different users often can impose diverse cost constraints on data oriented workflows. This makes the reuse of DOWs challenging. There is no clear solution for retrieving DOWs with cost constraints. In this paper, we present a novel graph based model of DOWs with cost constraints, called constrained data oriented workflow (CDW, which can express cost constraints that users are often concerned about. An approach is proposed for retrieving CDWs, which seamlessly combines semantic and structural information of CDWs. A distance measure based on matrix theory is adopted to seamlessly combine semantic and structural similarities of CDWs for selecting and reusing them. Finally, the related experiments are made to show the effectiveness and efficiency of our approach.

  13. Constructing workflows from script applications

    NARCIS (Netherlands)

    Baranowski, M.; Belloum, A.; Bubak, M.; Malawski, M.

    2012-01-01

    For programming and executing complex applications on grid infrastructures, scientific workflows have been proposed as convenient high-level alternative to solutions based on general-purpose programming languages, APIs and scripts. GridSpace is a collaborative programming and execution environment,

  14. Workflow automation architecture standard

    Energy Technology Data Exchange (ETDEWEB)

    Moshofsky, R.P.; Rohen, W.T. [Boeing Computer Services Co., Richland, WA (United States)

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  15. Managing Library IT Workflow with Bugzilla

    Directory of Open Access Journals (Sweden)

    Nina McHale

    2010-09-01

    Full Text Available Prior to September 2008, all technology issues at the University of Colorado Denver's Auraria Library were reported to a dedicated departmental phone line. A variety of staff changes necessitated a more formal means of tracking, delegating, and resolving reported issues, and the department turned to Bugzilla, an open source bug tracking application designed by Mozilla.org developers. While designed with software development bug tracking in mind, Bugzilla can be easily customized and modified to serve as an IT ticketing system. Twenty-three months and over 2300 trouble tickets later, Auraria's IT department workflow is much smoother and more efficient. This article includes two Perl Template Toolkit code samples for customized Bugzilla screens for its use in a library environment; readers will be able to easily replicate the project in their own environments.

  16. OpenFLUID: an open-source software environment for modelling fluxes in landscapes

    Science.gov (United States)

    Fabre, Jean-Christophe; Rabotin, Michaël; Crevoisier, David; Libres, Aline; Dagès, Cécile; Moussa, Roger; Lagacherie, Philippe; Raclot, Damien; Voltz, Marc

    2013-04-01

    Integrative landscape functioning has become a common concept in environmental management. Landscapes are complex systems where many processes interact in time and space. In agro-ecosystems, these processes are mainly physical processes, including hydrological-processes, biological processes and human activities. Modelling such systems requires an interdisciplinary approach, coupling models coming from different disciplines, developed by different teams. In order to support collaborative works, involving many models coupled in time and space for integrative simulations, an open software modelling platform is a relevant answer. OpenFLUID is an open source software platform for modelling landscape functioning, mainly focused on spatial fluxes. It provides an advanced object-oriented architecture allowing to i) couple models developed de novo or from existing source code, and which are dynamically plugged to the platform, ii) represent landscapes as hierarchical graphs, taking into account multi-scale, spatial heterogeneities and landscape objects connectivity, iii) run and explore simulations in many ways : using the OpenFLUID software interfaces for users (command line interface, graphical user interface), or using external applications such as GNU R through the provided ROpenFLUID package. OpenFLUID is developed in C++ and relies on open source libraries only (Boost, libXML2, GLib/GTK, OGR/GDAL, …). For modelers and developers, OpenFLUID provides a dedicated environment for model development, which is based on an open source toolchain, including the Eclipse editor, the GCC compiler and the CMake build system. OpenFLUID is distributed under the GPLv3 open source license, with a special exception allowing to plug existing models licensed under any license. It is clearly in the spirit of sharing knowledge and favouring collaboration in a community of modelers. OpenFLUID has been involved in many research applications, such as modelling of hydrological network

  17. Optimize Internal Workflow Management

    Directory of Open Access Journals (Sweden)

    Lucia RUSU

    2010-01-01

    Full Text Available Workflow Management has the role of creating andmaintaining an efficient flow of information and tasks inside anorganization. The major benefit of workflows is the solutions tothe growing needs of organizations. The external and theinternal processes associated with a business need to be carefullyorganized in order to provide a strong foundation for the dailywork. This paper focuses internal workflow within a company,attempts to provide some basic principles related to workflows,and a workflows solution for modeling and deploying usingVisual Studio and SharePoint Server.

  18. An OpenMP Programming Environment on Mobile Devices

    Directory of Open Access Journals (Sweden)

    Tyng-Yeu Liang

    2016-01-01

    Full Text Available Recently, the computational speed and battery capability of mobile devices were greatly promoted. With an enormous number of APPs, users can do many things in mobile devices as well as in computers. Consequently, more and more scientific researchers are encouraged to move their working environment from computers to mobile devices for increasing their work efficiency because they can analyze data and make decisions on their mobile devices anytime and anywhere. Accordingly, we propose a mobile OpenMP programming environment called MOMP in this paper. Using this APP, users can directly write, compile, and execute OpenMP programs on their Android-based mobile devices to exploit embedded CPU and GPU for resolving their problems without network connection. Because of source compatibility, MOMP makes users easily port their OpenMP programs from computers to mobile devices without any modification. Moreover, MOMP provides users with an easy interface to choose CPU or GPU for executing different parallel regions in the same program based on the properties of parallel regions. Therefore, MOMP can effectively reduce the programming complexity of heterogeneous computing in mobile devices and exploit the computational power of mobile devices for the performance of user applications.

  19. Dynamic Reusable Workflows for Ocean Science

    Directory of Open Access Journals (Sweden)

    Richard P. Signell

    2016-10-01

    Full Text Available Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog searches and data access now make it possible to create catalog-driven workflows that automate—end-to-end—data search, analysis, and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused, and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS which automates the skill assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC Catalog Service for the Web (CSW, then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enter the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased

  20. Dynamic reusable workflows for ocean science

    Science.gov (United States)

    Signell, Richard; Fernandez, Filipe; Wilcox, Kyle

    2016-01-01

    Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog search and data access make it now possible to create catalog-driven workflows that automate — end-to-end — data search, analysis and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS) which automates the skill-assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC) Catalog Service for the Web (CSW), then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS) for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enters the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased use of dynamic

  1. Building integrated business environments: analysing open-source ESB

    Science.gov (United States)

    Martínez-Carreras, M. A.; García Jimenez, F. J.; Gómez Skarmeta, A. F.

    2015-05-01

    Integration and interoperability are two concepts that have gained significant prominence in the business field, providing tools which enable enterprise application integration (EAI). In this sense, enterprise service bus (ESB) has played a crucial role as the underpinning technology for creating integrated environments in which companies may connect all their legacy-applications. However, the potential of these technologies remains unknown and some important features are not used to develop suitable business environments. The aim of this paper is to describe and detail the elements for building the next generation of integrated business environments (IBE) and to analyse the features of ESBs as the core of this infrastructure. For this purpose, we evaluate how well-known open-source ESB products fulfil these needs. Moreover, we introduce a scenario in which the collaborative system 'Alfresco' is integrated in the business infrastructure. Finally, we provide a comparison of the different open-source ESBs available for IBE requirements. According to this study, Fuse ESB provides the best results, considering features such as support for a wide variety of standards and specifications, documentation and implementation, security, advanced business trends, ease of integration and performance.

  2. Automated data reduction workflows for astronomy

    CERN Document Server

    Freudling, W; Bramich, D M; Ballester, P; Forchi, V; Garcia-Dablo, C E; Moehler, S; Neeser, M J

    2013-01-01

    Data from complex modern astronomical instruments often consist of a large number of different science and calibration files, and their reduction requires a variety of software tools. The execution chain of the tools represents a complex workflow that needs to be tuned and supervised, often by individual researchers that are not necessarily experts for any specific instrument. The efficiency of data reduction can be improved by using automatic workflows to organise data and execute the sequence of data reduction steps. To realize such efficiency gains, we designed a system that allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection and interaction with the data. The European Southern Observatory (ESO) has developed Reflex, an environment to automate data reduction workflows. Reflex is implemented as a package of customized components for the Kepler workflow engine. Kepler provides the graphical user interface to create an executable flowch...

  3. The Open Microscopy Environment: open image informatics for the biological sciences

    Science.gov (United States)

    Blackburn, Colin; Allan, Chris; Besson, Sébastien; Burel, Jean-Marie; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gault, David; Gillen, Kenneth; Leigh, Roger; Leo, Simone; Li, Simon; Lindner, Dominik; Linkert, Melissa; Moore, Josh; Moore, William J.; Ramalingam, Balaji; Rozbicki, Emil; Rustici, Gabriella; Tarkowska, Aleksandra; Walczysko, Petr; Williams, Eleanor; Swedlow, Jason R.

    2016-07-01

    Despite significant advances in biological imaging and analysis, major informatics challenges remain unsolved: file formats are proprietary, storage and analysis facilities are lacking, as are standards for sharing image data and results. While the open FITS file format is ubiquitous in astronomy, astronomical imaging shares many challenges with biological imaging, including the need to share large image sets using secure, cross-platform APIs, and the need for scalable applications for processing and visualization. The Open Microscopy Environment (OME) is an open-source software framework developed to address these challenges. OME tools include: an open data model for multidimensional imaging (OME Data Model); an open file format (OME-TIFF) and library (Bio-Formats) enabling free access to images (5D+) written in more than 145 formats from many imaging domains, including FITS; and a data management server (OMERO). The Java-based OMERO client-server platform comprises an image metadata store, an image repository, visualization and analysis by remote access, allowing sharing and publishing of image data. OMERO provides a means to manage the data through a multi-platform API. OMERO's model-based architecture has enabled its extension into a range of imaging domains, including light and electron microscopy, high content screening, digital pathology and recently into applications using non-image data from clinical and genomic studies. This is made possible using the Bio-Formats library. The current release includes a single mechanism for accessing image data of all types, regardless of original file format, via Java, C/C++ and Python and a variety of applications and environments (e.g. ImageJ, Matlab and R).

  4. A framework for operations in the competitive open access environment

    Energy Technology Data Exchange (ETDEWEB)

    Ilic, M.D.; Graves, F.C.; Fink, L.H.; DiCaprio, A.M.

    1996-04-01

    A pragmatic framework is available for maintaining reliable system operations in the context of an unbundled open access environment, while fostering a competitive supply/demand market. The proposed framework shows how incentives will meld a decentralized, competitive profit-driven market and a centrally directed services market together into a reliable free market. The electric power industry is in the midst of developing responses to the Federal Energy Regulatory Commission`s mandate to provide open access transmission service. The benfits and consequences of alternative financial, regulatory, and operational designs for unbundling and restructuring are being actively debated within the industry. Each alternative must meet three goals: (1) provide open access to transmission facilities; (2) create competition among generation resources; and (3) preserve system reliability. The differences among the alternatives are in details of how to meet these goals. It is a question of where and how to draw the lines between the infrastructure and the ancillary services required to support that infrastructure. Of particular note is the debate between proponents of a Poolco and the proponents of Bilateral scheduling transactions. Both schemes incorporate the use of a third-party Independent System Operator (ISO). The ISO is required to maintain system security and reliability in a non-discriminatory fashion. Much of this debate has taken place on a philosophical level, more concerned with competitive paradigms and analogies to other industries than with an informed engineering discussion. The authors try to put some practical concerns into this discussion.

  5. The use of serious gaming for open learning environments

    Directory of Open Access Journals (Sweden)

    Janet Lunn

    2016-03-01

    Full Text Available The extensive growth of Open Learning has been facilitated through technological innovation and continuous examination of the global Open Education development. With the introduction of compulsory computing subjects being incorporated into the UK school system in September 2014, the challenge of harnessing and integrating technological advances to aid children's learning is becoming increasingly important, referring to £1.1 million being invested to offer training programs for teachers to become knowledgeable and experienced in computing. From the age of 5, children will be taught detailed computing knowledge and skills such as; algorithms, how to store digital content, to write and test simple programs. Simultaneously, as the Internet and technology are improving, parents and teachers are looking at the incorporation of game based learning to aid children’s learning processes in more exciting and engaging ways. The purpose of game-based learning is to provide a better engagement, and in turn, an anticipated improvement in learning ability. This paper presents a research based on the investigation of properly combining the advantages of serious games and Open Learning to enhance the learning abilities of primary school children. The case study and the adequate evaluation address a learning environment in support of a history subject matter.

  6. Planning for distributed workflows: constraint-based coscheduling of computational jobs and data placement in distributed environments

    Science.gov (United States)

    Makatun, Dzmitry; Lauret, Jérôme; Rudová, Hana; Šumbera, Michal

    2015-05-01

    When running data intensive applications on distributed computational resources long I/O overheads may be observed as access to remotely stored data is performed. Latencies and bandwidth can become the major limiting factor for the overall computation performance and can reduce the CPU/WallTime ratio to excessive IO wait. Reusing the knowledge of our previous research, we propose a constraint programming based planner that schedules computational jobs and data placements (transfers) in a distributed environment in order to optimize resource utilization and reduce the overall processing completion time. The optimization is achieved by ensuring that none of the resources (network links, data storages and CPUs) are oversaturated at any moment of time and either (a) that the data is pre-placed at the site where the job runs or (b) that the jobs are scheduled where the data is already present. Such an approach eliminates the idle CPU cycles occurring when the job is waiting for the I/O from a remote site and would have wide application in the community. Our planner was evaluated and simulated based on data extracted from log files of batch and data management systems of the STAR experiment. The results of evaluation and estimation of performance improvements are discussed in this paper.

  7. Schedule Algorithm for Data Intensive Workflow on Environment of Scarcity of Storage Resource%存储资源受限时的数据密集工作流调度算法

    Institute of Scientific and Technical Information of China (English)

    汤小春; 郝婷

    2009-01-01

    Scientific workflow on distributed computing environment has huge data transfer and storage, Aiming at this problem, on the environment of an executing machine which have limited storage resource, this paper disassembles the tasks of workflow into data jobs and computing jobs. The model of new workflow which include data jobs and computing jobs is proposed. It inserts data clear job when those data are no longer needed and transfer jobs when data need to transfer from one executing node to others, the method of creating the new workflow is provided. Schedule algorithm for scientific workflow based on the consider of available storage is proposed. Experimental results indicate that the schedule algorithm consumes less response time and raises the efficiency.%针对数据密集型科学工作流需要大量的数据传送和数据存储的问题,在执行节点可用存储资源受限的情况下,构造计算作业与数据作业分离的工作流模型,没计数据与计算分离后的工作流牛成算法,增加数据转送作业、数据清除作业、数据作业及其依赖关系.给出资源受限情况下数据密集工作流的预估存储调度算法,并对其进行系统评价,取得了较好的效果.

  8. IMPROVING RESOURCE UTILIZATION USING QoS BASED LOAD BALANCING ALGORITHM FOR MULTIPLE WORKFLOWS IN IAAS CLOUD COMPUTING ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    L. Shakkeera

    2013-06-01

    Full Text Available loud computing is the extension of parallel computing, distributed computing and grid computing. It provides secure, quick, convenient data storage and net computing services through the internet. The services are available to user in pay per-use-on-demand model. The main aim of using resources from cloud is to reduce the cost and to increase the performance in terms of request response time. Thus, optimizing the resource usage through efficient load balancing strategy is crucial. The main aim of this paper is to develop and implement an Optimized Load balancing algorithm in IaaS virtual cloud environment that aims to utilize the virtual cloud resources efficiently. It minimizes the cost of the applications by effectively using cloud resources and identifies the virtual cloud resources that must be suitable for all the applications. The web application is created with many modules. These modules are considered as tasks and these tasks are submitted to the load balancing server. The server which consists our load balancing policies redirect the tasks to the corresponding virtual machines created by KVM virtual machine manager as per the load balancing algorithm. If the size of the database inside the machine exceeds then the load balancing algorithm uses the other virtual machines for further incoming request. The load balancing strategy are evaluated for various QoS performance metrics like cost, average execution times, throughput, CPU usage, disk space, memory usage, network transmission and reception rate, resource utilization rate and scheduling success rate for the number of virtual machines and it improves the scalability among resources using load balancing techniques.

  9. Fault tolerant workflow scheduling based on replication and resubmission of tasks in Cloud Computing

    OpenAIRE

    Jayadivya S K; Jaya Nirmala S; Mary Saira Bhanu S

    2012-01-01

    The aim of workflow scheduling system is to schedule the workflows within the user given deadline to achieve a good success rate. Workflow is a set of tasks processed in a predefined order based on its data and control dependency. Scheduling these workflows in a computing environment, like cloud environment, is an NP-Complete problem and it becomes more challenging when failures of tasks areconsidered. To overcome these failures, the workflow scheduling system should be fault tolerant. In thi...

  10. Agreement Workflow Tool (AWT)

    Data.gov (United States)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  11. The design of cloud workflow systems

    CERN Document Server

    Liu, Xiao; Zhang, Gaofeng

    2011-01-01

    Cloud computing is the latest market-oriented computing paradigm which brings software design and development into a new era characterized by ""XaaS"", i.e. everything as a service. Cloud workflows, as typical software applications in the cloud, are composed of a set of partially ordered cloud software services to achieve specific goals. However, due to the low QoS (quality of service) nature of the cloud environment, the design of workflow systems in the cloud becomes a challenging issue for the delivery of high quality cloud workflow applications. To address such an issue, this book presents

  12. Scientific workflows for bibliometrics

    NARCIS (Netherlands)

    Guler, A.T.; Waaijer, C.J.; Palmblad, M.

    2016-01-01

    Scientific workflows organize the assembly of specialized software into an overall data flow and are particularly well suited for multi-step analyses using different types of software tools. They are also favorable in terms of reusability, as previously designed workflows could be made publicly avai

  13. From Workflow to Interworkflow

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Workflow management systems are being introduced in manyorganizations to automa te the business process. The initial emphasis of introducing a workflow manageme nt system is on its application to the workflow in a given organization. The nex t step is to interconnect the workflow across organizations. We call it interwor kflow, and the total support technologies, which are necessary for its realizati on, interworkflow management mechanism. Interworkflow is being expected as a su pporting mechanism for Business-to-Business Electronic Commerce. We had propos ed this management mechanism and confirmed its realization with the prototype. At the same time, the interface and the protocol for interconnecting heterogeneous workflow management systems has been standardized by the WfMC. So, we advance t he project of the implementation of interworkflow management system for the prac tical use and its experimental proof.

  14. Benchmarking ETL Workflows

    Science.gov (United States)

    Simitsis, Alkis; Vassiliadis, Panos; Dayal, Umeshwar; Karagiannis, Anastasios; Tziovara, Vasiliki

    Extraction-Transform-Load (ETL) processes comprise complex data workflows, which are responsible for the maintenance of a Data Warehouse. A plethora of ETL tools is currently available constituting a multi-million dollar market. Each ETL tool uses its own technique for the design and implementation of an ETL workflow, making the task of assessing ETL tools extremely difficult. In this paper, we identify common characteristics of ETL workflows in an effort of proposing a unified evaluation method for ETL. We also identify the main points of interest in designing, implementing, and maintaining ETL workflows. Finally, we propose a principled organization of test suites based on the TPC-H schema for the problem of experimenting with ETL workflows.

  15. Open and Distance Education in Global Environment: Opportunities for Collaboration

    Directory of Open Access Journals (Sweden)

    S. K. PULIST

    2007-01-01

    Full Text Available Distance education system in India has undergone many stages and phases of evolution before it really reached the stage of what is called open education, ICT-enabled education and global education. During these phases, it has assimilated different aspects of ICT with all applauds and has been able to go hand-in-hand with it transcending the national and regional boundaries. The distance education institutions have now started giving a serious thought to explore the possibility of cross-boarder expansion. The educational needs of the present society are changing very fast. The education is now being seen as an enabling tool for empowerment and all-round development of individuals. It is difficult for an institution to come up to all the educational requirements of the society. It is, therefore, time to collaborate rather than compete. Quality concern becomes a serious issue in such a situation. Consequently, globalization, internationalization, collaboration, networking have become the buzzwords of the day in distance education. In furtherance of this journey, Indira National Open University, INDIA organized an international conference on the theme “Open and Distance Education in Global Environment: Opportunities for Collaboration” under the aegis of International Council for Distance Education. The articles of the renowned educationists presented in the Conference have reserved their place in the volume under review. The volume is a repository of their experiences in the becoming of distance education all these years. The volume is spread over 32 chapters summed up into four major streams– internationalization are: collaboration and networking; ICT-enabled education; quality assurance; and distance education for development. The canvas of the volume covers the present scenario of open and distance education from the global perspective.The first part discusses as to how collaboration can be tamed to develop joint curriculum and deliver

  16. Flexible workflow sharing and execution services for e-scientists

    Science.gov (United States)

    Kacsuk, Péter; Terstyanszky, Gábor; Kiss, Tamas; Sipos, Gergely

    2013-04-01

    The sequence of computational and data manipulation steps required to perform a specific scientific analysis is called a workflow. Workflows that orchestrate data and/or compute intensive applications on Distributed Computing Infrastructures (DCIs) recently became standard tools in e-science. At the same time the broad and fragmented landscape of workflows and DCIs slows down the uptake of workflow-based work. The development, sharing, integration and execution of workflows is still a challenge for many scientists. The FP7 "Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs" (SHIWA) project significantly improved the situation, with a simulation platform that connects different workflow systems, different workflow languages, different DCIs and workflows into a single, interoperable unit. The SHIWA Simulation Platform is a service package, already used by various scientific communities, and used as a tool by the recently started ER-flow FP7 project to expand the use of workflows among European scientists. The presentation will introduce the SHIWA Simulation Platform and the services that ER-flow provides based on the platform to space and earth science researchers. The SHIWA Simulation Platform includes: 1. SHIWA Repository: A database where workflows and meta-data about workflows can be stored. The database is a central repository to discover and share workflows within and among communities . 2. SHIWA Portal: A web portal that is integrated with the SHIWA Repository and includes a workflow executor engine that can orchestrate various types of workflows on various grid and cloud platforms. 3. SHIWA Desktop: A desktop environment that provides similar access capabilities than the SHIWA Portal, however it runs on the users' desktops/laptops instead of a portal server. 4. Workflow engines: the ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflow engines are already

  17. Scientific workflows for bibliometrics.

    Science.gov (United States)

    Guler, Arzu Tugce; Waaijer, Cathelijn J F; Palmblad, Magnus

    Scientific workflows organize the assembly of specialized software into an overall data flow and are particularly well suited for multi-step analyses using different types of software tools. They are also favorable in terms of reusability, as previously designed workflows could be made publicly available through the myExperiment community and then used in other workflows. We here illustrate how scientific workflows and the Taverna workbench in particular can be used in bibliometrics. We discuss the specific capabilities of Taverna that makes this software a powerful tool in this field, such as automated data import via Web services, data extraction from XML by XPaths, and statistical analysis and visualization with R. The support of the latter is particularly relevant, as it allows integration of a number of recently developed R packages specifically for bibliometrics. Examples are used to illustrate the possibilities of Taverna in the fields of bibliometrics and scientometrics.

  18. Pegasus Workflow Management System: Helping Applications From Earth and Space

    Science.gov (United States)

    Mehta, G.; Deelman, E.; Vahi, K.; Silva, F.

    2010-12-01

    Pegasus WMS is a Workflow Management System that can manage large-scale scientific workflows across Grid, local and Cloud resources simultaneously. Pegasus WMS provides a means for representing the workflow of an application in an abstract XML form, agnostic of the resources available to run it and the location of data and executables. It then compiles these workflows into concrete plans by querying catalogs and farming computations across local and distributed computing resources, as well as emerging commercial and community cloud environments in an easy and reliable manner. Pegasus WMS optimizes the execution as well as data movement by leveraging existing Grid and cloud technologies via a flexible pluggable interface and provides advanced features like reusing existing data, automatic cleanup of generated data, and recursive workflows with deferred planning. It also captures all the provenance of the workflow from the planning stage to the execution of the generated data, helping scientists to accurately measure performance metrics of their workflow as well as data reproducibility issues. Pegasus WMS was initially developed as part of the GriPhyN project to support large-scale high-energy physics and astrophysics experiments. Direct funding from the NSF enabled support for a wide variety of applications from diverse domains including earthquake simulation, bacterial RNA studies, helioseismology and ocean modeling. Earthquake Simulation: Pegasus WMS was recently used in a large scale production run in 2009 by the Southern California Earthquake Centre to run 192 million loosely coupled tasks and about 2000 tightly coupled MPI style tasks on National Cyber infrastructure for generating a probabilistic seismic hazard map of the Southern California region. SCEC ran 223 workflows over a period of eight weeks, using on average 4,420 cores, with a peak of 14,540 cores. A total of 192 million files were produced totaling about 165TB out of which 11TB of data was saved

  19. Open Search Environments: The Free Alternative to Commercial Search Services

    Directory of Open Access Journals (Sweden)

    Adrian O'Riordan

    2014-06-01

    Full Text Available Open search systems present a free and less restricted alternative to commercial search services. This paper explores the space of open search technology looking in particular at the issue of interoperability. A description of current protocols and formats for engineering open search applications is presented. The suitability of these technologies and issues around their adoption and operation are discussed. This open search approach is especially proving a fitting choice in applications involving the harvesting of resources and information integration. Principal among the technological solutions are OpenSearch and SRU. OpenSearch and SRU implement a federated model to enable existing and new search engines and search clients communicate. Applications and instances where Opensearch and SRU can be combined are presented. Other relevant technologies such as OpenURL, Apache Solr, and OAI-PMH are also discussed. The deployment of these freely licensed open standards in digital library applications is now a genuine alternative to commercial or proprietary systems.

  20. Designing Flexible E-Business Workflow Systems

    Directory of Open Access Journals (Sweden)

    Cătălin Silvestru

    2010-01-01

    Full Text Available In today’s business environment organizations must cope with complex interactions between actors adapt fast to frequent market changes and be innovative. In this context, integrating knowledge with processes and Business Intelligenceis a major step towards improving organization agility. Therefore, traditional environments for workflow design have been adapted to answer the new business models and current requirements in the field of collaborative processes. This paper approaches the design of flexible and dynamic workflow management systems for electronic businesses that can lead to agility.

  1. Workflow-Based Dynamic Enterprise Modeling

    Institute of Scientific and Technical Information of China (English)

    黄双喜; 范玉顺; 罗海滨; 林慧萍

    2002-01-01

    Traditional systems for enterprise modeling and business process control are often static and cannot adapt to the changing environment. This paper presents a workflow-based method to dynamically execute the enterprise model. This method gives an explicit representation of the business process logic and the relationships between the elements involved in the process. An execution-oriented integrated enterprise modeling system is proposed in combination with other enterprise views. The enterprise model can be established and executed dynamically in the actual environment due to the dynamic properties of the workflow model.

  2. Tavaxy: integrating Taverna and Galaxy workflows with cloud computing support.

    Science.gov (United States)

    Abouelhoda, Mohamed; Issa, Shadi Alaa; Ghanem, Moustafa

    2012-05-04

    Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis.The system can be accessed either through a

  3. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    Directory of Open Access Journals (Sweden)

    Abouelhoda Mohamed

    2012-05-01

    Full Text Available Abstract Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub- workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and

  4. Quality Metadata Management for Geospatial Scientific Workflows: from Retrieving to Assessing with Online Tools

    Science.gov (United States)

    Leibovici, D. G.; Pourabdollah, A.; Jackson, M.

    2011-12-01

    Experts and decision-makers use or develop models to monitor global and local changes of the environment. Their activities require the combination of data and processing services in a flow of operations and spatial data computations: a geospatial scientific workflow. The seamless ability to generate, re-use and modify a geospatial scientific workflow is an important requirement but the quality of outcomes is equally much important [1]. Metadata information attached to the data and processes, and particularly their quality, is essential to assess the reliability of the scientific model that represents a workflow [2]. Managing tools, dealing with qualitative and quantitative metadata measures of the quality associated with a workflow, are, therefore, required for the modellers. To ensure interoperability, ISO and OGC standards [3] are to be adopted, allowing for example one to define metadata profiles and to retrieve them via web service interfaces. However these standards need a few extensions when looking at workflows, particularly in the context of geoprocesses metadata. We propose to fill this gap (i) at first through the provision of a metadata profile for the quality of processes, and (ii) through providing a framework, based on XPDL [4], to manage the quality information. Web Processing Services are used to implement a range of metadata analyses on the workflow in order to evaluate and present quality information at different levels of the workflow. This generates the metadata quality, stored in the XPDL file. The focus is (a) on the visual representations of the quality, summarizing the retrieved quality information either from the standardized metadata profiles of the components or from non-standard quality information e.g., Web 2.0 information, and (b) on the estimated qualities of the outputs derived from meta-propagation of uncertainties (a principle that we have introduced [5]). An a priori validation of the future decision-making supported by the

  5. Mining workflow processes from distributed workflow enactment event logs

    Directory of Open Access Journals (Sweden)

    Kwanghoon Pio Kim

    2012-12-01

    Full Text Available Workflow management systems help to execute, monitor and manage work process flow and execution. These systems, as they are executing, keep a record of who does what and when (e.g. log of events. The activity of using computer software to examine these records, and deriving various structural data results is called workflow mining. The workflow mining activity, in general, needs to encompass behavioral (process/control-flow, social, informational (data-flow, and organizational perspectives; as well as other perspectives, because workflow systems are "people systems" that must be designed, deployed, and understood within their social and organizational contexts. This paper particularly focuses on mining the behavioral aspect of workflows from XML-based workflow enactment event logs, which are vertically (semantic-driven distribution or horizontally (syntactic-driven distribution distributed over the networked workflow enactment components. That is, this paper proposes distributed workflow mining approaches that are able to rediscover ICN-based structured workflow process models through incrementally amalgamating a series of vertically or horizontally fragmented temporal workcases. And each of the approaches consists of a temporal fragment discovery algorithm, which is able to discover a set of temporal fragment models from the fragmented workflow enactment event logs, and a workflow process mining algorithm which rediscovers a structured workflow process model from the discovered temporal fragment models. Where, the temporal fragment model represents the concrete model of the XML-based distributed workflow fragment events log.

  6. Integrating configuration workflows with project management system

    Science.gov (United States)

    Nilsen, Dimitri; Weber, Pavel

    2014-06-01

    The complexity of the heterogeneous computing resources, services and recurring infrastructure changes at the GridKa WLCG Tier-1 computing center require a structured approach to configuration management and optimization of interplay between functional components of the whole system. A set of tools deployed at GridKa, including Puppet, Redmine, Foreman, SVN and Icinga, provides the administrative environment giving the possibility to define and develop configuration workflows, reduce the administrative effort and improve sustainable operation of the whole computing center. In this presentation we discuss the developed configuration scenarios implemented at GridKa, which we use for host installation, service deployment, change management procedures, service retirement etc. The integration of Puppet with a project management tool like Redmine provides us with the opportunity to track problem issues, organize tasks and automate these workflows. The interaction between Puppet and Redmine results in automatic updates of the issues related to the executed workflow performed by different system components. The extensive configuration workflows require collaboration and interaction between different departments like network, security, production etc. at GridKa. Redmine plugins developed at GridKa and integrated in its administrative environment provide an effective way of collaboration within the GridKa team. We present the structural overview of the software components, their connections, communication protocols and show a few working examples of the workflows and their automation.

  7. Enhancing User Support in Open Problem Solving Environments through Bayesian Network Inference Techniques

    Science.gov (United States)

    Tselios, Nikolaos; Stoica, Adrian; Maragoudakis, Manolis; Avouris, Nikolaos; Komis, Vassilis

    2006-01-01

    During the last years, development of open learning environments that support effectively their users has been a challenge for the research community of educational technologies. The open interactive nature of these environments results in users experiencing difficulties in coping with the plethora of available functions, especially during their…

  8. Comparison of Resource Platform Selection Approaches for Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Ramakrishnan, Lavanya

    2010-03-05

    Cloud computing is increasingly considered as an additional computational resource platform for scientific workflows. The cloud offers opportunity to scale-out applications from desktops and local cluster resources. At the same time, it can eliminate the challenges of restricted software environments and queue delays in shared high performance computing environments. Choosing from these diverse resource platforms for a workflow execution poses a challenge for many scientists. Scientists are often faced with deciding resource platform selection trade-offs with limited information on the actual workflows. While many workflow planning methods have explored task scheduling onto different resources, these methods often require fine-scale characterization of the workflow that is onerous for a scientist. In this position paper, we describe our early exploratory work into using blackbox characteristics to do a cost-benefit analysis across of using cloud platforms. We use only very limited high-level information on the workflow length, width, and data sizes. The length and width are indicative of the workflow duration and parallelism. The data size characterizes the IO requirements. We compare the effectiveness of this approach to other resource selection models using two exemplar scientific workflows scheduled on desktops, local clusters, HPC centers, and clouds. Early results suggest that the blackbox model often makes the same resource selections as a more fine-grained whitebox model. We believe the simplicity of the blackbox model can help inform a scientist on the applicability of cloud computing resources even before porting an existing workflow.

  9. An Efficient Workflow Environment to Support the Collaborative Development of Actionable Climate Information Using the NCAR Climate Risk Management Engine (CRMe)

    Science.gov (United States)

    Ammann, C. M.; Vigh, J. L.; Lee, J. A.

    2016-12-01

    Society's growing needs for robust and relevant climate information have fostered an explosion in tools and frameworks for processing climate projections. Many top-down workflows might be employed to generate sets of pre-computed data and plots, frequently served in a "loading-dock style" through a metadata-enabled search and discovery engine. Despite these increasing resources, the diverse needs of applications-driven projects often result in data processing workflow requirements that cannot be fully satisfied using past approaches. In parallel to the data processing challenges, the provision of climate information to users in a form that is also usable represents a formidable challenge of its own. Finally, many users do not have the time nor the desire to synthesize and distill massive volumes of climate information to find the relevant information for their particular application. All of these considerations call for new approaches to developing actionable climate information. CRMe seeks to bridge the gap between the diversity and richness of bottom-up needs of practitioners, with discrete, structured top-down workflows typically implemented for rapid delivery. Additionally, CRMe has implemented web-based data services capable of providing focused climate information in usable form for a given location, or as spatially aggregated information for entire regions or countries following the needs of users and sectors. Making climate data actionable also involves summarizing and presenting it in concise and approachable ways. CRMe is developing the concept of dashboards, co-developed with the users, to condense the key information into a quick summary of the most relevant, curated climate data for a given discipline, application, or location, while still enabling users to efficiently conduct deeper discovery into rich datasets on an as-needed basis.

  10. Towards an actor-driven workflow management system for Grids

    NARCIS (Netherlands)

    F. Berretz; S. Skorupa; V. Sander; A. Belloum

    2010-01-01

    Currently, most workflow management systems in Grid environments provide push-oriented job distribution strategies, where jobs are explicitly delegated to resources. In those scenarios the dedicated resources execute submitted jobs according to the request of a workflow engine or Grid wide scheduler

  11. Towards an actor-driven workflow management system for Grids

    NARCIS (Netherlands)

    Berretz, F.; Skorupa, S.; Sander, V.; Belloum, A.

    2010-01-01

    Currently, most workflow management systems in Grid environments provide push-oriented job distribution strategies, where jobs are explicitly delegated to resources. In those scenarios the dedicated resources execute submitted jobs according to the request of a workflow engine or Grid wide

  12. Towards an actor-driven workflow management system for Grids

    NARCIS (Netherlands)

    Berretz, F.; Skorupa, S.; Sander, V.; Belloum, A.

    2010-01-01

    Currently, most workflow management systems in Grid environments provide push-oriented job distribution strategies, where jobs are explicitly delegated to resources. In those scenarios the dedicated resources execute submitted jobs according to the request of a workflow engine or Grid wide scheduler

  13. Multimedia Courseware in an Open Systems Environment: A Federal Strategy.

    Science.gov (United States)

    Moline, Judi; And Others

    The Portable Courseware Project (PORTCO) of the U.S. Department of Defense (DoD) is typical of projects worldwide that require standard software interfaces. This paper articulates the strategy whereby the federal multimedia courseware initiative leverages the open systems movement and the new realities of information technology. The federal…

  14. Positioning Your Library in an Open-Access Environment

    Science.gov (United States)

    Bhatt, Anjana H.

    2010-01-01

    This paper is a summary of the project that the author completed at Florida Gulf Coast University (FGCU) library for providing online access to 80 open access E-journals and digital collections. Although FGCU uses SerialsSolutions products to establish online access, any one can provide access to these collections as they are free for all. Paper…

  15. Positioning Your Library in an Open-Access Environment

    Science.gov (United States)

    Bhatt, Anjana H.

    2010-01-01

    This paper is a summary of the project that the author completed at Florida Gulf Coast University (FGCU) library for providing online access to 80 open access E-journals and digital collections. Although FGCU uses SerialsSolutions products to establish online access, any one can provide access to these collections as they are free for all. Paper…

  16. Learning Features in an Open, Flexible, and Distributed Environment

    Science.gov (United States)

    Khan, Badrul

    2005-01-01

    The Internet, supported by various digital technologies, is well-suited for open, flexible and distributed e-learning. Designing and delivering instruction and training on the Internet requires thoughtful analysis and investigation, combined with an understanding of both the Internet's capabilities and resources and the ways in which instructional…

  17. The PBase Scientific Workflow Provenance Repository

    Directory of Open Access Journals (Sweden)

    Víctor Cuevas-Vicenttín

    2014-10-01

    Full Text Available Scientific workflows and their supporting systems are becoming increasingly popular for compute-intensive and data-intensive scientific experiments. The advantages scientific workflows offer include rapid and easy workflow design, software and data reuse, scalable execution, sharing and collaboration, and other advantages that altogether facilitate “reproducible science”. In this context, provenance – information about the origin, context, derivation, ownership, or history of some artifact – plays a key role, since scientists are interested in examining and auditing the results of scientific experiments. However, in order to perform such analyses on scientific results as part of extended research collaborations, an adequate environment and tools are required. Concretely, the need arises for a repository that will facilitate the sharing of scientific workflows and their associated execution traces in an interoperable manner, also enabling querying and visualization. Furthermore, such functionality should be supported while taking performance and scalability into account. With this purpose in mind, we introduce PBase: a scientific workflow provenance repository implementing the ProvONE proposed standard, which extends the emerging W3C PROV standard for provenance data with workflow specific concepts. PBase is built on the Neo4j graph database, thus offering capabilities such as declarative and efficient querying. Our experiences demonstrate the power gained by supporting various types of queries for provenance data. In addition, PBase is equipped with a user friendly interface tailored for the visualization of scientific workflow provenance data, making the specification of queries and the interpretation of their results easier and more effective.

  18. Towards a Collaborative Open Environment of Project-Centred Learning

    DEFF Research Database (Denmark)

    Bongio, Aldo; van Bruggen, Jan; Ceri, Stefano

    problems. Such an environment will become increasingly relevant in multinational universities and companies, and it has brought a number of challenges to existing e-learning technologies. COOPER is an ongoing project that focuses on developing and testing such a collaborative and project-centred leaning...... environment. This paper proposes a COOPER framework and shows its approaches to address the various research challenges. This work is partially supported by EU/IST FP6 STREP project COOPER (contract number IST-2005-027073)....

  19. A Reference Architecture for Workflow Management Systems

    NARCIS (Netherlands)

    Grefen, Paul; Remmerts de Vries, Remmert

    1998-01-01

    In the workflow management field, fast developments are taking place. A growing number of systems is currently under development, both in academic and commercial environments. Consequently, a wide variety of ad hoc architectures has come into existence. Reference models are necessary, however, to al

  20. On Secure Workflow Decentralisation on the Internet

    Directory of Open Access Journals (Sweden)

    Petteri Kaskenpalo

    2010-06-01

    Full Text Available Decentralised workflow management systems are a new research area, where most work to-date has focused on the system's overall architecture. As little attention has been given to the security aspects in such systems, we follow a security driven approach, and consider, from the perspective of available security building blocks, how security can be implemented and what new opportunities are presented when empowering the decentralised environment with modern distributed security protocols. Our research is motivated by a more general question of how to combine the positive enablers that email exchange enjoys, with the general benefits of workflow systems, and more specifically with the benefits that can be introduced in a decentralised environment. This aims to equip email users with a set of tools to manage the semantics of a message exchange, contents, participants and their roles in the exchange in an environment that provides inherent assurances of security and privacy. This work is based on a survey of contemporary distributed security protocols, and considers how these protocols could be used in implementing a distributed workflow management system with decentralised control . We review a set of these protocols, focusing on the required message sequences in reviewing the protocols, and discuss how these security protocols provide the foundations for implementing core control-flow, data, and resource patterns in a distributed workflow environment.

  1. Parametric Room Acoustic Workflows

    DEFF Research Database (Denmark)

    Parigi, Dario; Svidt, Kjeld; Molin, Erik

    2017-01-01

    The paper investigates and assesses different room acoustics software and the opportunities they offer to engage in parametric acoustics workflow and to influence architectural designs. The first step consists in the testing and benchmarking of different tools on the basis of accuracy, speed and ...

  2. Seamless online science workflow development and collaboration using IDL and the ENVI Services Engine

    Science.gov (United States)

    Harris, A. T.; Ramachandran, R.; Maskey, M.

    2013-12-01

    The Exelis-developed IDL and ENVI software are ubiquitous tools in Earth science research environments. The IDL Workbench is used by the Earth science community for programming custom data analysis and visualization modules. ENVI is a software solution for processing and analyzing geospatial imagery that combines support for multiple Earth observation scientific data types (optical, thermal, multi-spectral, hyperspectral, SAR, LiDAR) with advanced image processing and analysis algorithms. The ENVI & IDL Services Engine (ESE) is an Earth science data processing engine that allows researchers to use open standards to rapidly create, publish and deploy advanced Earth science data analytics within any existing enterprise infrastructure. Although powerful in many ways, the tools lack collaborative features out-of-box. Thus, as part of the NASA funded project, Collaborative Workbench to Accelerate Science Algorithm Development, researchers at the University of Alabama in Huntsville and Exelis have developed plugins that allow seamless research collaboration from within IDL workbench. Such additional features within IDL workbench are possible because IDL workbench is built using the Eclipse Rich Client Platform (RCP). RCP applications allow custom plugins to be dropped in for extended functionalities. Specific functionalities of the plugins include creating complex workflows based on IDL application source code, submitting workflows to be executed by ESE in the cloud, and sharing and cloning of workflows among collaborators. All these functionalities are available to scientists without leaving their IDL workbench. Because ESE can interoperate with any middleware, scientific programmers can readily string together IDL processing tasks (or tasks written in other languages like C++, Java or Python) to create complex workflows for deployment within their current enterprise architecture (e.g. ArcGIS Server, GeoServer, Apache ODE or SciFlo from JPL). Using the collaborative IDL

  3. Analysis of Cisco Open Network Environment (ONE) OpenFlow Controller Implementation

    Science.gov (United States)

    2014-08-01

    or match criteria to ports that are not connected) as well as the ability to install and uninstall a flow from switches without deleting it entirely...Port recognition Not explicit Yes Install/ uninstall flows NA Yes Set dynamic flows Requires code manipulation Yes 6 5.2 OpenFlow Control Channel...Setup The initial handshake between the controller and switch was monitored using packet analysis tools capable of understanding OpenFlow protocol

  4. Network Business Environment for Open Innovation in SMEs

    OpenAIRE

    Ţoniş BuceaManea, Rocsana; Catană, Mădălin Gabriel; Tonoiu, Sergiu

    2014-01-01

    The SMEs represent an important factor of growth in both developed and developing countries, into which, however, they face different obstacles in the process of innovation. This paper analyses how open communication and collaboration can help SMEs in their struggle for sustainable innovation and profitable market competition. Based on a literature review, a number of obstacles that SMEs have to overcome in their current activity and possible support to be competitive are revea...

  5. A Multi-Dimensional Classification Model for Scientific Workflow Characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Ramakrishnan, Lavanya; Plale, Beth

    2010-04-05

    Workflows have been used to model repeatable tasks or operations in manufacturing, business process, and software. In recent years, workflows are increasingly used for orchestration of science discovery tasks that use distributed resources and web services environments through resource models such as grid and cloud computing. Workflows have disparate re uirements and constraints that affects how they might be managed in distributed environments. In this paper, we present a multi-dimensional classification model illustrated by workflow examples obtained through a survey of scientists from different domains including bioinformatics and biomedical, weather and ocean modeling, astronomy detailing their data and computational requirements. The survey results and classification model contribute to the high level understandingof scientific workflows.

  6. Towards an Open Framework Leveraging a Trusted Execution Environment

    DEFF Research Database (Denmark)

    Gonzalez, Javier; Bonnet, Philippe

    2013-01-01

    should have full control over. Like- wise, companies face a trade-off as the benefits of innovative services must be weighted against the risk of exposing data that reveal core in- ternal processes. How to design a data platform that enables innovative data services and yet enforce access and usage...... control? The solutions pro- posed in the literature to this trade-off all involve some form of trusted execution environment, where data and processing is trusted and safe from corruption by users or attackers. The hardware that could support such trusted execution environments is however closed...

  7. Towards an Open Framework Leveraging a Trusted Execution Environment

    DEFF Research Database (Denmark)

    Gonzalez, Javier; Bonnet, Philippe

    2013-01-01

    should have full control over. Like- wise, companies face a trade-off as the benefits of innovative services must be weighted against the risk of exposing data that reveal core in- ternal processes. How to design a data platform that enables innovative data services and yet enforce access and usage...... control? The solutions pro- posed in the literature to this trade-off all involve some form of trusted execution environment, where data and processing is trusted and safe from corruption by users or attackers. The hardware that could support such trusted execution environments is however closed...

  8. Towards a Collaborative Open Environment of Project-Centred Learning

    DEFF Research Database (Denmark)

    Bongio, Aldo; van Bruggen, Jan; Ceri, Stefano;

    problems. Such an environment will become increasingly relevant in multinational universities and companies, and it has brought a number of challenges to existing e-learning technologies. COOPER is an ongoing project that focuses on developing and testing such a collaborative and project-centred leaning...

  9. MBAT: A scalable informatics system for unifying digital atlasing workflows

    Directory of Open Access Journals (Sweden)

    Sane Nikhil

    2010-12-01

    Full Text Available Abstract Background Digital atlases provide a common semantic and spatial coordinate system that can be leveraged to compare, contrast, and correlate data from disparate sources. As the quality and amount of biological data continues to advance and grow, searching, referencing, and comparing this data with a researcher's own data is essential. However, the integration process is cumbersome and time-consuming due to misaligned data, implicitly defined associations, and incompatible data sources. This work addressing these challenges by providing a unified and adaptable environment to accelerate the workflow to gather, align, and analyze the data. Results The MouseBIRN Atlasing Toolkit (MBAT project was developed as a cross-platform, free open-source application that unifies and accelerates the digital atlas workflow. A tiered, plug-in architecture was designed for the neuroinformatics and genomics goals of the project to provide a modular and extensible design. MBAT provides the ability to use a single query to search and retrieve data from multiple data sources, align image data using the user's preferred registration method, composite data from multiple sources in a common space, and link relevant informatics information to the current view of the data or atlas. The workspaces leverage tool plug-ins to extend and allow future extensions of the basic workspace functionality. A wide variety of tool plug-ins were developed that integrate pre-existing as well as newly created technology into each workspace. Novel atlasing features were also developed, such as supporting multiple label sets, dynamic selection and grouping of labels, and synchronized, context-driven display of ontological data. Conclusions MBAT empowers researchers to discover correlations among disparate data by providing a unified environment for bringing together distributed reference resources, a user's image data, and biological atlases into the same spatial or semantic context

  10. Workflow Tools for Digital Curation

    Directory of Open Access Journals (Sweden)

    Andrew James Weidner

    2013-04-01

    Full Text Available Maintaining usable and sustainable digital collections requires a complex set of actions that address the many challenges at various stages of the digital object lifecycle. Digital curation activities enhance access and retrieval, maintain quality, add value, and facilitate use and re-use over time. Digital resource lifecycle management is becoming an increasingly important topic as digital curators actively explore software tools that perform metadata curation and file management tasks. Accordingly, the University of North Texas (UNT Libraries develop tools and workflows that streamline production and quality assurance activities. This article demonstrates two open source software tools, AutoHotkey and Selenium IDE, which the UNT Digital Libraries Division has adopted for use during the pre-ingest and post-ingest stages of the digital resource lifecycle.

  11. Optimal power transaction matrix rescheduling under multilateral open access environment

    Energy Technology Data Exchange (ETDEWEB)

    Moghaddam, M.P.; Raoofat, M.; Haghifam, M.R. [Tarbiat Modarres University, Tehran (Iran). Department of Electrical Engineering

    2004-09-01

    This paper addresses a new concept for determining optimal transactions between different entities in a multilateral environment while benefits of both buyer and seller entities are taken into account with respect to the rules of the system. At the same time, constraints of the network are met, which leads to an optimal power flow problem. A modified power transaction matrix is proposed for modeling the environment. The optimization method in this paper is the continuation method, which is suited for complex situations of power system studies. This complexity will become more serious when dual interaction between financial and electrical subsystems of competitive power system are taken into account. The proposed approach is tested on a typical network with satisfactory results. (author)

  12. Insightful Workflow For Grid Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Charles Earl

    2008-10-09

    We developed a workflow adaptation and scheduling system for Grid workflow. The system currently interfaces with and uses the Karajan workflow system. We developed machine learning agents that provide the planner/scheduler with information needed to make decisions about when and how to replan. The Kubrick restructures workflow at runtime, making it unique among workflow scheduling systems. The existing Kubrick system provides a platform on which to integrate additional quality of service constraints and in which to explore the use of an ensemble of scheduling and planning algorithms. This will be the principle thrust of our Phase II work.

  13. Workflow Automation with Lotus Notes for the Governmental Administrative Information System

    OpenAIRE

    Maskeliunas, Saulius

    1999-01-01

    The paper presents an introductory overview of the workflow automation area, outlining the main types, basic technologies, the essential features of workflow applications. Two sorts of process models for the definition of workflows (according to the conversation-based and activity-based methodologies) are sketched. Later on, the nature of Lotus Notes and its capabilities (as an environment for workflow management systems development) are indicated. Concluding, the experience of automating adm...

  14. The Richness of Open-ended Play - Rules, feedback and adaptation mechanisms in intelligent play environments

    Directory of Open Access Journals (Sweden)

    Pepijn Rijnbout

    2015-11-01

    Full Text Available How can we design intelligent play environments for open-ended play that support richness in play? Rich play can be described as ongoing play that changes over time in character, form and nature. This paper elaborates on our initial insights on how rules and goals develop from interaction opportunities of the system, based on two pilot studies with an interactive play environment for open-ended play. Furthermore we will discuss the roles of feedback and adaptation mechanisms in the environment. Those system properties will change the interaction opportunities to match with the current situation in the play environment and to support richness in play.

  15. Toward Project-based Learning and Team Formation in Open Learning Environments

    NARCIS (Netherlands)

    Spoelstra, Howard; Van Rosmalen, Peter; Sloep, Peter

    2014-01-01

    Open Learning Environments, MOOCs, as well as Social Learning Networks, embody a new approach to learning. Although both emphasise interactive participation, somewhat surprisingly, they do not readily support bond creating and motivating collaborative learning opportunities. Providing project-based

  16. Toward Project-based Learning and Team Formation in Open Learning Environments

    NARCIS (Netherlands)

    Spoelstra, Howard; Van Rosmalen, Peter; Sloep, Peter

    2014-01-01

    Open Learning Environments, MOOCs, as well as Social Learning Networks, embody a new approach to learning. Although both emphasise interactive participation, somewhat surprisingly, they do not readily support bond creating and motivating collaborative learning opportunities. Providing project-based

  17. Kronos: a workflow assembler for genome analytics and informatics.

    Science.gov (United States)

    Taghiyar, M Jafar; Rosner, Jamie; Grewal, Diljot; Grande, Bruno M; Aniba, Radhouane; Grewal, Jasleen; Boutros, Paul C; Morin, Ryan D; Bashashati, Ali; Shah, Sohrab P

    2017-07-01

    The field of next-generation sequencing informatics has matured to a point where algorithmic advances in sequence alignment and individual feature detection methods have stabilized. Practical and robust implementation of complex analytical workflows (where such tools are structured into "best practices" for automated analysis of next-generation sequencing datasets) still requires significant programming investment and expertise. We present Kronos, a software platform for facilitating the development and execution of modular, auditable, and distributable bioinformatics workflows. Kronos obviates the need for explicit coding of workflows by compiling a text configuration file into executable Python applications. Making analysis modules would still require programming. The framework of each workflow includes a run manager to execute the encoded workflows locally (or on a cluster or cloud), parallelize tasks, and log all runtime events. The resulting workflows are highly modular and configurable by construction, facilitating flexible and extensible meta-applications that can be modified easily through configuration file editing. The workflows are fully encoded for ease of distribution and can be instantiated on external systems, a step toward reproducible research and comparative analyses. We introduce a framework for building Kronos components that function as shareable, modular nodes in Kronos workflows. The Kronos platform provides a standard framework for developers to implement custom tools, reuse existing tools, and contribute to the community at large. Kronos is shipped with both Docker and Amazon Web Services Machine Images. It is free, open source, and available through the Python Package Index and at https://github.com/jtaghiyar/kronos.

  18. Team formation instruments to enhance learner interactions in open learning environments

    NARCIS (Netherlands)

    Spoelstra, Howard; Van Rosmalen, Peter; Houtmans, Tilly; Sloep, Peter

    2015-01-01

    Open learning environments, such as Massive Open Online Courses (MOOCs), often lack adequate learner collaboration possibilities; they are also plagued by high levels of drop-out. Introducing project-based learning (PBL) can enhance learner collaboration and motivation, but PBL does not easily scale

  19. Towards Adaptive Open Learning Environments: Evaluating the Precision of Identifying Learning Styles by Tracking Learners' Behaviours

    Science.gov (United States)

    Fasihuddin, Heba; Skinner, Geoff; Athauda, Rukshan

    2017-01-01

    Open learning represents a new form of online learning where courses are provided freely online for large numbers of learners. MOOCs are examples of this form of learning. The authors see an opportunity for personalising open learning environments by adapting to learners' learning styles and providing adaptive support to meet individual learner…

  20. Team formation instruments to enhance learner interactions in open learning environments

    NARCIS (Netherlands)

    Spoelstra, Howard; Van Rosmalen, Peter; Houtmans, Tilly; Sloep, Peter

    2015-01-01

    Open learning environments, such as Massive Open Online Courses (MOOCs), often lack adequate learner collaboration possibilities; they are also plagued by high levels of drop-out. Introducing project-based learning (PBL) can enhance learner collaboration and motivation, but PBL does not easily

  1. Workflow User Interfaces Patterns

    Directory of Open Access Journals (Sweden)

    Jean Vanderdonckt

    2012-03-01

    Full Text Available Este trabajo presenta una colección de patrones de diseño de interfaces de usuario para sistemas de información para el flujo de trabajo; la colección incluye cuarenta y tres patrones clasificados en siete categorías identificados a partir de la lógica del ciclo de vida de la tarea sobre la base de la oferta y la asignación de tareas a los responsables de realizarlas (i. e. recursos humanos durante el flujo de trabajo. Cada patrón de la interfaz de usuario de flujo de trabajo (WUIP, por sus siglas en inglés se caracteriza por las propiedades expresadas en el lenguaje PLML para expresar patrones y complementado por otros atributos y modelos que se adjuntan a dicho modelo: la interfaz de usuario abstracta y el modelo de tareas correspondiente. Estos modelos se especifican en un lenguaje de descripción de interfaces de usuario. Todos los WUIPs se almacenan en una biblioteca y se pueden recuperar a través de un editor de flujo de trabajo que vincula a cada patrón de asignación de trabajo a su WUIP correspondiente.A collection of user interface design patterns for workflow information systems is presented that contains forty three resource patterns classified in seven categories. These categories and their corresponding patterns have been logically identified from the task life cycle based on offering and allocation operations. Each Workflow User Interface Pattern (WUIP is characterized by properties expressed in the PLML markup language for expressing patterns and augmented by additional attributes and models attached to the pattern: the abstract user interface and the corresponding task model. These models are specified in a User Interface Description Language. All WUIPs are stored in a library and can be retrieved within a workflow editor that links each workflow pattern to its corresponding WUIP, thus giving rise to a user interface for each workflow pattern.

  2. CSP for Executable Scientific Workflows

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard

    and can usually benefit performance-wise from both multiprocessing, cluster and grid environments. PyCSP is an implementation of Communicating Sequential Processes (CSP) for the Python programming language and takes advantage of CSP's formal and verifiable approach to controlling concurrency...... and the readability of Python source code. Python is a popular programming language in the scientific community, with many scientific libraries (modules) and simple integration to external languages. This thesis presents a PyCSP extended with many new features and a more robust implementation to allow scientific...... is demonstrated through examples. By providing a robust library for organising scientific workflows in a Python application I hope to inspire scientific users to adopt PyCSP. As a proof-of-concept this thesis demonstrates three scientific applications: kNN, stochastic minimum search and McStas to scale well...

  3. Workflow logs analysis system for enterprise performance measurement

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Workflow logs that record the execution of business processes offer very valuable data resource for real-time enterprise performance measurement. In this paper, a novel scheme that uses the technology of data warehouse and OLAP to explore workflow logs and create complex analysis reports for enterprise performance measurement is proposed. Three key points of this scheme are studied: 1) the measure set; 2) the open and flexible architecture for workflow logs analysis system; 3) the data models in WFMS and data warehouse. A case study that shows the validity of the scheme is also provided.

  4. An Approach to Design Reusable Workflow Engine

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Developers still need design workflow system according to users' specific needs, though workflow management coalition standardized the five kinds of abstract interfaces in workflow reference model. Specific business process characteristics are still supported by specific workflow system. A set of common functionalities of workflow engine are abstracted from business component, so the reusability of business component is extended into workflow engine and composition method is proposed. Needs of different business requirements and characteristics are met by reusing the workflow engine.

  5. Contextual variables of open innovation paradigm in the business environment of Slovenian companies

    Directory of Open Access Journals (Sweden)

    Jana Krapež

    2012-01-01

    Full Text Available This article addresses the current condition of Slovenian business environment and its support to open innovation. By carrying out qualitative empirical research, we investigate to what extent determinants from internal, narrower and broader external business environment influence open innovation in Slovenian companies. Several support mechanisms were established to create friendlier environment for open innovation. Our study indicates that if Slovenia wants to be successful on the long run, supportive environment cannot and should not be based solely on government financial support, but must also contain other elements that affect technological development, meaning: 1 organizational culture, values, reward system; 2 legislation; 3 tax and social contributions; 4 bureaucratic barriers; 5 human resources; and 6 favorable bank loans, bank guarantees, venture capital, etc. The paper concludes with implications for managers and policy makers, outlining several promising areas for future research.

  6. Deploying and sharing U-Compare workflows as web services.

    Science.gov (United States)

    Kontonatsios, Georgios; Korkontzelos, Ioannis; Kolluru, Balakrishna; Thompson, Paul; Ananiadou, Sophia

    2013-02-18

    U-Compare is a text mining platform that allows the construction, evaluation and comparison of text mining workflows. U-Compare contains a large library of components that are tuned to the biomedical domain. Users can rapidly develop biomedical text mining workflows by mixing and matching U-Compare's components. Workflows developed using U-Compare can be exported and sent to other users who, in turn, can import and re-use them. However, the resulting workflows are standalone applications, i.e., software tools that run and are accessible only via a local machine, and that can only be run with the U-Compare platform. We address the above issues by extending U-Compare to convert standalone workflows into web services automatically, via a two-click process. The resulting web services can be registered on a central server and made publicly available. Alternatively, users can make web services available on their own servers, after installing the web application framework, which is part of the extension to U-Compare. We have performed a user-oriented evaluation of the proposed extension, by asking users who have tested the enhanced functionality of U-Compare to complete questionnaires that assess its functionality, reliability, usability, efficiency and maintainability. The results obtained reveal that the new functionality is well received by users. The web services produced by U-Compare are built on top of open standards, i.e., REST and SOAP protocols, and therefore, they are decoupled from the underlying platform. Exported workflows can be integrated with any application that supports these open standards. We demonstrate how the newly extended U-Compare enhances the cross-platform interoperability of workflows, by seamlessly importing a number of text mining workflow web services exported from U-Compare into Taverna, i.e., a generic scientific workflow construction platform.

  7. Open access: changing global science publishing.

    Science.gov (United States)

    Gasparyan, Armen Yuri; Ayvazyan, Lilit; Kitas, George D

    2013-08-01

    The article reflects on open access as a strategy of changing the quality of science communication globally. Successful examples of open-access journals are presented to highlight implications of archiving in open digital repositories for the quality and citability of research output. Advantages and downsides of gold, green, and hybrid models of open access operating in diverse scientific environments are described. It is assumed that open access is a global trend which influences the workflow in scholarly journals, changing their quality, credibility, and indexability.

  8. Data Exchange in Grid Workflow

    Institute of Scientific and Technical Information of China (English)

    ZENG Hongwei; MIAO Huaikou

    2006-01-01

    In existing web services-based workflow, data exchanging across the web services is centralized, the workflow engine intermediates at each step of the application sequence.However, many grid applications, especially data intensive scientific applications, require exchanging large amount of data across the grid services.Having a central workfiow engine relay the data between the services would results in a bottleneck in these cases.This paper proposes a data exchange model for individual grid workflow and multiworkflows composition respectively.The model enables direct communication for large amounts of data between two grid services.To enable data to exchange among multiple workflows, the bridge data service is used.

  9. Aboveground Biomass of Glossy Buckthorn is Similar in Open and Understory Environments but Architectural Strategy Differs

    Directory of Open Access Journals (Sweden)

    Caroline Hamelin

    2015-04-01

    Full Text Available The exotic shrub glossy buckthorn (Frangula alnus is a great concern among forest managers because it invades both open and shaded environments. To evaluate if buckthorn grows similarly across light environments, and if adopting different shapes contributes to an efficient use of light, we compared buckthorns growing in an open field and in the understory of a mature hybrid poplar plantation. For a given age, the relationships describing aboveground biomass of buckthorns in the open field and in the plantation were not significantly different. However, we observed a significant difference between the diameter-height relationships in the two environments. These results suggest a change in buckthorn’s architecture, depending on the light environment in which it grows. Buckthorn adopts either an arborescent shape under a tree canopy, or a shrubby shape in an open field, to optimally capture the light available. This architectural plasticity helps explain a similar invasion success for glossy buckthorn growing in both open and shaded environments, at least up to the canopy closure level of the plantation used for this study.

  10. Data Workflow - A Workflow Model for Continuous Data Processing

    NARCIS (Netherlands)

    Wombacher, Andreas

    2010-01-01

    Online data or streaming data are getting more and more important for enterprise information systems, e.g. by integrating sensor data and workflows. The continuous flow of data provided e.g. by sensors requires new workflow models addressing the data perspective of these applications, since

  11. Tailored business solutions by workflow technologies

    Directory of Open Access Journals (Sweden)

    Alexandra Fortiş

    2006-01-01

    Full Text Available VISP (Virtual Internet Service Provider is an IST-STREP project, which is conducting research in the field of these new technologies, targeted to telecom/ISP companies. One of the first tasks of the VISP project is to identify the most appropriate technologies in order to construct the VISP platform. This paper presents the most significant results in the field of choreography and orchestration, two key domains that must accompany process modeling in the construction of a workflow environment.

  12. Build and Execute Environment

    Energy Technology Data Exchange (ETDEWEB)

    2017-04-21

    At exascale, the challenge becomes to develop applications that run at scale and use exascale platforms reliably, efficiently, and flexibly. Workflows become much more complex because they must seamlessly integrate simulation and data analytics. They must include down-sampling, post-processing, feature extraction, and visualization. Power and data transfer limitations require these analysis tasks to be run in-situ or in-transit. We expect successful workflows will comprise multiple linked simulations along with tens of analysis routines. Users will have limited development time at scale and, therefore, must have rich tools to develop, debug, test, and deploy applications. At this scale, successful workflows will compose linked computations from an assortment of reliable, well-defined computation elements, ones that can come and go as required, based on the needs of the workflow over time. We propose a novel framework that utilizes both virtual machines (VMs) and software containers to create a workflow system that establishes a uniform build and execution environment (BEE) beyond the capabilities of current systems. In this environment, applications will run reliably and repeatably across heterogeneous hardware and software. Containers, both commercial (Docker and Rocket) and open-source (LXC and LXD), define a runtime that isolates all software dependencies from the machine operating system. Workflows may contain multiple containers that run different operating systems, different software, and even different versions of the same software. We will run containers in open-source virtual machines (KVM) and emulators (QEMU) so that workflows run on any machine entirely in user-space. On this platform of containers and virtual machines, we will deliver workflow software that provides services, including repeatable execution, provenance, checkpointing, and future proofing. We will capture provenance about how containers were launched and how they interact to annotate

  13. Monitoring of Grid scientific workflows

    NARCIS (Netherlands)

    Balis, B.; Bubak, M.; Łabno, B.

    2008-01-01

    Scientific workflows are a means of conducting in silico experiments in modern computing infrastructures for e-Science, often built on top of Grids. Monitoring of Grid scientific workflows is essential not only for performance analysis but also to collect provenance data and gather feedback useful

  14. Monitoring of Grid scientific workflows

    NARCIS (Netherlands)

    Balis, B.; Bubak, M.; Łabno, B.

    2008-01-01

    Scientific workflows are a means of conducting in silico experiments in modern computing infrastructures for e-Science, often built on top of Grids. Monitoring of Grid scientific workflows is essential not only for performance analysis but also to collect provenance data and gather feedback useful i

  15. Digital workflows in contemporary orthodontics

    Directory of Open Access Journals (Sweden)

    Lars R Christensen

    2017-01-01

    Full Text Available Digital workflows are now increasingly possible in orthodontic practice. Workflows designed to improve the customization of orthodontic appliances are now available through laboratories and orthodontic manufacturing facilities in many parts of the world. These now have the potential to improve certain aspects of patient care.

  16. Workflow Management in Electronic Commerce

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Spaccapietra, S.; March, S.T.; Kambayashi, Y.

    2002-01-01

    This tutorial addresses the application of workflow management (WFM) for process support in both these cases. The tutorial is organized into three parts. In the first part, we pay attention to (classical) workflow management in the context of a single organization. In the second part, we extend this

  17. Automating Workflow using Dialectical Argumentation

    NARCIS (Netherlands)

    Urovi, Visara; Bromuri, Stefano; McGinnis, Jarred; Stathis, Kostas; Omicini, Andrea

    2008-01-01

    This paper presents a multi-agent framework based on argumentative agent technology for the automation of the workflow selection and execution. In this framework, workflow selection is coordinated by agent interactions governed by the rules of a dialogue game whose purpose is to evaluate the workflo

  18. A workflow for digitalization projects

    OpenAIRE

    De Mulder, Tom

    2005-01-01

    More and more institutions want to convert their traditional content to digital formats. In such pro jects the digitalization and metadata stages often happen asynchronously. This paper identifies the importance of frequent cross-verification of both. We suggest a workflow to formalise this process, and a possible technical implementation to automate this workflow.

  19. A standard-enabled workflow for synthetic biology

    KAUST Repository

    Myers, Chris J.

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications.

  20. Parametric Room Acoustic Workflows

    DEFF Research Database (Denmark)

    Parigi, Dario; Svidt, Kjeld; Molin, Erik

    2017-01-01

    The paper investigates and assesses different room acoustics software and the opportunities they offer to engage in parametric acoustics workflow and to influence architectural designs. The first step consists in the testing and benchmarking of different tools on the basis of accuracy, speed...... and interoperability with Grasshopper 3d. The focus will be placed to the benchmarking of three different acoustic analysis tools based on raytracing. To compare the accuracy and speed of the acoustic evaluation across different tools, a homogeneous set of acoustic parameters is chosen. The room acoustics parameters...... included in the set are reverberation time (EDT, RT30), clarity (C50), loudness (G), and definition (D50). Scenarios are discussed for determining at different design stages the most suitable acoustic tool. Those scenarios are characterized, by the use of less accurate but fast evaluation tools to be used...

  1. An Investigation of an Open-Source Software Development Environment in a Software Engineering Graduate Course

    OpenAIRE

    Ge, Xun; Huang, Kun; Dong, Yifei

    2010-01-01

    A semester-long ethnography study was carried out to investigate project-based learning in a graduate software engineering course through the implementation of an Open-Source Software Development (OSSD) learning environment, which featured authentic projects, learning community, cognitive apprenticeship, and technology affordances. The study revealed that while the OSSD learning environment motivated students to engage in real-world projects, tensions were found between the students’ self-pro...

  2. A social survey on the noise impact in open-plan working environments in China.

    Science.gov (United States)

    Zhang, Mei; Kang, Jian; Jiao, Fenglei

    2012-11-01

    The aim of this study is to reveal noise impact in open-plan working environments in China, through a series of questionnaire surveys and acoustic measurements in typical open-plan working environments. It has been found that compared to other physical environmental factors in open-plan working environments, people are much less satisfied with the acoustic environment. The noise impact in the surveyed working environments is rather significant, in terms of sound level inside the office, understanding of colleagues' conversation, and the use of background music such as music players. About 30-50% of the interviewees think that various noise sources inside and outside offices are 'very disturbing' and 'disturbing', and the most annoying sounds include noises from outside, ventilation systems, office equipment, and keyboard typing. Using higher panels to separate work space, or working in enclosed offices, are regarded as effective improvement measures, whereas introducing natural sounds to mask unwanted sounds seems to be not preferable. There are significant correlations between the evaluation of acoustic environment and office symptoms, including hypersensitivity to loud sounds, easily getting tired and depression. There are also significant correlations between evaluation of various acoustics-related factors and certain statements relating to job satisfaction, including sensitivity to noise, as well as whether conversations could be heard by colleagues. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Managing and Documenting Legacy Scientific Workflows.

    Science.gov (United States)

    Acuña, Ruben; Chomilier, Jacques; Lacroix, Zoé

    2015-10-06

    Scientific legacy workflows are often developed over many years, poorly documented and implemented with scripting languages. In the context of our cross-disciplinary projects we face the problem of maintaining such scientific workflows. This paper presents the Workflow Instrumentation for Structure Extraction (WISE) method used to process several ad-hoc legacy workflows written in Python and automatically produce their workflow structural skeleton. Unlike many existing methods, WISE does not assume input workflows to be preprocessed in a known workflow formalism. It is also able to identify and analyze calls to external tools. We present the method and report its results on several scientific workflows.

  4. Work environment perceptions following relocation to open-plan offices: A twelve-month longitudinal study.

    Science.gov (United States)

    Bergström, Jessica; Miller, Michael; Horneij, Eva

    2015-01-01

    A workplace's design can have various positive or negative effects on the employees and since the 1970s the advantages and disadvantages of open-plan offices have been discussed. The aim of this study was to investigate perceived health, work environment and self-estimated productivity one month before and at three, six and twelve months after relocation from individual offices to an open-plan office environment. Employees from three departments within the same company group and who worked with relatively similar tasks and who were planned to be relocated from private offices to open-plan offices were invited to participate. Questionnaires comprising items from The Salutogenic Health Indicator Scale, The Work Experience Measurement Scale, the questionnaire by Brennan et al. about perceived performance and one question from the Work Ability Index were sent to participants one month before relocation (baseline) to open-plan offices and then at three, six and twelve months after relocation. At baseline, 82 questionnaires were sent out. The response rate was 85%. At the follow-ups 77-79 questionnaires were sent out and the response-rate was 70%-81%. At follow-ups, perceived health, job satisfaction and performance had generally deteriorated. The results of the study indicate that employees' perception of health, work environment and performance decreased during a 12 month period following relocation from individual offices to open-plan offices.

  5. Open and Anonymous Peer Review in a Digital Online Environment Compared in Academic Writing Context

    Science.gov (United States)

    Razi, Salim

    2016-01-01

    This study compares the impact of "open" and "anonymous" peer feedback as an adjunct to teacher-mediated feedback in a digital online environment utilising data gathered on an academic writing course at a Turkish university. Students were divided into two groups with similar writing proficiencies. Students peer reviewed papers…

  6. An Investigation of an Open-Source Software Development Environment in a Software Engineering Graduate Course

    Science.gov (United States)

    Ge, Xun; Huang, Kun; Dong, Yifei

    2010-01-01

    A semester-long ethnography study was carried out to investigate project-based learning in a graduate software engineering course through the implementation of an Open-Source Software Development (OSSD) learning environment, which featured authentic projects, learning community, cognitive apprenticeship, and technology affordances. The study…

  7. An Investigation of an Open-Source Software Development Environment in a Software Engineering Graduate Course

    Science.gov (United States)

    Ge, Xun; Huang, Kun; Dong, Yifei

    2010-01-01

    A semester-long ethnography study was carried out to investigate project-based learning in a graduate software engineering course through the implementation of an Open-Source Software Development (OSSD) learning environment, which featured authentic projects, learning community, cognitive apprenticeship, and technology affordances. The study…

  8. Transfer from Structured to Open-Ended Problem Solving in a Computerized Metacognitive Environment

    Science.gov (United States)

    Kapa, Esther

    2007-01-01

    A new computerized environment introducing a variety of metacognitive support mechanisms (MSMs) in different phases of the problem-solving process was designed to influence students' transfer from solving structured problems (near transfer) to solving open-ended problems (far transfer). Two hundred and thirty one students (aged 13-14 years) were…

  9. Openings for Researching Environment and Place in Children's Literature: Ecologies, Potentials, Realities and Challenges

    Science.gov (United States)

    Reid, Alan; Payne, Phillip G.; Cutter-Mackenzie, Amy

    2010-01-01

    This not quite "final" ending of this special issue of "Environmental Education Research" traces a series of hopeful, if somewhat difficult and at times challenging, openings for researching experiences of environment and place through children's literature. In the first instance, we draw inspiration from the contributors who…

  10. Indoor climate, psychosocial work environment and symptoms in open-plan offices

    DEFF Research Database (Denmark)

    Pejtersen, J; Allermann, L; Kristensen, T S

    2006-01-01

    -plan offices, whereas eight buildings had a mixture of cellular, multi-person and open-plan offices. A total of 2301 occupants, corresponding to a response rate of 72%, completed a retrospective questionnaire. The questionnaire comprised questions concerning environmental perceptions, mucous membrane......-person and cellular offices. The association between psychosocial factors and office size was weak. Open-plan offices may not be suited for all job types. PRACTICAL IMPLICATION: Open-plan offices may be a risk factor for adverse environmental perceptions and symptoms.......To study the indoor climate, the psychosocial work environment and occupants' symptoms in offices a cross-sectional questionnaire survey was made in 11 naturally and 11 mechanically ventilated office buildings. Nine of the buildings had mainly cellular offices; five of the buildings had mainly open...

  11. Text mining meets workflow: linking U-Compare with Taverna

    Science.gov (United States)

    Kano, Yoshinobu; Dobson, Paul; Nakanishi, Mio; Tsujii, Jun'ichi; Ananiadou, Sophia

    2010-01-01

    Summary: Text mining from the biomedical literature is of increasing importance, yet it is not easy for the bioinformatics community to create and run text mining workflows due to the lack of accessibility and interoperability of the text mining resources. The U-Compare system provides a wide range of bio text mining resources in a highly interoperable workflow environment where workflows can very easily be created, executed, evaluated and visualized without coding. We have linked U-Compare to Taverna, a generic workflow system, to expose text mining functionality to the bioinformatics community. Availability: http://u-compare.org/taverna.html, http://u-compare.org Contact: kano@is.s.u-tokyo.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20709690

  12. Scientific workflows as productivity tools for drug discovery.

    Science.gov (United States)

    Shon, John; Ohkawa, Hitomi; Hammer, Juergen

    2008-05-01

    Large pharmaceutical companies annually invest tens to hundreds of millions of US dollars in research informatics to support their early drug discovery processes. Traditionally, most of these investments are designed to increase the efficiency of drug discovery. The introduction of do-it-yourself scientific workflow platforms has enabled research informatics organizations to shift their efforts toward scientific innovation, ultimately resulting in a possible increase in return on their investments. Unlike the handling of most scientific data and application integration approaches, researchers apply scientific workflows to in silico experimentation and exploration, leading to scientific discoveries that lie beyond automation and integration. This review highlights some key requirements for scientific workflow environments in the pharmaceutical industry that are necessary for increasing research productivity. Examples of the application of scientific workflows in research and a summary of recent platform advances are also provided.

  13. Concurrency & Asynchrony in Declarative Workflows

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2015-01-01

    Declarative or constraint-based business process and workflow notations have received increasing interest in the last decade as possible means of addressing the challenge of supporting at the same time flexibility in execution, adaptability and compliance. However, the definition of concurrent...... of concurrency in DCR Graphs admits asynchronous execution of declarative workflows both conceptually and by reporting on a prototype implementation of a distributed declarative workflow engine. Both the theoretical development and the implementation is supported by an extended example; moreover, the theoretical...

  14. Summer Student Report - AV Workflow

    CERN Document Server

    Abramson, Jessie

    2014-01-01

    The AV Workflow is web application which allows cern users to publish, update and delete videos from cds. During my summer internship I implemented the backend of the new version of the AV Worklow in python using the django framework.

  15. CSP for Executable Scientific Workflows

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard

    is demonstrated through examples. By providing a robust library for organising scientific workflows in a Python application I hope to inspire scientific users to adopt PyCSP. As a proof-of-concept this thesis demonstrates three scientific applications: kNN, stochastic minimum search and McStas to scale well......This thesis presents CSP as a means of orchestrating the execution of tasks in a scientific workflow. Scientific workflow systems are popular in a wide range of scientific areas, where tasks are organised in directed graphs. Execution of such graphs is handled by the scientific workflow systems...... on multi-processing and cluster computing using PyCSP. Additionally, McStas is demonstrated to utilise grid computing resources using PyCSP. Finally, this thesis presents a new dynamic channel model, which has not yet been implemented for PyCSP. The dynamic channel is able to change the internal...

  16. Indoor climate, psychosocial work environment and symptoms in open-plan offices

    DEFF Research Database (Denmark)

    Pejtersen, J; Allermann, L; Kristensen, T S

    2006-01-01

    To study the indoor climate, the psychosocial work environment and occupants' symptoms in offices a cross-sectional questionnaire survey was made in 11 naturally and 11 mechanically ventilated office buildings. Nine of the buildings had mainly cellular offices; five of the buildings had mainly open...... irritation, skin irritation, central nervous system (CNS) symptoms and psychosocial factors. Occupants in open-plan offices are more likely to perceive thermal discomfort, poor air quality and noise and they more frequently complain about CNS and mucous membrane symptoms than occupants in multi...

  17. Standards for business analytics and departmental workflow.

    Science.gov (United States)

    Erickson, Bradley J; Meenan, Christopher; Langer, Steve

    2013-02-01

    Efficient workflow is essential for a successful business. However, there is relatively little literature on analytical tools and standards for defining workflow and measuring workflow efficiency. Here, we describe an effort to define a workflow lexicon for medical imaging departments, including the rationale, the process, and the resulting lexicon.

  18. Higher Education Reform for Computer Major Students in Open and Research Environments

    Institute of Scientific and Technical Information of China (English)

    LI Xin; XU Xin-shun; JIA Zhi-ping; MENG Xiang-xu

    2012-01-01

    This paper analyzes the requirement of professional computer talents in Chinese universities, and introduces the practice in innovative educational methods taken by Shandong University in an open and research environment. In order to improve educational quality, we have carried out a serial of reforms, including "Four Experiences" aiming at diversifying study environments and fostering their adaptability and extending their vision. Students are encouraged to join "Research Assistant" program and participate in scientific projects to improve their ability in research and innovation. They also conduct "Engineering Practice" to learn latest modeling and programming skills. Compound talents characterized of solid foundation, high quality and strong practical ability are shaped through these initiatives.

  19. PERFORMANCE OF PRE-WEANED FEMALE CALVES CONFINED IN HOUSING AND OPEN ENVIRONMENT HUTCHES IN KUWAIT

    Directory of Open Access Journals (Sweden)

    M. A. RAZZAQUE, S. ABBAS, T. AL-MUTAWA AND M. BEDAIR

    2009-02-01

    Full Text Available Objective of the present study was to compare the responses of Holstein Friesian pre-weaned female calves confined in elevated metallic crates in closed-type of housing and polyvinyl hutches in an open environment of Kuwait. A total of 176 newborn Holstein Friesian female calves were randomly distributed to conventional confinement in closed-type calf houses (control and individual calf hutch in open environment (treatment. These calves were monitored upto the weaning age of 90 days. The average daily live weight gain was significantly higher in calves housed in hutches than conventional housing system (413 versus 113 g/h/d; P≤0.0001. Mean risk rates (RR for mortality in hutch and conventional housing were 0.017 and 0.23, respectively. The results showed a significant positive impact of hutch housing with respect to growth, mortality and incidence of diseases in Kuwait’s intensive dairy farming system

  20. An ultrasound image-guided surgical workflow model

    Science.gov (United States)

    Guo, Bing; Lemke, Heinz; Liu, Brent; Huang, H. K.; Grant, Edward G.

    2006-03-01

    A 2003 report in the Journal of Annual Surgery predicted an increase in demand for surgical services to be as high as 14 to 47% in the workload of all surgical fields by 2020. Medical difficulties which are already now apparent in the surgical OR (Operation Room) will be amplified in the near future and it is necessary to address this problem and develop strategies to handle the workload. Workflow issues are central to the efficiency of the OR and in response to today's continuing workforce shortages and escalating costs. Among them include: Inefficient and redundant processes, System Inflexibility, Ergonomic deficiencies, Scattered Data, Lack of Guidelines, Standards, and Organization. The objective of this research is to validate the hypothesis that a workflow model does improve the efficiency and quality of surgical procedure. We chose to study the image-guided surgical workflow for US as a first proof of concept by minimizing the OR workflow issues. We developed, and implemented deformable workflow models using existing and projected future clinical environment data as well as a customized ICT system with seamless integration and real-time availability. An ultrasound (US) image-guided surgical workflow (IG SWF) for a specific surgical procedure, the US IG Liver Biopsy, was researched to find out the inefficient and redundant processes, scattered data in clinical systems, and improve the overall quality of surgical procedures to the patient.

  1. HermitFS: A Secure Storage System in the Open Environment

    Institute of Scientific and Technical Information of China (English)

    LONG Qin; ZENG Feng-ping; WU Si-lian; PAN Ai-min

    2005-01-01

    We present a secure storage system named HermitFS against many types of attacks. HermitFS uses strong cryptography algorithms and a secure protocol to secure the data from the time it is written to the time an authorized user accesses it. Our experimental results and secure analysis show that HermitFS can protect information from unauthorized access in any open environment with little penalty of data overhead and acceptable performance.

  2. Guest Editorial ~ Issues, Challenges and Possibilities for Academics and Tutors at Open and Distance Learning Environments

    Directory of Open Access Journals (Sweden)

    Heather Kanuka

    2006-09-01

    Full Text Available Institutions of open and distance learning present a number of special challenges for academics. Development loads and teaching effectiveness are increasing, while traditional demands for research productivity have become a new and/ or increased pressure. The size, complexity, and structure of the networked learning environment at most institutions of open and distance learning have been known to contribute to feelings of isolation and loneliness leading to disengagement experienced by many new and not so new academics. It is possible if we do not address the disconnectedness experienced by many open and distance academics and tutors that detachment to our institutions will occur, resulting in an increased migration to either collaborate with, or work in, other institutions.Retaining faculty members is not only important for the stability and health of open and distance organizations, but retention – and recruitment – are also issues that institutions of open and distance learning need to be concerned about. The large numbers of senior faculty appointed in the mid 1970s are moving into retirement and/ or later-life careers. It has been estimated that 40 percent of university faculty will retire within the next 10 years. Recruitment and retention of academics is a pressing concern for all universities – but particularly for open and distance universities. The current detached environment may result in a serious employment problem down the road as other traditional universities begin an intensive competition for the best academics. And while these problems exist to some extent at all universities, there is probably no other type of university where building a sense of community is needed.

  3. 利用日志文件进行工作流恢复的一种策略%A Strategy for Workflow Recovery via Log Pile

    Institute of Scientific and Technical Information of China (English)

    高军; 王海洋

    2000-01-01

    In a traditional way, when the execuUon of workflow is interruptted, we rise compensete worktask for workflow recovery itt order to keep the consistense of database. In this paper,we open a new .feasible way for system-supported recovery mechanism via database log file and workflow log file.

  4. Social Radar Workflows, Dashboards, and Environments

    Science.gov (United States)

    2012-01-01

    shows a mockup of such an interface. Figure 3: Social Radar Interface Mock-Up [21] A major lesson learned is that each data source needs to be...for Social Radar is the Ozone Widget Framework (Ozone). This is a lightweight framework that wraps web applications and exposes them to the analyst...as small applications or widgets inside of a web browser. Additionally, Ozone allows the widgets to communicate with one another via provided

  5. An Improvement on Algorithm of Grid-Workflow Based on QoS

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yun-feng; GE Wei

    2004-01-01

    With the emergence of grid computing, new challenges have arisen in workflow tasks scheduling.The goal of grid-workflow task scheduling is to achieve high system throughput and to match the application needs with the available computing resources.This matching of resources in a non-deterministically share heterogeneous environment leads to concerns on quality of service (QoS).Grid concept is presented in this paper, coupled with the QoS requirement of workflow task and an improved algorithm-ILGSS algorithm, has been brought out.The complexity of the improved scheduling algorithm has been analyzed.The experiment results show that the improved algorithm can lead to significant performance gain in various applications.An important research domain-adaptive workflow transaction in grid computing environment, has been explored and a new solution for the scheduling of distribute workflow has been bring forward in grid environment.

  6. A Trust-Based Approach to Estimating the Confidence of the Software System in Open Environments

    Institute of Scientific and Technical Information of China (English)

    Feng Xu; Jing Pan; Wen Lu

    2009-01-01

    Emerging with open environments, the software paradigms, such as open resource coalition and Internetware,present several novel characteristics including user-centric, non-central control, and continual evolution. The goal of obtaining high confidence on such systems is more difficult to achieve. The general developer-oriented metrics and testing-based methods which are adopted in the traditional measurement for high confidence software seem to be infeasible in the new situation. Firstly, the software development is changed from the developer-centric to user-centric, while user's opinions are usually subjective, and cannot be generalized in one objective metric. Secondly, there is non-central control to guarantee the testing on components which formed the software system, and continual evolution makes it impossible to test on the whole software system. Therefore, this paper proposes a trust-based approach that consists of three sequential sub-stages:1) describing metrics for confidence estimation from users; 2) estimating the confidence of the components based on the quantitative information from the trusted recommenders; 3) estimating the confidence of the whole software system based on the component confidences and their interactions, as well as attempts to make a step toward a reasonable and effective method for confidence estimation of the software system in open environments.

  7. A Framework for Distributed Preservation Workflows

    Directory of Open Access Journals (Sweden)

    Rainer Schmidt

    2010-07-01

    Full Text Available The Planets Project is developing a service-oriented environment for the definition and evaluation of preservation strategies for human-centric data. It focuses on the question of logically preserving digital materials, as opposed to the physical preservation of content bit-streams. This includes the development of preservation tools for the automated characterisation, migration, and comparison of different types of Digital Objects as well as the emulation of their original runtime environment in order to ensure long-time access and interpretability. The Planets integrated environment provides a number of end-user applications that allow data curators to execute and scientifically evaluate preservation experiments based on composable preservation services. In this paper, we focus on the middleware and programming model and show how it can be utilised in order to create complex preservation workflows.

  8. A Novel Verification Approach of Workflow Schema

    Institute of Scientific and Technical Information of China (English)

    WANG Guangqi; WANG Juying; WANG Yan; SONG Baoyan; YU Ge

    2006-01-01

    A workflow schema is an abstract description of the business processed by workflow model, and plays a critical role in analyzing, executing and reorganizing business processes. The verification issue on the correctness of complicated workflow schemas is difficult in the field of workflow. We make an intensive study of it in this paper. We present here local errors and schema logic errors (global errors) in workflow schemas in detail, and offer some constraint rules trying to avoid schema errors during modeling. In addition, we propose a verification approach based on graph reduction and graph spread, and give the algorithm. The algorithm is implemented in a workflow prototype system e-ScopeWork.

  9. Scientific Process Automation and Workflow Management

    Energy Technology Data Exchange (ETDEWEB)

    Ludaescher, Bertram T.; Altintas, Ilkay; Bowers, Shawn; Cummings, J.; Critchlow, Terence J.; Deelman, Ewa; De Roure, D.; Freire, Juliana; Goble, Carole; Jones, Matt; Klasky, S.; McPhillips, Timothy; Podhorszki, Norbert; Silva, C.; Taylor, I.; Vouk, M.

    2010-01-01

    We introduce and describe scientific workflows, i.e., executable descriptions of automatable scientific processes such as computational science simulations and data analyses. Scientific workflows are often expressed in terms of tasks and their (data ow) dependencies. This chapter first provides an overview of the characteristic features of scientific workflows and outlines their life cycle. A detailed case study highlights workflow challenges and solutions in simulation management. We then provide a brief overview of how some concrete systems support the various phases of the workflow life cycle, i.e., design, resource management, execution, and provenance management. We conclude with a discussion on community-based workflow sharing.

  10. Office 2010 Workflow Developing Collaborative Solutions

    CERN Document Server

    Mann, David; Enterprises, Creative

    2010-01-01

    Workflow is the glue that binds information worker processes, users, and artifacts. Without workflow, information workers are just islands of data and potential. Office 2010 Workflow details how to implement workflow in SharePoint 2010 and the client Microsoft Office 2010 suite to help information workers share data, enforce processes and business rules, and work more efficiently together or solo. This book covers everything you need to know-from what workflow is all about to creating new activities; from the SharePoint Designer to Visual Studio 2010; from out-of-the-box workflows to state mac

  11. An Overview of Workflow Management on Mobile Agent Technology

    Directory of Open Access Journals (Sweden)

    Anup Patnaik

    2014-07-01

    Full Text Available Mobile agent workflow management/plugins is quite appropriate to handle control flows in open distributed system; basically it is the emerging technology which can bring the process oriented tasks to run as a single unit from diverse frameworks. This workflow technology offers organizations the opportunity to reshape business processes beyond the boundaries of their own organizations so that instead of static models, modern era incurring dynamic workflows which can respond the changes during its execution, provide necessary security measures, great degree of adaptivity, troubleshoot the running processes and recovery of lost states through fault tolerance. The prototype that we are planning to design makes sure to hold reliability, security, robustness, scalability without being forced to make tradeoffs the performance. This paper is concerned with design, implementation and evaluation of performance on the improved methods of proposed prototype models based on current research in this domain.

  12. Procesos workflow en la nube

    OpenAIRE

    Peralta,Mario; Salgado, Carlos Humberto; Baigorria, Lorena; Montejano, Germán Antonio; Riesco, Daniel Eduardo

    2014-01-01

    Dada la globalización de la información, las organizaciones tienden a virtualizar sus negocios: subir su negocio a la Nube. Desde la perspectiva de la complejidad de los procesos de negocio, una de las tecnologías más significativas para soportar su automatización son los Sistemas de Gestión Workflow, dando soporte computacional para definir, sincronizar y ejecutar actividades del proceso utilizando workflows. Para favorecer y dar flexibilidad a dichos sistemas, es fundamental tener herram...

  13. Use and Mastery of Virtual Learning Environment in Brazilian Open University

    Directory of Open Access Journals (Sweden)

    Margarita Victoria Gomez

    2014-07-01

    Full Text Available This paper describes and analyses the dynamics of the use and/or mastery of Virtual Learning Environments (VLEs by educators and students Open University, important part of the Brazilian Educational System. A questionnaire with 32 items was answered by 174 students/instructors/coordinators of the Media in Education and Physics courses, of two federal universities, between 2011 and early 2012. The interview with a coordinator was transcribed and related to the data systematised in tables and graphs. Interpretative analysis, in an open dialogue with the references and with the data from the Universidade Aberta do Brasil (UAB - Open University of Brazil site resulted in the final considerations. These suggest that the use and/or mastery of VLEs by students are important, and the specificities of these uses subsidise studies and publications, still in a small number in the literature in this area of knowledge. The work reflects the development of the Open Distance Education System, conducted with strong popular participation, as a response to the challenge posed to the educational policies for expanding the public provision of higher education, also using VLEs for this purpose.

  14. An Open Source Software Platform for Visualizing and Teaching Conservation Tasks in Architectural Heritage Environments

    Science.gov (United States)

    San Jose, I. Ignacio; Martinez, J.; Alvarez, N.; Fernandez, J. J.; Delgado, F.; Martinez, R.; Puche, J. C.; Finat, J.

    2013-07-01

    In this work we present a new software platform for interactive volumetric visualization of complex architectural objects and their applications to teaching and training conservation interventions in Architectural Cultural Heritage. Photogrammetric surveying is performed by processing the information arising from image- and range-based devices. Our visualization application is based on an adaptation of WebGL open standard; the performed adaptation allows to import open standards and an interactive navigation of 3D models in ordinary web navigators with a good performance. The Visualization platform is scalable and can be applied to urban environments, provided open source files be used; CityGML is an open standard based on a geometry -driven Ontology which is compatible with this approach. We illustrate our results with examples concerning to very damaged churches and a urban district of Segovia (World Cultural Heritage). Their connection with appropriate database eases the building evolution and interventions tracking. We have incorporated some preliminary examples to illustrate Advanced Visualization Tools and architectural e-Learning software platform which have been created for assessing conservation and restoration tasks in very damaged buildings. First version of the Advanced Visualization application has been developed in the framework of ADISPA Spanish Project Results. Our results are illustrated with the application of these software applications to several very damaged cultural heritage buildings in rural zones of Castilla y Leon (Spain).

  15. Analysing scientific workflows: why workflows not only connect web services

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; Wolstencroft, K.; Neerincx, P.B.T.; Roos, M.; Rauwerda, H.; Breit, T.M.; Zhang, LJ.

    2009-01-01

    Life science workflow systems are developed to help life scientists to conveniently connect various programs and web services. In practice however, much time is spent on data conversion, because web services provided by different organisations use different data formats. We have analysed all the

  16. Analysing scientific workflows: Why workflows not only connect web services

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; Wolstencroft, K.; Neerincx, P.B.T.; Roos, M.; Rauwerda, H.; Breit, T.M.; Zhang, L.J.

    2009-01-01

    Life science workflow systems are developed to help life scientists to conveniently connect various programs and web services. In practice however, much time is spent on data conversion, because web services provided by different organisations use different data formats. We have analysed all the

  17. Bats coordinate sonar and flight behavior as they forage in open and cluttered environments.

    Science.gov (United States)

    Falk, Benjamin; Jakobsen, Lasse; Surlykke, Annemarie; Moss, Cynthia F

    2014-12-15

    Echolocating bats use active sensing as they emit sounds and listen to the returning echoes to probe their environment for navigation, obstacle avoidance and pursuit of prey. The sensing behavior of bats includes the planning of 3D spatial trajectory paths, which are guided by echo information. In this study, we examined the relationship between active sonar sampling and flight motor output as bats changed environments from open space to an artificial forest in a laboratory flight room. Using high-speed video and audio recordings, we reconstructed and analyzed 3D flight trajectories, sonar beam aim and acoustic sonar emission patterns as the bats captured prey. We found that big brown bats adjusted their sonar call structure, temporal patterning and flight speed in response to environmental change. The sonar beam aim of the bats predicted the flight turn rate in both the open room and the forest. However, the relationship between sonar beam aim and turn rate changed in the forest during the final stage of prey pursuit, during which the bat made shallower turns. We found flight stereotypy developed over multiple days in the forest, but did not find evidence for a reduction in active sonar sampling with experience. The temporal patterning of sonar sound groups was related to path planning around obstacles in the forest. Together, these results contribute to our understanding of how bats coordinate echolocation and flight behavior to represent and navigate their environment.

  18. Bats coordinate sonar and flight behavior as they forage in open and cluttered environments

    DEFF Research Database (Denmark)

    Falk, Benjamin; Jakobsen, Lasse; Surlykke, Annemarie;

    2014-01-01

    . In this study, we examined the relationship between active sonar sampling and flight motor output as bats changed environments from open space to an artificial forest in a laboratory flight room. Using high-speed video and audio recordings, we reconstructed and analyzed 3D flight trajectories, sonar beam aim...... and acoustic sonar emission patterns as the bats captured prey. We found that big brown bats adjusted their sonar call structure, temporal patterning and flight speed in response to environmental change. The sonar beam aim of the bats predicted the flight turn rate in both the open room and the forest. However......, the relationship between sonar beam aim and turn rate changed in the forest during the final stage of prey pursuit, during which the bat made shallower turns. We found flight stereotypy developed over multiple days in the forest, but did not find evidence for a reduction in active sonar sampling with experience...

  19. Income-environment relationship in Sub-Saharan African countries: Further evidence with trade openness.

    Science.gov (United States)

    Zerbo, Eléazar

    2017-07-01

    This paper examines the dynamic relationship between energy consumption, income growth, carbon emissions and trade openness in fourteen Sub-Saharan African (SSA) countries. The autoregressive distributed lag (ARDL) approach to cointegration and the Toda-Yamamoto causality test were used to investigate the long-run and short-run properties, respectively. The long-run estimations give evidence against the environmental Kuznets curve (EKC) hypothesis in SSA countries. In contrast, the results highlight the significant and monotonically contribution of income growth and energy consumption in explaining carbon emissions in the long-run and short-run in several countries. Furthermore, the results show that trade openness enhances economic growth and is not linked to causing carbon emissions in these countries. Hence, a trade incentive policy may be implemented without harmful effect on the quality of the environment.

  20. Investigating the Contextual Interference Effect Using Combination Sports Skills in Open and Closed Skill Environments

    Directory of Open Access Journals (Sweden)

    Jadeera P.G. Cheong, Brendan Lay, Rizal Razman

    2016-03-01

    Full Text Available This study attempted to present conditions that were closer to the real-world setting of team sports. The primary purpose was to examine the effects of blocked, random and game-based training practice schedules on the learning of the field hockey trap, close dribble and push pass that were practiced in combination. The secondary purpose was to investigate the effects of predictability of the environment on the learning of field hockey sport skills according to different practice schedules. A game-based training protocol represented a form of random practice in an unstable environment and was compared against a blocked and a traditional random practice schedule. In general, all groups improved dribble and push accuracy performance during the acquisition phase when assessed in a closed environment. In the retention phase, there were no differences between the three groups. When assessed in an open skills environment, all groups improved their percentage of successful executions for trapping and passing execution, and improved total number of attempts and total number of successful executions for both dribbling and shooting execution. Between-group differences were detected for dribbling execution with the game-based group scoring a higher number of dribbling successes. The CI effect did not emerge when practicing and assessing multiple sport skills in a closed skill environment, even when the skills were practiced in combination. However, when skill assessment was conducted in a real-world situation, there appeared to be some support for the CI effect.

  1. Degrees of secrecy in an open environment. The case of electronic theses and dissertations

    Directory of Open Access Journals (Sweden)

    Joachim SCHÖPFE

    2013-12-01

    Full Text Available The open access (OA principle requires that scientific information be made widely and readily available to society. Defined in 2003 as a “comprehensive source of human knowledge and cultural heritage that has been approved by the scientific community”, open access implies that content be openly accessible and this needs the active commitment of each and every individual producer of scientific knowledge. Today, the success of the open access initiative cannot be denied. Yet, in spite of the growing success of the open access initiative, a significant part of scientific and technical information remains unavailable on the web or circulates with restrictions. Even in institutional repositories (IR created to provide access to the scientific output of an academic institution and central vector of the so-called green road to open access, more or less important sectors of the scientific production are missing. This is because of lack of awareness, embargo, deposit of metadata without full text, confidential content etc. This problem concerns in particular electronic theses and dissertations (ETDs that are disseminated with different statuses – some are freely available, others are under embargo, confidential, restricted to campus access (encrypted or not or not available at all. While other papers may be available through alternative channels (journals, monographs etc., ETDs most often are not. Our paper describes a new and unexpected effect of the development of digital libraries and open access, as a paradoxical practice of hiding information from the scientific community and society, partly while sharing it with a restricted population (campus. We try to explain these different shades of grey literature in terms of different degrees of secrecy related to intellectual property, legitimate interests, expected exploitation and trade secrets, and suggest some ways of increasing availability of ETDs in an open environment (inter-lending loan and

  2. THE SYSTEM OF INTEGRATED LESSONS IN AN OPEN A SPORTS AND EDUCATIONAL ENVIRONMENT OF PEDAGOGICAL UNIVERSITY

    Directory of Open Access Journals (Sweden)

    Михаил Александрович Правдов

    2014-05-01

    Full Text Available The urgency to develop effective ways to modernize the education system is associated with the transition to the new standards of education (FSES, and the need to respect the principle of continuity and integration of all levels of education. This problem is particularly relevant in relation to the system of training of future teachers. Creating a model of open a sports and educational environment for educational institutions of various types and species on the basis of pedagogical high school can provide conditions for successful implementation of the goals and objectives of physical education for pre-school institutions and schools, so and for the teaching in high school for the training of future teachers of physical culture. In the article considered system of integrated lessons as an additional form of educational process for pre-school children. The system is realized by students of the Faculty of Physical culture in an open a sports and educational environment of pedagogical university. Physical education classes, which are realized through the integration of educational content areas (FSES preschool education  and are conducted on the basis of pedagogical high school can be used in the practice of pre-school institutions.Purpose: justification for a system of integrated classes with children of preschool age in an open a sports and educational environment Pedagogical University.Methodology of work: methodology of pedagogical research.Results: employment system based on the integration of educational preschool education, the integration of the educational activities of the university teachers and educational organizations.Practical implications: The results of this research can be applied in the preparation of future teachers, and the organization of sports and recreation activities for children of preschool age.DOI: http://dx.doi.org/10.12731/2218-7405-2013-10-46

  3. The EDRN knowledge environment: an open source, scalable informatics platform for biological sciences research

    Science.gov (United States)

    Crichton, Daniel; Mahabal, Ashish; Anton, Kristen; Cinquini, Luca; Colbert, Maureen; Djorgovski, S. George; Kincaid, Heather; Kelly, Sean; Liu, David

    2017-05-01

    We describe here the Early Detection Research Network (EDRN) for Cancer's knowledge environment. It is an open source platform built by NASA's Jet Propulsion Laboratory with contributions from the California Institute of Technology, and Giesel School of Medicine at Dartmouth. It uses tools like Apache OODT, Plone, and Solr, and borrows heavily from JPL's Planetary Data System's ontological infrastructure. It has accumulated data on hundreds of thousands of biospecemens and serves over 1300 registered users across the National Cancer Institute (NCI). The scalable computing infrastructure is built such that we are being able to reach out to other agencies, provide homogeneous access, and provide seamless analytics support and bioinformatics tools through community engagement.

  4. Decaf: Decoupled Dataflows for In Situ High-Performance Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Dreher, M.; Peterka, T.

    2017-07-31

    Decaf is a dataflow system for the parallel communication of coupled tasks in an HPC workflow. The dataflow can perform arbitrary data transformations ranging from simply forwarding data to complex data redistribution. Decaf does this by allowing the user to allocate resources and execute custom code in the dataflow. All communication through the dataflow is efficient parallel message passing over MPI. The runtime for calling tasks is entirely message-driven; Decaf executes a task when all messages for the task have been received. Such a messagedriven runtime allows cyclic task dependencies in the workflow graph, for example, to enact computational steering based on the result of downstream tasks. Decaf includes a simple Python API for describing the workflow graph. This allows Decaf to stand alone as a complete workflow system, but Decaf can also be used as the dataflow layer by one or more other workflow systems to form a heterogeneous task-based computing environment. In one experiment, we couple a molecular dynamics code with a visualization tool using the FlowVR and Damaris workflow systems and Decaf for the dataflow. In another experiment, we test the coupling of a cosmology code with Voronoi tessellation and density estimation codes using MPI for the simulation, the DIY programming model for the two analysis codes, and Decaf for the dataflow. Such workflows consisting of heterogeneous software infrastructures exist because components are developed separately with different programming models and runtimes, and this is the first time that such heterogeneous coupling of diverse components was demonstrated in situ on HPC systems.

  5. Integrated Sensitivity Analysis Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Friedman-Hill, Ernest J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoffman, Edward L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gibson, Marcus J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Clay, Robert L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-08-01

    Sensitivity analysis is a crucial element of rigorous engineering analysis, but performing such an analysis on a complex model is difficult and time consuming. The mission of the DART Workbench team at Sandia National Laboratories is to lower the barriers to adoption of advanced analysis tools through software integration. The integrated environment guides the engineer in the use of these integrated tools and greatly reduces the cycle time for engineering analysis.

  6. Herschel Interactive Processing Environment (HIPE): Open to the World and the Future

    Science.gov (United States)

    Balm, P.

    2012-09-01

    Herschel is ESA's space-based infrared observatory. It was launched on May 14, 2009 and is in routine science operations. The Herschel Interactive Processing Environment, HIPE, is Herschel's interactive analysis package. HIPE has a user-base of approximately 1,000 users and a major new version is released twice a year. HIPE is the first open-source astronomy data analysis package written entirely in Java and Jython, which allows it to provide a modern GUI with command echoing, sophisticated interoperability and extensibility, with access to the vast amounts of Java libraries. HIPE includes the official data reduction scripts and allows executing and modifying them as needed. These aspects may make HIPE the seed for the astronomy working environment of the future.

  7. Concurrency & Asynchrony in Declarative Workflows

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2015-01-01

    Declarative or constraint-based business process and workflow notations have received increasing interest in the last decade as possible means of addressing the challenge of supporting at the same time flexibility in execution, adaptability and compliance. However, the definition of concurrent se...... development has been verified correct in the Isabelle-HOL interactive theorem prover....

  8. Concurrency & Asynchrony in Declarative Workflows

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2015-01-01

    Declarative or constraint-based business process and workflow notations have received increasing interest in the last decade as possible means of addressing the challenge of supporting at the same time flexibility in execution, adaptability and compliance. However, the definition of concurrent se...... development has been verified correct in the Isabelle-HOL interactive theorem prover....

  9. New Interactions with Workflow Systems

    NARCIS (Netherlands)

    Wassink, I.; Vet, van der P.E.; Veer, van der G.C.; Roos, M.; Dijk, van E.M.A.G.; Norros, L.; Koskinen, H.; Salo, L.; Savioja, P.

    2009-01-01

    This paper describes the evaluation of our early design ideas of an ad-hoc of workflow system. Using the teach-back technique, we have performed a hermeneutic analysis of the mockup implementation named NIWS to get corrective and creative feedback at the functional, dialogue and representation level

  10. New Interactions with Workflow Systems

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; van der Veer, Gerrit C.; Roos, M.; van Dijk, Elisabeth M.A.G.; Norros, L.; Koskinen, H.; Salo, L.; Savioja, P.

    2009-01-01

    This paper describes the evaluation of our early design ideas of an ad-hoc of workflow system. Using the teach-back technique, we have performed a hermeneutic analysis of the mockup implementation named NIWS to get corrective and creative feedback at the functional, dialogue and representation level

  11. Modeling workflow using XML and Petri net

    Institute of Scientific and Technical Information of China (English)

    杨东; 温泉; 张申生

    2004-01-01

    Nowadays an increasing number of workflow products and research prototypes begin to adopt XML for representing workflow models owing to its easy use and well understanding for people and machines. However, most of workflow products and research prototypes provide the few supports for the verification of XML-based workflow model, such as free-deadlock properties, which is essential to successful application of workflow technology. In this paper, we tackle this problem by mapping the XML-based workflow model into Petri-net, a kind of well-known formalism for modeling,analyzing and verifying system. As a result, the XML-based workflow model can be automatically verified with the help of general Petri-net tools, such as DANAMICS. The presented approach not only enables end users to represent workflow model with XML-based modeling language, but also the correctness of model can be ensured, thus satisfying the needs of business processes.

  12. Adobe Photoshop Lightroom and Photoshop workflow bible

    CERN Document Server

    Fitzgerald, Mark

    2013-01-01

    The digital photographer's workflow is divided into two distinct parts - the Production Workflow and the Creative Workflow. The Production workflow is used to import and organize large numbers of images, and prepare them for presentation via proof printing, Web, or slideshow. Increasingly, photographers are turning to Adobe's acclaimed new Lightroom software to manage this part of the workflow. After the best images are identified, photographers move to the second part of the workflow, the Creative Workflow, to fine-tune special images using a variety of advanced digital tools so that the creative vision is realized. An overwhelming majority of digital photographers use Photoshop for this advanced editing. Adobe Photoshop Lightroom & Photoshop Workflow Bible effectively guides digital photographers through both parts of this process. Author Mark Fitzgerald, an Adobe Certified Expert and Adobe Certified Instructor in Photoshop CS3 offers readers a clear path to using both Lightroom 2 and Photoshop CS3 to c...

  13. Design Tools and Workflows for Braided Structures

    DEFF Research Database (Denmark)

    Vestartas, Petras; Heinrich, Mary Katherine; Zwierzycki, Mateusz

    2017-01-01

    and merits of our method, demonstrated though four example design and analysis workflows. The workflows frame specific aspects of enquiry for the ongoing research project flora robotica. These include modelling target geometries, automatically producing instructions for fabrication, conducting structural...

  14. A prototype of workflow management system for construction design projects

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    A great deal of benefits can be achieved if information and process are integrated within the building design project. This paper aims to establish a prototype of workflow management system for construction design project through the application of workflow technology. The composition and function of prototype is presented to satisfy the needs of information share and process integration. By integrating all subsystems and modules of the prototype, the whole system can deal with design information-flow modeling, emulating and optimizing, task planning and distributing, automatic tracking and monitoring, as well as network service, etc. In this way, the collaborative design environment of building design project is brought into being.

  15. [Integration of the radiotherapy irradiation planning in the digital workflow].

    Science.gov (United States)

    Röhner, F; Schmucker, M; Henne, K; Momm, F; Bruggmoser, G; Grosu, A-L; Frommhold, H; Heinemann, F E

    2013-02-01

    At the Clinic of Radiotherapy at the University Hospital Freiburg, all relevant workflow is paperless. After implementing the Operating Schedule System (OSS) as a framework, all processes are being implemented into the departmental system MOSAIQ. Designing a digital workflow for radiotherapy irradiation planning is a large challenge, it requires interdisciplinary expertise and therefore the interfaces between the professions also have to be interdisciplinary. For every single step of radiotherapy irradiation planning, distinct responsibilities have to be defined and documented. All aspects of digital storage, backup and long-term availability of data were considered and have already been realized during the OSS project. After an analysis of the complete workflow and the statutory requirements, a detailed project plan was designed. In an interdisciplinary workgroup, problems were discussed and a detailed flowchart was developed. The new functionalities were implemented in a testing environment by the Clinical and Administrative IT Department (CAI). After extensive tests they were integrated into the new modular department system. The Clinic of Radiotherapy succeeded in realizing a completely digital workflow for radiotherapy irradiation planning. During the testing phase, our digital workflow was examined and afterwards was approved by the responsible authority.

  16. Optimizing high performance computing workflow for protein functional annotation.

    Science.gov (United States)

    Stanberry, Larissa; Rekepalli, Bhanu; Liu, Yuan; Giblock, Paul; Higdon, Roger; Montague, Elizabeth; Broomall, William; Kolker, Natali; Kolker, Eugene

    2014-09-10

    Functional annotation of newly sequenced genomes is one of the major challenges in modern biology. With modern sequencing technologies, the protein sequence universe is rapidly expanding. Newly sequenced bacterial genomes alone contain over 7.5 million proteins. The rate of data generation has far surpassed that of protein annotation. The volume of protein data makes manual curation infeasible, whereas a high compute cost limits the utility of existing automated approaches. In this work, we present an improved and optmized automated workflow to enable large-scale protein annotation. The workflow uses high performance computing architectures and a low complexity classification algorithm to assign proteins into existing clusters of orthologous groups of proteins. On the basis of the Position-Specific Iterative Basic Local Alignment Search Tool the algorithm ensures at least 80% specificity and sensitivity of the resulting classifications. The workflow utilizes highly scalable parallel applications for classification and sequence alignment. Using Extreme Science and Engineering Discovery Environment supercomputers, the workflow processed 1,200,000 newly sequenced bacterial proteins. With the rapid expansion of the protein sequence universe, the proposed workflow will enable scientists to annotate big genome data.

  17. A Hybrid Authorization Model For Project-Oriented Workflow

    Institute of Scientific and Technical Information of China (English)

    Zhang Xiaoguang(张晓光); Cao Jian; Zhang Shensheng

    2003-01-01

    In the context of workflow systems, security-relevant aspect is related to the assignment of activities to (human or automated) agents. This paper intends to cast light on the management of project-oriented workflow. A comprehensive authorization model is proposed from the perspective of project management. In this model, the concept of activity decomposition and team is introduced, which improves the security of conventional role-based access control. Furthermore, policy is provided to define the static and dynamic constraints such as Separation of Duty (SoD). Validity of constraints is proposed to provide a fine-grained assignment, which improves the performance of policy management. The model is applicable not only to project-oriented workflow applications but also to other teamwork environments such as virtual enterprise.

  18. Hybrid Workflow Policy Management for Heart Disease Identification

    Directory of Open Access Journals (Sweden)

    Dong-Hyun Kim

    2009-12-01

    Full Text Available As science technology grows, medical application is becoming more complex to solve the physiological problems within expected time. Workflow management systems (WMS in Grid computing are promisingsolution to solve the sophisticated problem such as genomic analysis, drug discovery, disease identification, etc. Although existing WMS can provide basic management functionality in Grid environment, consideration of user requirements such as performance, reliability and interaction with user is missing. In this paper, we proposehybrid workflow management system for heart disease identification and discuss how to guarantee different user requirements according to user SLA. The proposed system is applied to Physio-Grid e-health platform to identify human heart disease with ECG analysis and Virtual Heart Simulation (VHS workflow applications.

  19. Hybrid Workflow Policy Management for Heart Disease Identification

    CERN Document Server

    Kim, Dong-Hyun; Youn, Chan-Hyun

    2010-01-01

    As science technology grows, medical application is becoming more complex to solve the physiological problems within expected time. Workflow management systems (WMS) in Grid computing are promising solution to solve the sophisticated problem such as genomic analysis, drug discovery, disease identification, etc. Although existing WMS can provide basic management functionality in Grid environment, consideration of user requirements such as performance, reliability and interaction with user is missing. In this paper, we propose hybrid workflow management system for heart disease identification and discuss how to guarantee different user requirements according to user SLA. The proposed system is applied to Physio-Grid e-health platform to identify human heart disease with ECG analysis and Virtual Heart Simulation (VHS) workflow applications.

  20. Execution Time Estimation for Workflow Scheduling

    NARCIS (Netherlands)

    Chirkin, A.M.; Belloum, A..S.Z.; Kovalchuk, S.V.; Makkes, M.X.

    2014-01-01

    Estimation of the execution time is an important part of the workflow scheduling problem. The aim of this paper is to highlight common problems in estimating the workflow execution time and propose a solution that takes into account the complexity and the randomness of the workflow components and th

  1. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    National Research Council Canada - National Science Library

    Kang, Sanggoo; Lee, Kiwon

    2016-01-01

    ... under identical experimental conditions. In this study, the cloud computing environment is built with OpenStack, and four algorithms from the Orfeo toolbox are used for practical geo-based image processing experiments...

  2. Information and Communication Technologies in Schools. A Handbook for Teachers or How ICT Can Create New, Open Learning Environments

    OpenAIRE

    Mariana Patru

    2006-01-01

    Information and Communication Technologies in Schools. A Handbook for Teachers or How ICT Can Create New, Open Learning Environments (Author Alexey Semenov, edited by Jonathan Anderson, published by UNESCO, Division of Higher Education, Paris, 2005. PP. 240).

  3. Standardizing clinical trials workflow representation in UML for international site comparison.

    Science.gov (United States)

    de Carvalho, Elias Cesar Araujo; Jayanti, Madhav Kishore; Batilana, Adelia Portero; Kozan, Andreia M O; Rodrigues, Maria J; Shah, Jatin; Loures, Marco R; Patil, Sunita; Payne, Philip; Pietrobon, Ricardo

    2010-11-09

    With the globalization of clinical trials, a growing emphasis has been placed on the standardization of the workflow in order to ensure the reproducibility and reliability of the overall trial. Despite the importance of workflow evaluation, to our knowledge no previous studies have attempted to adapt existing modeling languages to standardize the representation of clinical trials. Unified Modeling Language (UML) is a computational language that can be used to model operational workflow, and a UML profile can be developed to standardize UML models within a given domain. This paper's objective is to develop a UML profile to extend the UML Activity Diagram schema into the clinical trials domain, defining a standard representation for clinical trial workflow diagrams in UML. Two Brazilian clinical trial sites in rheumatology and oncology were examined to model their workflow and collect time-motion data. UML modeling was conducted in Eclipse, and a UML profile was developed to incorporate information used in discrete event simulation software. Ethnographic observation revealed bottlenecks in workflow: these included tasks requiring full commitment of CRCs, transferring notes from paper to computers, deviations from standard operating procedures, and conflicts between different IT systems. Time-motion analysis revealed that nurses' activities took up the most time in the workflow and contained a high frequency of shorter duration activities. Administrative assistants performed more activities near the beginning and end of the workflow. Overall, clinical trial tasks had a greater frequency than clinic routines or other general activities. This paper describes a method for modeling clinical trial workflow in UML and standardizing these workflow diagrams through a UML profile. In the increasingly global environment of clinical trials, the standardization of workflow modeling is a necessary precursor to conducting a comparative analysis of international clinical trials

  4. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows

    Science.gov (United States)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-05-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues' expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable "software appliance" to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish "talkoot" (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a "science story" in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be

  5. AZOrange - High performance open source machine learning for QSAR modeling in a graphical programming environment

    Directory of Open Access Journals (Sweden)

    Stålring Jonna C

    2011-07-01

    Full Text Available Abstract Background Machine learning has a vast range of applications. In particular, advanced machine learning methods are routinely and increasingly used in quantitative structure activity relationship (QSAR modeling. QSAR data sets often encompass tens of thousands of compounds and the size of proprietary, as well as public data sets, is rapidly growing. Hence, there is a demand for computationally efficient machine learning algorithms, easily available to researchers without extensive machine learning knowledge. In granting the scientific principles of transparency and reproducibility, Open Source solutions are increasingly acknowledged by regulatory authorities. Thus, an Open Source state-of-the-art high performance machine learning platform, interfacing multiple, customized machine learning algorithms for both graphical programming and scripting, to be used for large scale development of QSAR models of regulatory quality, is of great value to the QSAR community. Results This paper describes the implementation of the Open Source machine learning package AZOrange. AZOrange is specially developed to support batch generation of QSAR models in providing the full work flow of QSAR modeling, from descriptor calculation to automated model building, validation and selection. The automated work flow relies upon the customization of the machine learning algorithms and a generalized, automated model hyper-parameter selection process. Several high performance machine learning algorithms are interfaced for efficient data set specific selection of the statistical method, promoting model accuracy. Using the high performance machine learning algorithms of AZOrange does not require programming knowledge as flexible applications can be created, not only at a scripting level, but also in a graphical programming environment. Conclusions AZOrange is a step towards meeting the needs for an Open Source high performance machine learning platform, supporting the

  6. Digital stereo photogrammetry for grain-scale monitoring of fluvial surfaces: Error evaluation and workflow optimisation

    Science.gov (United States)

    Bertin, Stephane; Friedrich, Heide; Delmas, Patrice; Chan, Edwin; Gimel'farb, Georgy

    2015-03-01

    Grain-scale monitoring of fluvial morphology is important for the evaluation of river system dynamics. Significant progress in remote sensing and computer performance allows rapid high-resolution data acquisition, however, applications in fluvial environments remain challenging. Even in a controlled environment, such as a laboratory, the extensive acquisition workflow is prone to the propagation of errors in digital elevation models (DEMs). This is valid for both of the common surface recording techniques: digital stereo photogrammetry and terrestrial laser scanning (TLS). The optimisation of the acquisition process, an effective way to reduce the occurrence of errors, is generally limited by the use of commercial software. Therefore, the removal of evident blunders during post processing is regarded as standard practice, although this may introduce new errors. This paper presents a detailed evaluation of a digital stereo-photogrammetric workflow developed for fluvial hydraulic applications. The introduced workflow is user-friendly and can be adapted to various close-range measurements: imagery is acquired with two Nikon D5100 cameras and processed using non-proprietary "on-the-job" calibration and dense scanline-based stereo matching algorithms. Novel ground truth evaluation studies were designed to identify the DEM errors, which resulted from a combination of calibration errors, inaccurate image rectifications and stereo-matching errors. To ensure optimum DEM quality, we show that systematic DEM errors must be minimised by ensuring a good distribution of control points throughout the image format during calibration. DEM quality is then largely dependent on the imagery utilised. We evaluated the open access multi-scale Retinex algorithm to facilitate the stereo matching, and quantified its influence on DEM quality. Occlusions, inherent to any roughness element, are still a major limiting factor to DEM accuracy. We show that a careful selection of the camera

  7. Integration of the radiotherapy irradiation planning in the digital workflow; Integration der Bestrahlungsplanung in den volldigitalen Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Roehner, F.; Schmucker, M.; Henne, K.; Bruggmoser, G.; Grosu, A.L.; Frommhold, H.; Heinemann, F.E. [Universitaetsklinikum Freiburg (Germany). Klinik fuer Strahlenheilkunde; Momm, F. [Ortenau Klinikum, Offenburg-Gengenbach (Germany). Radio-Onkologie

    2013-02-15

    Background and purpose: At the Clinic of Radiotherapy at the University Hospital Freiburg, all relevant workflow is paperless. After implementing the Operating Schedule System (OSS) as a framework, all processes are being implemented into the departmental system MOSAIQ. Designing a digital workflow for radiotherapy irradiation planning is a large challenge, it requires interdisciplinary expertise and therefore the interfaces between the professions also have to be interdisciplinary. For every single step of radiotherapy irradiation planning, distinct responsibilities have to be defined and documented. All aspects of digital storage, backup and long-term availability of data were considered and have already been realized during the OSS project. Method: After an analysis of the complete workflow and the statutory requirements, a detailed project plan was designed. In an interdisciplinary workgroup, problems were discussed and a detailed flowchart was developed. The new functionalities were implemented in a testing environment by the Clinical and Administrative IT Department (CAI). After extensive tests they were integrated into the new modular department system. Results and conclusion: The Clinic of Radiotherapy succeeded in realizing a completely digital workflow for radiotherapy irradiation planning. During the testing phase, our digital workflow was examined and afterwards was approved by the responsible authority. (orig.)

  8. Maternal environment alters social interactive traits but not open-field behavior in Fischer 344 rats.

    Science.gov (United States)

    Yamamuro, Yutaka

    2008-10-01

    Although it is recognized that the genetic background governs behavioral phenotypes, environmental factors also play a critical role in the development of various behavioral processes. The maternal environment has a major impact on pups, and the cross-fostering procedure is used to determine the influence of early life experiences. The present study examined the influence of maternal environment on behavioral traits in inbred Fischer 344 (F344) rats. F344/DuCrlCrlj and Wistar (Crlj:WI) pups were fostered from postnatal day 1 as follows: Wistar pups raised by Wistar dams, F344 raised by Wistar, Wistar raised by F344, and F344 raised by F344. At 10 weeks of age, rats were randomly assigned to an open-field test and social interaction test. In the open-field test, irrespective of the rearing conditions, the activity during the first 1 min was significantly lower in F344 rats than in Wistar rats. Latency to the onset of movement showed no difference between groups. In the social interaction test, the recognition performance during the first 1 min in F344 raised by F344 was significantly shorter than that in the other groups. The onset of recognition to a novel social partner in F344 raised by F344 was significantly delayed, and the delay disappeared upon cross-fostering by Wistar dams. These results raise the possibility that the behavioral phenotype of F344 rats results from the interplay of genetic factors and maternal environment during early life, and that F344 rats are a strain with high susceptibility to rearing conditions for the formation of their emotionality.

  9. Uav Photgrammetric Workflows: a best Practice Guideline

    Science.gov (United States)

    Federman, A.; Santana Quintero, M.; Kretz, S.; Gregg, J.; Lengies, M.; Ouimet, C.; Laliberte, J.

    2017-08-01

    The increasing commercialization of unmanned aerial vehicles (UAVs) has opened the possibility of performing low-cost aerial image acquisition for the documentation of cultural heritage sites through UAV photogrammetry. The flying of UAVs in Canada is regulated through Transport Canada and requires a Special Flight Operations Certificate (SFOC) in order to fly. Various image acquisition techniques have been explored in this review, as well as well software used to register the data. A general workflow procedure has been formulated based off of the literature reviewed. A case study example of using UAV photogrammetry at Prince of Wales Fort is discussed, specifically in relation to the data acquisition and processing. Some gaps in the literature reviewed highlight the need for streamlining the SFOC application process, and incorporating UAVs into cultural heritage documentation courses.

  10. Delta: Data Reduction for Integrated Application Workflows.

    Energy Technology Data Exchange (ETDEWEB)

    Lofstead, Gerald Fredrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jean-Baptiste, Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Oldfield, Ron A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-01

    Integrated Application Workflows (IAWs) run multiple simulation workflow components con- currently on an HPC resource connecting these components using compute area resources and compensating for any performance or data processing rate mismatches. These IAWs require high frequency and high volume data transfers between compute nodes and staging area nodes during the lifetime of a large parallel computation. The available network band- width between the two areas may not be enough to efficiently support the data movement. As the processing power available to compute resources increases, the requirements for this data transfer will become more difficult to satisfy and perhaps will not be satisfiable at all since network capabilities are not expanding at a comparable rate. Furthermore, energy consumption in HPC environments is expected to grow by an order of magnitude as exas- cale systems become a reality. The energy cost of moving large amounts of data frequently will contribute to this issue. It is necessary to reduce the volume of data without reducing the quality of data when it is being processed and analyzed. Delta resolves the issue by addressing the lifetime data transfer operations. Delta removes subsequent identical copies of already transmitted data during transfers and restores those copies once the data has reached the destination. Delta is able to identify duplicated information and determine the most space efficient way to represent it. Initial tests show about 50% reduction in data movement while maintaining the same data quality and transmission frequency.

  11. Scientific Workflows + Provenance = Better (Meta-)Data Management

    Science.gov (United States)

    Ludaescher, B.; Cuevas-Vicenttín, V.; Missier, P.; Dey, S.; Kianmajd, P.; Wei, Y.; Koop, D.; Chirigati, F.; Altintas, I.; Belhajjame, K.; Bowers, S.

    2013-12-01

    The origin and processing history of an artifact is known as its provenance. Data provenance is an important form of metadata that explains how a particular data product came about, e.g., how and when it was derived in a computational process, which parameter settings and input data were used, etc. Provenance information provides transparency and helps to explain and interpret data products. Other common uses and applications of provenance include quality control, data curation, result debugging, and more generally, 'reproducible science'. Scientific workflow systems (e.g. Kepler, Taverna, VisTrails, and others) provide controlled environments for developing computational pipelines with built-in provenance support. Workflow results can then be explained in terms of workflow steps, parameter settings, input data, etc. using provenance that is automatically captured by the system. Scientific workflows themselves provide a user-friendly abstraction of the computational process and are thus a form of ('prospective') provenance in their own right. The full potential of provenance information is realized when combining workflow-level information (prospective provenance) with trace-level information (retrospective provenance). To this end, the DataONE Provenance Working Group (ProvWG) has developed an extension of the W3C PROV standard, called D-PROV. Whereas PROV provides a 'least common denominator' for exchanging and integrating provenance information, D-PROV adds new 'observables' that described workflow-level information (e.g., the functional steps in a pipeline), as well as workflow-specific trace-level information ( timestamps for each workflow step executed, the inputs and outputs used, etc.) Using examples, we will demonstrate how the combination of prospective and retrospective provenance provides added value in managing scientific data. The DataONE ProvWG is also developing tools based on D-PROV that allow scientists to get more mileage from provenance metadata

  12. Modeling Workflow Using UML Activity Diagram

    Institute of Scientific and Technical Information of China (English)

    Wei Yinxing(韦银星); Zhang Shensheng

    2004-01-01

    An enterprise can improve its adaptability in the changing market by means of workflow technologies. In the build time, the main function of Workflow Management System (WFMS) is to model business process. Workflow model is an abstract representation of the real-world business process. The Unified Modeling Language (UML) activity diagram is an important visual process modeling language proposed by the Object Management Group (OMG). The novelty of this paper is representing workflow model by means of UML activity diagram. A translation from UML activity diagram to π-calculus is established. Using π-calculus, the deadlock property of workflow is analyzed.

  13. Workflow-based approaches to neuroimaging analysis.

    Science.gov (United States)

    Fissell, Kate

    2007-01-01

    Analysis of functional and structural magnetic resonance imaging (MRI) brain images requires a complex sequence of data processing steps to proceed from raw image data to the final statistical tests. Neuroimaging researchers have begun to apply workflow-based computing techniques to automate data analysis tasks. This chapter discusses eight major components of workflow management systems (WFMSs): the workflow description language, editor, task modules, data access, verification, client, engine, and provenance, and their implementation in the Fiswidgets neuroimaging workflow system. Neuroinformatics challenges involved in applying workflow techniques in the domain of neuroimaging are discussed.

  14. Tuning the cognitive environment: Sound masking with 'natural' sounds in open-plan offices

    Science.gov (United States)

    DeLoach, Alana

    With the gain in popularity of open-plan office design and the engineering efforts to achieve acoustical comfort for building occupants, a majority of workers still report dissatisfaction in their workplace environment. Office acoustics influence organizational effectiveness, efficiency, and satisfaction through meeting appropriate requirements for speech privacy and ambient sound levels. Implementing a sound masking system is one tried-and-true method of achieving privacy goals. Although each sound masking system is tuned for its specific environment, the signal -- random steady state electronic noise, has remained the same for decades. This research work explores how `natural' sounds may be used as an alternative to this standard masking signal employed so ubiquitously in sound masking systems in the contemporary office environment. As an unobtrusive background sound, possessing the appropriate spectral characteristics, this proposed use of `natural' sounds for masking challenges the convention that masking sounds should be as meaningless as possible. Through the pilot study presented in this work, we hypothesize that `natural' sounds as sound maskers will be as effective at masking distracting background noise as the conventional masking sound, will enhance cognitive functioning, and increase participant (worker) satisfaction.

  15. An Open Environment to Support the Development of Computational Chemistry Solutions

    Science.gov (United States)

    Bejarano, Bernardo Palacios; Ruiz, Irene Luque; Gómez-Nieto, Miguel Ángel

    2009-08-01

    In this paper we present an open software environment devoted to support the investigations in computational chemistry. The software, named CoChiSE (Computational Chimica Software Environment) is fully developed in Java using Eclipse as IDE; in this way, the system is integrated by different perspectives oriented to solve different aspects of the computational chemistry research. CoChiSE is able to manage large chemical databases, maintaining information about molecules and properties as well; this information can be exported and imported to/from the most popular standard file formats. The system also allows the user to perform the calculation of different type of isomorphism and molecular similarity. Besides, CoChiSE incorporates a perspective in charge of the calculation of molecular descriptors, considering more than four hundred descriptors of different categories. All the information and system perspectives are integrated in the same environment, so a huge amount of information is managed by the user. The characteristics of the developed system permit the easy integration of either user, proprietary and free software.

  16. A novel compact mass detection platform for the open access (OA) environment in drug discovery and early development.

    Science.gov (United States)

    Gao, Junling; Ceglia, Scott S; Jones, Michael D; Simeone, Jennifer; Antwerp, John Van; Zhang, Li-Kang; Ross, Charles W; Helmy, Roy

    2016-04-15

    A new 'compact mass detector' co-developed with an instrument manufacturer (Waters Corporation) as an interface for liquid chromatography (LC), specifically Ultra-high performance LC(®) (UPLC(®) or UHPLC) analysis was evaluated as a potential new Open Access (OA) LC-MS platform in the Drug Discovery and Early Development space. This new compact mass detector based platform was envisioned to provide increased reliability and speed while exhibiting significant cost, noise, and footprint reductions. The new detector was evaluated in batch mode (typically 1-3 samples per run) to monitor reactions and check purity, as well as in High Throughput Screening (HTS) mode to run 24, 48, and 96 well plates. The latter workflows focused on screening catalysis conditions, process optimization, and library work. The objective of this investigation was to assess the performance, reliability, and flexibility of the compact mass detector in the OA setting for a variety of applications. The compact mass detector results were compared to those obtained by current OA LC-MS systems, and the capabilities and benefits of the compact mass detector in the open access setting for chemists in the drug discovery and development space are demonstrated.

  17. Climate Data Analytics Workflow Management

    Science.gov (United States)

    Zhang, J.; Lee, S.; Pan, L.; Mattmann, C. A.; Lee, T. J.

    2016-12-01

    In this project we aim to pave a novel path to create a sustainable building block toward Earth science big data analytics and knowledge sharing. Closely studying how Earth scientists conduct data analytics research in their daily work, we have developed a provenance model to record their activities, and to develop a technology to automatically generate workflows for scientists from the provenance. On top of it, we have built the prototype of a data-centric provenance repository, and establish a PDSW (People, Data, Service, Workflow) knowledge network to support workflow recommendation. To ensure the scalability and performance of the expected recommendation system, we have leveraged the Apache OODT system technology. The community-approved, metrics-based performance evaluation web-service will allow a user to select a metric from the list of several community-approved metrics and to evaluate model performance using the metric as well as the reference dataset. This service will facilitate the use of reference datasets that are generated in support of the model-data intercomparison projects such as Obs4MIPs and Ana4MIPs. The data-centric repository infrastructure will allow us to catch richer provenance to further facilitate knowledge sharing and scientific collaboration in the Earth science community. This project is part of Apache incubator CMDA project.

  18. Diffusion dynamics in external noise-activated non-equilibrium open system-reservoir coupling environment

    Institute of Scientific and Technical Information of China (English)

    Wang Chun-Yang

    2013-01-01

    The diffusion process in an extemal noise-activated non-equilibrium open system-reservoir coupling environment is studied by analytically solving the generalized Langevin equation.The dynamical property of the system near the barrier top is investigated in detail by numerically calculating the quantities such as mean diffusion path,invariance,barrier passing probability,and so on.It is found that,comparing with the unfavorable effect of internal fluctuations,the external noise activation is sometimes beneficial to the diffusion process.An optimal strength of external activation or correlation time of the internal fluctuation is expected for the diffusing particle to have a maximal probability to escape from the potential well.

  19. Gray QB-sing-faced version 2 (SF2) open environment test report

    Energy Technology Data Exchange (ETDEWEB)

    Plummer, J. [Savannah River Site (SRS), Aiken, SC (United States); Immel, D. [Savannah River Site (SRS), Aiken, SC (United States); Bobbitt, J. [Savannah River Site (SRS), Aiken, SC (United States); Negron, M. [Savannah River Site (SRS), Aiken, SC (United States)

    2015-02-16

    This report details the design upgrades incorporated into the new version of the GrayQbTM SF2 device and the characterization testing of this upgraded device. Results from controlled characterization testing in the Savannah River National Laboratory (SRNL) R&D Engineering Imaging and Radiation Lab (IRL) and the Savannah River Site (SRS) Health Physics Instrument Calibration Laboratory (HPICL) is presented, as well as results from the open environment field testing performed in the E-Area Low Level Waste Storage Area. Resultant images presented in this report were generated using the SRNL developed Radiation Analyzer (RAzerTM) software program which overlays the radiation contour images onto the visual image of the location being surveyed.

  20. Employees' satisfaction as influenced by acoustic and visual privacy in the open office environment

    Science.gov (United States)

    Soules, Maureen Jeanette

    The purpose of this study was to examine the relationship between employees' acoustic and visual privacy issues and their perceived satisfaction in their open office work environments while in focus work mode. The study examined the Science Teaching Student Services Building at the University of Minnesota Minneapolis. The building houses instructional classrooms and administrative offices that service UMN students. The Sustainable Post-Occupancy Evaluation Survey was used to collect data on overall privacy conditions, acoustic and visual privacy conditions, and employees' perceived privacy conditions while in their primary workplace. Paired T-tests were used to analyze the relationships between privacy conditions and employees' perceptions of privacy. All hypotheses are supported indicating that the privacy variables are correlated to the employees' perception of satisfaction within the primary workplace. The findings are important because they can be used to inform business leaders, designers, educators and future research in the field of office design.

  1. Suppression of Strong Background Interference on E-Nose Sensors in an Open Country Environment.

    Science.gov (United States)

    Tian, Fengchun; Zhang, Jian; Yang, Simon X; Zhao, Zhenzhen; Liang, Zhifang; Liu, Yan; Wang, Di

    2016-02-16

    The feature extraction technique for an electronic nose (e-nose) applied in tobacco smell detection in an open country/outdoor environment with periodic background strong interference is studied in this paper. Principal component analysis (PCA), Independent component analysis (ICA), re-filtering and a priori knowledge are combined to separate and suppress background interference on the e-nose. By the coefficient of multiple correlation (CMC), it can be verified that a better separation of environmental temperature, humidity, and atmospheric pressure variation related background interference factors can be obtained with ICA. By re-filtering according to the on-site interference characteristics a composite smell curve was obtained which is more related to true smell information based on the tobacco curer's experience.

  2. ICT usage as a key prerequisite for open knowledge environment creation

    Directory of Open Access Journals (Sweden)

    Marcin Gryczka

    2014-12-01

    Full Text Available Development of information and telecommunication technologies (ICT and growing popularity of the Internet as communication medium were the most important incentives having influenced the global knowledge-based economy for at least two decades. Although the ICT infrastructure development degree is still often considered as an information society measure, it seems that nowadays, when so many people has broadband and mobile access to the Web, more appropriate ones are those reflecting ICT skills and efficient Internet usage qualifications. One of the new concepts is "open innovation", which in a wider sense could be understood as an exemplification of networked knowledge and innovation exchange method possible thanks to the contemporary Internet revolution. The main purpose of this paper is to evaluate the general concept of open innovation environment, which could be created in Poland to facilitate and foster a scientific and innovation-oriented cooperation among different stakeholders, like companies (especially SMEs, universities, public institutions and the mass of individual Internet users. To achieve this goal, the latter part of the paper is dedicated to the electronic survey results analysis and discussion. This survey has been conducted via social media and other electronic communication channels in course of author’s research concerning new models of knowledge diffusion and technology transfer in electronic networks, especially on the Web.

  3. Moving in extreme environments: open water swimming in cold and warm water.

    Science.gov (United States)

    Tipton, Michael; Bradford, Carl

    2014-01-01

    Open water swimming (OWS), either 'wild' such as river swimming or competitive, is a fast growing pastime as well as a part of events such as triathlons. Little evidence is available on which to base high and low water temperature limits. Also, due to factors such as acclimatisation, which disassociates thermal sensation and comfort from thermal state, individuals cannot be left to monitor their own physical condition during swims. Deaths have occurred during OWS; these have been due to not only thermal responses but also cardiac problems. This paper, which is part of a series on 'Moving in Extreme Environments', briefly reviews current understanding in pertinent topics associated with OWS. Guidelines are presented for the organisation of open water events to minimise risk, and it is concluded that more information on the responses to immersion in cold and warm water, the causes of the individual variation in these responses and the precursors to the cardiac events that appear to be the primary cause of death in OWS events will help make this enjoyable sport even safer.

  4. Flexible Early Warning Systems with Workflows and Decision Tables

    Science.gov (United States)

    Riedel, F.; Chaves, F.; Zeiner, H.

    2012-04-01

    are usually only suited for rigid processes. We show how improvements can be achieved by using decision tables and rule-based adaptive workflows. Decision tables have been shown to be an intuitive tool that can be used by domain experts to express rule sets that can be interpreted automatically at runtime. Adaptive workflows use a rule-based approach to increase the flexibility of workflows by providing mechanisms to adapt workflows based on context changes, human intervention and availability of services. The combination of workflows, decision tables and rule-based adaption creates a framework that opens up new possibilities for flexible and adaptable workflows, especially, for use in early warning and crisis management systems.

  5. GeolOkit 1.0: a new Open Source, Cross-Platform software for geological data visualization in Google Earth environment

    Science.gov (United States)

    Triantafyllou, Antoine; Bastin, Christophe; Watlet, Arnaud

    2016-04-01

    GIS software suites are today's essential tools to gather and visualise geological data, to apply spatial and temporal analysis and in fine, to create and share interactive maps for further geosciences' investigations. For these purposes, we developed GeolOkit: an open-source, freeware and lightweight software, written in Python, a high-level, cross-platform programming language. GeolOkit software is accessible through a graphical user interface, designed to run in parallel with Google Earth. It is a super user-friendly toolbox that allows 'geo-users' to import their raw data (e.g. GPS, sample locations, structural data, field pictures, maps), to use fast data analysis tools and to plot these one into Google Earth environment using KML code. This workflow requires no need of any third party software, except Google Earth itself. GeolOkit comes with large number of geosciences' labels, symbols, colours and placemarks and may process : (i) multi-points data, (ii) contours via several interpolations methods, (iii) discrete planar and linear structural data in 2D or 3D supporting large range of structures input format, (iv) clustered stereonets and rose diagram, (v) drawn cross-sections as vertical sections, (vi) georeferenced maps and vectors, (vii) field pictures using either geo-tracking metadata from a camera built-in GPS module, or the same-day track of an external GPS. We are looking for you to discover all the functionalities of GeolOkit software. As this project is under development, we are definitely looking to discussions regarding your proper needs, your ideas and contributions to GeolOkit project.

  6. Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews%Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews

    Institute of Scientific and Technical Information of China (English)

    林慧苹; 范玉顺; 吴澄

    2001-01-01

    Many enterprise modeling methods are proposed to model thebusiness process of enterprises and to implement CIM systems. But difficulties are still encountered when these methods are applied to the CIM system design and implementation. This paper proposes a new integrated enterprise modeling methodology based on the workflow model. The system architecture and the integrated modeling environment are described with a new simulation strategy. The modeling process and the relationship between the workflow model and the views are discussed.

  7. It's All About the Data: Workflow Systems and Weather

    Science.gov (United States)

    Plale, B.

    2009-05-01

    Digital data is fueling new advances in the computational sciences, particularly geospatial research as environmental sensing grows more practical through reduced technology costs, broader network coverage, and better instruments. e-Science research (i.e., cyberinfrastructure research) has responded to data intensive computing with tools, systems, and frameworks that support computationally oriented activities such as modeling, analysis, and data mining. Workflow systems support execution of sequences of tasks on behalf of a scientist. These systems, such as Taverna, Apache ODE, and Kepler, when built as part of a larger cyberinfrastructure framework, give the scientist tools to construct task graphs of execution sequences, often through a visual interface for connecting task boxes together with arcs representing control flow or data flow. Unlike business processing workflows, scientific workflows expose a high degree of detail and control during configuration and execution. Data-driven science imposes unique needs on workflow frameworks. Our research is focused on two issues. The first is the support for workflow-driven analysis over all kinds of data sets, including real time streaming data and locally owned and hosted data. The second is the essential role metadata/provenance collection plays in data driven science, for discovery, determining quality, for science reproducibility, and for long-term preservation. The research has been conducted over the last 6 years in the context of cyberinfrastructure for mesoscale weather research carried out as part of the Linked Environments for Atmospheric Discovery (LEAD) project. LEAD has pioneered new approaches for integrating complex weather data, assimilation, modeling, mining, and cyberinfrastructure systems. Workflow systems have the potential to generate huge volumes of data. Without some form of automated metadata capture, either metadata description becomes largely a manual task that is difficult if not impossible

  8. Escript: Open Source Environment For Solving Large-Scale Geophysical Joint Inversion Problems in Python

    Science.gov (United States)

    Gross, Lutz; Altinay, Cihan; Fenwick, Joel; Smith, Troy

    2014-05-01

    The program package escript has been designed for solving mathematical modeling problems using python, see Gross et al. (2013). Its development and maintenance has been funded by the Australian Commonwealth to provide open source software infrastructure for the Australian Earth Science community (recent funding by the Australian Geophysical Observing System EIF (AGOS) and the AuScope Collaborative Research Infrastructure Scheme (CRIS)). The key concepts of escript are based on the terminology of spatial functions and partial differential equations (PDEs) - an approach providing abstraction from the underlying spatial discretization method (i.e. the finite element method (FEM)). This feature presents a programming environment to the user which is easy to use even for complex models. Due to the fact that implementations are independent from data structures simulations are easily portable across desktop computers and scalable compute clusters without modifications to the program code. escript has been successfully applied in a variety of applications including modeling mantel convection, melting processes, volcanic flow, earthquakes, faulting, multi-phase flow, block caving and mineralization (see Poulet et al. 2013). The recent escript release (see Gross et al. (2013)) provides an open framework for solving joint inversion problems for geophysical data sets (potential field, seismic and electro-magnetic). The strategy bases on the idea to formulate the inversion problem as an optimization problem with PDE constraints where the cost function is defined by the data defect and the regularization term for the rock properties, see Gross & Kemp (2013). This approach of first-optimize-then-discretize avoids the assemblage of the - in general- dense sensitivity matrix as used in conventional approaches where discrete programming techniques are applied to the discretized problem (first-discretize-then-optimize). In this paper we will discuss the mathematical framework for

  9. OPEN

    DEFF Research Database (Denmark)

    Nickelsen, Anders; Paterno, Fabio; Grasselli, Agnese;

    2010-01-01

    One important aspect of ubiquitous environments is to provide users with the possibility to freely move about and continue to interact with the available applications through a variety of interactive devices such as cell phones, PDAs, desktop computers, intelligent watches or digital television...... and be controlled by the platform to enrich the user experience with the application. We describe the challenges following the centralisation of a migration platform that can support different types of applications, both games and business applications, implemented with either web-technologies or as component...

  10. ScreenMasker: An Open-source Gaze-contingent Screen Masking Environment.

    Science.gov (United States)

    Orlov, Pavel A; Bednarik, Roman

    2016-09-01

    The moving-window paradigm, based on gazecontingent technic, traditionally used in a studies of the visual perceptual span. There is a strong demand for new environments that could be employed by non-technical researchers. We have developed an easy-to-use tool with a graphical user interface (GUI) allowing both execution and control of visual gaze-contingency studies. This work describes ScreenMasker, an environment that allows create gaze-contingent textured displays used together with stimuli presentation software. ScreenMasker has an architecture that meets the requirements of low-latency real-time eye-movement experiments. It also provides a variety of settings and functions. Effective rendering times and performance are ensured by means of GPU processing under CUDA technology. Performance tests show ScreenMasker's latency to be 67-74 ms on a typical office computer, and high-end 144-Hz screen latencies of about 25-28 ms. ScreenMasker is an open-source system distributed under the GNU Lesser General Public License and is available at https://github.com/PaulOrlov/ScreenMasker .

  11. Controlling cooperativity of a metastable open system coupled weakly to a noisy environment

    Institute of Scientific and Technical Information of China (English)

    赵阳

    2015-01-01

    The notion of cooperativity comprises a specific characteristic of a multipartite system concerning its ability to demon-strate a sigmoidal-type response of varying sensitivities to input stimuli in transitions between states under controlled con-ditions. From a statistical physics viewpoint, in this work we attempt to describe the cooperativity by the stability of a metastable open system with respect to irreversibility. To treat the evolution of a system weakly coupled to the environment in a kinetic framework, we consider two fluctuating energy levels of different dimensionalities, initial population of one level, reversible transitions of population between the levels, and irreversible depopulation of another level. An average is made over level fluctuations and environment vibrations so that inter-level transition rate can be obtained accounting for the influences of external control on level position and dimensionality. It is found that the cooperativity of the two-level system is bounded approximately between 0.736 and unity, with the lower bound indicating worsening system stability.

  12. Controlling cooperativity of a metastable open system coupled weakly to a noisy environment

    Science.gov (United States)

    Victor, I. Teslenko; Oleksiy, L. Kapitanchuk; Zhao, Yang

    2015-02-01

    The notion of cooperativity comprises a specific characteristic of a multipartite system concerning its ability to demonstrate a sigmoidal-type response of varying sensitivities to input stimuli in transitions between states under controlled conditions. From a statistical physics viewpoint, in this work we attempt to describe the cooperativity by the stability of a metastable open system with respect to irreversibility. To treat the evolution of a system weakly coupled to the environment in a kinetic framework, we consider two fluctuating energy levels of different dimensionalities, initial population of one level, reversible transitions of population between the levels, and irreversible depopulation of another level. An average is made over level fluctuations and environment vibrations so that an inter-level transition rate can be obtained accounting for the influences of external control on level position and dimensionality. It is found that the cooperativity of the two-level system is bounded approximately between 0.736 and unity, with the lower bound indicating worsening system stability. Project supported by the National Academy of Sciences of Ukraine (Grant No. 0110U007542) and the National Research Foundation of Singapore through the Competitive Research Programme (Grant No. NRF-CRP5-2009-04).

  13. Book Review: Institutional Repositories: Content and Culture in an Open Access Environment

    Directory of Open Access Journals (Sweden)

    Isabel Galina

    2007-12-01

    Full Text Available As repository technology matures, the cultural and organizational aspects of setting up and running an institutional repository have come to the forefront of the discussion surrounding their deployment. The book deliberately does not discuss any software in particular but focuses more on identifying key stake holders in the changing information environment and their role in the institutional repository scenario with regard to strategic and policy issues. Key aspects such as advocacy, user engagement, content policy, preservation and curation are covered in a clear and practical fashion, drawing on the author’s experience of running an institutional repository. Although the book covers important and relevant issues, it is occasionally uneven in its depth and coverage, dealing with some aspects in great detail and only briefly mentioning others. A short introductory chapter creates the framework for the book by providing a definition of institutional repositories, followed by a very broad second chapter entitled The Changing Information Environment. In this chapter key stake holders are identified and described, followed by a general section describing the Open Access movement and finishing by describing certain online information tools such as Flickr and Wikipedia in quite some detail. Although it is clear that the intention is to place institutional repositories within the wider information content, it would have been interesting if the author had mentioned for example, Cyber infrastructure or eScience projects which are important frameworks for future digital networks and academic communication and publishing.

  14. A standard-enabled workflow for synthetic biology.

    Science.gov (United States)

    Myers, Chris J; Beal, Jacob; Gorochowski, Thomas E; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Göksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications. © 2017 The Author(s); published by Portland Press Limited on behalf of the Biochemical Society.

  15. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    Directory of Open Access Journals (Sweden)

    Tianhong Song

    2014-10-01

    Full Text Available Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfalls. For example, static analysis before execution can be used to detect the potential problems in a workflow and help the user to improve workflow design. In this paper, we propose a declarative workflow approach that supports semi-automated workflow design, analysis and optimization. We show how the workflow design engine helps users to construct data curation workflows, how the workflow analysis engine detects different design problems of workflows and how workflows can be optimized by exploiting parallelism.

  16. A Novel Open Service Framework Mining (OSFM for Executing Data Mining tasks

    Directory of Open Access Journals (Sweden)

    Asif Ali

    2011-09-01

    Full Text Available Data mining services on grids is the need of today’s era. Workflow environments are widely used in data mining systems to manage data and execution flows associated to complex applications. Weka, one of the most used open-source data mining systems, includes the Knowledge-Flow environment which provides a drag-and-drop inter-face to compose and execute data mining workflows. It allows users to execute a whole workflow only on a single compute on the basis of simplicity. There are several workflows in today’s scene. Most data mining workflows include several independent branches that could be run in parallel on a set of distributed machines to reduce the overall execution time. In this paper we proposed a novel Open Service Framework Mining (OSFM for executing data mining tasks. Our algorithm contains five phases 1 Authentication 2 Reading Database3 Define the minimum support 4 Subset Find 5 Prune phase. Finally our algorithm shows better performance showing the simulation result.

  17. VLAM-G: Interactive Data Driven Workflow Engine for Grid-Enabled Resources

    Directory of Open Access Journals (Sweden)

    Vladimir Korkhov

    2007-01-01

    Full Text Available Grid brings the power of many computers to scientists. However, the development of Grid-enabled applications requires knowledge about Grid infrastructure and low-level API to Grid services. In turn, workflow management systems provide a high-level environment for rapid prototyping of experimental computing systems. Coupling Grid and workflow paradigms is important for the scientific community: it makes the power of the Grid easily available to the end user. The paradigm of data driven workflow execution is one of the ways to enable distributed workflow on the Grid. The work presented in this paper is carried out in the context of the Virtual Laboratory for e-Science project. We present the VLAM-G workflow management system and its core component: the Run-Time System (RTS. The RTS is a dataflow driven workflow engine which utilizes Grid resources, hiding the complexity of the Grid from a scientist. Special attention is paid to the concept of dataflow and direct data streaming between distributed workflow components. We present the architecture and components of the RTS, describe the features of VLAM-G workflow execution, and evaluate the system by performance measurements and a real life use case.

  18. Workflow modeling in the graphic arts and printing industry

    Science.gov (United States)

    Tuijn, Chris

    2003-12-01

    The last few years, a lot of effort has been spent on the standardization of the workflow in the graphic arts and printing industry. The main reasons for this standardization are two-fold: first of all, the need to represent all aspects of products, processes and resources in a uniform, digital framework and, secondly, the need to have different systems communicate with each other without having to implement dedicated drivers or protocols. Since many years, a number of organizations in the IT sector have been quite busy developing models and languages on the topic of workflow modeling. In addition to the more formal methods (such as, e.g., extended finite state machines, Petri Nets, Markov Chains etc.) introduced a number of decades ago, more pragmatic methods have been proposed quite recently. We hereby think in particular of the activities of the Workflow Management Coalition that resulted in an XML based Process Definition Language. Although one might be tempted to use the already established standards in the graphic environment, one should be well aware of the complexity and uniqueness of the graphic arts workflow. In this paper, we will show that it is quite hard though not impossible to model the graphic arts workflow using the already established workflow systems. After a brief summary of the graphic arts workflow requirements, we will show why the traditional models are less suitable to use. It will turn out that one of the main reasons for the incompatibility is that the graphic arts workflow is primarily resource driven; this means that the activation of processes depends on the status of different incoming resources. The fact that processes can start running with a partial availability of the input resources is a further complication that asks for additional knowledge on process level. In the second part of this paper, we will discuss in more detail the different software components that are available in any graphic enterprise. In the last part, we will

  19. Camera Calibration Based on OpenCV in VS2010 Environment%OpenCV在摄像机标定上的应用

    Institute of Scientific and Technical Information of China (English)

    梅向辉; 杨洁

    2015-01-01

    In this paper, the accuracy of camera calibration in the scope of computer vision is improved by taking the lens distortion into consideration based on the analysis of camera model of OpenCV. In VS2010 environment, camera calibration algorithm based on the OpenCV is obtained owing to the full use of the OpenCv library functions. The algorithm enjoys high accuracy of calibration, simplicity of operation, great running efficiency and excellent scalability, thus meeting real-time requirements.%针对计算机视觉领域内的摄像机标定问题,在分析 OpenCV 中摄像机模型的基础上,考虑透镜畸变,提高标定精度.在VS2010开发环境下,充分利用OpenCv函数库的功能,给出了基于OpenCv的摄像机标定算法.该算法具有标定结果精确、操作简单、运行效率高、可扩展性好,可满足实时性要求.

  20. Operational Semantic of Workflow Engine and the Realizing Technique

    Institute of Scientific and Technical Information of China (English)

    FU Yan-ning; LIU Lei; ZHAO Dong-fan; JIN Long-fei

    2005-01-01

    At present, there is no formalized description of the executing procedure of workflow models. The procedure of workflow models executing in workflow engine is described using operational semantic. The formalized description of process instances and activity instances leads to very clear structure of the workflow engine, has easy cooperation of the heterogeneous workflow engines and guides the realization of the workflow engine function. Meanwhile, the software of workflow engine has been completed by means of the formalized description.

  1. Designing a road map for geoscience workflows

    Science.gov (United States)

    Duffy, Christopher; Gil, Yolanda; Deelman, Ewa; Marru, Suresh; Pierce, Marlon; Demir, Ibrahim; Wiener, Gerry

    2012-06-01

    Advances in geoscience research and discovery are fundamentally tied to data and computation, but formal strategies for managing the diversity of models and data resources in the Earth sciences have not yet been resolved or fully appreciated. The U.S. National Science Foundation (NSF) EarthCube initiative (http://earthcube.ning.com), which aims to support community-guided cyberinfrastructure to integrate data and information across the geosciences, recently funded four community development activities: Geoscience Workflows; Semantics and Ontologies; Data Discovery, Mining, and Integration; and Governance. The Geoscience Workflows working group, with broad participation from the geosciences, cyberinfrastructure, and other relevant communities, is formulating a workflows road map (http://sites.google.com/site/earthcubeworkflow/). The Geoscience Workflows team coordinates with each of the other community development groups given their direct relevance to workflows. Semantics and ontologies are mechanisms for describing workflows and the data they process.

  2. Workflows for Full Waveform Inversions

    Science.gov (United States)

    Boehm, Christian; Krischer, Lion; Afanasiev, Michael; van Driel, Martin; May, Dave A.; Rietmann, Max; Fichtner, Andreas

    2017-04-01

    Despite many theoretical advances and the increasing availability of high-performance computing clusters, full seismic waveform inversions still face considerable challenges regarding data and workflow management. While the community has access to solvers which can harness modern heterogeneous computing architectures, the computational bottleneck has fallen to these often manpower-bounded issues that need to be overcome to facilitate further progress. Modern inversions involve huge amounts of data and require a tight integration between numerical PDE solvers, data acquisition and processing systems, nonlinear optimization libraries, and job orchestration frameworks. To this end we created a set of libraries and applications revolving around Salvus (http://salvus.io), a novel software package designed to solve large-scale full waveform inverse problems. This presentation focuses on solving passive source seismic full waveform inversions from local to global scales with Salvus. We discuss (i) design choices for the aforementioned components required for full waveform modeling and inversion, (ii) their implementation in the Salvus framework, and (iii) how it is all tied together by a usable workflow system. We combine state-of-the-art algorithms ranging from high-order finite-element solutions of the wave equation to quasi-Newton optimization algorithms using trust-region methods that can handle inexact derivatives. All is steered by an automated interactive graph-based workflow framework capable of orchestrating all necessary pieces. This naturally facilitates the creation of new Earth models and hopefully sparks new scientific insights. Additionally, and even more importantly, it enhances reproducibility and reliability of the final results.

  3. Usage of OpenGL in the Visual Basic Environment%VB环境下OpenGL的使用

    Institute of Scientific and Technical Information of China (English)

    马继东; 王立海

    2007-01-01

    OpenGL作为事实的工业标准,广泛地应用于二维和三维图形的编程中,但大多工作于C语言的环境下,如何在VB环境下使用它,相关的参考文献较少.本文针对此问题,论述了在Visual Basic环境下,应用第三方函数库VBOpenGL type library进行三维图形编程设计的使用方法.

  4. Agile parallel bioinformatics workflow management using Pwrake.

    OpenAIRE

    2011-01-01

    Abstract Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environm...

  5. Effects of acoustic environment on work in private office rooms and open-plan offices - longitudinal study during relocation.

    Science.gov (United States)

    Kaarlela-Tuomaala, A; Helenius, R; Keskinen, E; Hongisto, V

    2009-11-01

    The aim was to determine how the perceived work environment, especially acoustic environment, and its effects differed in private office rooms and in open-plan offices. The subjects consisted of 31 workers who moved from private office rooms to open-plan offices and who answered the questionnaire before and after the relocation. Private office rooms were occupied only by one person while open-plan offices were occupied by more than 20 persons. Room acoustical descriptors showed a significant reduction in speech privacy after relocation. The noise level averaged over the whole work day did not change but the variability of noise level reduced significantly. Negative effects of acoustic environment increased significantly, including increased distraction, reduced privacy, increased concentration difficulties and increased use of coping strategies. Self-rated loss of work performance because of noise doubled. Cognitively demanding work and phone conversations were most distracted by noise. The benefits that are often associated with open-plan offices did not appear: cooperation became less pleasant and direct and information flow did not change. Nowadays, most office workers, independent of job type, are located in open-plan offices without the individual needs of privacy, concentration and interaction being analysed. This intervention study consisted of professional workers. Their work tasks mainly required individual efforts, and interaction between other workers was not of primary concern, although necessary. The results suggest that the open-plan office is not recommended for professional workers. Similar intervention studies should also be made for other job types.

  6. Pro WF Windows Workflow in NET 40

    CERN Document Server

    Bukovics, Bruce

    2010-01-01

    Windows Workflow Foundation (WF) is a revolutionary part of the .NET 4 Framework that allows you to orchestrate human and system interactions as a series of workflows that can be easily mapped, analyzed, adjusted, and implemented. As business problems become more complex, the need for workflow-based solutions has never been more evident. WF provides a simple and consistent way to model and implement complex problems. As a developer, you focus on developing the business logic for individual workflow tasks. The runtime handles the execution of those tasks after they have been composed into a wor

  7. Integration of services into workflow applications

    CERN Document Server

    Czarnul, Pawel

    2015-01-01

    Describing state-of-the-art solutions in distributed system architectures, Integration of Services into Workflow Applications presents a concise approach to the integration of loosely coupled services into workflow applications. It discusses key challenges related to the integration of distributed systems and proposes solutions, both in terms of theoretical aspects such as models and workflow scheduling algorithms, and technical solutions such as software tools and APIs.The book provides an in-depth look at workflow scheduling and proposes a way to integrate several different types of services

  8. Paleomagnetism.org: An online multi-platform open source environment for paleomagnetic data analysis

    Science.gov (United States)

    Koymans, Mathijs R.; Langereis, Cor G.; Pastor-Galán, Daniel; van Hinsbergen, Douwe J. J.

    2016-08-01

    This contribution provides an overview of Paleomagnetism.org, an open-source, multi-platform online environment for paleomagnetic data analysis. Paleomagnetism.org provides an interactive environment where paleomagnetic data can be interpreted, evaluated, visualized, and exported. The Paleomagnetism.org application is split in to an interpretation portal, a statistics portal, and a portal for miscellaneous paleomagnetic tools. In the interpretation portal, principle component analysis can be performed on visualized demagnetization diagrams. Interpreted directions and great circles can be combined to find great circle solutions. These directions can be used in the statistics portal, or exported as data and figures. The tools in the statistics portal cover standard Fisher statistics for directions and VGPs, including other statistical parameters used as reliability criteria. Other available tools include an eigenvector approach foldtest, two reversal test including a Monte Carlo simulation on mean directions, and a coordinate bootstrap on the original data. An implementation is included for the detection and correction of inclination shallowing in sediments following TK03.GAD. Finally we provide a module to visualize VGPs and expected paleolatitudes, declinations, and inclinations relative to widely used global apparent polar wander path models in coordinates of major continent-bearing plates. The tools in the miscellaneous portal include a net tectonic rotation (NTR) analysis to restore a body to its paleo-vertical and a bootstrapped oroclinal test using linear regressive techniques, including a modified foldtest around a vertical axis. Paleomagnetism.org provides an integrated approach for researchers to work with visualized (e.g. hemisphere projections, Zijderveld diagrams) paleomagnetic data. The application constructs a custom exportable file that can be shared freely and included in public databases. This exported file contains all data and can later be

  9. Iterative Workflows for Numerical Simulations in Subsurface Sciences

    Energy Technology Data Exchange (ETDEWEB)

    Chase, Jared M.; Schuchardt, Karen L.; Chin, George; Daily, Jeffrey A.; Scheibe, Timothy D.

    2008-07-08

    Numerical simulators are frequently used to assess future risks, support remediation and monitoring program decisions, and assist in design of specific remedial actions with respect to groundwater contaminants. Due to the complexity of the subsurface environment and uncertainty in the models, many alternative simulations must be performed, each producing data that is typically post-processed and analyzed before deciding on the next set of simulations Though parts of the process are readily amenable to automation through scientific workflow tools, the larger”research workflow”, is not supported by current tools. We present a detailed use case for subsurface modeling, describe the use case in terms of workflow structure, briefly summarize a prototype that seeks to facilitate the overall modeling process, and discuss the many challenges for building such a comprehensive environment.

  10. Rapid speciation in a newly opened postglacial marine environment, the Baltic Sea.

    Science.gov (United States)

    Pereyra, Ricardo T; Bergström, Lena; Kautsky, Lena; Johannesson, Kerstin

    2009-03-31

    Theory predicts that speciation can be quite rapid. Previous examples comprise a wide range of organisms such as sockeye salmon, polyploid hybrid plants, fruit flies and cichlid fishes. However, few studies have shown natural examples of rapid evolution giving rise to new species in marine environments. Using microsatellite markers, we show the evolution of a new species of brown macroalga (Fucus radicans) in the Baltic Sea in the last 400 years, well after the formation of this brackish water body ~8-10 thousand years ago. Sympatric individuals of F. radicans and F. vesiculosus (bladder wrack) show significant reproductive isolation. Fucus radicans, which is endemic to the Baltic, is most closely related to Baltic Sea F. vesiculosus among north Atlantic populations, supporting the hypothesis of a recent divergence. Fucus radicans exhibits considerable clonal reproduction, probably induced by the extreme conditions of the Baltic. This reproductive mode is likely to have facilitated the rapid foundation of the new taxon. This study represents an unparalleled example of rapid speciation in a species-poor open marine ecosystem and highlights the importance of increasing our understanding on the role of these habitats in species formation. This observation also challenges presumptions that rapid speciation takes place only in hybrid plants or in relatively confined geographical places such as postglacial or crater lakes, oceanic islands or rivers.

  11. Rapid speciation in a newly opened postglacial marine environment, the Baltic Sea

    Directory of Open Access Journals (Sweden)

    Kautsky Lena

    2009-03-01

    Full Text Available Abstract Background Theory predicts that speciation can be quite rapid. Previous examples comprise a wide range of organisms such as sockeye salmon, polyploid hybrid plants, fruit flies and cichlid fishes. However, few studies have shown natural examples of rapid evolution giving rise to new species in marine environments. Results Using microsatellite markers, we show the evolution of a new species of brown macroalga (Fucus radicans in the Baltic Sea in the last 400 years, well after the formation of this brackish water body ~8–10 thousand years ago. Sympatric individuals of F. radicans and F. vesiculosus (bladder wrack show significant reproductive isolation. Fucus radicans, which is endemic to the Baltic, is most closely related to Baltic Sea F. vesiculosus among north Atlantic populations, supporting the hypothesis of a recent divergence. Fucus radicans exhibits considerable clonal reproduction, probably induced by the extreme conditions of the Baltic. This reproductive mode is likely to have facilitated the rapid foundation of the new taxon. Conclusion This study represents an unparalleled example of rapid speciation in a species-poor open marine ecosystem and highlights the importance of increasing our understanding on the role of these habitats in species formation. This observation also challenges presumptions that rapid speciation takes place only in hybrid plants or in relatively confined geographical places such as postglacial or crater lakes, oceanic islands or rivers.

  12. Visual Overlay on OpenStreetMap Data to Support Spatial Exploration of Urban Environments

    Directory of Open Access Journals (Sweden)

    Chandan Kumar

    2015-01-01

    Full Text Available Increasing volumes of spatial data about urban areas are captured and made available via volunteered geographic information (VGI sources, such as OpenStreetMap (OSM. Hence, new opportunities arise for regional exploration that can lead to improvements in the lives of citizens through spatial decision support. We believe that the VGI data of the urban environment could be used to present a constructive overview of the regional infrastructure with the advent of web technologies. Current location-based services provide general map-based information for the end users with conventional local search functionality, and hence, the presentation of the rich urban information is limited. In this work, we analyze the OSM data to classify the geo entities into consequential categories with facilities, landscape and land use distribution. We employ a visual overlay of heat map and interactive visualizations to present the regional characterization on OSM data classification. In the proposed interface, users are allowed to express a variety of spatial queries to exemplify their geographic interests. They can compare the characterization of urban areas with respect to multiple spatial dimensions of interest and can search for the most suitable region. The search experience is further enhanced via efficient optimization and interaction methods to support the decision making of end users. We report the end user acceptability and efficiency of the proposed system via usability studies and performance analysis comparison.

  13. Rapid speciation in a newly opened postglacial marine environment, the Baltic Sea

    Science.gov (United States)

    Pereyra, Ricardo T; Bergström, Lena; Kautsky, Lena; Johannesson, Kerstin

    2009-01-01

    Background Theory predicts that speciation can be quite rapid. Previous examples comprise a wide range of organisms such as sockeye salmon, polyploid hybrid plants, fruit flies and cichlid fishes. However, few studies have shown natural examples of rapid evolution giving rise to new species in marine environments. Results Using microsatellite markers, we show the evolution of a new species of brown macroalga (Fucus radicans) in the Baltic Sea in the last 400 years, well after the formation of this brackish water body ~8–10 thousand years ago. Sympatric individuals of F. radicans and F. vesiculosus (bladder wrack) show significant reproductive isolation. Fucus radicans, which is endemic to the Baltic, is most closely related to Baltic Sea F. vesiculosus among north Atlantic populations, supporting the hypothesis of a recent divergence. Fucus radicans exhibits considerable clonal reproduction, probably induced by the extreme conditions of the Baltic. This reproductive mode is likely to have facilitated the rapid foundation of the new taxon. Conclusion This study represents an unparalleled example of rapid speciation in a species-poor open marine ecosystem and highlights the importance of increasing our understanding on the role of these habitats in species formation. This observation also challenges presumptions that rapid speciation takes place only in hybrid plants or in relatively confined geographical places such as postglacial or crater lakes, oceanic islands or rivers. PMID:19335884

  14. Gas-phase generation of photoacoustic sound in an open environment.

    Science.gov (United States)

    Yönak, Serdar H; Dowling, David R

    2003-12-01

    The photoacoustic effect is commonly exploited for molecular spectroscopy, nondestructive evaluation, and trace gas detection. Photoacoustic sound is produced when a photoactive material absorbs electromagnetic radiation and converts it to acoustic waves. This article focuses on the generation of photoacoustic sound from thermal expansion of photoactive gases due to unsteady heating from a laser light source, and extends the work of prior studies on photoacoustic sound generation in an open environment. Starting with the forced free-space wave equation, a simple model is constructed for photoacoustic sounds produced by both acoustically distributed and compact gas clouds. The model accounts for laser absorption through the Lambert-Beer law and includes the effects of photoactive gas cloud characteristics (shape, size, and concentration distribution), but does not include molecular diffusion, thermal conduction, convection, or the effects of acoustic propagation through sound-absorbing inhomogeneous media. This model is compared to experimentally measured photoacoustic sounds generated by scanning a 10.6-micron carbon dioxide (CO2) laser beam through small clouds of a photoactive gas, sulfur hexafluoride (SF6). For the current investigation, the photoactive gas clouds are formed either by low flow-rate calibrated leak sources or by a laminar jet emerging from a 1.6-mm-diam tube. Model-measurement comparisons are presented over a 3- to 160-kHz bandwidth. Signal pulse shapes from simple gas cloud geometries are found to match calculated results when unmeasured gas cloud characteristics within the model are adjusted.

  15. AtomPy: An Open Atomic Data Curation Environment for Astrophysical Applications

    Directory of Open Access Journals (Sweden)

    Claudio Mendoza

    2014-05-01

    Full Text Available We present a cloud-computing environment, referred to as AtomPy, based on Google-Drive Sheets and Pandas (Python Data Analysis Library DataFrames to promote community-driven curation of atomic data for astrophysical applications, a stage beyond database development. The atomic model for each ionic species is contained in a multi-sheet workbook, tabulating representative sets of energy levels, A-values and electron impact effective collision strengths from different sources. The relevant issues that AtomPy intends to address are: (i data quality by allowing open access to both data producers and users; (ii comparisons of different datasets to facilitate accuracy assessments; (iii downloading to local data structures (i.e., Pandas DataFrames for further manipulation and analysis by prospective users; and (iv data preservation by avoiding the discard of outdated sets. Data processing workflows are implemented by means of IPython Notebooks, and collaborative software developments are encouraged and managed within the GitHub social network. The facilities of AtomPy are illustrated with the critical assessment of the transition probabilities for ions in the hydrogen and helium isoelectronic sequences with atomic number Z ≤ 10.

  16. Towards a framework for standardized semantic workflow modeling and management in the surgical domain

    Directory of Open Access Journals (Sweden)

    Neumann Juliane

    2015-09-01

    Full Text Available An essential aspect for workflow management support in operating room environments is the description and visualization of the underlying processes and activities in a machine readable format as Surgical Process Models (SPM. However, the process models often vary in terms of granularity, naming and representation of process elements and their modeling structure. The aim of this paper is to present a new methodology for standardized semantic workflow modeling and a framework for semantic work-flow execution and management in the surgical domain.

  17. Using sentence openers to foster student interaction in computer-mediated learning environments.

    NARCIS (Netherlands)

    Lazonder, Adrianus W.; Wilhelm, P.; Ootes, S.A.W.

    2003-01-01

    This paper reports two studies into the efficacy of sentence openers to foster online peer-to-peer interaction. Sentence openers are pre-defined ways to start an utterance that are implemented in communication facilities as menu's or buttons. In the first study, typical opening phrases were derived

  18. PREDICTION OF AEROSOL HAZARDS ARISING FROM THE OPENING OF AN ANTHRAX-TAINTED LETTER IN AN OPEN OFFICE ENVIRONMENT USING COMPUTATIONAL FLUID DYNAMICS

    Directory of Open Access Journals (Sweden)

    FUE-SANG LIEN

    2010-09-01

    Full Text Available Early experimental work, conducted at Defence R&D Canada–Suffield, measured and characterized the personal and environmental contamination associated with simulated anthrax-tainted letters under a number of different scenarios in order to obtain a better understanding of the physical and biological processes for detecting, assessing, and formulating potential mitigation strategies for managing the risks associated with opening an anthrax-tainted letter. These experimental investigations have been extended in the present study to simulate numerically the contamination from the opening of anthrax-tainted letters in an open office environment using computational fluid dynamics (CFD. A quantity of 0.1 g of Bacillus atropheus (formerly referred to as Bacillus subtilis var globigii (BG spores in dry powder form, which was used here as a surrogate species for Bacillus anthracis (anthrax, was released from an opened letter in the experiment. The accuracy of the model for prediction of the spatial distribution of BG spores in the office from the opened letter is assessed qualitatively (and to the extent possible, quantitatively by detailed comparison with measured BG concentrations obtained under a number of different scenarios, some involving people moving within the office. The observed discrepancy between the numerical predictions and experimental measurements of concentration was probably the result of a number of physical processes which were not accounted for in the numerical simulation. These include air flow leakage from cracks and crevices of the building shell; the dispersion of BG spores in the Heating, Ventilation, and Air Conditioning (HVAC system; and, the effect of deposition and re-suspension of BG spores from various surfaces in the office environment.

  19. A framework for streamlining research workflow in neuroscience and psychology

    Directory of Open Access Journals (Sweden)

    Jonas eKubilius

    2014-01-01

    Full Text Available Successful accumulation of knowledge is critically dependent on the ability to verify and replicate every part of scientific conduct. However, such principles are difficult to enact when researchers continue to resort on ad hoc workflows and with poorly maintained code base. In this paper I examine the needs of neuroscience and psychology community, and introduce psychopy_ext, a unifying framework that seamlessly integrates popular experiment building, analysis and manuscript preparation tools by choosing reasonable defaults and implementing relatively rigid patterns of workflow. This structure allows for automation of multiple tasks, such as generated user interfaces, unit testing, control analyses of stimuli, single-command access to descriptive statistics, and publication quality plotting. Taken together, psychopy_ext opens an exciting possibility for faster, more robust code development and collaboration for researchers.

  20. Response to Comment on "Open-ocean fish reveal an omnidirectional solution to camouflage in polarized environments".

    Science.gov (United States)

    Brady, Parrish; Gilerson, Alex; Kattawar, George; Sullivan, Jim; Twardowski, Mike; Dierssen, Heidi; Cummings, Molly

    2016-08-01

    Cronin et al take issue with our evidence for polarocryptic carangid fish based on concerns of pseudoreplication, our contrast metric, and habitat. We clarify (i) the importance of camouflage in near-surface open ocean environments and (ii) the use of a Stokes contrast metric and further (iii) conduct individual-based statistics on our data set to confirm the reported polarocrypsis patterns.

  1. Experimental check of possibility of research of repeated inclusion of the open gas generator in the water environment

    Directory of Open Access Journals (Sweden)

    Goldaev Sergey

    2017-01-01

    Full Text Available Experimental study of repeated burning interruption with the subsequent ignition in the water environment for bibasic firm fuel as a part of a model open solid propellant gas generator is presented. Video filming of burning zone mobile localizer providing process of repeated inclusion of a gas generator are given. Some parameters of the processes proceeding in these conditions are defined

  2. Creating a Sustainable City through a System of Citizen-Based Learning: ESD at Nagoya Open University of the Environment

    Science.gov (United States)

    Chikami, Satoshi; Sobue, Kirstie

    2008-01-01

    In Japan, environmental education partnerships among citizens, businesses and local government increased since new legislation was introduced in 2003, but there was little evidence of cross-sector collaboration until recently. Nagoya Open University of the Environment is a highly innovative, multi-sectoral citizen learning system founded in 2005…

  3. The Symbiotic Relationship between Scientific Workflow and Provenance (Invited)

    Science.gov (United States)

    Stephan, E.

    2010-12-01

    The purpose of this presentation is to describe the symbiotic nature of scientific workflows and provenance. We will also discuss the current trends and real world challenges facing these two distinct research areas. Although motivated differently, the needs of the international science communities are the glue that binds this relationship together. Understanding and articulating the science drivers to these communities is paramount as these technologies evolve and mature. Originally conceived for managing business processes, workflows are now becoming invaluable assets in both computational and experimental sciences. These reconfigurable, automated systems provide essential technology to perform complex analyses by coupling together geographically distributed disparate data sources and applications. As a result, workflows are capable of higher throughput in a shorter amount of time than performing the steps manually. Today many different workflow products exist; these could include Kepler and Taverna or similar products like MeDICI, developed at PNNL, that are standardized on the Business Process Execution Language (BPEL). Provenance, originating from the French term Provenir “to come from”, is used to describe the curation process of artwork as art is passed from owner to owner. The concept of provenance was adopted by digital libraries as a means to track the lineage of documents while standards such as the DublinCore began to emerge. In recent years the systems science community has increasingly expressed the need to expand the concept of provenance to formally articulate the history of scientific data. Communities such as the International Provenance and Annotation Workshop (IPAW) have formalized a provenance data model. The Open Provenance Model, and the W3C is hosting a provenance incubator group featuring the Proof Markup Language. Although both workflows and provenance have risen from different communities and operate independently, their mutual

  4. Implementing bioinformatic workflows within the bioextract server

    Science.gov (United States)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  5. Contracts for Cross-Organizational Workflow Management

    NARCIS (Netherlands)

    Koetsier, M.J.; Grefen, P.W.P.J.; Vonk, J.

    1999-01-01

    Nowadays, many organizations form dynamic partnerships to deal effectively with market requirements. As companies use automated workflow systems to control their processes, a way of linking workflow processes in different organizations is useful in turning the co-operating companies into a seamless

  6. Contracts for Cross-Organizational Workflow Management

    NARCIS (Netherlands)

    Koetsier, M.; Grefen, P.; Vonk, J.

    1999-01-01

    Nowadays, many organizations form dynamic partnerships to deal effectively with market requirements. As companies use automated workflow systems to control their processes, a way of linking workflow processes in different organizations is useful in turning the co-operating companies into a seamless

  7. Graphical overview and navigation of electronic health records in a prototyping environment using Google Earth and openEHR archetypes.

    Science.gov (United States)

    Sundvall, Erik; Nyström, Mikael; Forss, Mattias; Chen, Rong; Petersson, Håkan; Ahlfeldt, Hans

    2007-01-01

    This paper describes selected earlier approaches to graphically relating events to each other and to time; some new combinations are also suggested. These are then combined into a unified prototyping environment for visualization and navigation of electronic health records. Google Earth (GE) is used for handling display and interaction of clinical information stored using openEHR data structures and 'archetypes'. The strength of the approach comes from GE's sophisticated handling of detail levels, from coarse overviews to fine-grained details that has been combined with linear, polar and region-based views of clinical events related to time. The system should be easy to learn since all the visualization styles can use the same navigation. The structured and multifaceted approach to handling time that is possible with archetyped openEHR data lends itself well to visualizing and integration with openEHR components is provided in the environment.

  8. Metaworkflows and Workflow Interoperability for Heliophysics

    Science.gov (United States)

    Pierantoni, Gabriele; Carley, Eoin P.

    2014-06-01

    Heliophysics is a relatively new branch of physics that investigates the relationship between the Sun and the other bodies of the solar system. To investigate such relationships, heliophysicists can rely on various tools developed by the community. Some of these tools are on-line catalogues that list events (such as Coronal Mass Ejections, CMEs) and their characteristics as they were observed on the surface of the Sun or on the other bodies of the Solar System. Other tools offer on-line data analysis and access to images and data catalogues. During their research, heliophysicists often perform investigations that need to coordinate several of these services and to repeat these complex operations until the phenomena under investigation are fully analyzed. Heliophysicists combine the results of these services; this service orchestration is best suited for workflows. This approach has been investigated in the HELIO project. The HELIO project developed an infrastructure for a Virtual Observatory for Heliophysics and implemented service orchestration using TAVERNA workflows. HELIO developed a set of workflows that proved to be useful but lacked flexibility and re-usability. The TAVERNA workflows also needed to be executed directly in TAVERNA workbench, and this forced all users to learn how to use the workbench. Within the SCI-BUS and ER-FLOW projects, we have started an effort to re-think and re-design the heliophysics workflows with the aim of fostering re-usability and ease of use. We base our approach on two key concepts, that of meta-workflows and that of workflow interoperability. We have divided the produced workflows in three different layers. The first layer is Basic Workflows, developed both in the TAVERNA and WS-PGRADE languages. They are building blocks that users compose to address their scientific challenges. They implement well-defined Use Cases that usually involve only one service. The second layer is Science Workflows usually developed in TAVERNA. They

  9. Workflow Management in CLARIN-DK

    DEFF Research Database (Denmark)

    Jongejan, Bart

    2013-01-01

    output, given the input resource at hand. Still, in such cases it may be possible to reach the set goal by chaining a number of tools. The approach presented here frees the user of having to meddle with tools and the construction of workflows. Instead, the user only needs to supply the workflow manager......The CLARIN-DK infrastructure is not only a repository of resources, but also a place where users can analyse, annotate, reformat and potentially even translate resources, using tools that are integrated in the infrastructure as web services. In many cases a single tool does not produce the desired...... with the features that describe her goal, because the workflow manager not only executes chains of tools in a workflow, but also takes care of autonomously devising workflows that serve the user’s intention, given the tools that currently are integrated in the infrastructure as web services. To do this...

  10. Implementation of WPDL Conforming Workflow Model

    Institute of Scientific and Technical Information of China (English)

    张志君; 范玉顺

    2003-01-01

    Workflow process definition language (WPDL) facilitates the transfer of workflow process definitions between separate workflow products. However, much work is still needed to transfer the specific workflow model to a WPDL conforming model. CIMFlow is a workflow management system developed by the National CIMS Engineering Research Center. This paper discusses the methods by which the CIMFlow model conforms to the WPDL meta-model and the differences between the WPDL meta-model and the CIMFlow model. Some improvements are proposed for the WPDL specification. Finally, the mapping and translating methods between the entities and attributes are given for the two models. The proposed methods and improvements are valuable as a reference for other mapping applications and the WPDL specification.

  11. Complexity Metrics for Workflow Nets

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M.P.

    2009-01-01

    , etc. It seems obvious that the complexity of the model contributes to design errors and a lack of understanding. It is not easy to measure complexity, however. This paper presents three complexity metrics that have been implemented in the process analysis tool ProM. The metrics are defined...... analysts have difficulties grasping the dynamics implied by a process model. Recent empirical studies show that people make numerous errors when modeling complex business processes, e.g., about 20 percent of the EPCs in the SAP reference model have design flaws resulting in potential deadlocks, livelocks...... for a subclass of Petri nets named Workflow nets, but the results can easily be applied to other languages. To demonstrate the applicability of these metrics, we have applied our approach and tool to 262 relatively complex Protos models made in the context of various student projects. This allows us to validate...

  12. Workflow patterns the definitive guide

    CERN Document Server

    Russell, Nick; ter Hofstede, Arthur H M

    2016-01-01

    The study of business processes has emerged as a highly effective approach to coordinating an organization's complex service- and knowledge-based activities. The growing field of business process management (BPM) focuses on methods and tools for designing, enacting, and analyzing business processes. This volume offers a definitive guide to the use of patterns, which synthesize the wide range of approaches to modeling business processes. It provides a unique and comprehensive introduction to the well-known workflow patterns collection -- recurrent, generic constructs describing common business process modeling and execution scenarios, presented in the form of problem-solution dialectics. The underlying principles of the patterns approach ensure that they are independent of any specific enabling technology, representational formalism, or modeling approach, and thus broadly applicable across the business process modeling and business process technology domains. The authors, drawing on extensive research done by...

  13. DEVELOPING CONCEPTUAL FRAMEWORK FOR REVISING SELF-LEARNING MATERIALS (SLMS OF THE OPEN SCHOOL (OS OF BANGLADESH OPEN UNIVERSITY (BOU AT A DIGITAL ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Sabina YEASMIN

    2012-07-01

    Full Text Available Bangladesh Open University (BOU runs school programs as part of its academic activities through open schooling since its inception. As of today, the Open School uses the first generation self-learning materials (SLMs written, before an era, following an in-house style and template. The concerned faculty member corrects, every year, texts before the reprint; but this is limited to spelling mistakes, factual errors and page make-ups only. The University has taken policy and steps to revise the texts as a whole which is still limited to the previous process. But; the current government is implementing the agenda of digital Bangladesh which definitely will influence the texts vis-à-vis template, learner’s instructions, gender-sensitiveness, context and content. In addition, education theory has been sifted from instructivism to constructivism which is being experimented and implemented by the Ministerial project entitled Teaching Quality Improvement (TQI partnering with the BOU School of Education with new texts. Time changes, new things are being adopted. Open School also requires revising its texts in relation to the government’s current agenda of implementing the digital Bangladesh. This study collects data from tutors, distance educators, writers and reviewers and finally develops a framework for revising the OS SLMs at a digital environment.

  14. Application of the organic and environment evolving principle on the land reclamation in the open coal mines

    Institute of Scientific and Technical Information of China (English)

    FAN Jun-fu

    2005-01-01

    According to the evolving principle of the organic and environment, firstly, for the purpose of growing and keeping the ground, some legume species were chosen as pioneer plants to improve the construction of soil and increased soil fertility in the light of the land term and soil condition. Along with soil fertility increased, it is necessary to cultivate some shrubs and arbors which have extra resistance. Gradually it becomes the stereoscopic landscape of planting arbor-shrub-herb plants. So that the evolving of the organic and environment can be enhanced. Taking the land reclamation in the refuse dump of Heidaigou open coal mines as the practical example, explained the application of the organic and environment evolving principle on the land reclamation in the open coal mines.

  15. myExperiment: a repository and social network for the sharing of bioinformatics workflows.

    Science.gov (United States)

    Goble, Carole A; Bhagat, Jiten; Aleksejevs, Sergejs; Cruickshank, Don; Michaelides, Danius; Newman, David; Borkum, Mark; Bechhofer, Sean; Roos, Marco; Li, Peter; De Roure, David

    2010-07-01

    myExperiment (http://www.myexperiment.org) is an online research environment that supports the social sharing of bioinformatics workflows. These workflows are procedures consisting of a series of computational tasks using web services, which may be performed on data from its retrieval, integration and analysis, to the visualization of the results. As a public repository of workflows, myExperiment allows anybody to discover those that are relevant to their research, which can then be reused and repurposed to their specific requirements. Conversely, developers can submit their workflows to myExperiment and enable them to be shared in a secure manner. Since its release in 2007, myExperiment currently has over 3500 registered users and contains more than 1000 workflows. The social aspect to the sharing of these workflows is facilitated by registered users forming virtual communities bound together by a common interest or research project. Contributors of workflows can build their reputation within these communities by receiving feedback and credit from individuals who reuse their work. Further documentation about myExperiment including its REST web service is available from http://wiki.myexperiment.org. Feedback and requests for support can be sent to bugs@myexperiment.org.

  16. A workflow for the 3D visualization of meteorological data

    Science.gov (United States)

    Helbig, Carolin; Rink, Karsten

    2014-05-01

    In the future, climate change will strongly influence our environment and living conditions. To predict possible changes, climate models that include basic and process conditions have been developed and big data sets are produced as a result of simulations. The combination of various variables of climate models with spatial data from different sources helps to identify correlations and to study key processes. For our case study we use results of the weather research and forecasting (WRF) model of two regions at different scales that include various landscapes in Northern Central Europe and Baden-Württemberg. We visualize these simulation results in combination with observation data and geographic data, such as river networks, to evaluate processes and analyze if the model represents the atmospheric system sufficiently. For this purpose, a continuous workflow that leads from the integration of heterogeneous raw data to visualization using open source software (e.g. OpenGeoSys Data Explorer, ParaView) is developed. These visualizations can be displayed on a desktop computer or in an interactive virtual reality environment. We established a concept that includes recommended 3D representations and a color scheme for the variables of the data based on existing guidelines and established traditions in the specific domain. To examine changes over time in observation and simulation data, we added the temporal dimension to the visualization. In a first step of the analysis, the visualizations are used to get an overview of the data and detect areas of interest such as regions of convection or wind turbulences. Then, subsets of data sets are extracted and the included variables can be examined in detail. An evaluation by experts from the domains of visualization and atmospheric sciences establish if they are self-explanatory and clearly arranged. These easy-to-understand visualizations of complex data sets are the basis for scientific communication. In addition, they have

  17. Structured Composition of Dataflow and Control-Flow for Reusable and Robust Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Bowers, S; Ludaescher, B; Ngu, A; Critchlow, T

    2005-09-07

    Data-centric scientific workflows are often modeled as dataflow process networks. The simplicity of the dataflow framework facilitates workflow design, analysis, and optimization. However, some workflow tasks are particularly ''control-flow intensive'', e.g., procedures to make workflows more fault-tolerant and adaptive in an unreliable, distributed computing environment. Modeling complex control-flow directly within a dataflow framework often leads to overly complicated workflows that are hard to comprehend, reuse, schedule, and maintain. In this paper, we develop a framework that allows a structured embedding of control-flow intensive subtasks within dataflow process networks. In this way, we can seamlessly handle complex control-flows without sacrificing the benefits of dataflow. We build upon a flexible actor-oriented modeling and design approach and extend it with (actor) frames and (workflow) templates. A frame is a placeholder for an (existing or planned) collection of components with similar function and signature. A template partially specifies the behavior of a subworkflow by leaving ''holes'' (i.e., frames) in the subworkflow definition. Taken together, these abstraction mechanisms facilitate the separation and structured re-combination of control-flow and dataflow in scientific workflow applications. We illustrate our approach with a real-world scientific workflow from the astrophysics domain. This data-intensive workflow requires remote execution and file transfer in a semi-reliable environment. For such work-flows, we propose a 3-layered architecture: The top-level, typically a dataflow process network, includes Generic Data Transfer (GDT) frames and Generic remote eXecution (GX) frames. At the second level, the user can specialize the behavior of these generic components by embedding a suitable template (here: transducer templates for control-flow intensive tasks). At the third level, frames inside the

  18. A Critical Look at the Policy Environment for Opening up Public Higher Education in Rwanda

    Science.gov (United States)

    Nkuyubwatsi, Bernard

    2016-01-01

    Policies play a critical role in the implementation of open, distance education and opening up higher education. To encourage participation of different stakeholders in related practices, policies may need to embody values and benefits for those stakeholders. It is in this perspective that this study was conducted to investigate the policy…

  19. 4onse: four times open & non-conventional technology for sensing the environment

    Science.gov (United States)

    Cannata, Massimiliano; Ratnayake, Rangageewa; Antonovic, Milan; Strigaro, Daniele; Cardoso, Mirko; Hoffmann, Marcus

    2017-04-01

    The availability of complete, quality and dense monitoring hydro-meteorological data is essential to address a number of practical issues including, but not limited to, flood-water and urban drainage management, climate change impact assessment, early warning and risk management, now-casting and weather predictions. Thanks to the recent technological advances such as Internet Of Things, Big Data and Ubiquitous Internet, non-conventional monitoring systems based on open technologies and low cost sensors may represent a great opportunity either as a complement of authoritative monitoring network or as a vital source of information wherever existing monitoring networks are in decline or completely missing. Nevertheless, scientific literature on such a kind of open and non-conventional monitoring systems is still limited and often relates to prototype engineering and testing in rather limited case studies. For this reason the 4onse project aims at integrating existing open technologies in the field of Free & Open Source Software, Open Hardware, Open Data, and Open Standards and evaluate this kind of system in a real case (about 30 stations) for a medium period of 2 years to better scientifically understand strengths, criticalities and applicabilities in terms of data quality; system durability; management costs; performances; sustainability. The ultimate objective is to contribute in non-conventional monitoring systems adoption based on four open technologies.

  20. Jflow: a workflow management system for web applications.

    Science.gov (United States)

    Mariette, Jérôme; Escudié, Frédéric; Bardou, Philippe; Nabihoudine, Ibouniyamine; Noirot, Céline; Trotard, Marie-Stéphane; Gaspin, Christine; Klopp, Christophe

    2016-02-01

    Biologists produce large data sets and are in demand of rich and simple web portals in which they can upload and analyze their files. Providing such tools requires to mask the complexity induced by the needed High Performance Computing (HPC) environment. The connection between interface and computing infrastructure is usually specific to each portal. With Jflow, we introduce a Workflow Management System (WMS), composed of jQuery plug-ins which can easily be embedded in any web application and a Python library providing all requested features to setup, run and monitor workflows. Jflow is available under the GNU General Public License (GPL) at http://bioinfo.genotoul.fr/jflow. The package is coming with full documentation, quick start and a running test portal. Jerome.Mariette@toulouse.inra.fr. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Precise Nanoscale Surface Modification and Coating of Macroscale Objects: Open-Environment in Loco Atomic Layer Deposition on an Automobile.

    Science.gov (United States)

    Mousa, Moataz Bellah M; Oldham, Christopher J; Parsons, Gregory N

    2015-09-09

    The fundamental chemical reaction conditions that define atomic layer deposition (ALD) can be achieved in an open environment on a macroscale surface too large and complex for typical laboratory reactor-based ALD. We describe the concept of in loco ALD using conventional modulated reactant flow through a surface-mounted "ALD delivery head" to form a precise nanoscale Al2O3 film on the window of a parked automobile. Analysis confirms that the processes eliminated ambient water contamination and met other conditions that define ALD growth. Using this tool, we demonstrate open-ambient patterned deposition, metal corrosion protection, and polymer surface modification.

  2. From the desktop to the grid: scalable bioinformatics via workflow conversion.

    Science.gov (United States)

    de la Garza, Luis; Veit, Johannes; Szolek, Andras; Röttig, Marc; Aiche, Stephan; Gesing, Sandra; Reinert, Knut; Kohlbacher, Oliver

    2016-03-12

    Reproducibility is one of the tenets of the scientific method. Scientific experiments often comprise complex data flows, selection of adequate parameters, and analysis and visualization of intermediate and end results. Breaking down the complexity of such experiments into the joint collaboration of small, repeatable, well defined tasks, each with well defined inputs, parameters, and outputs, offers the immediate benefit of identifying bottlenecks, pinpoint sections which could benefit from parallelization, among others. Workflows rest upon the notion of splitting complex work into the joint effort of several manageable tasks. There are several engines that give users the ability to design and execute workflows. Each engine was created to address certain problems of a specific community, therefore each one has its advantages and shortcomings. Furthermore, not all features of all workflow engines are royalty-free -an aspect that could potentially drive away members of the scientific community. We have developed a set of tools that enables the scientific community to benefit from workflow interoperability. We developed a platform-free structured representation of parameters, inputs, outputs of command-line tools in so-called Common Tool Descriptor documents. We have also overcome the shortcomings and combined the features of two royalty-free workflow engines with a substantial user community: the Konstanz Information Miner, an engine which we see as a formidable workflow editor, and the Grid and User Support Environment, a web-based framework able to interact with several high-performance computing resources. We have thus created a free and highly accessible way to design workflows on a desktop computer and execute them on high-performance computing resources. Our work will not only reduce time spent on designing scientific workflows, but also make executing workflows on remote high-performance computing resources more accessible to technically inexperienced users. We

  3. Design, implementation, and assessment of a radiology workflow management system.

    Science.gov (United States)

    Halsted, Mark J; Froehle, Craig M

    2008-08-01

    The objective of this article is to describe the development, launch, and outcomes studies of a paperless workflow management system (WMS) that improves radiology workflow in a filmless and speech-recognition environment. The WMS prioritizes cases automatically on the basis of medical and operational acuity factors, automatically facilitates communication of critical radiology results, and provides permanent documentation of these results and communications. It runs in parallel with an integrated radiology information system (RIS)-PACS and speech-recognition system. Its effects on operations, staff stress and satisfaction, and patient satisfaction were studied. Despite an increase in caseload volume after the launch of the WMS, case turnaround times, defined as the time between case availability on PACS and signing of the final radiology staff interpretation, decreased for all case types. Median case turnaround time decreased by 33 minutes (22%) for emergency department, 47 minutes (37%) for inpatient, and 22 minutes (38%) for outpatient radiology cases. All reductions were significant at a p value of WMS was implemented. Staff satisfaction showed no significant change. There is room for improvement in radiology workflow even in departments with integrated RIS-PACS and speech-recognition systems. This study has shown that software tools that coordinate decentralized workflow and dynamically balance workloads can increase the efficiency and efficacy of radiologists. Operational benefits, such as reduced reading times, improvements in the timeliness of care (both actual and as perceived by patients), and reduced interruptions to radiologists, further reinforce the benefits of such a system. Secondary benefits, such as documenting communication about a case and facilitating review of results, can also promote more timely and effective care. Although use of the system did not result in a substantial improvement in staff perceptions, neither did it reduce their

  4. PALEOMAGNETISM.ORG - AN Online Multi-Platform and Open Source Environment for Paleomagnetic Analysis

    Science.gov (United States)

    Koymans, M. R.; Langereis, C. G.; Pastor-Galán, D.; Van Hinsbergen, D. J. J.

    2015-12-01

    This contribution provides an overview of the features of Paleomagnetism.org, a new open-source online environment for paleomagnetic analysis that is supported by all modern browsers on multiple platforms. The core functionality of Paleomagnetism.org is written in JavaScript and maintains an interactive website in which paleomagnetic data can be interpreted, evaluated, visualized, and exported. Although being an online platform, the data processing is performed client-sided within the browser to respect the integrity of the data and users. In the interpretation portal, principle component analysis (Kirschvink et al., 1981) can be applied on visualized demagnetization data (Zijderveld, 1967). The interpreted directions and great circles can be combined using the iterative procedure of (McFadden and McElhinny, 1988). The resulting magnetic directions can be used in the statistics portal or exported as raw tabulated data and figures. The available tools in the statistics portal cover standard Fisher statistics for directional data and virtual geomagnetic poles (Fisher, 1953; Butler, 1992; Deenen et al., 2011). Other tools include the eigenvector approach foldtest (Tauxe and Watson, 1994), a bootstrapped reversal test (Tauxe et al., 2009), and the classical reversal test of (McFadden and McElhinny, 1990). An implementation exists for the detection and correction of inclination shallowing in sediments (Tauxe and Kent, 2004; Tauxe et al., 2008) and a module to visualize custom or default APWP reference frames (Torsvik et al., 2012; Kent and Irving, 2010; Besse and Courtillot, 2002) for continent-bearing plates. Paleomagnetism.org provides an integrated approach for researchers to export tabulated and visualized (e.g. equal area projections) paleomagnetic data. The portals construct a custom exportable file that can be shared with other researchers and included in public databases. With a publication, this custom file can be appended and would contain all data used in the

  5. Implementing bioinformatic workflows within the bioextract server.

    Science.gov (United States)

    Lushbough, Carol M; Bergman, Michael K; Lawrence, Carolyn J; Jennewein, Doug; Brendel, Volker

    2008-01-01

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed service designed to provide researchers with the web ability to query multiple data sources, save results as searchable data sets, and execute analytic tools. As the researcher works with the system, their tasks are saved in the background. At any time these steps can be saved as a workflow that can then be executed again and/or modified later.

  6. Learning Competences in Open Mobile Environments: A Comparative Analysis Between Formal and Non-Formal Spaces

    Directory of Open Access Journals (Sweden)

    Daniel Dominguez

    2014-07-01

    Full Text Available As a result of the increasing use of mobile devices in education, new approaches to define the learning competences in the field of digitally mediated learning have emerged. This paper examines these approaches, using data obtained from empirical research with a group of Spanish university students. The analysis is focused on the experiences of students in the use of mobile devices in both formal and open-informal educational contexts. The theoretical framework of the study is based on the ecological focus applied to explanatory models of digital literacy. As a result of the data it is possible to study this framework in depth, taking into account the theories defending an open view of digital literacy. The study may be of interest to instructional designers and researchers in the fields of open educational resources and technologies applied to education in open contexts.

  7. Air pollution abatement performances of green infrastructure in open road and built-up street canyon environments - A review

    Science.gov (United States)

    Abhijith, K. V.; Kumar, Prashant; Gallagher, John; McNabola, Aonghus; Baldauf, Richard; Pilla, Francesco; Broderick, Brian; Di Sabatino, Silvana; Pulvirenti, Beatrice

    2017-08-01

    Intensifying the proportion of urban green infrastructure has been considered as one of the remedies for air pollution levels in cities, yet the impact of numerous vegetation types deployed in different built environments has to be fully synthesised and quantified. This review examined published literature on neighbourhood air quality modifications by green interventions. Studies were evaluated that discussed personal exposure to local sources of air pollution under the presence of vegetation in open road and built-up street canyon environments. Further, we critically evaluated the available literature to provide a better understanding of the interactions between vegetation and surrounding built-up environments and ascertain means of reducing local air pollution exposure using green infrastructure. The net effects of vegetation in each built-up environment are also summarised and possible recommendations for the future design of green infrastructure are proposed. In a street canyon environment, high-level vegetation canopies (trees) led to a deterioration in air quality, while low-level green infrastructure (hedges) improved air quality conditions. For open road conditions, wide, low porosity and tall vegetation leads to downwind pollutant reductions while gaps and high porosity vegetation could lead to no improvement or even deteriorated air quality. The review considers that generic recommendations can be provided for vegetation barriers in open road conditions. Green walls and roofs on building envelopes can also be used as effective air pollution abatement measures. The critical evaluation of the fundamental concepts and the amalgamation of key technical features of past studies by this review could assist urban planners to design and implement green infrastructures in the built environment.

  8. Security aspects in teleradiology workflow

    Science.gov (United States)

    Soegner, Peter I.; Helweg, Gernot; Holzer, Heimo; zur Nedden, Dieter

    2000-05-01

    The medicolegal necessity of privacy, security and confidentiality was the aim of the attempt to develop a secure teleradiology workflow between the telepartners -- radiologist and the referring physician. To avoid the lack of dataprotection and datasecurity we introduced biometric fingerprint scanners in combination with smart cards to identify the teleradiology partners and communicated over an encrypted TCP/IP satellite link between Innsbruck and Reutte. We used an asymmetric kryptography method to guarantee authentification, integrity of the data-packages and confidentiality of the medical data. It was necessary to use a biometric feature to avoid a case of mistaken identity of persons, who wanted access to the system. Only an invariable electronical identification allowed a legal liability to the final report and only a secure dataconnection allowed the exchange of sensible medical data between different partners of Health Care Networks. In our study we selected the user friendly combination of a smart card and a biometric fingerprint technique, called SkymedTM Double Guard Secure Keyboard (Agfa-Gevaert) to confirm identities and log into the imaging workstations and the electronic patient record. We examined the interoperability of the used software with the existing platforms. Only the WIN-XX operating systems could be protected at the time of our study.

  9. Evolutionary optimization of production materials workflow processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    We present an evolutionary optimisation technique for stochastic production processes, which is able to find improved production materials workflow processes with respect to arbitrary combinations of numerical quantities associated with the production process. Working from a core fragment...

  10. Implementing Workflow Reconfiguration in WS-BPEL

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Dragoni, Nicola; Zhou, Mu

    2012-01-01

    This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would not natu......This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would...... not naturally support dynamic change, is used as a target for implementation. The WS-BPEL recovery framework is here exploited to implement the reconfiguration using principles derived from previous research in process algebra and two mappings from BPMN to WS-BPEL are presented, one automatic and only mostly...

  11. KNIME for Open-Source Bioimage Analysis: A Tutorial.

    Science.gov (United States)

    Dietz, Christian; Berthold, Michael R

    2016-01-01

    The open analytics platform KNIME is a modular environment that enables easy visual assembly and interactive execution of workflows. KNIME is already widely used in various areas of research, for instance in cheminformatics or classical data analysis. In this tutorial the KNIME Image Processing Extension is introduced, which adds the capabilities to process and analyse huge amounts of images. In combination with other KNIME extensions, KNIME Image Processing opens up new possibilities for inter-domain analysis of image data in an understandable and reproducible way.

  12. Workflow Fault Tree Generation Through Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2014-01-01

    We present a framework for the automated generation of fault trees from models of realworld process workflows, expressed in a formalised subset of the popular Business Process Modelling and Notation (BPMN) language. To capture uncertainty and unreliability in workflows, we extend this formalism...... of the system being modelled. From these calculations, a comprehensive fault tree is generated. Further, we show that annotating the model with rewards (data) allows the expected mean values of reward structures to be calculated at points of failure....

  13. E-BioFlow: Different perspectives on scientific workflows

    NARCIS (Netherlands)

    Wassink, I.; Rauwerda, H.; van der Vet, P.; Breit, T.; Nijholt, A.

    2008-01-01

    We introduce a new type of workflow design system called e-BioFlow and illustrate it by means of a simple sequence alignment workflow. E-BioFlow, intended to model advanced scientific workflows, enables the user to model a workflow from three different but strongly coupled perspectives: the control

  14. E-BioFlow: Different Perspectives on Scientific Workflows

    NARCIS (Netherlands)

    Wassink, I.; Rauwerda, H.; Vet, van der P.E.; Breit, T.; Nijholt, A.; Elloumi, M.; Küng, J.; Linial, M.; Murphy, R.F.; Schneider, K.; Toma, C.

    2008-01-01

    We introduce a new type of workflow design system called e-BioFlow and illustrate it by means of a simple sequence alignment workflow. E-BioFlow, intended to model advanced scientific workflows, enables the user to model a workflow from three different but strongly coupled perspectives: the control

  15. OpenRS-Cloud:A remote sensing image processing platform based on cloud computing environment

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    This paper explores the use of cloud computing for remote sensing image processing.The main contribution of our work is to develop a remote sensing image processing platform based on cloud computing technology(OpenRS-Cloud).This paper focuses on enabling methodical investigations into the development pattern,computational model,data management and service model exploring this novel distributed computing model.The experimental INSAR processing flow is implemented to verify the efficiency and feasibility of OpenRS-Cloud platform.The results show that cloud computing is well suited for computationally-intensive and data-intensive remote sensing services.

  16. An Exploratory Account of Incentives for Underexploitation in an Open Innovation Environment

    DEFF Research Database (Denmark)

    Piirainen, Kalle; Raivio, Tuomas; Lähteenmäki-smith, Kaisa

    2014-01-01

    This paper presents an empirical account of incentives for underexploiting intellectual property in an open innovation setting. In this exploratory empirical account the phenomenon is observed in a research, development and innovation program where participants are required to share intellectual...... such an event is not only costly in terms of time and resources, but can in fact render IPR effectively worthless in terms of commercial exploitation and block innovation. This finding is pertinent to policy makers designing research, development and innovation instruments, as well as for managers who need...... to make choices how to implement open practices in innovation....

  17. On the Prospects and Concerns of Integrating Open Source Software Environment in Software Engineering Education

    Science.gov (United States)

    Kamthan, Pankaj

    2007-01-01

    Open Source Software (OSS) has introduced a new dimension in software community. As the development and use of OSS becomes prominent, the question of its integration in education arises. In this paper, the following practices fundamental to projects and processes in software engineering are examined from an OSS perspective: project management;…

  18. On the Prospects and Concerns of Integrating Open Source Software Environment in Software Engineering Education

    Science.gov (United States)

    Kamthan, Pankaj

    2007-01-01

    Open Source Software (OSS) has introduced a new dimension in software community. As the development and use of OSS becomes prominent, the question of its integration in education arises. In this paper, the following practices fundamental to projects and processes in software engineering are examined from an OSS perspective: project management;…

  19. Search and Result Presentation in Scientific Workflow Repositories

    OpenAIRE

    Davidson, Susan B.; Huang, Xiaocheng; Stoyanovich, Julia; Yuan, Xiaojie

    2013-01-01

    We study the problem of searching a repository of complex hierarchical workflows whose component modules, both composite and atomic, have been annotated with keywords. Since keyword search does not use the graph structure of a workflow, we develop a model of workflows using context-free bag grammars. We then give efficient polynomial-time algorithms that, given a workflow and a keyword query, determine whether some execution of the workflow matches the query. Based on these algorithms we deve...

  20. Opening the gas market - Effects on energy consumption, energy prices and the environment and compensation measures; Marktoeffnung im Gasbereich

    Energy Technology Data Exchange (ETDEWEB)

    Dettli, R.; Signer, B.; Kaufmann, Y.

    2001-07-01

    This final report for the Swiss Federal Office of Energy (SFOE) examines the effects of a future liberalisation of the gas market in Switzerland. The report first examines the current situation of the gas supply industry in Switzerland. The contents of European Union Guidelines are described and their implementation in Switzerland is discussed. Experience already gained in other countries is looked at, including market opening already implemented in the USA and Great Britain. The effect of market-opening on gas prices is discussed; the various components of the gas price are examined and comparisons are made with international figures. The pressure of competition on the individual sectors of the gas industry are looked at and the perspectives in the gas purchasing market are examined. The report presents basic scenarios developed from these considerations. Further effects resulting from a market opening are discussed, including those on the structure of the gas industry, its participants, electricity generation, energy use and the environment, consumers in general, security of supply and the national economy. Possible compensatory measures are discussed and factors for increasing efficiency and the promotion of a competitive environment are discussed. In the appendix, two price scenarios are presented.

  1. How Workflow Documentation Facilitates Curation Planning

    Science.gov (United States)

    Wickett, K.; Thomer, A. K.; Baker, K. S.; DiLauro, T.; Asangba, A. E.

    2013-12-01

    The description of the specific processes and artifacts that led to the creation of a data product provide a detailed picture of data provenance in the form of a workflow. The Site-Based Data Curation project, hosted by the Center for Informatics Research in Science and Scholarship at the University of Illinois, has been investigating how workflows can be used in developing curation processes and policies that move curation "upstream" in the research process. The team has documented an individual workflow for geobiology data collected during a single field trip to Yellowstone National Park. This specific workflow suggests a generalized three-part process for field data collection that comprises three distinct elements: a Planning Stage, a Fieldwork Stage, and a Processing and Analysis Stage. Beyond supplying an account of data provenance, the workflow has allowed the team to identify 1) points of intervention for curation processes and 2) data products that are likely candidates for sharing or deposit. Although these objects may be viewed by individual researchers as 'intermediate' data products, discussions with geobiology researchers have suggested that with appropriate packaging and description they may serve as valuable observational data for other researchers. Curation interventions may include the introduction of regularized data formats during the planning process, data description procedures, the identification and use of established controlled vocabularies, and data quality and validation procedures. We propose a poster that shows the individual workflow and our generalization into a three-stage process. We plan to discuss with attendees how well the three-stage view applies to other types of field-based research, likely points of intervention, and what kinds of interventions are appropriate and feasible in the example workflow.

  2. SHIWA Services for Workflow Creation and Sharing in Hydrometeorolog

    Science.gov (United States)

    Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter; Sipos, Gergely

    2014-05-01

    Researchers want to run scientific experiments on Distributed Computing Infrastructures (DCI) to access large pools of resources and services. To run these experiments requires specific expertise that they may not have. Workflows can hide resources and services as a virtualisation layer providing a user interface that researchers can use. There are many scientific workflow systems but they are not interoperable. To learn a workflow system and create workflows may require significant efforts. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows developed in other workflow systems. To overcome it requires creating workflow interoperability solutions to allow workflow sharing. The FP7 'Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs' (SHIWA) project developed the Coarse-Grained Interoperability concept (CGI). It enables recycling and sharing workflows of different workflow systems and executing them on different DCIs. SHIWA developed the SHIWA Simulation Platform (SSP) to implement the CGI concept integrating three major components: the SHIWA Science Gateway, the workflow engines supported by the CGI concept and DCI resources where workflows are executed. The science gateway contains a portal, a submission service, a workflow repository and a proxy server to support the whole workflow life-cycle. The SHIWA Portal allows workflow creation, configuration, execution and monitoring through a Graphical User Interface using the WS-PGRADE workflow system as the host workflow system. The SHIWA Repository stores the formal description of workflows and workflow engines plus executables and data needed to execute them. It offers a wide-range of browse and search operations. To support non-native workflow execution the SHIWA Submission Service imports the workflow and workflow engine from the SHIWA Repository. This service either invokes locally or remotely

  3. Open Data Distribution Service (DDS) for Use in a Real Time Simulation Laboratory Environment

    Science.gov (United States)

    2012-02-29

    definitions are constrained to define only data that can be transported by the DDS service. The model will be used to generate CORBA Interface Definition...static discovery mechanisms to support small footprint and embedded system applications. CORBA Component Model Integrate OpenDDS with the DDS OMG... Corba Component Model (DDS4CCM) abstraction. Delay Tolerant Networking Implement an RFC5050 DTN capability, including bundle processing and a

  4. Reference and PDF-manager software: complexities, support and workflow.

    Science.gov (United States)

    Mead, Thomas L; Berryman, Donna R

    2010-10-01

    In the past, librarians taught reference management by training library users to use established software programs such as RefWorks or EndNote. In today's environment, there is a proliferation of Web-based programs that are being used by library clientele that offer a new twist on the well-known reference management programs. Basically, these new programs are PDF-manager software (e.g., Mendeley or Papers). Librarians are faced with new questions, issues, and concerns, given the new workflows and pathways that these PDF-manager programs present. This article takes a look at some of those.

  5. Telecommuting Academics within an Open Distance Education Environment of South Africa: More Content, Productive, and Healthy?

    Science.gov (United States)

    Tustin, Deon Harold

    2014-01-01

    Outside an academic setting, telecommuting has become fairly popular in recent years. However, research on telecommuting practices within a higher education environment is fairly sparse, especially within the higher distance education sphere. Drawing on existing literature on telecommuting and the outcome of a valuation study on the success of an…

  6. Yabi: An online research environment for grid, high performance and cloud computing

    Directory of Open Access Journals (Sweden)

    Hunter Adam A

    2012-02-01

    Full Text Available Abstract Background There is a significant demand for creating pipelines or workflows in the life science discipline that chain a number of discrete compute and data intensive analysis tasks into sophisticated analysis procedures. This need has led to the development of general as well as domain-specific workflow environments that are either complex desktop applications or Internet-based applications. Complexities can arise when configuring these applications in heterogeneous compute and storage environments if the execution and data access models are not designed appropriately. These complexities manifest themselves through limited access to available HPC resources, significant overhead required to configure tools and inability for users to simply manage files across heterogenous HPC storage infrastructure. Results In this paper, we describe the architecture of a software system that is adaptable to a range of both pluggable execution and data backends in an open source implementation called Yabi. Enabling seamless and transparent access to heterogenous HPC environments at its core, Yabi then provides an analysis workflow environment that can create and reuse workflows as well as manage large amounts of both raw and processed data in a secure and flexible way across geographically distributed compute resources. Yabi can be used via a web-based environment to drag-and-drop tools to create sophisticated workflows. Yabi can also be accessed through the Yabi command line which is designed for users that are more comfortable with writing scripts or for enabling external workflow environments to leverage the features in Yabi. Configuring tools can be a significant overhead in workflow environments. Yabi greatly simplifies this task by enabling system administrators to configure as well as manage running tools via a web-based environment and without the need to write or edit software programs or scripts. In this paper, we highlight Yabi's capabilities

  7. Pricing risk analysis for competitive electricity trading in deregulated and open access environment

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Z.; Sparrow, F.T. [Purdue Univ., West Lafayette, IN (United States)

    1998-12-31

    This paper presents two models for evaluating pricing risks of future competitive electricity market with open transmission access. The first model is developed for measuring the risk of losing contract when a seller`s price is higher than that of one of its competitors. The second model is deduced for measuring the risk of benefit loss when a bidder`s price is low such that this seller has unrealized benefit. Both models are probabilistic and are expressed as probability distributions. The models are developed for evaluating bilateral contract pricing risks, however the idea can be extended to a competitive power pool situation.

  8. Low Latency Workflow Scheduling and an Application of Hyperspectral Brightness Temperatures

    Science.gov (United States)

    Nguyen, P. T.; Chapman, D. R.; Halem, M.

    2012-12-01

    New system analytics for Big Data computing holds the promise of major scientific breakthroughs and discoveries from the exploration and mining of the massive data sets becoming available to the science community. However, such data intensive scientific applications face severe challenges in accessing, managing and analyzing petabytes of data. While the Hadoop MapReduce environment has been successfully applied to data intensive problems arising in business, there are still many scientific problem domains where limitations in the functionality of MapReduce systems prevent its wide adoption by those communities. This is mainly because MapReduce does not readily support the unique science discipline needs such as special science data formats, graphic and computational data analysis tools, maintaining high degrees of computational accuracies, and interfacing with application's existing components across heterogeneous computing processors. We address some of these limitations by exploiting the MapReduce programming model for satellite data intensive scientific problems and address scalability, reliability, scheduling, and data management issues when dealing with climate data records and their complex observational challenges. In addition, we will present techniques to support the unique Earth science discipline needs such as dealing with special science data formats (HDF and NetCDF). We have developed a Hadoop task scheduling algorithm that improves latency by 2x for a scientific workflow including the gridding of the EOS AIRS hyperspectral Brightness Temperatures (BT). This workflow processing algorithm has been tested at the Multicore Computing Center private Hadoop based Intel Nehalem cluster, as well as in a virtual mode under the Open Source Eucalyptus cloud. The 55TB AIRS hyperspectral L1b Brightness Temperature record has been gridded at the resolution of 0.5x1.0 degrees, and we have computed a 0.9 annual anti-correlation to the El Nino Southern oscillation in

  9. TLSpy: An Open-Source Addition to Terrestrial Lidar Workflows

    Science.gov (United States)

    Frechette, J. D.; Weissmann, G. S.; Wawrzyniec, T. F.

    2008-12-01

    Terrestrial lidar scanners (TLS) that capture three dimensional (3D) geometry with cm scale precision present many new opportunities in the Earth Sciences and related fields. However, the lack of domain specific tools impedes full and efficient utilization of the information contained in these datasets. Most processing and analysis is performed using a variety of manufacturing, surveying, airborne lidar, and GIS software. Although much overlap exists, inevitably some needs are not addressed by these applications. TLSpy provides a plugin driven framework with 3D visualization capabilities that encourages researchers to fill these gaps. The goal is to free researchers from the intellectual overhead imposed by user and data interface design, enabling rapid development of TLS specific processing and analysis algorithms. We present two plugins as examples of problems that TLSpy is being applied to. The first plugin corrects for the strong influence of target orientation on TLS measured reflectance intensities. It calculates the distribution of incidence angles and intensities in an input scan and assists the user in fitting a reflectance model to the distribution. The model is then used to normalize input intensities, minimizing the impact of surface orientation and simplifying the extraction of quantitative data from reflectance measurements. Although reasonable default models can be determined the large number of factors influencing reflectance values require that the plugin be designed for maximum flexibility, allowing the user to adjust all model parameters and define new reflectance models as needed. The second plugin helps eliminate multipath reflections from water surfaces. Characterized by a lower intensity mirror image of the subaerial bank appearing below the water surface, these reflections are a common problem in scans containing water. These erroneous reflections can be removed by manually selecting points that lie on the waterline, fitting a plane to the points, and deleting points below that plane. This plugin simplifies the process by automatically identifying waterline points using characteristic changes in geometry and intensity. Automatic identification is often faster and more reliable than manual identification, however, manual control is retained as a fallback for degenerate cases.

  10. Toward Transparent and Reproducible Science: Using Open Source "Big Data" Tools for Water Resources Assessment

    Science.gov (United States)

    Buytaert, W.; Zulkafli, Z. D.; Vitolo, C.

    2014-12-01

    Transparency and reproducibility are fundamental properties of good science. In the current era of large and diverse datasets and long and complex workflows for data analysis and inference, ensuring such transparency and reproducibility is challenging. Hydrological science is a good case in point, because the discipline typically uses a large variety of datasets ranging from local observations to large-scale remotely sensed products. These data are often obtained from various different sources, and integrated using complex yet uncertain modelling tools. In this paper, we present and discuss methods of ensuring transparency and reproducibility in scientific workflows for hydrological data analysis for the purpose of water resources assessment, using relevant examples of emerging open source "big data" tools. First, we discuss standards for data storage, access, and processing that allow improving the modularity of a hydrological analysis workflow. In particular standards emerging from the Open Geospatial Consortium, such as the Sensor Observation Service, the Web Coverage Service, hold promise. However, some bottlenecks such as the availability of data models and the ability to work with spatio-temperal subsets of large datasets, need further development. Next, we focus on available methods to build transparent data processing workflows. Again, standards such as OGC's Web Processing Service are being developed to facilitate web-based analytics. Yet, in practice, the experimental nature of these standards and web services in general often requires a more pragmatic approach. The availability of web technologies in popular open source data analysis environments such as R and Python often makes them an attractive solution for workflow creation and sharing. Lastly, we elaborate on the potential of open source solutions hold in the context of participatory approaches to data collection and knowledge generation. Using examples from the tropical Andes and the Himalayas, we

  11. Time-dependent density functional theory for open systems with a positivity-preserving decomposition scheme for environment spectral functions

    Energy Technology Data Exchange (ETDEWEB)

    Wang, RuLin [Beijing Computational Science Research Center, No. 3 He-Qing Road, Beijing 100084 (China); Zheng, Xiao, E-mail: xz58@ustc.edu.cn [Hefei National Laboratory for Physical Sciences at the Microscale, University of Science and Technology of China, Hefei, Anhui 230026 (China); Synergetic Innovation Center of Quantum Information and Quantum Physics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Kwok, YanHo; Xie, Hang; Chen, GuanHua [Department of Chemistry, The University of Hong Kong, Pokfulam Road, Hong Kong (China); Yam, ChiYung, E-mail: yamcy@csrc.ac.cn [Beijing Computational Science Research Center, No. 3 He-Qing Road, Beijing 100084 (China); Department of Chemistry, The University of Hong Kong, Pokfulam Road, Hong Kong (China)

    2015-04-14

    Understanding electronic dynamics on material surfaces is fundamentally important for applications including nanoelectronics, inhomogeneous catalysis, and photovoltaics. Practical approaches based on time-dependent density functional theory for open systems have been developed to characterize the dissipative dynamics of electrons in bulk materials. The accuracy and reliability of such approaches depend critically on how the electronic structure and memory effects of surrounding material environment are accounted for. In this work, we develop a novel squared-Lorentzian decomposition scheme, which preserves the positive semi-definiteness of the environment spectral matrix. The resulting electronic dynamics is guaranteed to be both accurate and convergent even in the long-time limit. The long-time stability of electronic dynamics simulation is thus greatly improved within the current decomposition scheme. The validity and usefulness of our new approach are exemplified via two prototypical model systems: quasi-one-dimensional atomic chains and two-dimensional bilayer graphene.

  12. Time-dependent density functional theory for open systems with a positivity-preserving decomposition scheme for environment spectral functions

    Science.gov (United States)

    Wang, RuLin; Zheng, Xiao; Kwok, YanHo; Xie, Hang; Chen, GuanHua; Yam, ChiYung

    2015-04-01

    Understanding electronic dynamics on material surfaces is fundamentally important for applications including nanoelectronics, inhomogeneous catalysis, and photovoltaics. Practical approaches based on time-dependent density functional theory for open systems have been developed to characterize the dissipative dynamics of electrons in bulk materials. The accuracy and reliability of such approaches depend critically on how the electronic structure and memory effects of surrounding material environment are accounted for. In this work, we develop a novel squared-Lorentzian decomposition scheme, which preserves the positive semi-definiteness of the environment spectral matrix. The resulting electronic dynamics is guaranteed to be both accurate and convergent even in the long-time limit. The long-time stability of electronic dynamics simulation is thus greatly improved within the current decomposition scheme. The validity and usefulness of our new approach are exemplified via two prototypical model systems: quasi-one-dimensional atomic chains and two-dimensional bilayer graphene.

  13. Time-dependent density functional theory for open systems with a positivity-preserving decomposition scheme for environment spectral functions.

    Science.gov (United States)

    Wang, RuLin; Zheng, Xiao; Kwok, YanHo; Xie, Hang; Chen, GuanHua; Yam, ChiYung

    2015-04-14

    Understanding electronic dynamics on material surfaces is fundamentally important for applications including nanoelectronics, inhomogeneous catalysis, and photovoltaics. Practical approaches based on time-dependent density functional theory for open systems have been developed to characterize the dissipative dynamics of electrons in bulk materials. The accuracy and reliability of such approaches depend critically on how the electronic structure and memory effects of surrounding material environment are accounted for. In this work, we develop a novel squared-Lorentzian decomposition scheme, which preserves the positive semi-definiteness of the environment spectral matrix. The resulting electronic dynamics is guaranteed to be both accurate and convergent even in the long-time limit. The long-time stability of electronic dynamics simulation is thus greatly improved within the current decomposition scheme. The validity and usefulness of our new approach are exemplified via two prototypical model systems: quasi-one-dimensional atomic chains and two-dimensional bilayer graphene.

  14. Telecommuting Academics Within an Open Distance Education Environment of South Africa: More Content, Productive, and Healthy?

    OpenAIRE

    Deon Harold Tustin

    2014-01-01

    Outside an academic setting, telecommuting has become fairly popular in recent years. However, research on telecommuting practices within a higher education environment is fairly sparse, especially within the higher distance education sphere. Drawing on existing literature on telecommuting and the outcome of a valuation study on the success of an experimental telecommuting programme at the largest distance education institution in South Africa, this article presents discerning findings on tel...

  15. Open Integrated Personal Learning Environment: Towards a New Conception of the ICT-Based Learning Processes

    Science.gov (United States)

    Conde, Miguel Ángel; García-Peñalvo, Francisco José; Casany, Marià José; Alier Forment, Marc

    Learning processes are changing related to technological and sociological evolution, taking this in to account, a new learning strategy must be considered. Specifically what is needed is to give an effective step towards the eLearning 2.0 environments consolidation. This must imply the fusion of the advantages of the traditional LMS (Learning Management System) - more formative program control and planning oriented - with the social learning and the flexibility of the web 2.0 educative applications.

  16. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2016-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value

  17. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2017-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the Cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value.

  18. Logical provenance in data-oriented workflows?

    KAUST Repository

    Ikeda, R.

    2013-04-01

    We consider the problem of defining, generating, and tracing provenance in data-oriented workflows, in which input data sets are processed by a graph of transformations to produce output results. We first give a new general definition of provenance for general transformations, introducing the notions of correctness, precision, and minimality. We then determine when properties such as correctness and minimality carry over from the individual transformations\\' provenance to the workflow provenance. We describe a simple logical-provenance specification language consisting of attribute mappings and filters. We provide an algorithm for provenance tracing in workflows where logical provenance for each transformation is specified using our language. We consider logical provenance in the relational setting, observing that for a class of Select-Project-Join (SPJ) transformations, logical provenance specifications encode minimal provenance. We have built a prototype system supporting the features and algorithms presented in the paper, and we report a few preliminary experimental results. © 2013 IEEE.

  19. Engineering solutions for open microalgae mass cultivation and realistic indoor simulation of outdoor environments.

    Science.gov (United States)

    Apel, Andreas Christoph; Weuster-Botz, Dirk

    2015-06-01

    Microalgae could become an important renewable source for chemicals, food, and energy if process costs can be reduced. In the past 60 years, relevant factors in open outdoor mass cultivation of microalgae were identified and elaborate solutions regarding bioprocesses and bioreactors were developed. An overview of these solutions is presented. Since the cost of most microalgal products from current mass cultivation systems is still prohibitively high, further development is required. The application of complex computational techniques for cost-effective process and reactor development will become more important if experimental validation of simulation results can easily be achieved. Due to difficulties inherent to outdoor experimentation, it can be useful to conduct validation experiments indoors. Considerations and approaches for realistic indoor reproduction of the most important environmental conditions in microalgae cultivation experiments-light, temperature, and substance concentrations, are discussed.

  20. Perti Net-Based Workflow Access Control Model

    Institute of Scientific and Technical Information of China (English)

    陈卓; 骆婷; 石磊; 洪帆

    2004-01-01

    Access control is an important protection mechanism for information systems. This paper shows how to make access control in workflow system. We give a workflow access control model (WACM) based on several current access control models. The model supports roles assignment and dynamic authorization. The paper defines the workflow using Petri net. It firstly gives the definition and description of the workflow, and then analyzes the architecture of the workflow access control model (WACM). Finally, an example of an e-commerce workflow access control model is discussed in detail.

  1. Digital workflow management for quality assessment in pathology.

    Science.gov (United States)

    Kalinski, Thomas; Sel, Saadettin; Hofmann, Harald; Zwönitzer, Ralf; Bernarding, Johannes; Roessner, Albert

    2008-01-01

    Information systems (IS) are well established in the multitude of departments and practices of pathology. Apart from being a collection of doctor's reports, IS can be used to organize and evaluate workflow processes. We report on such a digital workflow management using IS at the Department of Pathology, University Hospital Magdeburg, Germany, and present an evaluation of workflow data collected over a whole year. This allows us to measure workflow processes and to distinguish the effects of alterations in the workflow for quality assessment. Moreover, digital workflow management provides the basis for the integration of diagnostic virtual microscopy.

  2. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...

  3. Workflow-driven clinical decision support for personalized oncology.

    Science.gov (United States)

    Bucur, Anca; van Leeuwen, Jasper; Christodoulou, Nikolaos; Sigdel, Kamana; Argyri, Katerina; Koumakis, Lefteris; Graf, Norbert; Stamatakos, Georgios

    2016-07-21

    The adoption in oncology of Clinical Decision Support (CDS) may help clinical users to efficiently deal with the high complexity of the domain, lead to improved patient outcomes, and reduce the current knowledge gap between clinical research and practice. While significant effort has been invested in the implementation of CDS, the uptake in the clinic has been limited. The barriers to adoption have been extensively discussed in the literature. In oncology, current CDS solutions are not able to support the complex decisions required for stratification and personalized treatment of patients and to keep up with the high rate of change in therapeutic options and knowledge. To address these challenges, we propose a framework enabling efficient implementation of meaningful CDS that incorporates a large variety of clinical knowledge models to bring to the clinic comprehensive solutions leveraging the latest domain knowledge. We use both literature-based models and models built within the p-medicine project using the rich datasets from clinical trials and care provided by the clinical partners. The framework is open to the biomedical community, enabling reuse of deployed models by third-party CDS implementations and supporting collaboration among modelers, CDS implementers, biomedical researchers and clinicians. To increase adoption and cope with the complexity of patient management in oncology, we also support and leverage the clinical processes adhered to by healthcare organizations. We design an architecture that extends the CDS framework with workflow functionality. The clinical models are embedded in the workflow models and executed at the right time, when and where the recommendations are needed in the clinical process. In this paper we present our CDS framework developed in p-medicine and the CDS implementation leveraging the framework. To support complex decisions, the framework relies on clinical models that encapsulate relevant clinical knowledge. Next to

  4. Open Science Grid (OSG) Ticket Synchronization: Keeping Your Home Field Advantage In A Distributed Environment

    Science.gov (United States)

    Gross, Kyle; Hayashi, Soichi; Teige, Scott; Quick, Robert

    2012-12-01

    Large distributed computing collaborations, such as the Worldwide LHC Computing Grid (WLCG), face many issues when it comes to providing a working grid environment for their users. One of these is exchanging tickets between various ticketing systems in use by grid collaborations. Ticket systems such as Footprints, RT, Remedy, and ServiceNow all have different schema that must be addressed in order to provide a reliable exchange of information between support entities and users in different grid environments. To combat this problem, OSG Operations has created a ticket synchronization interface called GOC-TX that relies on web services instead of error-prone email parsing methods of the past. Synchronizing tickets between different ticketing systems allows any user or support entity to work on a ticket in their home environment, thus providing a familiar and comfortable place to provide updates without having to learn another ticketing system. The interface is built in a way that it is generic enough that it can be customized for nearly any ticketing system with a web-service interface with only minor changes. This allows us to be flexible and rapidly bring new ticket synchronization online. Synchronization can be triggered by different methods including mail, web services interface, and active messaging. GOC-TX currently interfaces with Global Grid User Support (GGUS) for WLCG, Remedy at Brookhaven National Lab (BNL), and Request Tracker (RT) at the Virtual Data Toolkit (VDT). Work is progressing on the Fermi National Accelerator Laboratory (FNAL) ServiceNow synchronization. This paper will explain the problems faced by OSG and how they led OSG to create and implement this ticket synchronization system along with the technical details that allow synchronization to be preformed at a production level.

  5. CaGrid Workflow Toolkit: A taverna based workflow tool for cancer grid

    Directory of Open Access Journals (Sweden)

    Sulakhe Dinanath

    2010-11-01

    Full Text Available Abstract Background In biological and medical domain, the use of web services made the data and computation functionality accessible in a unified manner, which helped automate the data pipeline that was previously performed manually. Workflow technology is widely used in the orchestration of multiple services to facilitate in-silico research. Cancer Biomedical Informatics Grid (caBIG is an information network enabling the sharing of cancer research related resources and caGrid is its underlying service-based computation infrastructure. CaBIG requires that services are composed and orchestrated in a given sequence to realize data pipelines, which are often called scientific workflows. Results CaGrid selected Taverna as its workflow execution system of choice due to its integration with web service technology and support for a wide range of web services, plug-in architecture to cater for easy integration of third party extensions, etc. The caGrid Workflow Toolkit (or the toolkit for short, an extension to the Taverna workflow system, is designed and implemented to ease building and running caGrid workflows. It provides users with support for various phases in using workflows: service discovery, composition and orchestration, data access, and secure service invocation, which have been identified by the caGrid community as challenging in a multi-institutional and cross-discipline domain. Conclusions By extending the Taverna Workbench, caGrid Workflow Toolkit provided a comprehensive solution to compose and coordinate services in caGrid, which would otherwise remain isolated and disconnected from each other. Using it users can access more than 140 services and are offered with a rich set of features including discovery of data and analytical services, query and transfer of data, security protections for service invocations, state management in service interactions, and sharing of workflows, experiences and best practices. The proposed solution is

  6. CaGrid Workflow Toolkit: a Taverna based workflow tool for cancer grid.

    Science.gov (United States)

    Tan, Wei; Madduri, Ravi; Nenadic, Alexandra; Soiland-Reyes, Stian; Sulakhe, Dinanath; Foster, Ian; Goble, Carole A

    2010-11-02

    In biological and medical domain, the use of web services made the data and computation functionality accessible in a unified manner, which helped automate the data pipeline that was previously performed manually. Workflow technology is widely used in the orchestration of multiple services to facilitate in-silico research. Cancer Biomedical Informatics Grid (caBIG) is an information network enabling the sharing of cancer research related resources and caGrid is its underlying service-based computation infrastructure. CaBIG requires that services are composed and orchestrated in a given sequence to realize data pipelines, which are often called scientific workflows. CaGrid selected Taverna as its workflow execution system of choice due to its integration with web service technology and support for a wide range of web services, plug-in architecture to cater for easy integration of third party extensions, etc. The caGrid Workflow Toolkit (or the toolkit for short), an extension to the Taverna workflow system, is designed and implemented to ease building and running caGrid workflows. It provides users with support for various phases in using workflows: service discovery, composition and orchestration, data access, and secure service invocation, which have been identified by the caGrid community as challenging in a multi-institutional and cross-discipline domain. By extending the Taverna Workbench, caGrid Workflow Toolkit provided a comprehensive solution to compose and coordinate services in caGrid, which would otherwise remain isolated and disconnected from each other. Using it users can access more than 140 services and are offered with a rich set of features including discovery of data and analytical services, query and transfer of data, security protections for service invocations, state management in service interactions, and sharing of workflows, experiences and best practices. The proposed solution is general enough to be applicable and reusable within other

  7. DEVELOPING CONCEPTUAL FRAMEWORK FOR REVISING LF-LEARNING MATERIALS (SLMs OF THE OPEN SCHOOL (OS OF BANGLADESH OPEN UNIVERSITY (BOU AT A DIGITAL ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Sabina YEASMIN

    2011-10-01

    Full Text Available uses the first generation self-learning materials (SLMs written, before an era, following an in-house style and template. The concerned faculty member corrects, every year, texts before the reprint; but this is limited to spelling mistakes, factual errors and page make-ups only. The University has taken policy and steps to revise the texts as a whole which is still limited to the previous process. But; the current government is implementing the agenda of digital Bangladesh which definitely will influence the texts vis-à-vis template, learner’s instructions, gender-sensitiveness, context and content. In addition, education theory has been shifted from instructivism to constructivism which is being experimented and implemented by the Ministerial project entitled Teaching Quality Improvement (TQI partnering with the BOU School of Education with new texts. Time changes, new things are being adopted. Open School also requires revising its texts in relation to the government’s current agenda of implementing the digital Bangladesh. This study collects data from tutors, distance educators, writers and reviewers and finally develops a framework for revising the OS SLMs at a digital environment.

  8. Publicly Open Virtualized Gaming Environment For Simulation of All Aspects Related to '100 Year Starship Study'

    Science.gov (United States)

    Obousy, R. K.

    2012-09-01

    Sending a mission to distant stars will require our civilization to develop new technologies and change the way we live. The complexity of the task is enormous [1] thus, the thought is to involve people from around the globe through the ``citizen scientist'' paradigm. The suggestion is a ``Gaming Virtual Reality Network'' (GVRN) to simulate sociological and technological aspects involved in this project. Currently there is work being done [2] in developing a technology which will construct computer games within GVRN. This technology will provide quick and easy ways for individuals to develop game scenarios related to various aspects of the ``100YSS'' project. People will be involved in solving certain tasks just by play games. Players will be able to modify conditions, add new technologies, geological conditions, social movements and assemble new strategies just by writing scenarios. The system will interface with textual and video information, extract scenarios written in millions of texts and use it to assemble new games. Thus, players will be able to simulate enormous amounts of possibilities. Information technologies will be involved which will require us to start building the system in a way that any modules can be easily replaced. Thus, GVRN should be modular and open to the community.

  9. ISAMBARD: an open-source computational environment for biomolecular analysis, modelling and design.

    Science.gov (United States)

    Wood, Christopher W; Heal, Jack W; Thomson, Andrew R; Bartlett, Gail J; Ibarra, Amaurys A; Leo Brady, R; Sessions, Richard B; Woolfson, Derek N

    2017-06-05

    The rational design of biomolecules is becoming a reality. However, further computational tools are needed to facilitate and accelerate this, and to make it accessible to more users. Here we introduce ISAMBARD, a tool for structural analysis, model building and rational design of biomolecules. ISAMBARD is open-source, modular, computationally scalable and intuitive to use. These features allow non-experts to explore biomolecular design in silico . ISAMBARD addresses a standing issue in protein design, namely, how to introduce backbone variability in a controlled manner. This is achieved through the generalisation of tools for parametric modelling, describing the overall shape of proteins geometrically, and without input from experimentally determined structures. This will allow backbone conformations for entire folds and assemblies not observed in nature to be generated de novo , that is, to access the 'dark matter of protein-fold space'. We anticipate that ISAMBARD will find broad applications in biomolecular design, biotechnology and synthetic biology. A current stable build can be downloaded from the python package index ( https://pypi.python.org/pypi/isambard /) with development builds available on GitHub ( https://github.com/woolfson-group /) along with documentation, tutorial material and all the scripts used to generate the data described in this paper. d.n.woolfson@bristol.ac.uk or chris.wood@bristol.ac.uk. Supplementary data are available at Bioinformatics online.

  10. An Open Architecture to Support Social and Health Services in a Smart TV Environment.

    Science.gov (United States)

    Costa, Carlos Rivas; Anido-Rifon, Luis E; Fernandez-Iglesias, Manuel J

    2017-03-01

    To design, implement, and test a solution to provide social and health services for the elderly at home based on smart TV technologies and access to all services. The architecture proposed is based on an open software platform and standard personal computing hardware. This provides great flexibility to develop new applications over the underlying infrastructure or to integrate new devices, for instance to monitor a broad range of vital signs in those cases where home monitoring is required. An actual system as a proof-of-concept was designed, implemented, and deployed. Applications range from social network clients to vital signs monitoring; from interactive TV contests to conventional online care applications such as medication reminders or telemedicine. In both cases, the results have been very positive, confirming the initial perception of the TV as a convenient, easy-to-use technology to provide social and health care. The TV set is a much more familiar computing interface for most senior users, and as a consequence, smart TVs become a most convenient solution for the design and implementation of applications and services targeted to this user group. This proposal has been tested in real setting with 62 senior people at their homes. Users included both individuals with experience using computers and others reluctant to them.

  11. SegMine workflows for semantic microarray data analysis in Orange4WS

    Directory of Open Access Journals (Sweden)

    Kulovesi Kimmo

    2011-10-01

    Full Text Available Abstract Background In experimental data analysis, bioinformatics researchers increasingly rely on tools that enable the composition and reuse of scientific workflows. The utility of current bioinformatics workflow environments can be significantly increased by offering advanced data mining services as workflow components. Such services can support, for instance, knowledge discovery from diverse distributed data and knowledge sources (such as GO, KEGG, PubMed, and experimental databases. Specifically, cutting-edge data analysis approaches, such as semantic data mining, link discovery, and visualization, have not yet been made available to researchers investigating complex biological datasets. Results We present a new methodology, SegMine, for semantic analysis of microarray data by exploiting general biological knowledge, and a new workflow environment, Orange4WS, with integrated support for web services in which the SegMine methodology is implemented. The SegMine methodology consists of two main steps. First, the semantic subgroup discovery algorithm is used to construct elaborate rules that identify enriched gene sets. Then, a link discovery service is used for the creation and visualization of new biological hypotheses. The utility of SegMine, implemented as a set of workflows in Orange4WS, is demonstrated in two microarray data analysis applications. In the analysis of senescence in human stem cells, the use of SegMine resulted in three novel research hypotheses that could improve understanding of the underlying mechanisms of senescence and identification of candidate marker genes. Conclusions Compared to the available data analysis systems, SegMine offers improved hypothesis generation and data interpretation for bioinformatics in an easy-to-use integrated workflow environment.

  12. SegMine workflows for semantic microarray data analysis in Orange4WS.

    Science.gov (United States)

    Podpečan, Vid; Lavrač, Nada; Mozetič, Igor; Novak, Petra Kralj; Trajkovski, Igor; Langohr, Laura; Kulovesi, Kimmo; Toivonen, Hannu; Petek, Marko; Motaln, Helena; Gruden, Kristina

    2011-10-26

    In experimental data analysis, bioinformatics researchers increasingly rely on tools that enable the composition and reuse of scientific workflows. The utility of current bioinformatics workflow environments can be significantly increased by offering advanced data mining services as workflow components. Such services can support, for instance, knowledge discovery from diverse distributed data and knowledge sources (such as GO, KEGG, PubMed, and experimental databases). Specifically, cutting-edge data analysis approaches, such as semantic data mining, link discovery, and visualization, have not yet been made available to researchers investigating complex biological datasets. We present a new methodology, SegMine, for semantic analysis of microarray data by exploiting general biological knowledge, and a new workflow environment, Orange4WS, with integrated support for web services in which the SegMine methodology is implemented. The SegMine methodology consists of two main steps. First, the semantic subgroup discovery algorithm is used to construct elaborate rules that identify enriched gene sets. Then, a link discovery service is used for the creation and visualization of new biological hypotheses. The utility of SegMine, implemented as a set of workflows in Orange4WS, is demonstrated in two microarray data analysis applications. In the analysis of senescence in human stem cells, the use of SegMine resulted in three novel research hypotheses that could improve understanding of the underlying mechanisms of senescence and identification of candidate marker genes. Compared to the available data analysis systems, SegMine offers improved hypothesis generation and data interpretation for bioinformatics in an easy-to-use integrated workflow environment.

  13. OPEN RADIATION: a collaborative project for radioactivity measurement in the environment by the public

    Science.gov (United States)

    Bottollier-Depois, Jean-François; Allain, E.; Baumont, G.; Berthelot, N.; Clairand, I.; Couvez, C.; Darley, G.; Henry, B.; Jolivet, T.; Laroche, P.; Lebau-Livé, A.; Lejeune, V.; Miss, J.; Monange, W.; Quéinnec, F.; Richet, Y.; Simon, C.; Trompier, F.; Vayron, F.

    2017-09-01

    After the Fukushima accident, initiatives emerged from the public to carry out themselves measurements of the radioactivity in the environment with various devices, among which smartphones, and to share data and experiences through collaborative tools and social networks. Such measurements have two major interests, on the one hand, to enable each individual of the public to assess his own risk regarding the radioactivity and, on the other hand, to provide "real time" data from the field at various locations, especially in the early phase of an emergency situation, which could be very useful for the emergency management. The objective of the OPENRADIATION project is to offer to the public the opportunity to be an actor for measurements of the radioactivity in the environment using connected dosimetric applications on smartphones. The challenge is to operate such a system on a sustainable basis in peaceful time and be useful in case of emergency. In "peaceful situation", this project is based on a collaborative approach with the aim to get complementary data to the existing ones, to consolidate the radiation background, to generate alerts in case of problem and to provide education & training and enhanced pedagogical approaches for a clear understanding of measures for the public. In case of emergency situation, data will be available "spontaneously" from the field in "real time" providing an opportunity for the emergency management and the communication with the public. … The practical objective is i) to develop a website centralising data from various systems/dosimeters, providing dose maps with raw and filtered data and creating dedicated areas for specific initiatives and exchanges of data and ii) to develop a data acquisition protocol and a dosimetric application using a connected dosimeter with a bluetooth connection. This project is conducted within a partnership between organisms' representative of the scientific community and associations to create links

  14. Adaptive workflow simulation of emergency response

    NARCIS (Netherlands)

    Bruinsma, Guido Wybe Jan

    2010-01-01

    Recent incidents and major training exercises in and outside the Netherlands have persistently shown that not having or not sharing information during emergency response are major sources of emergency response inefficiency and error, and affect incident mitigation outcomes through workflow planning

  15. Workflow Automation: A Collective Case Study

    Science.gov (United States)

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  16. Planning bioinformatics workflows using an expert system.

    Science.gov (United States)

    Chen, Xiaoling; Chang, Jeffrey T

    2017-04-15

    Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. https://github.com/jefftc/changlab. jeffrey.t.chang@uth.tmc.edu.

  17. Soundness of Timed-Arc Workflow Nets

    DEFF Research Database (Denmark)

    Mateo, Jose Antonio; Srba, Jiri; Sørensen, Mathias Grund

    2014-01-01

    , we demonstrate the usability of our theory on the case studies of a Brake System Control Unit used in aircraft certification, the MPEG2 encoding algorithm, and a blood transfusion workflow. The implementation of the algorithms is freely available as a part of the model checker TAPAAL....

  18. Workflow Automation: A Collective Case Study

    Science.gov (United States)

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  19. Using workflow for projects in higher education

    NARCIS (Netherlands)

    van der Veen, Johan (CTIT); Jones, Valerie M.; Collis, Betty

    2000-01-01

    The WWW is increasingly used as a medium to support education and training. A course at the University of Twente in which groups of students collaborate in the design and production of multimedia instructional materials has now been supported by a website since 1995. Workflow was integrated with

  20. Building Digital Audio Preservation Infrastructure and Workflows

    Science.gov (United States)

    Young, Anjanette; Olivieri, Blynne; Eckler, Karl; Gerontakos, Theodore

    2010-01-01

    In 2009 the University of Washington (UW) Libraries special collections received funding for the digital preservation of its audio indigenous language holdings. The university libraries, where the authors work in various capacities, had begun digitizing image and text collections in 1997. Because of this, at the onset of the project, workflows (a…

  1. Using workflow for projects in higher education

    NARCIS (Netherlands)

    van der Veen, Johan (CTIT); Jones, Valerie M.; Collis, Betty

    2000-01-01

    The WWW is increasingly used as a medium to support education and training. A course at the University of Twente in which groups of students collaborate in the design and production of multimedia instructional materials has now been supported by a website since 1995. Workflow was integrated with oth

  2. Text mining for the biocuration workflow.

    Science.gov (United States)

    Hirschman, Lynette; Burns, Gully A P C; Krallinger, Martin; Arighi, Cecilia; Cohen, K Bretonnel; Valencia, Alfonso; Wu, Cathy H; Chatr-Aryamontri, Andrew; Dowell, Karen G; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G

    2012-01-01

    Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on 'Text Mining for the BioCuration Workflow' at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community.

  3. RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service

    Science.gov (United States)

    Yang, Chao; Chen, Nengcheng; Di, Liping

    2012-10-01

    Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.

  4. The Architecture of an Intuitive Scientific Workflow System for Spatial Planning

    Directory of Open Access Journals (Sweden)

    Tiberiu Florescu

    2018-06-01

    Full Text Available In recent years, the quantity of open data has multiplied dramatically. This newly found wealth of data has raised a number of issues for any research exercise within the field of spatial planning: underpowered traditional tools and methods, difficulties in tracking data, transparency issues, sharing difficulties and, above all, an erratic workflow. Some new research tools to counter this irksome tendency do exist at the moment, but, unfortunately, we feel that there is still ample room for improvement. We have therefore commenced the development of our own research instrument, based on the Scientific Workflow System concept. This paper lays the foundation for its architecture. Once completed, both the instrument and the resulting data shall be freely available, truthful to the spirit of the open source and open data tradition. We strongly believe that spatial planning professionals and researchers might find it interesting and worthwhile for increasing the quality and speed of their work.

  5. Changing environment and urban identity following open-cast mining and thermic power plant in Turkey: case of Soma.

    Science.gov (United States)

    Karadag, Arife

    2012-03-01

    This paper is a summary of a project changed into a book named by "Changing Environment, City and Identity in Soma with the Geographical Evaluations" issued on May 2005. In this research, Soma, which is one of the most remarkable districts in Manisa in the West Anatolia from the point of economical figures, is assessed with its physical environment potential, improving economical activities and changing socio-economical structure. Owing to the open coal basins in the northeast and southwest of the district where lignite is produced and the impact of the thermic power plant near the city centre, Soma has changed on a large scale. This change has introduced some environmental problems into the district such as the devastation of the forestry land; the infertility of farming land; and soil, water and air pollution. Even though the change under discussion has led to many problems to deal with, it has also influenced its socio-economical structure to a large extent and revealed new type of inhabitants having different life expectations and aims. In conclusion, in this article, changing environment and city structure after lignite processing and thermic station establishment in Soma are discussed through the effective geographical factors. The new city profile formed by the local dynamics in question is evaluated according to the data obtained by the studies made in the neighbourhood.

  6. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronny S.; van der Aalst, Willibrordus Martinus Pancratius; Molemann, A.J.;

    2007-01-01

    an Executable Use Case (EUC) and Colored Care organizations, such as hospitals, need to support complex and dynamic workflows. Moreover, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where...... an approach where an Executable Use Case (EUC) and Colored Workflow Net (CWN) are used to close the gap between the given requirements specification and the realization of these requirements with the help of a workflow system. This paper describes a large case study where the diagnostic trajectory...

  7. Telecommuting Academics Within an Open Distance Education Environment of South Africa: More Content, Productive, and Healthy?

    Directory of Open Access Journals (Sweden)

    Deon Harold Tustin

    2014-07-01

    Full Text Available Outside an academic setting, telecommuting has become fairly popular in recent years. However, research on telecommuting practices within a higher education environment is fairly sparse, especially within the higher distance education sphere. Drawing on existing literature on telecommuting and the outcome of a valuation study on the success of an experimental telecommuting programme at the largest distance education institution in South Africa, this article presents discerning findings on telecommuting practices. In fact, the research builds on evolutionary telecommuting assessment methods of the direct or indirect effect (work based and affective impact (emotional on multiple stakeholder groups. This holistic approach allowed for comparative analysis between telecommuting and nontelecommuting academics with regard to the impact of telecommuting practices. The research reveals high levels of support for telecommuting practices that are associated with high levels of work productivity and satisfaction, lower levels of emotional and physical fatigue, and reduced work stress, frustration, and overload. The study also reveals higher levels of student satisfaction with academic support from telecommuters than nontelecommuters. Overall, the critique presents insightful findings on telecommuting practices within an academic setting, which clearly signal a potential for a shift in the office culture of higher distance education institutions in the years to come. The study makes a significant contribution to a limited collection of empirical research on telecommuting practices within the higher distance education sector and guides institutions in refining and/or redefining future telecommuting strategies or programmes.

  8. The application of workflows to digital heritage systems

    OpenAIRE

    Al-Barakati, Abdullah

    2012-01-01

    Digital heritage systems usually handle a rich and varied mix of digital objects, accompanied by complex and intersecting workflows and processes. However, they usually lack effective workflow management within their components as evident in the lack of integrated solutions that include workflow components. There are a number of reasons for this limitation in workflow management utilization including some technical challenges, the unique nature of each digital resource and the challenges impo...

  9. Perception of performance management system by academic staff in an open distance learning higher education environment

    Directory of Open Access Journals (Sweden)

    Esther M. Maimela

    2016-02-01

    Full Text Available Orientation: Institutions of higher learning in South Africa are fast embracing performance management system (PMS as a mechanism for the achievement of teaching excellence and enhancement of research productivity. However, literature provided evidence to show that application of PMS in the private sector had failed to drive competition, efficiency and productivity.Research purpose: The main purpose of this article was to evaluate the perception of academic staff members of an open distance learning institution regarding the implementation of a PMS.Motivation for the study: PMS as a mechanism through which performance of academics is measured has been described as inconsistent with the long tradition of academic freedom, scholarship and collegiality in the academy. Moreso, previous research on the implementation of PMS was limited to private sector organisations, thus resulting in the dearth of empirical literature relating to its practice in service-driven public sector institutions.Research design, approach and method: The article adopted a quantitative research approach using census survey methodology. Data were collected from 492 academic staff from the surveyed institution using a self-developed questionnaire that was tested for high content validity with a consolidated Cronbach’s alpha value of 0.83. Data were analysed using a onesample t-test because of the one-measurement nature of the variable under investigation.Main findings: Major findings of the study indicated that respondents were satisfied with the implementation of the PMS by management. However, the payment of performance bonuses was not considered as sufficiently motivating, thus necessitating a pragmatic review by management.Practical/managerial implications: The findings of this article provided a practical guide to managers on the implementation and management of PMS as an employee performance reward mechanism in non-profit and service-oriented organisations

  10. Open-source Peer-to-Peer Environment to Enable Sensor Web Architecture: Application to Geomagnetic Observations and Modeling

    Science.gov (United States)

    Holland, M.; Pulkkinen, A.

    2007-12-01

    A flexible, dynamic, and reliable secure peer-to-peer (P2P) communication environment is under development at NASA's Goddard Space Flight Center (GSFC). Popular open-source P2P software technology provides a self- organizing, self-healing ad hoc "virtual network overlay" protocol-suite. The current effort builds a proof-of-concept geomagnetic Sensor Web upon this foundation. Our long-term objective is to enable an evolution of many types of distributed Earth system sensors and related processing/storage components into elements of an operational Sensor Web via integration into this P2P Environment. In general, the Environment distributes data communication tasks among the sensors (viewed as peers, each assigned a peer-role) and controls the flow of data. This work encompasses dynamic discovery, monitoring, control, and configuration as well as autonomous operations, real-time modeling and data processing, and secure ubiquitous communications. We currently restrict our communications to be within the secure GSFC network environment, and have integrated "simulated" (via historical data) geomagnetic sensors. Each remote sensor has operating modes to manage (from remote interfaces) and is designed to have features nearly indistinguishable from a live magnetometer. We have implemented basic identity management features (organized around GSFC identity-management practices); providing mechanisms which restrict data-serving privileges to authorized users, and which allow improved trust and accountability among users of the Environment. Data-serving peers digitally "sign" their services, and their data-browsing counterparts will only accept the products of services whose signature (and hence identity) can be verified. The current usage scenario involves modeling-peers, which operate within the same Environment as the sensors and also have operating modes to remotely manage, portraying a near-real- time global representation of geomagnetic activity from dynamic sensor

  11. Reduction of Hospital Physicians' Workflow Interruptions: A Controlled Unit-Based Intervention Study

    Directory of Open Access Journals (Sweden)

    Matthias Weigl

    2012-01-01

    Full Text Available Highly interruptive clinical environments may cause work stress and suboptimal clinical care. This study features an intervention to reduce workflow interruptions by re-designing work and organizational practices in hospital physicians providing ward coverage. A prospective, controlled intervention was conducted in two surgical and two internal wards. The intervention was based on physician quality circles - a participative technique to involve employees in the development of solutions to overcome work-related stressors. Outcome measures were the frequency of observed workflow interruptions. Workflow interruptions by fellow physicians and nursing staff were significantly lower after the intervention. However, a similar decrease was also observed in control units. Additional interviews to explore process-related factors suggested that there might have been spill-over effects in the sense that solutions were not strictly confined to the intervention group. Recommendations for further research on the effectiveness and consequences of such interventions for professional communication and patient safety are discussed.

  12. Database Support for Workflow Management: The WIDE Project

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Pernici, B; Sánchez, G.; Unknown, [Unknown

    1999-01-01

    Database Support for Workflow Management: The WIDE Project presents the results of the ESPRIT WIDE project on advanced database support for workflow management. The book discusses the state of the art in combining database management and workflow management technology, especially in the areas of

  13. Workflows for automated downstream data analysis and visualization in large-scale computational mass spectrometry.

    Science.gov (United States)

    Aiche, Stephan; Sachsenberg, Timo; Kenar, Erhan; Walzer, Mathias; Wiswedel, Bernd; Kristl, Theresa; Boyles, Matthew; Duschl, Albert; Huber, Christian G; Berthold, Michael R; Reinert, Knut; Kohlbacher, Oliver

    2015-04-01

    MS-based proteomics and metabolomics are rapidly evolving research fields driven by the development of novel instruments, experimental approaches, and analysis methods. Monolithic analysis tools perform well on single tasks but lack the flexibility to cope with the constantly changing requirements and experimental setups. Workflow systems, which combine small processing tools into complex analysis pipelines, allow custom-tailored and flexible data-processing workflows that can be published or shared with collaborators. In this article, we present the integration of established tools for computational MS from the open-source software framework OpenMS into the workflow engine Konstanz Information Miner (KNIME) for the analysis of large datasets and production of high-quality visualizations. We provide example workflows to demonstrate combined data processing and visualization for three diverse tasks in computational MS: isobaric mass tag based quantitation in complex experimental setups, label-free quantitation and identification of metabolites, and quality control for proteomics experiments. © 2015 The Authors. PROTEOMICS published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Compact and robust open-loop fiber-optic gyroscope for applications in harsh environments

    Science.gov (United States)

    Moslehi, Behzad M.; Yahalom, Ram; Faridian, Ferey; Black, Richard J.; Taylor, Edward W.; Ooi, Teng; Corder, Aaron

    2010-09-01

    Next generation navigation systems demand performance enhancements to support new applications with longer range capabilities, provide robust operation in severe thermal and vibration environments while simultaneously reducing weight, size and power dissipation. Compact, inexpensive, advanced guidance components are essential for such applications. In particular, Inertial Reference Units (IRUs) that can provide high-resolution stabilization and accurate inertial pointing knowledge are needed. For space applications, an added requirement is radiation hardening up to 300 krad over 5 to 15 years. Manufacturing specifications for the radiation-induced losses are not readily available and empirical test data is required for all components in order to optimize the system performance. Interferometric Fiber-Optic Gyroscopes (IFOGs) have proven to be a leading technology for tactical and navigational systems. The sensors have no moving parts. This ensures high reliability and a long life compared to the mechanical gyroscopes and dithered ring laser gyroscopes. However, the available architectures limit the potential size and cost of the IFOG. The work reported here describes an innovative approach for the design, fabrication, and testing of the IFOG and enables the production of a small, robust and low cost gyro with excellent noise and bandwidth characteristics with high radiation tolerance. The development is aimed at achieving a sensor volume architecture, where the light source, electronics and receiver are integrated in an external package, while the sensor head is integrated in a robust and environmentally rigid package. The sensor package design is compatible with the most severe environmental requirements foreseen for the target applications. This paper presents the current state-of-the-art performance of the prototype gyros and the potential for further reduction of size with improved performance. The gyro sample and data rates are extremely high and can be close

  15. Information and Communication Technologies in Schools A Handbook for Teachers or How ICT Can Create New, Open Learning Environments

    Directory of Open Access Journals (Sweden)

    Ramazan Güzel

    2017-02-01

    Full Text Available Information and Communication Technologies in Schools, a Handbook for Teachers or How ICT can Create New, Open Learning Environments delivers very detailed presentation and utilization of ICT in education. This publication is very good resource to teachers and teacher educators. In reviewing this book, the first thing that attracts the readers’ attention is the layout of the publication. Content, organization, and reference sources are efficient enough for this publication which aims to help teachers while forming new, open learning environments with ICT. However, the cover page image and watermark image in the first nine pages are not very relevant with use of ICT in education. Globe in the UNESCO Headquarter garden and the Eiffel Tower doesn’t make any sense with ICT. Instead of this image, more convenient image could have been selected.   This publication allows the reader to easily follow the use of ICT in the classroom by giving authentic examples. The book is divided into seven chapters and first chapter starts with the background information of the ICT. Second chapter explains very detailed ICT tools used for education. Some tools mentioned in this chapter under storage title have already been outdated. It shows that how fast technology changes and how fast it wears out the old technology. Third chapter mentions about the change in learning environment with the use of ICT by examining it from teachers’ and students’ view. In the fourth chapter, it proposes new pedagogical methods in learning and teaching. In my opinion, this chapter is foremost part of this publication. It explains the organization of the learning process with the use of ICT and examples are can easily be implemented in classrooms. Fifth Chapter describes the place of ICT in school learning activities. This chapter also defines how to structure ICT in school curricula. It gives very good examples but these examples do not relate directly to the teachers because

  16. Applying Idea Management System (IMS Approach to Design and Implement a collaborative Environment in Public Service related open Innovation Processes

    Directory of Open Access Journals (Sweden)

    Marco Alessi

    2015-12-01

    Full Text Available Novel ideas are the key ingredients for innovation processes, and Idea Management System (IMS plays a prominent role in managing captured ideas from external stakeholders and internal actors within an Open Innovation process. By considering a specific case study, Lecce-Italy, we have designed and implemented a collaborative environment, which provides an ideal platform for government, citizens, etc. to share ideas and co-create the value of innovative public services in Lecce. In this study the application of IMS with six main steps, including: idea generation, idea improvement, idea selection, refinement, idea implementation, and monitoring, shows that this, remarkably, helps service providers to exploit the intellectual capital and initiatives of the regional stakeholders and citizens and assist service providers to stay in line with the needs of society. Moreover, we have developed two support tools to foster collaboration and transparency: sentiment analysis tool and gamification application.

  17. Open public spaces and street furniture: the potential for increased use of photovoltaics in the built environment

    Energy Technology Data Exchange (ETDEWEB)

    Abbate-Gardner, C. [Officine di Architettura, Rome (Italy)

    1996-07-01

    The trend toward industrializing architectural components, the increasing complexity and multifunctional purpose of buildings and the concern for the CO{sub 2} emissions common to our cities is pushing design research to experiment with new environmentally friendly construction technology. Current experiments in integrating photovoltaic (PV) systems in buildings and the built environment have already been proven to offer numerous advantages. This article focuses on the notable flexibility and adaptability of PV integration in urban structures due to the features of its industrial components. To illustrate this point, I should like to offer a brief overview of some selected examples of the use of PV in public open spaces to demonstrate that it is possible to achieve a positive integration with its environmental context, while enhancing the architectural quality of the PV material and respecting its technological efficiency. (author)

  18. A New Open Data Open Modeling Framework for the Geosciences Community (Invited)

    Science.gov (United States)

    Liang, X.; Salas, D.; Navarro, M.; Liang, Y.; Teng, W. L.; Hooper, R. P.; Restrepo, P. J.; Bales, J. D.

    2013-12-01

    A prototype Open Hydrospheric Modeling Framework (OHMF), also called Open Data Open Modeling framework, has been developed to address two key modeling challenges faced by the broad research community: (1) accessing external data from diverse sources and (2) execution, coupling, and evaluation/intercomparison of various and complex models. The former is achieved via the Open Data architecture, while the latter is achieved via the Open Modeling architecture. The Open Data architecture adopts a common internal data model and representation, to facilitate the integration of various external data sources into OHMF, using Data Agents that handle remote data access protocols (e.g., OPeNDAP, Web services), metadata standards, and source-specific implementations. These Data Agents hide the heterogeneity of the external data sources and provide a common interface to the OHMF system core. The Open Modeling architecture allows different models or modules to be easily integrated into OHMF. The OHMF architectural design offers a general many-to-many connectivity between individual models and external data sources, instead of one-to-one connectivity from data access to model simulation results. OHMF adopts a graphical scientific workflow, offers tools to re-scale in space and time, and provides multi-scale data fusion and assimilation functionality. Notably, the OHMF system employs a strategy that does not require re-compiling or adding interface codes for a user's model to be integrated. Thus, a corresponding model agent can be easily developed by a user. Once an agent is available for a model, it can be shared and used by others. An example will be presented to illustrate the prototype OHMF system and the automatic flow from accessing data to model simulation results in a user-friendly workflow-controlled environment.

  19. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows (Invited)

    Science.gov (United States)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-12-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues’ expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable “software appliance” to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish “talkoot” (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a “science story” in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of

  20. Designing Collaborative Healthcare Technology for the Acute Care Workflow

    Directory of Open Access Journals (Sweden)

    Michael Gonzales

    2015-10-01

    Full Text Available Preventable medical errors in hospitals are the third leading cause of death in the United States. Many of these are caused by poor situational awareness, especially in acute care resuscitation scenarios. While a number of checklists and technological interventions have been developed to reduce cognitive load and improve situational awareness, these tools often do not fit the clinical workflow. To better understand the challenges faced by clinicians in acute care codes, we conducted a qualitative study with interprofessional clinicians at three regional hospitals. Our key findings are: Current documentation processes are inadequate (with information recorded on paper towels; reference guides can serve as fixation points, reducing rather than enhancing situational awareness; the physical environment imposes significant constraints on workflow; homegrown solutions may be used often to solve unstandardized processes; simulation scenarios do not match real-world practice. We present a number of considerations for collaborative healthcare technology design and discuss the implications of our findings on current work for the development of more effective interventions for acute care resuscitation scenarios.

  1. Automation and workflow considerations for embedding Digimarc Barcodes at scale

    Science.gov (United States)

    Rodriguez, Tony; Haaga, Don; Calhoon, Sean

    2015-03-01

    The Digimarc® Barcode is a digital watermark applied to packages and variable data labels that carries GS1 standard GTIN-14 data traditionally carried by a 1-D barcode. The Digimarc Barcode can be read with smartphones and imaging-based barcode readers commonly used in grocery and retail environments. Using smartphones, consumers can engage with products and retailers can materially increase the speed of check-out, increasing store margins and providing a better experience for shoppers. Internal testing has shown an average of 53% increase in scanning throughput, enabling 100's of millions of dollars in cost savings [1] for retailers when deployed at scale. To get to scale, the process of embedding a digital watermark must be automated and integrated within existing workflows. Creating the tools and processes to do so represents a new challenge for the watermarking community. This paper presents a description and an analysis of the workflow implemented by Digimarc to deploy the Digimarc Barcode at scale. An overview of the tools created and lessons learned during the introduction of technology to the market are provided.

  2. Autonomic Management of Application Workflows on Hybrid Computing Infrastructure

    Directory of Open Access Journals (Sweden)

    Hyunjoo Kim

    2011-01-01

    Full Text Available In this paper, we present a programming and runtime framework that enables the autonomic management of complex application workflows on hybrid computing infrastructures. The framework is designed to address system and application heterogeneity and dynamics to ensure that application objectives and constraints are satisfied. The need for such autonomic system and application management is becoming critical as computing infrastructures become increasingly heterogeneous, integrating different classes of resources from high-end HPC systems to commodity clusters and clouds. For example, the framework presented in this paper can be used to provision the appropriate mix of resources based on application requirements and constraints. The framework also monitors the system/application state and adapts the application and/or resources to respond to changing requirements or environment. To demonstrate the operation of the framework and to evaluate its ability, we employ a workflow used to characterize an oil reservoir executing on a hybrid infrastructure composed of TeraGrid nodes and Amazon EC2 instances of various types. Specifically, we show how different applications objectives such as acceleration, conservation and resilience can be effectively achieved while satisfying deadline and budget constraints, using an appropriate mix of dynamically provisioned resources. Our evaluations also demonstrate that public clouds can be used to complement and reinforce the scheduling and usage of traditional high performance computing infrastructure.

  3. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Sanggoo Kang

    2016-08-01

    Full Text Available Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-based image processing algorithms by comparing the performance of a single virtual server and multiple auto-scaled virtual servers under identical experimental conditions. In this study, the cloud computing environment is built with OpenStack, and four algorithms from the Orfeo toolbox are used for practical geo-based image processing experiments. The auto-scaling results from all experimental performance tests demonstrate applicable significance with respect to cloud utilization concerning response time. Auto-scaling contributes to the development of web-based satellite image application services using cloud-based technologies.

  4. Opening Reproducible Research

    Science.gov (United States)

    Nüst, Daniel; Konkol, Markus; Pebesma, Edzer; Kray, Christian; Klötgen, Stephanie; Schutzeichel, Marc; Lorenz, Jörg; Przibytzin, Holger; Kussmann, Dirk

    2016-04-01

    reproduce the original research and hence recreate the original research results (figures, tables), but also facilitates interaction with them as well as their recombination with new data or methods. Building on existing open standards and software, this project develops standards and tools for ERCs, and will demonstrate and evaluate these, focusing on the geosciences domains. The project goes beyond a technical solution for ERCs by evaluating the system from the perspectives of geoscience researchers as participants in a scientific publication process. It will focus on the statistical environment R, but also evaluate larger run time systems captured in virtual environments (Docker containers). ERCs are built upon and integrate well with both established day-to-day workflows of digital research and the scientific publication process. They make research accessible on different levels at any stage to anyone via open web platforms. Other scientists can transfer a compendium of software and tools to their own local environment and collaborate, while others make minimal changes and compare changed results in a web browser. Building on recent advances in mainstream IT, ORR envisions a new architecture for storing, executing and interacting with the original analysis environment alongside the corresponding research data and text. ORR bridges the gap between long-term archives, practical geoscience researchers, as well as publication media. Consequently, the project team seeks input and feedback from researchers working with geospatial data to ensure usable and useful open access publications as well as a publication process that minimizes effort while maximizing usability and re-usability. {References} Pebesma, E., D. Nüst, R. Bivand, 2012. The R software environment in reproducible geoscientific research. Eos, Transactions American Geophysical Union 93, vol 16, p. http://dx.doi.org/10.1029/2012EO160003{163-164}. Opening Reproducible Research project description and website

  5. Grid workflow job execution service 'Pilot'

    Science.gov (United States)

    Shamardin, Lev; Kryukov, Alexander; Demichev, Andrey; Ilyin, Vyacheslav

    2011-12-01

    'Pilot' is a grid job execution service for workflow jobs. The main goal for the service is to automate computations with multiple stages since they can be expressed as simple workflows. Each job is a directed acyclic graph of tasks and each task is an execution of something on a grid resource (or 'computing element'). Tasks may be submitted to any WS-GRAM (Globus Toolkit 4) service. The target resources for the tasks execution are selected by the Pilot service from the set of available resources which match the specific requirements from the task and/or job definition. Some simple conditional execution logic is also provided. The 'Pilot' service is built on the REST concepts and provides a simple API through authenticated HTTPS. This service is deployed and used in production in a Russian national grid project GridNNN.

  6. Statistical modeling and recognition of surgical workflow.

    Science.gov (United States)

    Padoy, Nicolas; Blum, Tobias; Ahmadi, Seyed-Ahmad; Feussner, Hubertus; Berger, Marie-Odile; Navab, Nassir

    2012-04-01

    In this paper, we contribute to the development of context-aware operating rooms by introducing a novel approach to modeling and monitoring the workflow of surgical interventions. We first propose a new representation of interventions in terms of multidimensional time-series formed by synchronized signals acquired over time. We then introduce methods based on Dynamic Time Warping and Hidden Markov Models to analyze and process this data. This results in workflow models combining low-level signals with high-level information such as predefined phases, which can be used to detect actions and trigger an event. Two methods are presented to train these models, using either fully or partially labeled training surgeries. Results are given based on tool usage recordings from sixteen laparoscopic cholecystectomies performed by several surgeons.

  7. Applying direct observation to model workflow and assess adoption.

    Science.gov (United States)

    Unertl, Kim M; Weinger, Matthew B; Johnson, Kevin B

    2006-01-01

    Lack of understanding about workflow can impair health IT system adoption. Observational techniques can provide valuable information about clinical workflow. A pilot study using direct observation was conducted in an outpatient chronic disease clinic. The goals of the study were to assess workflow and information flow and to develop a general model of workflow and information behavior. Over 55 hours of direct observation showed that the pilot site utilized many of the features of the informatics systems available to them, but also employed multiple non-electronic artifacts and workarounds. Gaps existed between clinic workflow and informatics tool workflow, as well as between institutional expectations of informatics tool use and actual use. Concurrent use of both paper-based and electronic systems resulted in duplication of effort and inefficiencies. A relatively short period of direct observation revealed important information about workflow and informatics tool adoption.

  8. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  9. IDD Archival Hardware Architecture and Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Mendonsa, D; Nekoogar, F; Martz, H

    2008-10-09

    This document describes the functionality of every component in the DHS/IDD archival and storage hardware system shown in Fig. 1. The document describes steps by step process of image data being received at LLNL then being processed and made available to authorized personnel and collaborators. Throughout this document references will be made to one of two figures, Fig. 1 describing the elements of the architecture and the Fig. 2 describing the workflow and how the project utilizes the available hardware.

  10. SPECT/CT workflow and imaging protocols

    Energy Technology Data Exchange (ETDEWEB)

    Beckers, Catherine [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Hustinx, Roland [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Domaine Universitaire du Sart Tilman, Service de Medecine Nucleaire et Imagerie Oncologique, CHU de Liege, Liege (Belgium)

    2014-05-15

    Introducing a hybrid imaging method such as single photon emission computed tomography (SPECT)/CT greatly alters the routine in the nuclear medicine department. It requires designing new workflow processes and the revision of original scheduling process and imaging protocols. In addition, the imaging protocol should be adapted for each individual patient, so that performing CT is fully justified and the CT procedure is fully tailored to address the clinical issue. Such refinements often occur before the procedure is started but may be required at some intermediate stage of the procedure. Furthermore, SPECT/CT leads in many instances to a new partnership with the radiology department. This article presents practical advice and highlights the key clinical elements which need to be considered to help understand the workflow process of SPECT/CT and optimise imaging protocols. The workflow process using SPECT/CT is complex in particular because of its bimodal character, the large spectrum of stakeholders, the multiplicity of their activities at various time points and the need for real-time decision-making. With help from analytical tools developed for quality assessment, the workflow process using SPECT/CT may be separated into related, but independent steps, each with its specific human and material resources to use as inputs or outputs. This helps identify factors that could contribute to failure in routine clinical practice. At each step of the process, practical aspects to optimise imaging procedure and protocols are developed. A decision-making algorithm for justifying each CT indication as well as the appropriateness of each CT protocol is the cornerstone of routine clinical practice using SPECT/CT. In conclusion, implementing hybrid SPECT/CT imaging requires new ways of working. It is highly rewarding from a clinical perspective, but it also proves to be a daily challenge in terms of management. (orig.)

  11. Reputation-controlled business process workflows

    OpenAIRE

    Aziz, Benjamin; Hamilton, G

    2013-01-01

    This paper presents a model solution for controlling the execution of BPEL business processes based on reputation constraints at the level of the services, the service providers and the BPEL workflow. The reputation constraints are expressed as part of a service level agreement and are then enforced at runtime by a reputation monitoring system. We use our model to demonstrate how trust requirements based on such reputation constraints can be upheld in a real world example of a distributed map...

  12. Quantifying nursing workflow in medication administration.

    Science.gov (United States)

    Keohane, Carol A; Bane, Anne D; Featherstone, Erica; Hayes, Judy; Woolf, Seth; Hurley, Ann; Bates, David W; Gandhi, Tejal K; Poon, Eric G

    2008-01-01

    New medication administration systems are showing promise in improving patient safety at the point of care, but adoption of these systems requires significant changes in nursing workflow. To prepare for these changes, the authors report on a time-motion study that measured the proportion of time that nurses spend on various patient care activities, focusing on medication administration-related activities. Implications of their findings are discussed.

  13. EDMS based workflow for Printing Industry

    OpenAIRE

    Prathap Nayak; Anuradha Rao; Ramakrishna Nayak

    2013-01-01

    Information is indispensable factor of any enterprise. It can be a record or a document generated for every transaction that is made, which is either a paper based or in electronic format for future reference. A Printing Industry is one such industry in which managing information of various formats, with latest workflows and technologies, could be a nightmare and a challenge for any operator or an user when each process from the least bit of information to a printed product are always depende...

  14. From chart tracking to workflow management.

    Science.gov (United States)

    Srinivasan, P; Vignes, G; Venable, C; Hazelwood, A; Cade, T

    1994-01-01

    The current interest in system-wide integration appears to be based on the assumption that an organization, by digitizing information and accepting a common standard for the exchange of such information, will improve the accessibility of this information and automatically experience benefits resulting from its more productive use. We do not dispute this reasoning, but assert that an organization's capacity for effective change is proportional to the understanding of the current structure among its personnel. Our workflow manager is based on the use of a Parameterized Petri Net (PPN) model which can be configured to represent an arbitrarily detailed picture of an organization. The PPN model can be animated to observe the model organization in action, and the results of the animation analyzed. This simulation is a dynamic ongoing process which changes with the system and allows members of the organization to pose "what if" questions as a means of exploring opportunities for change. We present, the "workflow management system" as the natural successor to the tracking program, incorporating modeling, scheduling, reactive planning, performance evaluation, and simulation. This workflow management system is more than adequate for meeting the needs of a paper chart tracking system, and, as the patient record is computerized, will serve as a planning and evaluation tool in converting the paper-based health information system into a computer-based system.

  15. IQ-Station: A Low Cost Portable Immersive Environment

    Energy Technology Data Exchange (ETDEWEB)

    Eric Whiting; Patrick O' Leary; William Sherman; Eric Wernert

    2010-11-01

    The emergence of inexpensive 3D TV’s, affordable input and rendering hardware and open-source software has created a yeasty atmosphere for the development of low-cost immersive environments (IE). A low cost IE system, or IQ-station, fashioned from commercial off the shelf technology (COTS), coupled with a targeted immersive application can be a viable laboratory instrument for enhancing scientific workflow for exploration and analysis. The use of an IQ-station in a laboratory setting also has the potential of quickening the adoption of a more sophisticated immersive environment as a critical enabler in modern scientific and engineering workflows. Prior work in immersive environments generally required either a head mounted display (HMD) system or a large projector-based implementation both of which have limitations in terms of cost, usability, or space requirements. The solution presented here provides an alternative platform providing a reasonable immersive experience that addresses those limitations. Our work brings together the needed hardware and software to create a fully integrated immersive display and interface system that can be readily deployed in laboratories and common workspaces. By doing so, it is now feasible for immersive technologies to be included in researchers’ day-to-day workflows. The IQ-Station sets the stage for much wider adoption of immersive environments outside the small communities of virtual reality centers.

  16. OpenGL Programming under Win32 Environment%Win32环境下的OpenGL程序设计

    Institute of Scientific and Technical Information of China (English)

    郑竞华

    2006-01-01

    OpenGL是目前公认的三维图形开发标准,在Win32环境下开发高质量高性能的三维图形应用程序,OpenGL是比较理想的选择.简要介绍Win32环境下,使用OpenGL进行图形操作的具体步骤,详细介绍Win32环境下进行OpenGL编程涉及的相关技术细节,如关联设备描述表和绘制描述表、图形显示与重绘、实时动画与交互操作等,并编写一个简单的动画例程,较全面地演示了Win32环境下的OpenGL编程方法.

  17. Environment

    DEFF Research Database (Denmark)

    Valentini, Chiara

    2017-01-01

    The term environment refers to the internal and external context in which organizations operate. For some scholars, environment is defined as an arrangement of political, economic, social and cultural factors existing in a given context that have an impact on organizational processes and structures....... For others, environment is a generic term describing a large variety of stakeholders and how these interact and act upon organizations. Organizations and their environment are mutually interdependent and organizational communications are highly affected by the environment. This entry examines the origin...... and development of organization-environment interdependence, the nature of the concept of environment and its relevance for communication scholarships and activities....

  18. Implementation of workflow engine technology to deliver basic clinical decision support functionality

    Directory of Open Access Journals (Sweden)

    Oberg Ryan

    2011-04-01

    Full Text Available Abstract Background Workflow engine technology represents a new class of software with the ability to graphically model step-based knowledge. We present application of this novel technology to the domain of clinical decision support. Successful implementation of decision support within an electronic health record (EHR remains an unsolved research challenge. Previous research efforts were mostly based on healthcare-specific representation standards and execution engines and did not reach wide adoption. We focus on two challenges in decision support systems: the ability to test decision logic on retrospective data prior prospective deployment and the challenge of user-friendly representation of clinical logic. Results We present our implementation of a workflow engine technology that addresses the two above-described challenges in delivering clinical decision support. Our system is based on a cross-industry standard of XML (extensible markup language process definition language (XPDL. The core components of the system are a workflow editor for modeling clinical scenarios and a workflow engine for execution of those scenarios. We demonstrate, with an open-source and publicly available workflow suite, that clinical decision support logic can be executed on retrospective data. The same flowchart-based representation can also function in a prospective mode where the system can be integrated with an EHR system and respond to real-time clinical events. We limit the scope of our implementation to decision support content generation (which can be EHR system vendor independent. We do not focus on supporting complex decision support content delivery mechanisms due to lack of standardization of EHR systems in this area. We present results of our evaluation of the flowchart-based graphical notation as well as architectural evaluation of our implementation using an established evaluation framework for clinical decision support architecture. Conclusions We

  19. Implementation of workflow engine technology to deliver basic clinical decision support functionality.

    Science.gov (United States)

    Huser, Vojtech; Rasmussen, Luke V; Oberg, Ryan; Starren, Justin B

    2011-04-10

    Workflow engine technology represents a new class of software with the ability to graphically model step-based knowledge. We present application of this novel technology to the domain of clinical decision support. Successful implementation of decision support within an electronic health record (EHR) remains an unsolved research challenge. Previous research efforts were mostly based on healthcare-specific representation standards and execution engines and did not reach wide adoption. We focus on two challenges in decision support systems: the ability to test decision logic on retrospective data prior prospective deployment and the challenge of user-friendly representation of clinical logic. We present our implementation of a workflow engine technology that addresses the two above-described challenges in delivering clinical decision support. Our system is based on a cross-industry standard of XML (extensible markup language) process definition language (XPDL). The core components of the system are a workflow editor for modeling clinical scenarios and a workflow engine for execution of those scenarios. We demonstrate, with an open-source and publicly available workflow suite, that clinical decision support logic can be executed on retrospective data. The same flowchart-based representation can also function in a prospective mode where the system can be integrated with an EHR system and respond to real-time clinical events. We limit the scope of our implementation to decision support content generation (which can be EHR system vendor independent). We do not focus on supporting complex decision support content delivery mechanisms due to lack of standardization of EHR systems in this area. We present results of our evaluation of the flowchart-based graphical notation as well as architectural evaluation of our implementation using an established evaluation framework for clinical decision support architecture. We describe an implementation of a free workflow technology

  20. Wireless remote control clinical image workflow: utilizing a PDA for offsite distribution

    Science.gov (United States)

    Liu, Brent J.; Documet, Luis; Documet, Jorge; Huang, H. K.; Muldoon, Jean

    2004-04-01

    Last year we presented in RSNA an application to perform wireless remote control of PACS image distribution utilizing a handheld device such as a Personal Digital Assistant (PDA). This paper describes the clinical experiences including workflow scenarios of implementing the PDA application to route exams from the clinical PACS archive server to various locations for offsite distribution of clinical PACS exams. By utilizing this remote control application, radiologists can manage image workflow distribution with a single wireless handheld device without impacting their clinical workflow on diagnostic PACS workstations. A PDA application was designed and developed to perform DICOM Query and C-Move requests by a physician from a clinical PACS Archive to a CD-burning device for automatic burning of PACS data for the distribution to offsite. In addition, it was also used for convenient routing of historical PACS exams to the local web server, local workstations, and teleradiology systems. The application was evaluated by radiologists as well as other clinical staff who need to distribute PACS exams to offsite referring physician"s offices and offsite radiologists. An application for image workflow management utilizing wireless technology was implemented in a clinical environment and evaluated. A PDA application was successfully utilized to perform DICOM Query and C-Move requests from the clinical PACS archive to various offsite exam distribution devices. Clinical staff can utilize the PDA to manage image workflow and PACS exam distribution conveniently for offsite consultations by referring physicians and radiologists. This solution allows the radiologist to expand their effectiveness in health care delivery both within the radiology department as well as offisite by improving their clinical workflow.

  1. Open intramedullary nailing for segmental long bone fractures: An effective alternative in a resource-restricted environment

    Directory of Open Access Journals (Sweden)

    Olasunkanmi M Babalola

    2016-01-01

    Full Text Available Background: Closed, locked intramedullary nailing has been accepted as the gold standard in the care of femoral fractures, with reported union rates as high as 98-100%. Closed, locked intramedullary nailing often requires expensive equipment which is a challenge in developing countries. Segmental long bone fractures are often a result of high-energy trauma and hence often associated with a lot of injuries to the surrounding soft tissues. This consequently results in higher rates of delayed or nonunion. This study was proposed to review the outcome of management of segmental fractures with locked intramedullary nails, using an open method of reduction. Methods: A retrospective analysis was made of data obtained from all segmental long bone fractures treated with intramedullary nailing over a 1-year period. Records were retrieved from the folders of patients operated on from January 2011 to December 2011. Patients were followed up for a minimum of 1 year after the surgery. Results: We managed a total of 12 segmental long bone fractures in 11 patients. Eight of the 12 fractures were femoral fractures and 10 of the fractures were closed fractures. All but one fracture (91.7% achieved union within 4 months with no major complications. Conclusions: Open method of locked intramedullary nailing achieves satisfactory results when used for the management of long bone fractures. The method can be used for segmental fractures of the humerus, femur, and tibia, with high union rates. This is particularly useful in low-income societies where the use of intraoperative imaging may be unavailable or unaffordable. It gives patients in such societies, a chance for comparable outcomes in terms of union rates as well as avoidance of major complications. Larger prospective studies will be necessary to conclusively validate the efficacy of this fixation method in this environment.

  2. The Distributed Workflow Management System--FlowAgent

    Institute of Scientific and Technical Information of China (English)

    王文军; 仲萃豪

    2000-01-01

    While mainframe or 2-tier client/server system have serious problems in flexibility and scalability for the large-scale business processes, 3-tier client/server architecture and object-oriented system modeling which construct business process on service components seem to bring software system some scalability. As enabling infrastructure for object-oriented methodology, distributed WFMS (Work-flow Management System) can flexibly describe business rules among autonomous 'service tasks', and support scalability of large-scale business process. But current distributed WFMS still have difficulty to manage a large number of distributed tasks, the 'multi-TaskDomain' architecture of FlowAgent will try to solve this problem, and bring a dynamic and distributed environment for task-scheduling.

  3. A Model of Workflow-oriented Attributed Based Access Control

    Directory of Open Access Journals (Sweden)

    Guoping Zhang

    2011-02-01

    Full Text Available the emergence of “Internet of Things” breaks previous traditional thinking, which integrates physical infrastructure and network infrastructure into unified infrastructure. There will be a lot of resources or information in IoT, so computing and processing of information is the core supporting of IoT. In this paper, we introduce “Service-Oriented Computing” to solve the problem where each device can offer its functionality as standard services. Here we mainly discuss the access control issue of service-oriented computing in Internet of Things. This paper puts forward a model of Workflow-oriented Attributed Based Access Control (WABAC, and design an access control framework based on WABAC model. The model grants permissions to subjects according to subject atttribute, resource attribute, environment attribute and current task, meeting access control request of SOC. Using the approach presented can effectively enhance the access control security for SOC applications, and prevent the abuse of subject permissions.

  4. Using Simulations to Integrate Technology into Health Care Aidesཿ Workflow

    Directory of Open Access Journals (Sweden)

    Sharla King

    2013-07-01

    Full Text Available Health care aides (HCAs are critical to home care, providing a range of services to people with chronic conditions, aging or are unable to care for themselves independently. The current HCA supply will not keep up with this increasing demand without fundamental changes in their work environment. One possible solution to some of the workflow challenges and workplace stress of HCAs is hand-held tablet technology. In order to introduce the use of tablets with HCAs, simulations were developed. Once an HCA was comfortable with the tablet, a simulated client was introduced. The HCA interacted with the simulated client and used the tablet applications to assist with providing care. After the simulations, the HCAs participated in a focus group. HCAs completed a survey before and after the tablet training and simulation to determine their perception and acceptance of the tablet. Future deployment and implementation of technologies in home care should be further evaluated for outcomes.

  5. A Low-Cost Rescheduling Policy for Efficient Mapping of Workflows on Grid Systems

    Directory of Open Access Journals (Sweden)

    Rizos Sakellariou

    2004-01-01

    Full Text Available Workflow management is emerging as an important service in Grid computing. A simple model that can be used for the representation of certain workflows is a directed acyclic graph. Although many heuristics have been proposed to schedule such graphs on heterogeneous environments, most of them assume accurate prediction of computation and communication costs. This limits their direct applicability to a dynamically changing environment, such as the Grid. In this environment, an initial schedule may be built based on estimates, but run-time rescheduling may be needed to improve application performance. This paper presents a low-cost rescheduling policy, which considers rescheduling at a few, carefully selected points during the execution. This policy achieves performance results, which are comparable with those achieved by a policy that dynamically attempts to reschedule before the execution of every task.

  6. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronnie S:; van der Aalst, Wil M.P.; Bakker, Piet J.M.;

    2007-01-01

    Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where an Execu......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where...... care process of the Academic Medical Center (AMC) hospital is used as reference process. The process consists of hundreds of activities. These have been modeled and analyzed using an EUC and a CWN. Moreover, based on the CWN, the process has been implemented using four different workflow systems...

  7. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology.

    Science.gov (United States)

    Cock, Peter J A; Grüning, Björn A; Paszkiewicz, Konrad; Pritchard, Leighton

    2013-01-01

    The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of "effector" proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen's predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu).

  8. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology

    Directory of Open Access Journals (Sweden)

    Peter J.A. Cock

    2013-09-01

    Full Text Available The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of “effector” proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen’s predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology.This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols.The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu.

  9. Scientific Workflows and the Sensor Web for Virtual Environmental Observatories

    Science.gov (United States)

    Simonis, I.; Vahed, A.

    2008-12-01

    Virtual observatories mature from their original domain and become common practice for earth observation research and policy building. The term Virtual Observatory originally came from the astronomical research community. Here, virtual observatories provide universal access to the available astronomical data archives of space and ground-based observatories. Further on, as those virtual observatories aim at integrating heterogeneous ressources provided by a number of participating organizations, the virtual observatory acts as a coordinating entity that strives for common data analysis techniques and tools based on common standards. The Sensor Web is on its way to become one of the major virtual observatories outside of the astronomical research community. Like the original observatory that consists of a number of telescopes, each observing a specific part of the wave spectrum and with a collection of astronomical instruments, the Sensor Web provides a multi-eyes perspective on the current, past, as well as future situation of our planet and its surrounding spheres. The current view of the Sensor Web is that of a single worldwide collaborative, coherent, consistent and consolidated sensor data collection, fusion and distribution system. The Sensor Web can perform as an extensive monitoring and sensing system that provides timely, comprehensive, continuous and multi-mode observations. This technology is key to monitoring and understanding our natural environment, including key areas such as climate change, biodiversity, or natural disasters on local, regional, and global scales. The Sensor Web concept has been well established with ongoing global research and deployment of Sensor Web middleware and standards and represents the foundation layer of systems like the Global Earth Observation System of Systems (GEOSS). The Sensor Web consists of a huge variety of physical and virtual sensors as well as observational data, made available on the Internet at standardized

  10. Elastic Spatial Query Processing in OpenStack Cloud Computing Environment for Time-Constraint Data Analysis

    Directory of Open Access Journals (Sweden)

    Wei Huang

    2017-03-01

    Full Text Available Geospatial big data analysis (GBDA is extremely significant for time-constraint applications such as disaster response. However, the time-constraint analysis is not yet a trivial task in the cloud computing environment. Spatial query processing (SQP is typical computation-intensive and indispensable for GBDA, and the spatial range query, join query, and the nearest neighbor query algorithms are not scalable without using MapReduce-liked frameworks. Parallel SQP algorithms (PSQPAs are trapped in screw-processing, which is a known issue in Geoscience. To satisfy time-constrained GBDA, we propose an elastic SQP approach in this paper. First, Spark is used to implement PSQPAs. Second, Kubernetes-managed Core Operation System (CoreOS clusters provide self-healing Docker containers for running Spark clusters in the cloud. Spark-based PSQPAs are submitted to Docker containers, where Spark master instances reside. Finally, the horizontal pod auto-scaler (HPA would scale-out and scale-in Docker containers for supporting on-demand computing resources. Combined with an auto-scaling group of virtual instances, HPA helps to find each of the five nearest neighbors for 46,139,532 query objects from 834,158 spatial data objects in less than 300 s. The experiments conducted on an OpenStack cloud demonstrate that auto-scaling containers can satisfy time-constraint GBDA in clouds.

  11. PGen: large-scale genomic variations analysis workflow and browser in SoyKB.

    Science.gov (United States)

    Liu, Yang; Khan, Saad M; Wang, Juexin; Rynge, Mats; Zhang, Yuanxun; Zeng, Shuai; Chen, Shiyuan; Maldonado Dos Santos, Joao V; Valliyodan, Babu; Calyam, Prasad P; Merchant, Nirav; Nguyen, Henry T; Xu, Dong; Joshi, Trupti

    2016-10-06

    With the advances in next-generation sequencing (NGS) technology and significant reductions in sequencing costs, it is now possible to sequence large collections of germplasm in crops for detecting genome-scale genetic variations and to apply the knowledge towards improvements in traits. To efficiently facilitate large-scale NGS resequencing data analysis of genomic variations, we have developed "PGen", an integrated and optimized workflow using the Extreme Science and Engineering Discovery Environment (XSEDE) high-performance computing (HPC) virtual system, iPlant cloud data storage resources and Pegasus workflow management system (Pegasus-WMS). The workflow allows users to identify single nucleotide polymorphisms (SNPs) and insertion-deletions (indels), perform SNP annotations and conduct copy number variation analyses on multiple resequencing datasets in a user-friendly and seamless way. We have developed both a Linux version in GitHub ( https://github.com/pegasus-isi/PGen-GenomicVariations-Workflow ) and a web-based implementation of the PGen workflow integrated within the Soybean Knowledge Base (SoyKB), ( http://soykb.org/Pegasus/index.php ). Using PGen, we identified 10,218,140 single-nucleotide polymorphisms (SNPs) and 1,398,982 indels from analysis of 106 soybean lines sequenced at 15X coverage. 297,245 non-synonymous SNPs and 3330 copy number variation (CNV) regions were identified from this analysis. SNPs identified using PGen from additional soybean resequencing projects adding to 500+ soybean germplasm lines in total have been integrated. These SNPs are being utilized for trait improvement using genotype to phenotype prediction approaches developed in-house. In order to browse and access NGS data easily, we have also developed an NGS resequencing data browser ( http://soykb.org/NGS_Resequence/NGS_index.php ) within SoyKB to provide easy access to SNP and downstream analysis results for soybean researchers. PGen workflow has been optimized for the most

  12. Integrated Development Environment Gesture for modeling workflow diagrams

    CERN Document Server

    Fernandez-y-Fernandez, Carlos Alberto

    2012-01-01

    The current software development tools show the same form of interaction as when they started back, in the mid 70's. However, since the appearance of visual languages and due to their own nature, they can be handled by tools which have different input methods to conventional ones. By incorporating new motion detection technology, it is intended that new forms of interaction are established. Interactions which respond to the free movement of hands, therefore the software's developer will have a substantial improvement in the user experience.

  13. Open visitation and nurse job satisfaction: An integrative review.

    Science.gov (United States)

    Monroe, Marissa; Wofford, Linda

    2017-06-15

    To explore the effect of open visitation on critical care nurse job satisfaction. Open visitation has many benefits for ICU patients and families. However, ICU open visitation also affects nurses' work environment. Because nurse satisfaction is crucial to overall patient satisfaction, nurses' perceptions, experiences and opinions about open visitation should be considered. A literature search was performed through CINAHL Complete, MEDLINE Complete, PubMed, ScienceDirect, Academic Search Premier and PsychINFO. Key terms, inclusion criteria and exclusion criteria were applied. Duplicates and articles focusing on family presence during end of life care were excluded. Articles were evaluated for data quality, resulting in 14 articles. Family presence can negatively affect nurses' workflow and environment. Nurses report a loss of control, interruptions in care and increased workloads with open visitation. Therefore, nurses prefer restricted visitation. The literature evidence encourages visitation policies that support nurses in managing the additional work and stress of meeting patient and family needs. Implementation of evidence-based strategies to support nursing staff in stressful ICU environments can improve job satisfaction. The conclusions of this integrative review can be utilized by hospitals considering or implementing ICU open visitation. © 2017 John Wiley & Sons Ltd.

  14. Open Plot Project: an open-source toolkit for 3-D structural data analysis

    Directory of Open Access Journals (Sweden)

    S. Tavani

    2010-12-01

    Full Text Available In this work we present the Open Plot Project, a software for structural data analysis including a 3-D environment. This first alpha release represents a stand-alone toolkit for structural data analysis and, due to many import/export facilities and to the presence of a 3-D environment, also candidates as a tool to be incorporated in workflows for 3-D geological modelling.

    The software (for both Windows and Linux O.S., the User Manual, a set of example movies, and the source code are provided as Supplement. It is our purpose that the publication of the source code sets the base for the development of a public and free software that, hopefully, the structural geologists community will use, modify, and implement. The creation of additional public controls/tools is strongly encouraged.

  15. Wildfire: distributed, Grid-enabled workflow construction and execution

    Directory of Open Access Journals (Sweden)

    Issac Praveen

    2005-03-01

    Full Text Available Abstract Background We observe two trends in bioinformatics: (i analyses are increasing in complexity, often requiring several applications to be run as a workflow; and (ii multiple CPU clusters and Grids are available to more scientists. The traditional solution to the problem of running workflows across multiple CPUs required programming, often in a scripting language such as perl. Programming places such solutions beyond the reach of many bioinformatics consumers. Results We present Wildfire, a graphical user interface for constructing and running workflows. Wildfire borrows user interface features from Jemboss and adds a drag-and-drop interface allowing the user to compose EMBOSS (and other programs into workflows. For execution, Wildfire uses GEL, the underlying workflow execution engine, which can exploit available parallelism on multiple CPU machines including Beowulf-class clusters and Grids. Conclusion Wildfire simplifies the tasks of constructing and executing bioinformatics workflows.

  16. Integrated Cloud-Based Services for Medical Workflow Systems

    Directory of Open Access Journals (Sweden)

    Gharbi Nada

    2016-12-01

    Full Text Available Recent years have witnessed significant progress of workflow systems in different business areas. However, in the medical domain, the workflow systems are comparatively scarcely researched. In the medical domain, the workflows are as important as in other areas. In fact, the flow of information in the healthcare industry is even more critical than it is in other industries. Workflow can provide a new way of looking at how processes and procedures are completed in particular medical systems, and it can help improve the decision-making in these systems. Despite potential capabilities of workflow systems, medical systems still often perceive critical challenges in maintaining patient medical information that results in the difficulties in accessing patient data by different systems. In this paper, a new cloud-based service-oriented architecture is proposed. This architecture will support a medical workflow system integrated with cloud services aligned with medical standards to improve the healthcare system.

  17. Create, run, share, publish, and reference your LC-MS, FIA-MS, GC-MS, and NMR data analysis workflows with the Workflow4Metabolomics 3.0 Galaxy online infrastructure for metabolomics.

    Science.gov (United States)

    Guitton, Yann; Tremblay-Franco, Marie; Le Corguillé, Gildas; Martin, Jean-François; Pétéra, Mélanie; Roger-Mele, Pierrick; Delabrière, Alexis; Goulitquer, Sophie; Monsoor, Misharl; Duperier, Christophe; Canlet, Cécile; Servien, Rémi; Tardivel, Patrick; Caron, Christophe; Giacomoni, Franck; Thévenot, Etienne A

    2017-07-12

    Metabolomics is a key approach in modern functional genomics and systems biology. Due to the complexity of metabolomics data, the variety of experimental designs, and the multiplicity of bioinformatics tools, providing experimenters with a simple and efficient resource to conduct comprehensive and rigorous analysis of their data is of utmost importance. In 2014, we launched the Workflow4Metabolomics (W4M; http://workflow4metabolomics.org) online infrastructure for metabolomics built on the Galaxy environment, which offers user-friendly features to build and run data analysis workflows including preprocessing, statistical analysis, and annotation steps. Here we present the new W4M 3.0 release, which contains twice as many tools as the first version, and provides two features which are, to our knowledge, unique among online resources. First, data from the four major metabolomics technologies (i.e., LC-MS, FIA-MS, GC-MS, and NMR) can be analyzed on a single platform. By using three studies in human physiology, alga evolution, and animal toxicology, we demonstrate how the 40 available tools can be easily combined to address biological issues. Second, the full analysis (including the workflow, the parameter values, the input data and output results) can be referenced with a permanent digital object identifier (DOI). Publication of data analyses is of major importance for robust and reproducible science. Furthermore, the publicly shared workflows are of high-value for e-learning and training. The Workflow4Metabolomics 3.0 e-infrastructure thus not only offers a unique online environment for analysis of data from the main metabolomics technologies, but it is also the first reference repository for metabolomics workflows. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. PRODUCT-ORIENTED WORKFLOW MANAGEMENT IN CAPP

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    A product-oriented process workflow management model is proposed based on the multi-agent technology.The autonomy, inter-operability, scalability and flexibility of agent are used to cooperate the whole process planning andachieve the full share of resource and information. Thus, unnecessary waste of human labor, time and work is reducedand the computer-aided process planning (CAPP) system's adaptability and stability are improved. In the detailed im-plementation, according to the products' BOM (Bill of materials) in structural design, the task assignment, managementcontrol, automatic process making, process examination and process sanction are combined into a unified management tomake it convenient for the adjustment, control and management.

  19. Electronic resource management systems a workflow approach

    CERN Document Server

    Anderson, Elsa K

    2014-01-01

    To get to the bottom of a successful approach to Electronic Resource Management (ERM), Anderson interviewed staff at 11 institutions about their ERM implementations. Among her conclusions, presented in this issue of Library Technology Reports, is that grasping the intricacies of your workflow-analyzing each step to reveal the gaps and problems-at the beginning is crucial to selecting and implementing an ERM. Whether the system will be used to fill a gap, aggregate critical data, or replace a tedious manual process, the best solution for your library depends on factors such as your current soft

  20. Research on an Integrated Enterprise Workflow Model

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    An integrated enterprise workflow model called PPROCE is presented firstly. Then, an enterprise's ontology established by TOVE and Process Specification Language (PSL) is studied. Combined with TOVE's partition idea, PSL is extended and new PSL Extensions is created to define the ontology of process, organization, resource and product in the PPROCE model. As a result, PPROCE model can be defined by a set of corresponding formal language. It facilitates the future work not only in the model verification, model optimization and model simulation, but also in the model translation.

  1. CMS data and workflow management system

    CERN Document Server

    Fanfani, A; Bacchi, W; Codispoti, G; De Filippis, N; Pompili, A; My, S; Abbrescia, M; Maggi, G; Donvito, G; Silvestris, L; Calzolari, F; Sarkar, S; Spiga, D; Cinquili, M; Lacaprara, S; Biasotto, M; Farina, F; Merlo, M; Belforte, S; Kavka, C; Sala, L; Harvey, J; Hufnagel, D; Fanzago, F; Corvo, M; Magini, N; Rehn, J; Toteva, Z; Feichtinger, D; Tuura, L; Eulisse, G; Bockelman, B; Lundstedt, C; Egeland, R; Evans, D; Mason, D; Gutsche, O; Sexton-Kennedy, L; Dagenhart, D W; Afaq, A; Guo, Y; Kosyakov, S; Lueking, L; Sekhri, V; Fisk, I; McBride, P; Bauerdick, L; Bakken, J; Rossman, P; Wicklund, E; Wu, Y; Jones, C; Kuznetsov, V; Riley, D; Dolgert, A; van Lingen, F; Narsky, I; Paus, C; Klute, M; Gomez-Ceballos, G; Piedra-Gomez, J; Miller, M; Mohapatra, A; Lazaridis, C; Bradley, D; Elmer, P; Wildish, T; Wuerthwein, F; Letts, J; Bourilkov, D; Kim, B; Smith, P; Hernandez, J M; Caballero, J; Delgado, A; Flix, J; Cabrillo-Bartolome, I; Kasemann, M; Flossdorf, A; Stadie, H; Kreuzer, P; Khomitch, A; Hof, C; Zeidler, C; Kalini, S; Trunov, A; Saout, C; Felzmann, U; Metson, S; Newbold, D; Geddes, N; Brew, C; Jackson, J; Wakefield, S; De Weirdt, S; Adler, V; Maes, J; Van Mulders, P; Villella, I; Hammad, G; Pukhaeva, N; Kurca, T; Semneniouk, I; Guan, W; Lajas, J A; Teodoro, D; Gregores, E; Baquero, M; Shehzad, A; Kadastik, M; Kodolova, O; Chao, Y; Ming Kuo, C; Filippidis, C; Walzel, G; Han, D; Kalinowski, A; Giro de Almeida, N M; Panyam, N

    CMS expects to manage many tens of peta bytes of data to be distributed over several computing centers around the world. The CMS distributed computing and analysis model is designed to serve, process and archive the large number of events that will be generated when the CMS detector starts taking data. The underlying concepts and the overall architecture of the CMS data and workflow management system will be presented. In addition the experience in using the system for MC production, initial detector commissioning activities and data analysis will be summarized.

  2. Workflow-based Context-aware Control of Surgical Robots

    OpenAIRE

    Beyl, Tim

    2015-01-01

    Surgical assistance system such as medical robots enhanced the capabilities of medical procedures in the last decades. This work presents a new perspective on the use of workflows with surgical robots in order to improve the technical capabilities and the ease of use of such systems. This is accomplished by a 3D perception system for the supervision of the surgical operating room and a workflow-based controller, that allows to monitor the surgical process using workflow-tracking techniques.

  3. Teaching, Doing, and Sharing Project Management in a Studio Environment: The Development of an Instructional Design Open-Source Project Management Textbook

    Science.gov (United States)

    Randall, Daniel L.; Johnson, Jacquelyn C.; West, Richard E.; Wiley, David A.

    2013-01-01

    In this article, the authors present an example of a project-based course within a studio environment that taught collaborative innovation skills and produced an open-source project management textbook for the field of instructional design and technology. While innovation plays an important role in our economy, and many have studied how to teach…

  4. Teaching, Doing, and Sharing Project Management in a Studio Environment: The Development of an Instructional Design Open-Source Project Management Textbook

    Science.gov (United States)

    Randall, Daniel L.; Johnson, Jacquelyn C.; West, Richard E.; Wiley, David A.

    2013-01-01

    In this article, the authors present an example of a project-based course within a studio environment that taught collaborative innovation skills and produced an open-source project management textbook for the field of instructional design and technology. While innovation plays an important role in our economy, and many have studied how to teach…

  5. Rethinking Distance Tutoring in E-Learning Environments: A Study of the Priority of Roles and Competencies of Open University Tutors in China

    Science.gov (United States)

    Li, Shuang; Zhang, Jingjing; Yu, Chen; Chen, Li

    2017-01-01

    This study aims to identify the priority of the roles and competencies of tutors working in the e-learning environments where the tutors are experiencing the changes brought by reforming traditional TV and broadcasting university to open universities. The mixed methods, DACUM, non-participatory observation, and questionnaires were used to identify…

  6. A Framework System for Intelligent Support in Open Distributed Learning Environments--A Look Back from 16 Years Later

    Science.gov (United States)

    Hoppe, H. Ulrich

    2016-01-01

    The 1998 paper by Martin Mühlenbrock, Frank Tewissen, and myself introduced a multi-agent architecture and a component engineering approach for building open distributed learning environments to support group learning in different types of classroom settings. It took up prior work on "multiple student modeling" as a method to configure…

  7. Training Language Teachers to Sustain Self-Directed Language Learning: An Exploration of Advisers' Experiences on a Web-Based Open Virtual Learning Environment

    Science.gov (United States)

    Bailly, Sophie; Ciekanski, Maud; Guély-Costa, Eglantine

    2013-01-01

    This article describes the rationale for pedagogical, technological and organizational choices in the design of a web-based and open virtual learning environment (VLE) promoting and sustaining self-directed language learning. Based on the last forty years of research on learner autonomy at the CRAPEL according to Holec's definition (1988), we…

  8. Automation of lidar-based hydrologic feature extraction workflows using GIS

    Science.gov (United States)

    Borlongan, Noel Jerome B.; de la Cruz, Roel M.; Olfindo, Nestor T.; Perez, Anjillyn Mae C.

    2016-10-01

    With the advent of LiDAR technology, higher resolution datasets become available for use in different remote sensing and GIS applications. One significant application of LiDAR datasets in the Philippines is in resource features extraction. Feature extraction using LiDAR datasets require complex and repetitive workflows which can take a lot of time for researchers through manual execution and supervision. The Development of the Philippine Hydrologic Dataset for Watersheds from LiDAR Surveys (PHD), a project under the Nationwide Detailed Resources Assessment Using LiDAR (Phil-LiDAR 2) program, created a set of scripts, the PHD Toolkit, to automate its processes and workflows necessary for hydrologic features extraction specifically Streams and Drainages, Irrigation Network, and Inland Wetlands, using LiDAR Datasets. These scripts are created in Python and can be added in the ArcGIS® environment as a toolbox. The toolkit is currently being used as an aid for the researchers in hydrologic feature extraction by simplifying the workflows, eliminating human errors when providing the inputs, and providing quick and easy-to-use tools for repetitive tasks. This paper discusses the actual implementation of different workflows developed by Phil-LiDAR 2 Project 4 in Streams, Irrigation Network and Inland Wetlands extraction.

  9. Beginning WF Windows Workflow in .NET 4.0

    CERN Document Server

    Collins, M

    2010-01-01

    Windows Workflow Foundation is a ground-breaking addition to the core of the .NET Framework that allows you to orchestrate human and system interactions as a series of workflows that can be easily mapped, analyzed, adjusted, and implemented. As business problems become more complex, the need for a workflow-based solution has never been more evident. WF provides a simple and consistent way to model and implement complex problems. As a developer, you focus on developing the business logic for individual workflow tasks. The runtime handles the execution of those tasks after they have been compose

  10. Business and scientific workflows a web service-oriented approach

    CERN Document Server

    Tan, Wei

    2013-01-01

    Focuses on how to use web service computing and service-based workflow technologies to develop timely, effective workflows for both business and scientific fields Utilizing web computing and Service-Oriented Architecture (SOA), Business and Scientific Workflows: A Web Service-Oriented Approach focuses on how to design, analyze, and deploy web service-based workflows for both business and scientific applications in many areas of healthcare and biomedicine. It also discusses and presents the recent research and development results. This informative reference features app

  11. Design, Modelling and Analysis of a Workflow Reconfiguration

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Abouzaid, Faisal; Dragoni, Nicola

    2011-01-01

    This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design and to ve......This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design...

  12. Model Checking Workflow Net Based on Petri Net

    Institute of Scientific and Technical Information of China (English)

    ZHOU Conghua; CHEN Zhenyu

    2006-01-01

    The soundness is a very important criterion for the correctness of the workflow.Specifying the soundness with Computation Tree Logic (CTL) allows us to verify the soundness with symbolic model checkers.Therefore the state explosion problem in verifying soundness can be overcome efficiently.When the property is not satisfied by the system,model checking can give a counter-example, which can guide us to correct the workflow.In addition, relaxed soundness is another important criterion for the workflow.We also prove that Computation Tree Logic * (CTL * ) can be used to character the relaxed soundness of the workflow.

  13. Workflow Management for a Cosmology Collaboratory

    Institute of Scientific and Technical Information of China (English)

    StewartC.Loken; CharlesMcParland

    2001-01-01

    The Nearby Supernova Factory Project will provide a unique opportunity to bring together simulation and observation to address crucial problms in particle and nuclear physics.Itsd goal is to significantly enhance our understanding of the nuclear processes in supernovae and to improve our ability to use both Type Ia and Type II supernovae as reference light sources (standard candles)in precision measurements of cosmological parameters.Over the past several years,astronomers and astrophysicists have been conducting in-depth sky searches with the goal of identifying supernovae in their earliest evolutionary stages and,during the 4 to 8 weeks of their most"explosive~ activity,measure their changing magnitude and spectra.The search program currently under development at LBNL is an earth-based observation program utilizing observational instruments at Haleakala and MaunaKea,Hawaii and Mt.Palomar,California,This new program provides a demanding testbed for the integration of computational,data management and collaboratory technologies.A citical element of this effort is the use of emerging workflow management tools to permit collaborating scientists to manage data processing and storage and to integrate advanced supernova simulation into the real-time control of the experiments .This paper describes the workflow management framework for the project,discusses security and resource allocation requirements and reviews emerging tools to support this important aspect of collaborative work.

  14. The Prosthetic Workflow in the Digital Era

    Directory of Open Access Journals (Sweden)

    Lidia Tordiglione

    2016-01-01

    Full Text Available The purpose of this retrospective study was to clinically evaluate the benefits of adopting a full digital workflow for the implementation of fixed prosthetic restorations on natural teeth. To evaluate the effectiveness of these protocols, treatment plans were drawn up for 15 patients requiring rehabilitation of one or more natural teeth. All the dental impressions were taken using a Planmeca PlanScan® (Planmeca OY, Helsinki, Finland intraoral scanner, which provided digital casts on which the restorations were digitally designed using Exocad® (Exocad GmbH, Germany, 2010 software and fabricated by CAM processing on 5-axis milling machines. A total of 28 single crowns were made from monolithic zirconia, 12 vestibular veneers from lithium disilicate, and 4 three-quarter vestibular veneers with palatal extension. While the restorations were applied, the authors could clinically appreciate the excellent match between the digitally produced prosthetic design and the cemented prostheses, which never required any occlusal or proximal adjustment. Out of all the restorations applied, only one exhibited premature failure and was replaced with no other complications or need for further scanning. From the clinical experience gained using a full digital workflow, the authors can confirm that these work processes enable the fabrication of clinically reliable restorations, with all the benefits that digital methods bring to the dentist, the dental laboratory, and the patient.

  15. Scalable Scientific Workflows Management System SWFMS

    Directory of Open Access Journals (Sweden)

    M. Abdul Rahman

    2016-11-01

    Full Text Available In today’s electronic world conducting scientific experiments, especially in natural sciences domain, has become more and more challenging for domain scientists since “science” today has turned out to be more complex due to the two dimensional intricacy; one: assorted as well as complex computational (analytical applications and two: increasingly large volume as well as heterogeneity of scientific data products processed by these applications. Furthermore, the involvement of increasingly large number of scientific instruments such as sensors and machines makes the scientific data management even more challenging since the data generated from such type of instruments are highly complex. To reduce the amount of complexities in conducting scientific experiments as much as possible, an integrated framework that transparently implements the conceptual separation between both the dimensions is direly needed. In order to facilitate scientific experiments ‘workflow’ technology has in recent years emerged in scientific disciplines like biology, bioinformatics, geology, environmental science, and eco-informatics. Much more research work has been done to develop the scientific workflow systems. However, our analysis over these existing systems shows that they lack a well-structured conceptual modeling methodology to deal with the two complex dimensions in a transparent manner. This paper presents a scientific workflow framework that properly addresses these two dimensional complexities in a proper manner.

  16. The Prosthetic Workflow in the Digital Era

    Science.gov (United States)

    De Franco, Michele; Bosetti, Giovanni

    2016-01-01

    The purpose of this retrospective study was to clinically evaluate the benefits of adopting a full digital workflow for the implementation of fixed prosthetic restorations on natural teeth. To evaluate the effectiveness of these protocols, treatment plans were drawn up for 15 patients requiring rehabilitation of one or more natural teeth. All the dental impressions were taken using a Planmeca PlanScan® (Planmeca OY, Helsinki, Finland) intraoral scanner, which provided digital casts on which the restorations were digitally designed using Exocad® (Exocad GmbH, Germany, 2010) software and fabricated by CAM processing on 5-axis milling machines. A total of 28 single crowns were made from monolithic zirconia, 12 vestibular veneers from lithium disilicate, and 4 three-quarter vestibular veneers with palatal extension. While the restorations were applied, the authors could clinically appreciate the excellent match between the digitally produced prosthetic design and the cemented prostheses, which never required any occlusal or proximal adjustment. Out of all the restorations applied, only one exhibited premature failure and was replaced with no other complications or need for further scanning. From the clinical experience gained using a full digital workflow, the authors can confirm that these work processes enable the fabrication of clinically reliable restorations, with all the benefits that digital methods bring to the dentist, the dental laboratory, and the patient. PMID:27829834

  17. Workflow management for a cosmology collaboratory

    Energy Technology Data Exchange (ETDEWEB)

    Loken, Stewart C.; McParland, Charles

    2001-07-20

    The Nearby Supernova Factory Project will provide a unique opportunity to bring together simulation and observation to address crucial problems in particle and nuclear physics. Its goal is to significantly enhance our understanding of the nuclear processes in supernovae and to improve our ability to use both Type Ia and Type II supernovae as reference light sources (standard candles) in precision measurements of cosmological parameters. Over the past several years, astronomers and astrophysicists have been conducting in-depth sky searches with the goal of identifying supernovae in their earliest evolutionary stages and, during the 4 to 8 weeks of their most ''explosive'' activity, measure their changing magnitude and spectra. The search program currently under development at LBNL is an earth-based observation program utilizing observational instruments at Haleakala and Mauna Kea, Hawaii and Mt. Palomar, California. This new program provides a demanding testbed for the integration of computational, data management and collaboratory technologies. A critical element of this effort is the use of emerging workflow management tools to permit collaborating scientists to manage data processing and storage and to integrate advanced supernova simulation into the real-time control of the experiments. This paper describes the workflow management framework for the project, discusses security and resource allocation requirements and reviews emerging tools to support this important aspect of collaborative work.

  18. Optimizing CyberShake Seismic Hazard Workflows for Large HPC Resources

    Science.gov (United States)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2014-12-01

    The CyberShake computational platform is a well-integrated collection of scientific software and middleware that calculates 3D simulation-based probabilistic seismic hazard curves and hazard maps for the Los Angeles region. Currently each CyberShake model comprises about 235 million synthetic seismograms from about 415,000 rupture variations computed at 286 sites. CyberShake integrates large-scale parallel and high-throughput serial seismological research codes into a processing framework in which early stages produce files used as inputs by later stages. Scientific workflow tools are used to manage the jobs, data, and metadata. The Southern California Earthquake Center (SCEC) developed the CyberShake platform using USC High Performance Computing and Communications systems and open-science NSF resources.CyberShake calculations were migrated to the NSF Track 1 system NCSA Blue Waters when it became operational in 2013, via an interdisciplinary team approach including domain scientists, computer scientists, and middleware developers. Due to the excellent performance of Blue Waters and CyberShake software optimizations, we reduced the makespan (a measure of wallclock time-to-solution) of a CyberShake study from 1467 to 342 hours. We will describe the technical enhancements behind this improvement, including judicious introduction of new GPU software, improved scientific software components, increased workflow-based automation, and Blue Waters-specific workflow optimizations.Our CyberShake performance improvements highlight the benefits of scientific workflow tools. The CyberShake workflow software stack includes the Pegasus Workflow Management System (Pegasus-WMS, which includes Condor DAGMan), HTCondor, and Globus GRAM, with Pegasus-mpi-cluster managing the high-throughput tasks on the HPC resources. The workflow tools handle data management, automatically transferring about 13 TB back to SCEC storage.We will present performance metrics from the most recent Cyber

  19. geoKepler Workflow Module for Computationally Scalable and Reproducible Geoprocessing and Modeling

    Science.gov (United States)

    Cowart, C.; Block, J.; Crawl, D.; Graham, J.; Gupta, A.; Nguyen, M.; de Callafon, R.; Smarr, L.; Altintas, I.

    2015-12-01

    The NSF-funded WIFIRE project has developed an open-source, online geospatial workflow platform for unifying geoprocessing tools and models for for fire and other geospatially dependent modeling applications. It is a product of WIFIRE's objective to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. geoKepler includes a set of reusable GIS components, or actors, for the Kepler Scientific Workflow System (https://kepler-project.org). Actors exist for reading and writing GIS data in formats such as Shapefile, GeoJSON, KML, and using OGC web services such as WFS. The actors also allow for calling geoprocessing tools in other packages such as GDAL and GRASS. Kepler integrates functions from multiple platforms and file formats into one framework, thus enabling optimal GIS interoperability, model coupling, and scalability. Products of the GIS actors can be fed directly to models such as FARSITE and WRF. Kepler's ability to schedule and scale processes using Hadoop and Spark also makes geoprocessing ultimately extensible and computationally scalable. The reusable workflows in geoKepler can be made to run automatically when alerted by real-time environmental conditions. Here, we show breakthroughs in the speed of creating complex data for hazard assessments with this platform. We also demonstrate geoKepler workflows that use Data Assimilation to ingest real-time weather data into wildfire simulations, and for data mining techniques to gain insight into environmental conditions affecting fire behavior. Existing machine learning tools and libraries such as R and MLlib are being leveraged for this purpose in Kepler, as well as Kepler's Distributed Data Parallel (DDP) capability to provide a framework for scalable processing. geoKepler workflows can be executed via an iPython notebook as a part of a Jupyter hub at UC San Diego for sharing and reporting of the scientific analysis and results from

  20. Meeting the electronic health record "meaningful use" criterion for the HL7 infobutton standard using OpenInfobutton and the Librarian Infobutton Tailoring Environment (LITE).

    Science.gov (United States)

    Cimino, James J; Jing, Xia; Del Fiol, Guilherme

    2012-01-01

    Infobuttons are clinical decision support tools that use information about the clinical context (institution, user, patient) in which an information need arises to provide direct access to relevant information from knowledge resources. Two freely available resources make infobutton implementation possible for virtually any EHR system. OpenInfobutton is an HL7-compliant system that accepts context parameters from an EHR and, using its knowledge base of resources and information needs, generates a set of links that direct the user to relevant information. The Librarian Infobutton Tailoring Environment (LITE) is a second system that allows institutional librarians to specify which resources should be selected in a given context by OpenInfobutton. This paper describes the steps needed to use LITE to customize OpenInfobutton and to integrate OpenInfobutton into an EHR.

  1. The BioExtract Server: a web-based bioinformatic workflow platform.

    Science.gov (United States)

    Lushbough, Carol M; Jennewein, Douglas M; Brendel, Volker P

    2011-07-01

    The BioExtract Server (bioextract.org) is an open, web-based system designed to aid researchers in the analysis of genomic data by providing a platform for the creation of bioinformatic workflows. Scientific workflows are created within the system by recording tasks performed by the user. These tasks may include querying multiple, distributed data sources, saving query results as searchable data extracts, and executing local and web-accessible analytic tools. The series of recorded tasks can then be saved as a reproducible, sharable workflow available for subsequent execution with the original or modified inputs and parameter settings. Integrated data resources include interfaces to the National Center for Biotechnology Information (NCBI) nucleotide and protein databases, the European Molecular Biology Laboratory (EMBL-Bank) non-redundant nucleotide database, the Universal Protein Resource (UniProt), and the UniProt Reference Clusters (UniRef) database. The system offers access to numerous preinstalled, curated analytic tools and also provides researchers with the option of selecting computational tools from a large list of web services including the European Molecular Biology Open Software Suite (EMBOSS), BioMoby, and the Kyoto Encyclopedia of Genes and Genomes (KEGG). The system further allows users to integrate local command line tools residing on their own computers through a client-side Java applet.

  2. Von der Prozeßorientierung zum Workflow Management - Teil 2: Prozeßmanagement, Workflow Management, Workflow-Management-Systeme

    OpenAIRE

    Maurer, Gerd

    1996-01-01

    Die Begriffe Prozeßorientierung, Prozeßmanagement, Workflow Management und Workflow-Management-Systeme sind noch immer nicht klar definiert und voneinander abgegrenzt. Ausgehend von einem speziellen Verständnis der Prozeßorientierung (Arbeitspapier WI Nr. 9/1996) wird Prozeßmanagement als ein umfassender Ansatz zur prozeßorientierten Gestaltung und Führung von Unternehmen definiert. Das Workflow Management stellt die eher formale, stark DV-bezogene Komponente des Prozeßmanagements dar und bil...

  3. Appoication of IPL and OpenCV in the VisualC++Environment%IPL和OpenCV在VC++环境下的应用

    Institute of Scientific and Technical Information of China (English)

    吕学刚; 于明; 刘翠响

    2003-01-01

    本文介绍了可用于图像处理与计算机视觉编程的强大类库--IPL和OpenCV,对其主要数据结构进行了说明,讨论在VC++环境下调用相关函数的软件设置问题,最后给出了一个典型的图像采集与处理的实例.本文对于图像处理与计算机视觉方面的应用设计具有实用价值.

  4. Open Content for eLearning: Cross-Institutional Collaboration for Education and Training in a Digital Environment

    Science.gov (United States)

    Marshall, Stewart; Kinuthia, Wanjira; Richards, Griff

    2012-01-01

    The University of the West Indies Open Campus and Athabasca University conducted a pilot workshop to see if open educational resources (OER) could be used to construct curricula. UWIOC was interested in increasing distance education offerings and Athabasca University was interested in expanding programming to offer an online graduate program in…

  5. Cognitive Learning, Monitoring and Assistance of Industrial Workflows Using Egocentric Sensor Networks.

    Science.gov (United States)

    Bleser, Gabriele; Damen, Dima; Behera, Ardhendu; Hendeby, Gustaf; Mura, Katharina; Miezal, Markus; Gee, Andrew; Petersen, Nils; Maçães, Gustavo; Domingues, Hugo; Gorecky, Dominic; Almeida, Luis; Mayol-Cuevas, Walterio; Calway, Andrew; Cohn, Anthony G; Hogg, David C; Stricker, Didier

    2015-01-01

    Today, the workflows that are involved in industrial assembly and production activities are becoming increasingly complex. To efficiently and safely perform these workflows is demanding on the workers, in particular when it comes to infrequent or repetitive tasks. This burden on the workers can be eased by introducing smart assistance systems. This article presents a scalable concept and an integrated system demonstrator designed for this purpose. The basic idea is to learn workflows from observing multiple expert operators and then transfer the learnt workflow models to novice users. Being entirely learning-based, the proposed system can be applied to various tasks and domains. The above idea has been realized in a prototype, which combines components pushing the state of the art of hardware and software designed with interoperability in mind. The emphasis of this article is on the algorithms developed for the prototype: 1) fusion of inertial and visual sensor information from an on-body sensor network (BSN) to robustly track the user's pose in magnetically polluted environments; 2) learning-based computer vision algorithms to map the workspace, localize the sensor with respect to the workspace and capture objects, even as they are carried; 3) domain-independent and robust workflow recovery and monitoring algorithms based on spatiotemporal pairwise relations deduced from object and user movement with respect to the scene; and 4) context-sensitive augmented reality (AR) user feedback using a head-mounted display (HMD). A distinguishing key feature of the developed algorithms is that they all operate solely on data from the on-body sensor network and that no external instrumentation is needed. The feasibility of the chosen approach for the complete action-perception-feedback loop is demonstrated on three increasingly complex datasets representing manual industrial tasks. These limited size datasets indicate and highlight the potential of the chosen technology as a

  6. Cognitive Learning, Monitoring and Assistance of Industrial Workflows Using Egocentric Sensor Networks.

    Directory of Open Access Journals (Sweden)

    Gabriele Bleser

    Full Text Available Today, the workflows that are involved in industrial assembly and production activities are becoming increasingly complex. To efficiently and safely perform these workflows is demanding on the workers, in particular when it comes to infrequent or repetitive tasks. This burden on the workers can be eased by introducing smart assistance systems. This article presents a scalable concept and an integrated system demonstrator designed for this purpose. The basic idea is to learn workflows from observing multiple expert operators and then transfer the learnt workflow models to novice users. Being entirely learning-based, the proposed system can be applied to various tasks and domains. The above idea has been realized in a prototype, which combines components pushing the state of the art of hardware and software designed with interoperability in mind. The emphasis of this article is on the algorithms developed for the prototype: 1 fusion of inertial and visual sensor information from an on-body sensor network (BSN to robustly track the user's pose in magnetically polluted environments; 2 learning-based computer vision algorithms to map the workspace, localize the sensor with respect to the workspace and capture objects, even as they are carried; 3 domain-independent and robust workflow recovery and monitoring algorithms based on spatiotemporal pairwise relations deduced from object and user movement with respect to the scene; and 4 context-sensitive augmented reality (AR user feedback using a head-mounted display (HMD. A distinguishing key feature of the developed algorithms is that they all operate solely on data from the on-body sensor network and that no external instrumentation is needed. The feasibility of the chosen approach for the complete action-perception-feedback loop is demonstrated on three increasingly complex datasets representing manual industrial tasks. These limited size datasets indicate and highlight the potential of the chosen

  7. Chang'E-3 data pre-processing system based on scientific workflow

    Science.gov (United States)

    tan, xu; liu, jianjun; wang, yuanyuan; yan, wei; zhang, xiaoxia; li, chunlai

    2016-04-01

    The Chang'E-3(CE3) mission have obtained a huge amount of lunar scientific data. Data pre-processing is an important segment of CE3 ground research and application system. With a dramatic increase in the demand of data research and application, Chang'E-3 data pre-processing system(CEDPS) based on scientific workflow is proposed for the purpose of making scientists more flexible and productive by automating data-driven. The system should allow the planning, conduct and control of the data processing procedure with the following possibilities: • describe a data processing task, include:1)define input data/output data, 2)define the data relationship, 3)define the sequence of tasks,4)define the communication between tasks,5)define mathematical formula, 6)define the relationship between task and data. • automatic processing of tasks. Accordingly, Describing a task is the key point whether the system is flexible. We design a workflow designer which is a visual environment for capturing processes as workflows, the three-level model for the workflow designer is discussed:1) The data relationship is established through product tree.2)The process model is constructed based on directed acyclic graph(DAG). Especially, a set of process workflow constructs, including Sequence, Loop, Merge, Fork are compositional one with another.3)To reduce the modeling complexity of the mathematical formulas using DAG, semantic modeling based on MathML is approached. On top of that, we will present how processed the CE3 data with CEDPS.

  8. Making Sense of Complexity with FRE, a Scientific Workflow System for Climate Modeling (Invited)

    Science.gov (United States)

    Langenhorst, A. R.; Balaji, V.; Yakovlev, A.

    2010-12-01

    A workflow is a description of a sequence of activities that is both precise and comprehensive. Capturing the workflow of climate experiments provides a record which can be queried or compared, and allows reproducibility of the experiments - sometimes even to the bit level of the model output. This reproducibility helps to verify the integrity of the output data, and enables easy perturbation experiments. GFDL's Flexible Modeling System Runtime Environment (FRE) is a production-level software project which defines and implements building blocks of the workflow as command line tools. The scientific, numerical and technical input needed to complete the workflow of an experiment is recorded in an experiment description file in XML format. Several key features add convenience and automation to the FRE workflow: ● Experiment inheritance makes it possible to define a new experiment with only a reference to the parent experiment and the parameters to override. ● Testing is a basic element of the FRE workflow: experiments define short test runs which are verified before the main experiment is run, and a set of standard experiments are verified with new code releases. ● FRE is flexible enough to support short runs with mere megabytes of data, to high-resolution experiments that run on thousands of processors for months, producing terabytes of output data. Experiments run in segments of model time; after each segment, the state is saved and the model can be checkpointed at that level. Segment length is defined by the user, but the number of segments per system job is calculated to fit optimally in the batch scheduler requirements. FRE provides job control across multiple segments, and tools to monitor and alter the state of long-running experiments. ● Experiments are entered into a Curator Database, which stores query-able metadata about the experiment and the experiment's output. ● FRE includes a set of standardized post-processing functions as well as the ability

  9. From political opportunities to niche-openings: the dilemmas of mobilizing for immigrant rights in inhospitable environments

    NARCIS (Netherlands)

    Nicholls, W.J.

    2014-01-01

    This article examines how undocumented immigrants mobilize for greater rights in inhospitable political and discursive environments. We would expect that such environments would dissuade this particularly vulnerable group of immigrants from mobilizing in high profile campaigns because such campaigns

  10. Conceptual Framework and Architecture for Service Mediating Workflow Management

    NARCIS (Netherlands)

    Hu jinmin, Jinmin; Grefen, P.W.P.J.

    2003-01-01

    This paper proposes a three-layer workflow concept framework to realize workflow enactment flexibility by dynamically binding activities to their implementations at run time. A service mediating layer is added to bridge business process definition and its implementation. Based on this framework, we

  11. From Paper Based Clinical Practice Guidelines to Declarative Workflow Management

    DEFF Research Database (Denmark)

    Lyng, Karen Marie; Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2009-01-01

    a sub workflow can be described in a declarative workflow management system: the Resultmaker Online Consultant (ROC). The example demonstrates that declarative primitives allow to naturally extend the paper based flowchart to an executable model without introducing a complex cyclic control flow graph....

  12. Conceptual Framework and Architecture for Service Mediating Workflow Management

    NARCIS (Netherlands)

    Hu, Jinmin; Grefen, Paul

    2003-01-01

    This paper proposes a three-layer workflow concept framework to realize workflow enactment flexibility by dynamically binding activities to their implementations at run time. A service mediating layer is added to bridge business process definition and its implementation. Based on this framework, we

  13. Workflow automation based on OSI job transfer and manipulation

    NARCIS (Netherlands)

    Sinderen, van Marten J.; Joosten, Stef M.M.; Guareis de Farias, Clever R.

    1999-01-01

    This paper shows that Workflow Management Systems (WFMS) and a data communication standard called Job Transfer and Manipulation (JTM) are built on the same concepts, even though different words are used. The paper analyses the correspondence of workflow concepts and JTM concepts. Besides, the corres

  14. An architecture including network QoS in scientific workflows

    NARCIS (Netherlands)

    Zhao, Z.; Grosso, P.; Koning, R.; van der Ham, J.; de Laat, C.

    2010-01-01

    The quality of the network services has so far rarely been considered in composing and executing scientific workflows. Currently, scientific applications tune the execution quality of workflows neglecting network resources, and by selecting only optimal software services and computing resources. One

  15. Research of Web-based Workflow Management System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The state of the art of workflow management techniques in research is introduced. The research and development trends of Workflow Manag ement System (WFMS) are presented. On basis of analysis and comparison of kinds of WFMSs, a WFMS based on Web technology and distributed object management is pr oposed. Finally, the application of the WFMS in supply chain management is descr ibed in detail.

  16. Piloting an empirical study on measures for workflow similarity

    NARCIS (Netherlands)

    Wombacher, Andreas; Rozie, M.

    Service discovery of state dependent services has to take workflow aspects into account. To increase the usability of a service discovery, the result list of services should be ordered with regard to the relevance of the services. Means of ordering a list of workflows due to their similarity with

  17. Modelling and analysis of workflow for lean supply chains

    Science.gov (United States)

    Ma, Jinping; Wang, Kanliang; Xu, Lida

    2011-11-01

    Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.

  18. Design decisions in workflow management and quality of work.

    NARCIS (Netherlands)

    Waal, B.M.E. de; Batenburg, R.

    2009-01-01

    In this paper, the design and implementation of a workflow management (WFM) system in a large Dutch social insurance organisation is described. The effect of workflow design decisions on the quality of work is explored theoretically and empirically, using the model of Zur Mühlen as a frame of refere

  19. From Paper Based Clinical Practice Guidelines to Declarative Workflow Management

    DEFF Research Database (Denmark)

    Lyng, Karen Marie; Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2009-01-01

    a sub workflow can be described in a declarative workflow management system: the Resultmaker Online Consultant (ROC). The example demonstrates that declarative primitives allow to naturally extend the paper based flowchart to an executable model without introducing a complex cyclic control flow graph....

  20. Workflow automation based on OSI job transfer and manipulation

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Joosten, Stef M.M.; Guareis de farias, Cléver

    1999-01-01

    This paper shows that Workflow Management Systems (WFMS) and a data communication standard called Job Transfer and Manipulation (JTM) are built on the same concepts, even though different words are used. The paper analyses the correspondence of workflow concepts and JTM concepts. Besides, the

  1. GENII-LIN-2.1: an open source software system for calculating radiation dose and risk from radionuclides released to the environment.

    Science.gov (United States)

    Teodori, Francesco; Sumini, Marco

    2008-12-01

    GENII-LIN is an open source radiation protection environmental software system running on the Linux operating system. It has capabilities for calculating radiation dose and risk to individuals or populations from radionuclides released to the environment and from pre-existing environmental contamination. It can handle exposure pathways that include ingestion, inhalation and direct exposure to air, water and soil. The package is available for free and is completely open source, i.e., transparent to the users, who have full access to the source code of the software.

  2. Federated Database Services for Wind Tunnel Experiment Workflows

    Directory of Open Access Journals (Sweden)

    A. Paventhan

    2006-01-01

    Full Text Available Enabling the full life cycle of scientific and engineering workflows requires robust middleware and services that support effective data management, near-realtime data movement and custom data processing. Many existing solutions exploit the database as a passive metadata catalog. In this paper, we present an approach that makes use of federation of databases to host data-centric wind tunnel application workflows. The user is able to compose customized application workflows based on database services. We provide a reference implementation that leverages typical business tools and technologies: Microsoft SQL Server for database services and Windows Workflow Foundation for workflow services. The application data and user's code are both hosted in federated databases. With the growing interest in XML Web Services in scientific Grids, and with databases beginning to support native XML types and XML Web services, we can expect the role of databases in scientific computation to grow in importance.

  3. Implementing and Running a Workflow Application on Cloud Resources

    Directory of Open Access Journals (Sweden)

    Gabriela Andreea MORAR

    2011-01-01

    Full Text Available Scientist need to run applications that are time and resource consuming, but, not all of them, have the requires knowledge to run this applications in a parallel manner, by using grid, cluster or cloud resources. In the past few years many workflow building frameworks were developed in order to help scientist take a better advantage of computing resources, by designing workflows based on their applications and executing them on heterogeneous resources. This paper presents a case study of implementing and running a workflow for an E-bay data retrieval application. The workflow was designed using Askalon framework and executed on the cloud resources. The purpose of this paper is to demonstrate how workflows and cloud resources can be used by scientists in order to achieve speedup for their application without the need of spending large amounts of money on computational resources.

  4. A framework for interoperability of BPEL-based workflows

    Institute of Scientific and Technical Information of China (English)

    Li Xitong; Fan Yushun; Huang Shuangxi

    2008-01-01

    With the prevalence of service-oriented architecture (SOA), web services have become the dominating technology to construct workflow systems. As a workflow is the composition of a series of interrelated web services which realize its activities, the interoperability of workflows can be treated as the composition of web services. To address it, a framework for interoperability of business process execution language (BPEL)-based workflows is presented, which can perform three phases, that is, transformation, conformance test and execution. The core components of the framework are proposed, especially how these components promote interoperability. In particular, dynamic binding and re-composition of workflows in terms of web service testing are presented. Besides, an example of business-to-business (B2B) collaboration is provided to illustrate how to perform composition and conformance test.

  5. An Integrated Workflow for DNA Methylation Analysis

    Institute of Scientific and Technical Information of China (English)

    Pingchuan Li; Feray Demirci; Gayathri Mahalingam; Caghan Demirci; Mayumi Nakano; Blake C.Meyers

    2013-01-01

    The analysis of cytosine methylation provides a new way to assess and describe epigenetic regulation at a whole-genome level in many eukaryotes.DNA methylation has a demonstrated role in the genome stability and protection,regulation of gene expression and many other aspects of genome function and maintenance.BS-seq is a relatively unbiased method for profiling the DNA methylation,with a resolution capable of measuring methylation at individual cytosines.Here we describe,as an example,a workflow to handle DNA methylation analysis,from BS-seq library preparation to the data visualization.We describe some applications for the analysis and interpretation of these data.Our laboratory provides public access to plant DNA methylation data via visualization tools available at our "Next-Gen Sequence" websites (http://mpss.udel.edu),along with small RNA,RNA-seq and other data types.

  6. A Formal Model For Declarative Workflows

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao

    it as a general formal model for specification and execution of declarative, event-based business processes, as a generalization of a concurrency model, the classic event structures. The model allows for an intuitive operational semantics and mapping of execution state by a notion of markings of the graphs and we...... the declarative nature of the projected graphs (which are also DCR graphs). We have also provided semantics for distributed executions based on synchronous communication among network of projected graphs and proved that global and distributed executions are equivalent. Further, to support modeling of processes......Current business process technology is pretty good in supporting well-structured business processes and aim at achieving a fixed goal by carrying out an exact set of operations. In contrast, those exact operations needed to fulfill a business pro- cess/workflow may not be always possible to foresee...

  7. Distributed execution of aggregated multi domain workflows using an agent framework

    NARCIS (Netherlands)

    Z. Zhao; A. Belloum; C. de Laat; P. Adriaans; B. Hertzberger

    2007-01-01

    In e-Science, meaningful experiment processes and workflow engines emerge as important scientific resources. A complex experiment often involves services and processes developed in different scientific domains. Aggregating different workflows into one meta workflow avoids unnecessary rewriting of ex

  8. Clinical Simulation and Workflow by use of two Clinical Information Systems, the Electronic Health Record and Digital Dictation

    DEFF Research Database (Denmark)

    Schou Jensen, Iben; Koldby, Sven

    2013-01-01

    digital dictation and the EHR (electronic health record) were simulated in realistic and controlled clinical environments. Useful information dealing with workflow and patient safety were obtained. The clinical simulation demonstrated that the EHR locks during use of the integration of digital dictation...

  9. CaGrid workflow toolkit: A taverna based workflow tool for cancer grid

    OpenAIRE

    Sulakhe Dinanath; Soiland-Reyes Stian; Nenadic Alexandra; Madduri Ravi; Tan Wei; Foster Ian; Goble Carole A

    2010-01-01

    Abstract Background In biological and medical domain, the use of web services made the data and computation functionality accessible in a unified manner, which helped automate the data pipeline that was previously performed manually. Workflow technology is widely used in the orchestration of multiple services to facilitate in-silico research. Cancer Biomedical Informatics Grid (caBIG) is an information network enabling the sharing of cancer research related resources and caGrid is its underly...

  10. Open Journal Systems and Dataverse Integration– Helping Journals to Upgrade Data Publication for Reusable Research

    Directory of Open Access Journals (Sweden)

    Micah Altman

    2015-10-01

    Full Text Available This article describes the novel open source tools for open data publication in open access journal workflows. This comprises a plugin for Open Journal Systems that supports a data submission, citation, review, and publication workflow; and an extension to the Dataverse system that provides a standard deposit API. We describe the function and design of these tools, provide examples of their use, and summarize their initial reception. We conclude by discussing future plans and potential impact.

  11. Bioaerosol emissions from open microalgal processes and their potential environmental impacts: what can be learned from natural and anthropogenic aquatic environments?

    Science.gov (United States)

    Sialve, Bruno; Gales, Amandine; Hamelin, Jérôme; Wery, Nathalie; Steyer, Jean-Philippe

    2015-06-01

    Open processes for microalgae mass cultivation and/or wastewater treatment present an air-water interface. Similarly to other open air-aquatic environments, they are subject to contamination, but as such, they also represent a source of bioaerosols. Indeed, meteorological, physico-chemical and biological factors cause aerial dispersion of the planktonic community. Operating conditions like liquid mixing or gas injection tend to both enhance microbial activity, as well as intensify aerosolization. Bacteria, virus particles, fungi and protozoa, in addition to microalgae, are all transient or permanent members of the planktonic community and can thus be emitted as aerosols. If they should remain viable, subsequent deposition on various habitats could instigate their colonization of other environments and the potential expression of their ecological function.

  12. Automated Finite State Workflow for Distributed Data Production

    Science.gov (United States)

    Hajdu, L.; Didenko, L.; Lauret, J.; Amol, J.; Betts, W.; Jang, H. J.; Noh, S. Y.

    2016-10-01

    In statistically hungry science domains, data deluges can be both a blessing and a curse. They allow the narrowing of statistical errors from known measurements, and open the door to new scientific opportunities as research programs mature. They are also a testament to the efficiency of experimental operations. However, growing data samples may need to be processed with little or no opportunity for huge increases in computing capacity. A standard strategy has thus been to share resources across multiple experiments at a given facility. Another has been to use middleware that “glues” resources across the world so they are able to locally run the experimental software stack (either natively or virtually). We describe a framework STAR has successfully used to reconstruct a ~400 TB dataset consisting of over 100,000 jobs submitted to a remote site in Korea from STAR's Tier 0 facility at the Brookhaven National Laboratory. The framework automates the full workflow, taking raw data files from tape and writing Physics-ready output back to tape without operator or remote site intervention. Through hardening we have demonstrated 97(±2)% efficiency, over a period of 7 months of operation. The high efficiency is attributed to finite state checking with retries to encourage resilience in the system over capricious and fallible infrastructure.

  13. An Open-Control Concept for a Holonic Multiagent System

    Science.gov (United States)

    Adam, Emmanuel; Berger, Thierry; Sallez, Yves; Trentesaux, Damien

    MAS are particularly adapted to deal with distributed and dynamic environment. The management of business workflow, or data flow, flexible manufacturing systems is typically a good application field for them. This kind of application requires centralization of the data control and flexibility to face with changes on the network. In the context of FMS, where products and resources entities can be seen as active, this paper presents the open-control concept and gives an example of its instantiation with holonic scheme. The open-control concept proposed in this paper exhibits the classic explicit control, as well as an innovative type of control called implicit control that allows system entities to be influenced via an Optimization Mechanism (OM). We illustrate our proposition by an implementation on a flexible assembly cell in our university.

  14. Supporting the Construction of Workflows for Biodiversity Problem-Solving Accessing Secure, Distributed Resources

    Directory of Open Access Journals (Sweden)

    J.S. Pahwa

    2006-01-01

    Full Text Available In the Biodiversity World (BDW project we have created a flexible and extensible Web Services-based Grid environment for biodiversity researchers to solve problems in biodiversity and analyse biodiversity patterns. In this environment, heterogeneous and globally distributed biodiversity-related resources such as data sets and analytical tools are made available to be accessed and assembled by users into workflows to perform complex scientific experiments. One such experiment is bioclimatic modelling of the geographical distribution of individual species using climate variables in order to explain past and future climate-related changes in species distribution. Data sources and analytical tools required for such analysis of species distribution are widely dispersed, available on heterogeneous platforms, present data in different formats and lack inherent interoperability. The present BDW system brings all these disparate units together so that the user can combine tools with little thought as to their original availability, data formats and interoperability. The new prototype BDW system architecture not only brings together heterogeneous resources but also enables utilisation of computational resources and provides a secure access to BDW resources via a federated security model. We describe features of the new BDW system and its security model which enable user authentication from a workflow application as part of workflow execution.

  15. Adaptive workflow scheduling in grid computing based on dynamic resource availability

    Directory of Open Access Journals (Sweden)

    Ritu Garg

    2015-06-01

    Full Text Available Grid computing enables large-scale resource sharing and collaboration for solving advanced science and engineering applications. Central to the grid computing is the scheduling of application tasks to the resources. Various strategies have been proposed, including static and dynamic strategies. The former schedules the tasks to resources before the actual execution time and later schedules them at the time of execution. Static scheduling performs better but it is not suitable for dynamic grid environment. The lack of dedicated resources and variations in their availability at run time has made this scheduling a great challenge. In this study, we proposed the adaptive approach to schedule workflow tasks (dependent tasks to the dynamic grid resources based on rescheduling method. It deals with the heterogeneous dynamic grid environment, where the availability of computing nodes and links bandwidth fluctuations are inevitable due to existence of local load or load by other users. The proposed adaptive workflow scheduling (AWS approach involves initial static scheduling, resource monitoring and rescheduling with the aim to achieve the minimum execution time for workflow application. The approach differs from other techniques in literature as it considers the changes in resources (hosts and links availability and considers the impact of existing load over the grid resources. The simulation results using randomly generated task graphs and task graphs corresponding to real world problems (GE and FFT demonstrates that the proposed algorithm is able to deal with fluctuations of resource availability and provides overall optimal performance.

  16. From Design to Production Control Through the Integration of Engineering Data Management and Workflow Management Systems

    CERN Document Server

    Le Goff, J M; Bityukov, S; Estrella, F; Kovács, Z; Le Flour, T; Lieunard, S; McClatchey, R; Murray, S; Organtini, G; Vialle, J P; Bazan, A; Chevenier, G

    1997-01-01

    At a time when many companies are under pressure to reduce "times-to-market" the management of product information from the early stages of design through assembly to manufacture and production has become increasingly important. Similarly in the construction of high energy physics devices the collection of ( often evolving) engineering data is central to the subsequent physics analysis. Traditionally in industry design engineers have employed Engineering Data Management Systems ( also called Product Data Management Systems) to coordinate and control access to documented versions of product designs. However, these systems provide control only at the collaborative design level and are seldom used beyond design. Workflow management systems, on the other hand, are employed in industry to coordinate and support the more complex and repeatable work processes of the production environment. Commer cial workflow products cannot support the highly dynamic activities found both in the design stages of product developmen...

  17. Using Make for Reproducible and Parallel Neuroimaging Workflow and Quality Assurance

    Directory of Open Access Journals (Sweden)

    Mary K. Askren

    2016-02-01

    Full Text Available The contribution of this paper is to describe how we can program neuroimaging workflow using Make, a software development tool designed for describing how to build executables from source files. We show that we can achieve many of the features of more sophisticated neuroimaging pipeline systems, including reproducibility, parallelization, fault tolerance, and quality assurance reports. We suggest that Make represents a large step towards these features with only a modest increase in programming demands over shell scripts. This approach reduces the technical skill and time required to write, debug, and maintain neuroimaging workflows in a dynamic environment, where pipelines are often modified to accommodate new best practices or to study the effect of alternative preprocessing steps, and where the underlying packages change frequently. This paper has a comprehensive accompanying manual with lab practicals and examples (see Supplemental Materials and all data, scripts and makefiles necessary to run the practicals and examples are available in the makepipelines project at NITRC.

  18. From Data to Knowledge to Discoveries: Artificial Intelligence and Scientific Workflows

    Directory of Open Access Journals (Sweden)

    Yolanda Gil

    2009-01-01

    Full Text Available Scientific computing has entered a new era of scale and sharing with the arrival of cyberinfrastructure facilities for computational experimentation. A key emerging concept is scientific workflows, which provide a declarative representation of complex scientific applications that can be automatically managed and executed in distributed shared resources. In the coming decades, computational experimentation will push the boundaries of current cyberinfrastructure in terms of inter-disciplinary scope and integrative models of scientific phenomena under study. This paper argues that knowledge-rich workflow environments will provide necessary capabilities for that vision by assisting scientists to validate and vet complex analysis processes and by automating important aspects of scientific exploration and discovery.

  19. Optimising workflow in andrology: a new electronic patient record and database

    Institute of Scientific and Technical Information of China (English)

    Frank Tüttelmann; C. Marc Luetjens; Eberhard Nieschlag

    2006-01-01

    Aim: To improve workflow and usability by introduction of a new electronic patient record (EPR) and database.Methods: Establishment of an EPR based on open source technology (MySQL database and PHP scripting language)in a tertiary care andrology center at a university clinic. Workflow analysis, a benchmark comparing the two systems and a survey for usability and ergonomics were carried out, Results: Worlflow optimizations (electronic ordering of laboratory analysis, elimination of transcription steps and automated referral letters) and the decrease in time required for data entry per patient to 71% ± 27%, P<0.05, lead to a workload reduction. The benchmark showed a significant performance increase (highest with starting the respective system: 1.3 ± 0.2 s vs. 11.1 ± 0.2 s, mean ± SD). In the survey, users rated the new system at least two ranks higher over its predecessor (P<0.01) in all sub-areas.Conclusion: With further improvements, today's EPR can evolve to substitute paper records, saving time (and possibly costs), supporting user satisfaction and expanding the basis for scientific evaluation when more data is electronically available. Newly introduced systems should be versatile, adaptable for users, and workflow-oriented to yield the highest benefit. If ready-made software is purchased, customization should be implemented during rollout.

  20. SYRMEP Tomo Project: a graphical user interface for customizing CT reconstruction workflows.

    Science.gov (United States)

    Brun, Francesco; Massimi, Lorenzo; Fratini, Michela; Dreossi, Diego; Billé, Fulvio; Accardo, Agostino; Pugliese, Roberto; Cedola, Alessia

    2017-01-01

    When considering the acquisition of experimental synchrotron radiation (SR) X-ray CT data, the reconstruction workflow cannot be limited to the essential computational steps of flat fielding and filtered back projection (FBP). More refined image processing is often required, usually to compensate artifacts and enhance the quality of the reconstructed images. In principle, it would be desirable to optimize the reconstruction workflow at the facility during the experiment (beamtime). However, several practical factors affect the image reconstruction part of the experiment and users are likely to conclude the beamtime with sub-optimal reconstructed images. Through an example of application, this article presents SYRMEP Tomo Project (STP), an open-source software tool conceived to let users design custom CT reconstruction workflows. STP has been designed for post-beamtime (off-line use) and for a new reconstruction of past archived data at user's home institution where simple computing resources are available. Releases of the software can be downloaded at the Elettra Scientific Computing group GitHub repository https://github.com/ElettraSciComp/STP-Gui.