WorldWideScience

Sample records for modeling workflow including

  1. Data Workflow - A Workflow Model for Continuous Data Processing

    NARCIS (Netherlands)

    Wombacher, Andreas

    2010-01-01

    Online data or streaming data are getting more and more important for enterprise information systems, e.g. by integrating sensor data and workflows. The continuous flow of data provided e.g. by sensors requires new workflow models addressing the data perspective of these applications, since

  2. Workflow Fault Tree Generation Through Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2014-01-01

    We present a framework for the automated generation of fault trees from models of realworld process workflows, expressed in a formalised subset of the popular Business Process Modelling and Notation (BPMN) language. To capture uncertainty and unreliability in workflows, we extend this formalism...

  3. A Multi-Dimensional Classification Model for Scientific Workflow Characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Ramakrishnan, Lavanya; Plale, Beth

    2010-04-05

    Workflows have been used to model repeatable tasks or operations in manufacturing, business process, and software. In recent years, workflows are increasingly used for orchestration of science discovery tasks that use distributed resources and web services environments through resource models such as grid and cloud computing. Workflows have disparate re uirements and constraints that affects how they might be managed in distributed environments. In this paper, we present a multi-dimensional classification model illustrated by workflow examples obtained through a survey of scientists from different domains including bioinformatics and biomedical, weather and ocean modeling, astronomy detailing their data and computational requirements. The survey results and classification model contribute to the high level understandingof scientific workflows.

  4. Workflow Patterns for Business Process Modeling

    NARCIS (Netherlands)

    Thom, Lucineia Heloisa; Lochpe, Cirano; Reichert, M.U.

    For its reuse advantages, workflow patterns (e.g., control flow patterns, data patterns, resource patterns) are increasingly attracting the interest of both researchers and vendors. Frequently, business process or workflow models can be assembeled out of a set of recurrent process fragments (or

  5. Patient-centered care requires a patient-oriented workflow model.

    Science.gov (United States)

    Ozkaynak, Mustafa; Brennan, Patricia Flatley; Hanauer, David A; Johnson, Sharon; Aarts, Jos; Zheng, Kai; Haque, Saira N

    2013-06-01

    Effective design of health information technology (HIT) for patient-centered care requires consideration of workflow from the patient's perspective, termed 'patient-oriented workflow.' This approach organizes the building blocks of work around the patients who are moving through the care system. Patient-oriented workflow complements the more familiar clinician-oriented workflow approaches, and offers several advantages, including the ability to capture simultaneous, cooperative work, which is essential in care delivery. Patient-oriented workflow models can also provide an understanding of healthcare work taking place in various formal and informal health settings in an integrated manner. We present two cases demonstrating the potential value of patient-oriented workflow models. Significant theoretical, methodological, and practical challenges must be met to ensure adoption of patient-oriented workflow models. Patient-oriented workflow models define meaningful system boundaries and can lead to HIT implementations that are more consistent with cooperative work and its emergent features.

  6. Perti Net-Based Workflow Access Control Model

    Institute of Scientific and Technical Information of China (English)

    陈卓; 骆婷; 石磊; 洪帆

    2004-01-01

    Access control is an important protection mechanism for information systems. This paper shows how to make access control in workflow system. We give a workflow access control model (WACM) based on several current access control models. The model supports roles assignment and dynamic authorization. The paper defines the workflow using Petri net. It firstly gives the definition and description of the workflow, and then analyzes the architecture of the workflow access control model (WACM). Finally, an example of an e-commerce workflow access control model is discussed in detail.

  7. Ship Repair Workflow Cost Model

    National Research Council Canada - National Science Library

    McDevitt, Mike

    2003-01-01

    The effects of intermittent work patterns and funding on the costs of ship repair and maintenance were modeled for the San Diego region in 2002 for Supervisor of Shipbuilding and Repair (SUPSHIP) San Diego...

  8. Rapid Energy Modeling Workflow Demonstration

    Science.gov (United States)

    2013-10-31

    trail at AutodeskVasari.com Considered a lightweight version of Revit for energy modeling and analysis Many capabilities are in process of...Journal of Hospitality & Tourism Research 32(1):3-21. DOD (2005) Energy Managers Handbook. Retrieved from www.wbdg.org/ccb/DOD/DOD4/dodemhb.pdf

  9. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  10. Design, Modelling and Analysis of a Workflow Reconfiguration

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Abouzaid, Faisal; Dragoni, Nicola

    2011-01-01

    This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design and to ve......This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design...

  11. A Model of Workflow Composition for Emergency Management

    Science.gov (United States)

    Xin, Chen; Bin-ge, Cui; Feng, Zhang; Xue-hui, Xu; Shan-shan, Fu

    The common-used workflow technology is not flexible enough in dealing with concurrent emergency situations. The paper proposes a novel model for defining emergency plans, in which workflow segments appear as a constituent part. A formal abstraction, which contains four operations, is defined to compose workflow segments under constraint rule. The software system of the business process resources construction and composition is implemented and integrated into Emergency Plan Management Application System.

  12. A workflow learning model to improve geovisual analytics utility.

    Science.gov (United States)

    Roth, Robert E; Maceachren, Alan M; McCabe, Craig A

    2009-01-01

    INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on

  13. Modelling and analysis of workflow for lean supply chains

    Science.gov (United States)

    Ma, Jinping; Wang, Kanliang; Xu, Lida

    2011-11-01

    Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.

  14. Multi-perspective workflow modeling for online surgical situation models.

    Science.gov (United States)

    Franke, Stefan; Meixensberger, Jürgen; Neumuth, Thomas

    2015-04-01

    Surgical workflow management is expected to enable situation-aware adaptation and intelligent systems behavior in an integrated operating room (OR). The overall aim is to unburden the surgeon and OR staff from both manual maintenance and information seeking tasks. A major step toward intelligent systems behavior is a stable classification of the surgical situation from multiple perspectives based on performed low-level tasks. The present work proposes a method for the classification of surgical situations based on multi-perspective workflow modeling. A model network that interconnects different types of surgical process models is described. Various aspects of a surgical situation description were considered: low-level tasks, high-level tasks, patient status, and the use of medical devices. A study with sixty neurosurgical interventions was conducted to evaluate the performance of our approach and its robustness against incomplete workflow recognition input. A correct classification rate of over 90% was measured for high-level tasks and patient status. The device usage models for navigation and neurophysiology classified over 95% of the situations correctly, whereas the ultrasound usage was more difficult to predict. Overall, the classification rate decreased with an increasing level of input distortion. Autonomous adaptation of medical devices and intelligent systems behavior do not currently depend solely on low-level tasks. Instead, they require a more general type of understanding of the surgical condition. The integration of various surgical process models in a network provided a comprehensive representation of the interventions and allowed for the generation of extensive situation descriptions. Multi-perspective surgical workflow modeling and online situation models will be a significant pre-requisite for reliable and intelligent systems behavior. Hence, they will contribute to a cooperative OR environment. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Modeling Complex Workflow in Molecular Diagnostics

    Science.gov (United States)

    Gomah, Mohamed E.; Turley, James P.; Lu, Huimin; Jones, Dan

    2010-01-01

    One of the hurdles to achieving personalized medicine has been implementing the laboratory processes for performing and reporting complex molecular tests. The rapidly changing test rosters and complex analysis platforms in molecular diagnostics have meant that many clinical laboratories still use labor-intensive manual processing and testing without the level of automation seen in high-volume chemistry and hematology testing. We provide here a discussion of design requirements and the results of implementation of a suite of lab management tools that incorporate the many elements required for use of molecular diagnostics in personalized medicine, particularly in cancer. These applications provide the functionality required for sample accessioning and tracking, material generation, and testing that are particular to the evolving needs of individualized molecular diagnostics. On implementation, the applications described here resulted in improvements in the turn-around time for reporting of more complex molecular test sets, and significant changes in the workflow. Therefore, careful mapping of workflow can permit design of software applications that simplify even the complex demands of specialized molecular testing. By incorporating design features for order review, software tools can permit a more personalized approach to sample handling and test selection without compromising efficiency. PMID:20007844

  16. Web-video-mining-supported workflow modeling for laparoscopic surgeries.

    Science.gov (United States)

    Liu, Rui; Zhang, Xiaoli; Zhang, Hao

    2016-11-01

    As quality assurance is of strong concern in advanced surgeries, intelligent surgical systems are expected to have knowledge such as the knowledge of the surgical workflow model (SWM) to support their intuitive cooperation with surgeons. For generating a robust and reliable SWM, a large amount of training data is required. However, training data collected by physically recording surgery operations is often limited and data collection is time-consuming and labor-intensive, severely influencing knowledge scalability of the surgical systems. The objective of this research is to solve the knowledge scalability problem in surgical workflow modeling with a low cost and labor efficient way. A novel web-video-mining-supported surgical workflow modeling (webSWM) method is developed. A novel video quality analysis method based on topic analysis and sentiment analysis techniques is developed to select high-quality videos from abundant and noisy web videos. A statistical learning method is then used to build the workflow model based on the selected videos. To test the effectiveness of the webSWM method, 250 web videos were mined to generate a surgical workflow for the robotic cholecystectomy surgery. The generated workflow was evaluated by 4 web-retrieved videos and 4 operation-room-recorded videos, respectively. The evaluation results (video selection consistency n-index ≥0.60; surgical workflow matching degree ≥0.84) proved the effectiveness of the webSWM method in generating robust and reliable SWM knowledge by mining web videos. With the webSWM method, abundant web videos were selected and a reliable SWM was modeled in a short time with low labor cost. Satisfied performances in mining web videos and learning surgery-related knowledge show that the webSWM method is promising in scaling knowledge for intelligent surgical systems. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. LQCD workflow execution framework: Models, provenance and fault-tolerance

    International Nuclear Information System (INIS)

    Piccoli, Luciano; Simone, James N; Kowalkowlski, James B; Dubey, Abhishek

    2010-01-01

    Large computing clusters used for scientific processing suffer from systemic failures when operated over long continuous periods for executing workflows. Diagnosing job problems and faults leading to eventual failures in this complex environment is difficult, specifically when the success of an entire workflow might be affected by a single job failure. In this paper, we introduce a model-based, hierarchical, reliable execution framework that encompass workflow specification, data provenance, execution tracking and online monitoring of each workflow task, also referred to as participants. The sequence of participants is described in an abstract parameterized view, which is translated into a concrete data dependency based sequence of participants with defined arguments. As participants belonging to a workflow are mapped onto machines and executed, periodic and on-demand monitoring of vital health parameters on allocated nodes is enabled according to pre-specified rules. These rules specify conditions that must be true pre-execution, during execution and post-execution. Monitoring information for each participant is propagated upwards through the reflex and healing architecture, which consists of a hierarchical network of decentralized fault management entities, called reflex engines. They are instantiated as state machines or timed automatons that change state and initiate reflexive mitigation action(s) upon occurrence of certain faults. We describe how this cluster reliability framework is combined with the workflow execution framework using formal rules and actions specified within a structure of first order predicate logic that enables a dynamic management design that reduces manual administrative workload, and increases cluster-productivity.

  18. Task Delegation Based Access Control Models for Workflow Systems

    Science.gov (United States)

    Gaaloul, Khaled; Charoy, François

    e-Government organisations are facilitated and conducted using workflow management systems. Role-based access control (RBAC) is recognised as an efficient access control model for large organisations. The application of RBAC in workflow systems cannot, however, grant permissions to users dynamically while business processes are being executed. We currently observe a move away from predefined strict workflow modelling towards approaches supporting flexibility on the organisational level. One specific approach is that of task delegation. Task delegation is a mechanism that supports organisational flexibility, and ensures delegation of authority in access control systems. In this paper, we propose a Task-oriented Access Control (TAC) model based on RBAC to address these requirements. We aim to reason about task from organisational perspectives and resources perspectives to analyse and specify authorisation constraints. Moreover, we present a fine grained access control protocol to support delegation based on the TAC model.

  19. Automatic support for product based workflow design : generation of process models from a product data model

    NARCIS (Netherlands)

    Vanderfeesten, I.T.P.; Reijers, H.A.; Aalst, van der W.M.P.; Vogelaar, J.J.C.L.; Meersman, R.; Dillon, T.; Herrero, P.

    2010-01-01

    Product Based Workflow Design (PBWD) is one of the few scientific methodologies for the (re)design of workflow processes. It is based on an analysis of the product that is produced in the workflow process and derives a process model from the product structure. Until now this derivation has been a

  20. A Workflow-Oriented Approach To Propagation Models In Heliophysics

    Directory of Open Access Journals (Sweden)

    Gabriele Pierantoni

    2014-01-01

    Full Text Available The Sun is responsible for the eruption of billions of tons of plasma andthe generation of near light-speed particles that propagate throughout the solarsystem and beyond. If directed towards Earth, these events can be damaging toour tecnological infrastructure. Hence there is an effort to understand the causeof the eruptive events and how they propagate from Sun to Earth. However, thephysics governing their propagation is not well understood, so there is a need todevelop a theoretical description of their propagation, known as a PropagationModel, in order to predict when they may impact Earth. It is often difficultto define a single propagation model that correctly describes the physics ofsolar eruptive events, and even more difficult to implement models capable ofcatering for all these complexities and to validate them using real observational data.In this paper, we envisage that workflows offer both a theoretical andpractical framerwork for a novel approach to propagation models. We definea mathematical framework that aims at encompassing the different modalitieswith which workflows can be used, and provide a set of generic building blockswritten in the TAVERNA workflow language that users can use to build theirown propagation models. Finally we test both the theoretical model and thecomposite building blocks of the workflow with a real Science Use Case that wasdiscussed during the 4th CDAW (Coordinated Data Analysis Workshop eventheld by the HELIO project. We show that generic workflow building blocks canbe used to construct a propagation model that succesfully describes the transitof solar eruptive events toward Earth and predict a correct Earth-impact time

  1. A Formal Model For Declarative Workflows

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao

    it as a general formal model for specification and execution of declarative, event-based business processes, as a generalization of a concurrency model, the classic event structures. The model allows for an intuitive operational semantics and mapping of execution state by a notion of markings of the graphs and we...

  2. The medical simulation markup language - simplifying the biomechanical modeling workflow.

    Science.gov (United States)

    Suwelack, Stefan; Stoll, Markus; Schalck, Sebastian; Schoch, Nicolai; Dillmann, Rüdiger; Bendl, Rolf; Heuveline, Vincent; Speidel, Stefanie

    2014-01-01

    Modeling and simulation of the human body by means of continuum mechanics has become an important tool in diagnostics, computer-assisted interventions and training. This modeling approach seeks to construct patient-specific biomechanical models from tomographic data. Usually many different tools such as segmentation and meshing algorithms are involved in this workflow. In this paper we present a generalized and flexible description for biomechanical models. The unique feature of the new modeling language is that it not only describes the final biomechanical simulation, but also the workflow how the biomechanical model is constructed from tomographic data. In this way, the MSML can act as a middleware between all tools used in the modeling pipeline. The MSML thus greatly facilitates the prototyping of medical simulation workflows for clinical and research purposes. In this paper, we not only detail the XML-based modeling scheme, but also present a concrete implementation. Different examples highlight the flexibility, robustness and ease-of-use of the approach.

  3. Rapid Energy Modeling Workflow Demonstration Project

    Science.gov (United States)

    2014-01-01

    app FormIt for conceptual modeling with further refinement available in Revit or Vasari. Modeling can also be done in Revit (detailed and conceptual...referenced building model while in the field. • Autodesk® Revit is a BIM software application with integrated energy and carbon analyses driven by Green...FormIt, Revit and Vasari, and (3) comparative analysis. The energy results of these building analyses are represented as annual energy use for natural

  4. Modeling, Design, and Implementation of a Cloud Workflow Engine Based on Aneka

    OpenAIRE

    Zhou, Jiantao; Sun, Chaoxin; Fu, Weina; Liu, Jing; Jia, Lei; Tan, Hongyan

    2014-01-01

    This paper presents a Petri net-based model for cloud workflow which plays a key role in industry. Three kinds of parallelisms in cloud workflow are characterized and modeled. Based on the analysis of the modeling, a cloud workflow engine is designed and implemented in Aneka cloud environment. The experimental results validate the effectiveness of our approach of modeling, design, and implementation of cloud workflow.

  5. A STRUCTURAL MODEL OF AN EXCAVATOR WORKFLOW CONTROL SYSTEM

    Directory of Open Access Journals (Sweden)

    A. Gurko

    2016-12-01

    Full Text Available Earthwork improving is connected with excavators automation. In this paper, on the basis of the analysis of problems that a hydraulic excavator control system have to solve, the hierarchical structure of a control system have been proposed. The decomposition of the control process had been executed that allowed to develop the structural model which reflects the characteristics of a multilevel space-distributed control system of an excavator workflow.

  6. Widening the adoption of workflows to include human and human-machine scientific processes

    Science.gov (United States)

    Salayandia, L.; Pinheiro da Silva, P.; Gates, A. Q.

    2010-12-01

    Scientific workflows capture knowledge in the form of technical recipes to access and manipulate data that help scientists manage and reuse established expertise to conduct their work. Libraries of scientific workflows are being created in particular fields, e.g., Bioinformatics, where combined with cyber-infrastructure environments that provide on-demand access to data and tools, result in powerful workbenches for scientists of those communities. The focus in these particular fields, however, has been more on automating rather than documenting scientific processes. As a result, technical barriers have impeded a wider adoption of scientific workflows by scientific communities that do not rely as heavily on cyber-infrastructure and computing environments. Semantic Abstract Workflows (SAWs) are introduced to widen the applicability of workflows as a tool to document scientific recipes or processes. SAWs intend to capture a scientists’ perspective about the process of how she or he would collect, filter, curate, and manipulate data to create the artifacts that are relevant to her/his work. In contrast, scientific workflows describe the process from the point of view of how technical methods and tools are used to conduct the work. By focusing on a higher level of abstraction that is closer to a scientist’s understanding, SAWs effectively capture the controlled vocabularies that reflect a particular scientific community, as well as the types of datasets and methods used in a particular domain. From there on, SAWs provide the flexibility to adapt to different environments to carry out the recipes or processes. These environments range from manual fieldwork to highly technical cyber-infrastructure environments, i.e., such as those already supported by scientific workflows. Two cases, one from Environmental Science and another from Geophysics, are presented as illustrative examples.

  7. BioInfra.Prot: A comprehensive proteomics workflow including data standardization, protein inference, expression analysis and data publication.

    Science.gov (United States)

    Turewicz, Michael; Kohl, Michael; Ahrens, Maike; Mayer, Gerhard; Uszkoreit, Julian; Naboulsi, Wael; Bracht, Thilo; Megger, Dominik A; Sitek, Barbara; Marcus, Katrin; Eisenacher, Martin

    2017-11-10

    The analysis of high-throughput mass spectrometry-based proteomics data must address the specific challenges of this technology. To this end, the comprehensive proteomics workflow offered by the de.NBI service center BioInfra.Prot provides indispensable components for the computational and statistical analysis of this kind of data. These components include tools and methods for spectrum identification and protein inference, protein quantification, expression analysis as well as data standardization and data publication. All particular methods of the workflow which address these tasks are state-of-the-art or cutting edge. As has been shown in previous publications, each of these methods is adequate to solve its specific task and gives competitive results. However, the methods included in the workflow are continuously reviewed, updated and improved to adapt to new scientific developments. All of these particular components and methods are available as stand-alone BioInfra.Prot services or as a complete workflow. Since BioInfra.Prot provides manifold fast communication channels to get access to all components of the workflow (e.g., via the BioInfra.Prot ticket system: bioinfraprot@rub.de) users can easily benefit from this service and get support by experts. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  8. Coupling of a continuum ice sheet model and a discrete element calving model using a scientific workflow system

    Science.gov (United States)

    Memon, Shahbaz; Vallot, Dorothée; Zwinger, Thomas; Neukirchen, Helmut

    2017-04-01

    Scientific communities generate complex simulations through orchestration of semi-structured analysis pipelines which involves execution of large workflows on multiple, distributed and heterogeneous computing and data resources. Modeling ice dynamics of glaciers requires workflows consisting of many non-trivial, computationally expensive processing tasks which are coupled to each other. From this domain, we present an e-Science use case, a workflow, which requires the execution of a continuum ice flow model and a discrete element based calving model in an iterative manner. Apart from the execution, this workflow also contains data format conversion tasks that support the execution of ice flow and calving by means of transition through sequential, nested and iterative steps. Thus, the management and monitoring of all the processing tasks including data management and transfer of the workflow model becomes more complex. From the implementation perspective, this workflow model was initially developed on a set of scripts using static data input and output references. In the course of application usage when more scripts or modifications introduced as per user requirements, the debugging and validation of results were more cumbersome to achieve. To address these problems, we identified a need to have a high-level scientific workflow tool through which all the above mentioned processes can be achieved in an efficient and usable manner. We decided to make use of the e-Science middleware UNICORE (Uniform Interface to Computing Resources) that allows seamless and automated access to different heterogenous and distributed resources which is supported by a scientific workflow engine. Based on this, we developed a high-level scientific workflow model for coupling of massively parallel High-Performance Computing (HPC) jobs: a continuum ice sheet model (Elmer/Ice) and a discrete element calving and crevassing model (HiDEM). In our talk we present how the use of a high

  9. Automated evolutionary restructuring of workflows to minimise errors via stochastic model checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents a framework for the automated restructuring of workflows that allows one to minimise the impact of errors on a production workflow. The framework allows for the modelling of workflows by means of a formalised subset of the Business Process Modelling and Notation (BPMN) language...

  10. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee

    2016-01-01

    This article presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults...

  11. Integrate Data into Scientific Workflows for Terrestrial Biosphere Model Evaluation through Brokers

    Science.gov (United States)

    Wei, Y.; Cook, R. B.; Du, F.; Dasgupta, A.; Poco, J.; Huntzinger, D. N.; Schwalm, C. R.; Boldrini, E.; Santoro, M.; Pearlman, J.; Pearlman, F.; Nativi, S.; Khalsa, S.

    2013-12-01

    Terrestrial biosphere models (TBMs) have become integral tools for extrapolating local observations and process-level understanding of land-atmosphere carbon exchange to larger regions. Model-model and model-observation intercomparisons are critical to understand the uncertainties within model outputs, to improve model skill, and to improve our understanding of land-atmosphere carbon exchange. The DataONE Exploration, Visualization, and Analysis (EVA) working group is evaluating TBMs using scientific workflows in UV-CDAT/VisTrails. This workflow-based approach promotes collaboration and improved tracking of evaluation provenance. But challenges still remain. The multi-scale and multi-discipline nature of TBMs makes it necessary to include diverse and distributed data resources in model evaluation. These include, among others, remote sensing data from NASA, flux tower observations from various organizations including DOE, and inventory data from US Forest Service. A key challenge is to make heterogeneous data from different organizations and disciplines discoverable and readily integrated for use in scientific workflows. This presentation introduces the brokering approach taken by the DataONE EVA to fill the gap between TBMs' evaluation scientific workflows and cross-organization and cross-discipline data resources. The DataONE EVA started the development of an Integrated Model Intercomparison Framework (IMIF) that leverages standards-based discovery and access brokers to dynamically discover, access, and transform (e.g. subset and resampling) diverse data products from DataONE, Earth System Grid (ESG), and other data repositories into a format that can be readily used by scientific workflows in UV-CDAT/VisTrails. The discovery and access brokers serve as an independent middleware that bridge existing data repositories and TBMs evaluation scientific workflows but introduce little overhead to either component. In the initial work, an OpenSearch-based discovery broker

  12. Open innovation: Towards sharing of data, models and workflows.

    Science.gov (United States)

    Conrado, Daniela J; Karlsson, Mats O; Romero, Klaus; Sarr, Céline; Wilkins, Justin J

    2017-11-15

    Sharing of resources across organisations to support open innovation is an old idea, but which is being taken up by the scientific community at increasing speed, concerning public sharing in particular. The ability to address new questions or provide more precise answers to old questions through merged information is among the attractive features of sharing. Increased efficiency through reuse, and increased reliability of scientific findings through enhanced transparency, are expected outcomes from sharing. In the field of pharmacometrics, efforts to publicly share data, models and workflow have recently started. Sharing of individual-level longitudinal data for modelling requires solving legal, ethical and proprietary issues similar to many other fields, but there are also pharmacometric-specific aspects regarding data formats, exchange standards, and database properties. Several organisations (CDISC, C-Path, IMI, ISoP) are working to solve these issues and propose standards. There are also a number of initiatives aimed at collecting disease-specific databases - Alzheimer's Disease (ADNI, CAMD), malaria (WWARN), oncology (PDS), Parkinson's Disease (PPMI), tuberculosis (CPTR, TB-PACTS, ReSeqTB) - suitable for drug-disease modelling. Organized sharing of pharmacometric executable model code and associated information has in the past been sparse, but a model repository (DDMoRe Model Repository) intended for the purpose has recently been launched. In addition several other services can facilitate model sharing more generally. Pharmacometric workflows have matured over the last decades and initiatives to more fully capture those applied to analyses are ongoing. In order to maximize both the impact of pharmacometrics and the knowledge extracted from clinical data, the scientific community needs to take ownership of and create opportunities for open innovation. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Modernizing Earth and Space Science Modeling Workflows in the Big Data Era

    Science.gov (United States)

    Kinter, J. L.; Feigelson, E.; Walker, R. J.; Tino, C.

    2017-12-01

    Modeling is a major aspect of the Earth and space science research. The development of numerical models of the Earth system, planetary systems or astrophysical systems is essential to linking theory with observations. Optimal use of observations that are quite expensive to obtain and maintain typically requires data assimilation that involves numerical models. In the Earth sciences, models of the physical climate system are typically used for data assimilation, climate projection, and inter-disciplinary research, spanning applications from analysis of multi-sensor data sets to decision-making in climate-sensitive sectors with applications to ecosystems, hazards, and various biogeochemical processes. In space physics, most models are from first principles, require considerable expertise to run and are frequently modified significantly for each case study. The volume and variety of model output data from modeling Earth and space systems are rapidly increasing and have reached a scale where human interaction with data is prohibitively inefficient. A major barrier to progress is that modeling workflows isn't deemed by practitioners to be a design problem. Existing workflows have been created by a slow accretion of software, typically based on undocumented, inflexible scripts haphazardly modified by a succession of scientists and students not trained in modern software engineering methods. As a result, existing modeling workflows suffer from an inability to onboard new datasets into models; an inability to keep pace with accelerating data production rates; and irreproducibility, among other problems. These factors are creating an untenable situation for those conducting and supporting Earth system and space science. Improving modeling workflows requires investments in hardware, software and human resources. This paper describes the critical path issues that must be targeted to accelerate modeling workflows, including script modularization, parallelization, and

  14. geoKepler Workflow Module for Computationally Scalable and Reproducible Geoprocessing and Modeling

    Science.gov (United States)

    Cowart, C.; Block, J.; Crawl, D.; Graham, J.; Gupta, A.; Nguyen, M.; de Callafon, R.; Smarr, L.; Altintas, I.

    2015-12-01

    The NSF-funded WIFIRE project has developed an open-source, online geospatial workflow platform for unifying geoprocessing tools and models for for fire and other geospatially dependent modeling applications. It is a product of WIFIRE's objective to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. geoKepler includes a set of reusable GIS components, or actors, for the Kepler Scientific Workflow System (https://kepler-project.org). Actors exist for reading and writing GIS data in formats such as Shapefile, GeoJSON, KML, and using OGC web services such as WFS. The actors also allow for calling geoprocessing tools in other packages such as GDAL and GRASS. Kepler integrates functions from multiple platforms and file formats into one framework, thus enabling optimal GIS interoperability, model coupling, and scalability. Products of the GIS actors can be fed directly to models such as FARSITE and WRF. Kepler's ability to schedule and scale processes using Hadoop and Spark also makes geoprocessing ultimately extensible and computationally scalable. The reusable workflows in geoKepler can be made to run automatically when alerted by real-time environmental conditions. Here, we show breakthroughs in the speed of creating complex data for hazard assessments with this platform. We also demonstrate geoKepler workflows that use Data Assimilation to ingest real-time weather data into wildfire simulations, and for data mining techniques to gain insight into environmental conditions affecting fire behavior. Existing machine learning tools and libraries such as R and MLlib are being leveraged for this purpose in Kepler, as well as Kepler's Distributed Data Parallel (DDP) capability to provide a framework for scalable processing. geoKepler workflows can be executed via an iPython notebook as a part of a Jupyter hub at UC San Diego for sharing and reporting of the scientific analysis and results from

  15. Design Tools and Workflows for Braided Structures

    DEFF Research Database (Denmark)

    Vestartas, Petras; Heinrich, Mary Katherine; Zwierzycki, Mateusz

    2017-01-01

    and merits of our method, demonstrated though four example design and analysis workflows. The workflows frame specific aspects of enquiry for the ongoing research project flora robotica. These include modelling target geometries, automatically producing instructions for fabrication, conducting structural...

  16. Declarative Modelling and Safe Distribution of Healthcare Workflows

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao; Slaats, Tijs

    2012-01-01

    We present a formal technique for safe distribution of workflow processes described declaratively as Nested Condition Response (NCR) Graphs and apply the technique to a distributed healthcare workflow. Concretely, we provide a method to synthesize from a NCR Graph and any distribution of its events......-organizational case management. The contributions of this paper is to adapt the technique to allow for nested processes and milestones and to apply it to a healthcare workflow identified in a previous field study at danish hospitals....

  17. BIM Workflow for Mechanical Ventilation Design : Object-Based Modeling with Autodesk Revit®

    OpenAIRE

    Bonduel, Mathias

    2016-01-01

    This study is conducted for the Belgian engineering firm CENERGIE, whose main business activities are within the fields of building systems and sustainable buildings. The company wants to change their current design workflows to adapt the use of Building Information Modeling (BIM) with Autodesk Revit The research focused on the development of a BIM workflow where no models are exchanged between building partners. The aim of this study was to develop such a Revit BIM workflow for the desig...

  18. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    International Nuclear Information System (INIS)

    Herbert, L.T.; Hansen, Z.N.L.

    2016-01-01

    This paper presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults and incorporate an intention preserving stochastic semantics able to model both probabilistic- and non-deterministic behaviour. Stochastic model checking techniques are employed to generate the state-space of a given workflow. Possible improvements obtained by restructuring are measured by employing the framework's capacity for tracking real-valued quantities associated with states and transitions of the workflow. The space of possible restructurings of a workflow is explored by means of an evolutionary algorithm, where the goals for improvement are defined in terms of optimising quantities, typically employed to model resources, associated with a workflow. The approach is fully automated and only the modelling of the production workflows, potential faults and the expression of the goals require manual input. We present the design of a software tool implementing this framework and explore the practical utility of this approach through an industrial case study in which the risk of production failures and their impact are reduced by restructuring the workflow. - Highlights: • We present a framework which allows for the automated restructuring of workflows. • This framework seeks to minimise the impact of errors on the workflow. • We illustrate a scalable software implementation of this framework. • We explore the practical utility of this approach through an industry case. • The impact of errors can be substantially reduced by restructuring the workflow.

  19. A three-level atomicity model for decentralized workflow management systems

    Science.gov (United States)

    Ben-Shaul, Israel Z.; Heineman, George T.

    1996-12-01

    A workflow management system (WFMS) employs a workflow manager (WM) to execute and automate the various activities within a workflow. To protect the consistency of data, the WM encapsulates each activity with a transaction; a transaction manager (TM) then guarantees the atomicity of activities. Since workflows often group several activities together, the TM is responsible for guaranteeing the atomicity of these units. There are scalability issues, however, with centralized WFMSs. Decentralized WFMSs provide an architecture for multiple autonomous WFMSs to interoperate, thus accommodating multiple workflows and geographically-dispersed teams. When atomic units are composed of activities spread across multiple WFMSs, however, there is a conflict between global atomicity and local autonomy of each WFMS. This paper describes a decentralized atomicity model that enables workflow administrators to specify the scope of multi-site atomicity based upon the desired semantics of multi-site tasks in the decentralized WFMS. We describe an architecture that realizes our model and execution paradigm.

  20. VisTrails SAHM: visualization and workflow management for species habitat modeling

    Science.gov (United States)

    Morisette, Jeffrey T.; Jarnevich, Catherine S.; Holcombe, Tracy R.; Talbert, Colin B.; Ignizio, Drew A.; Talbert, Marian; Silva, Claudio; Koop, David; Swanson, Alan; Young, Nicholas E.

    2013-01-01

    The Software for Assisted Habitat Modeling (SAHM) has been created to both expedite habitat modeling and help maintain a record of the various input data, pre- and post-processing steps and modeling options incorporated in the construction of a species distribution model through the established workflow management and visualization VisTrails software. This paper provides an overview of the VisTrails:SAHM software including a link to the open source code, a table detailing the current SAHM modules, and a simple example modeling an invasive weed species in Rocky Mountain National Park, USA.

  1. Next-Generation Climate Modeling Science Challenges for Simulation, Workflow and Analysis Systems

    Science.gov (United States)

    Koch, D. M.; Anantharaj, V. G.; Bader, D. C.; Krishnan, H.; Leung, L. R.; Ringler, T.; Taylor, M.; Wehner, M. F.; Williams, D. N.

    2016-12-01

    We will present two examples of current and future high-resolution climate-modeling research that are challenging existing simulation run-time I/O, model-data movement, storage and publishing, and analysis. In each case, we will consider lessons learned as current workflow systems are broken by these large-data science challenges, as well as strategies to repair or rebuild the systems. First we consider the science and workflow challenges to be posed by the CMIP6 multi-model HighResMIP, involving around a dozen modeling groups performing quarter-degree simulations, in 3-member ensembles for 100 years, with high-frequency (1-6 hourly) diagnostics, which is expected to generate over 4PB of data. An example of science derived from these experiments will be to study how resolution affects the ability of models to capture extreme-events such as hurricanes or atmospheric rivers. Expected methods to transfer (using parallel Globus) and analyze (using parallel "TECA" software tools) HighResMIP data for such feature-tracking by the DOE CASCADE project will be presented. A second example will be from the Accelerated Climate Modeling for Energy (ACME) project, which is currently addressing challenges involving multiple century-scale coupled high resolution (quarter-degree) climate simulations on DOE Leadership Class computers. ACME is anticipating production of over 5PB of data during the next 2 years of simulations, in order to investigate the drivers of water cycle changes, sea-level-rise, and carbon cycle evolution. The ACME workflow, from simulation to data transfer, storage, analysis and publication will be presented. Current and planned methods to accelerate the workflow, including implementing run-time diagnostics, and implementing server-side analysis to avoid moving large datasets will be presented.

  2. Conceptual-level workflow modeling of scientific experiments using NMR as a case study

    Directory of Open Access Journals (Sweden)

    Gryk Michael R

    2007-01-01

    Full Text Available Abstract Background Scientific workflows improve the process of scientific experiments by making computations explicit, underscoring data flow, and emphasizing the participation of humans in the process when intuition and human reasoning are required. Workflows for experiments also highlight transitions among experimental phases, allowing intermediate results to be verified and supporting the proper handling of semantic mismatches and different file formats among the various tools used in the scientific process. Thus, scientific workflows are important for the modeling and subsequent capture of bioinformatics-related data. While much research has been conducted on the implementation of scientific workflows, the initial process of actually designing and generating the workflow at the conceptual level has received little consideration. Results We propose a structured process to capture scientific workflows at the conceptual level that allows workflows to be documented efficiently, results in concise models of the workflow and more-correct workflow implementations, and provides insight into the scientific process itself. The approach uses three modeling techniques to model the structural, data flow, and control flow aspects of the workflow. The domain of biomolecular structure determination using Nuclear Magnetic Resonance spectroscopy is used to demonstrate the process. Specifically, we show the application of the approach to capture the workflow for the process of conducting biomolecular analysis using Nuclear Magnetic Resonance (NMR spectroscopy. Conclusion Using the approach, we were able to accurately document, in a short amount of time, numerous steps in the process of conducting an experiment using NMR spectroscopy. The resulting models are correct and precise, as outside validation of the models identified only minor omissions in the models. In addition, the models provide an accurate visual description of the control flow for conducting

  3. Integration Of Externalized Decision Models In The Definition Of Workflows For Digital Pathology

    Directory of Open Access Journals (Sweden)

    J. van Leeuwen

    2016-06-01

    We proposed a workflow solution enabling the representation of decision models as externalized executable tasks in the process definition. Our approach separates the task implementations from the workflow model, ensuring scalability and allowing for the inclusion of complex decision logic in the workflow execution. In we depict a simplified model of a pathology diagnosis workflow (starting with the digitization of the slides, represented according to the BPMN modeling conventions. The example shows a workflow sequence that automatically orders a HER2 FISH when IHC is borderline according to defined customizable thresholds. The process model integrates an image analysis algorithm that scores images. Based on the score and the thresholds the decision model evaluates the condition and recommends the pre-ordering of an additional test when the score falls between the two thresholds. This leads to faster diagnosis and allows balancing the costs of an additional test versus the overhead of the pathologist by choosing the values of the thresholds.  

  4. Aligning observed and modelled behaviour based on workflow decomposition

    Science.gov (United States)

    Wang, Lu; Du, YuYue; Liu, Wei

    2017-09-01

    When business processes are mostly supported by information systems, the availability of event logs generated from these systems, as well as the requirement of appropriate process models are increasing. Business processes can be discovered, monitored and enhanced by extracting process-related information. However, some events cannot be correctly identified because of the explosion of the amount of event logs. Therefore, a new process mining technique is proposed based on a workflow decomposition method in this paper. Petri nets (PNs) are used to describe business processes, and then conformance checking of event logs and process models is investigated. A decomposition approach is proposed to divide large process models and event logs into several separate parts that can be analysed independently; while an alignment approach based on a state equation method in PN theory enhances the performance of conformance checking. Both approaches are implemented in programmable read-only memory (ProM). The correctness and effectiveness of the proposed methods are illustrated through experiments.

  5. modeling workflow management in a distributed computing system

    African Journals Online (AJOL)

    Dr Obe

    communication system, which allows for computerized support. ... Keywords: Distributed computing system; Petri nets;Workflow management. 1. ... A distributed operating system usually .... the questionnaire is returned with invalid data,.

  6. The OCoN approach to workflow modeling in object-oriented systems

    NARCIS (Netherlands)

    Wirtz, G.; Weske, M.H.; Giese, H.

    2001-01-01

    Workflow management aims at modeling and executing application processes in complex technical and organizational environments. Modern information systems are often based on object-oriented design techniques, for instance, the Unified Modeling Language (UML). These systems consist of application

  7. Structuring research methods and data with the research object model: genomics workflows as a case study.

    Science.gov (United States)

    Hettne, Kristina M; Dharuri, Harish; Zhao, Jun; Wolstencroft, Katherine; Belhajjame, Khalid; Soiland-Reyes, Stian; Mina, Eleni; Thompson, Mark; Cruickshank, Don; Verdes-Montenegro, Lourdes; Garrido, Julian; de Roure, David; Corcho, Oscar; Klyne, Graham; van Schouwen, Reinout; 't Hoen, Peter A C; Bechhofer, Sean; Goble, Carole; Roos, Marco

    2014-01-01

    One of the main challenges for biomedical research lies in the computer-assisted integrative study of large and increasingly complex combinations of data in order to understand molecular mechanisms. The preservation of the materials and methods of such computational experiments with clear annotations is essential for understanding an experiment, and this is increasingly recognized in the bioinformatics community. Our assumption is that offering means of digital, structured aggregation and annotation of the objects of an experiment will provide necessary meta-data for a scientist to understand and recreate the results of an experiment. To support this we explored a model for the semantic description of a workflow-centric Research Object (RO), where an RO is defined as a resource that aggregates other resources, e.g., datasets, software, spreadsheets, text, etc. We applied this model to a case study where we analysed human metabolite variation by workflows. We present the application of the workflow-centric RO model for our bioinformatics case study. Three workflows were produced following recently defined Best Practices for workflow design. By modelling the experiment as an RO, we were able to automatically query the experiment and answer questions such as "which particular data was input to a particular workflow to test a particular hypothesis?", and "which particular conclusions were drawn from a particular workflow?". Applying a workflow-centric RO model to aggregate and annotate the resources used in a bioinformatics experiment, allowed us to retrieve the conclusions of the experiment in the context of the driving hypothesis, the executed workflows and their input data. The RO model is an extendable reference model that can be used by other systems as well. The Research Object is available at http://www.myexperiment.org/packs/428 The Wf4Ever Research Object Model is available at http://wf4ever.github.io/ro.

  8. Advanced computational workflow for the multi-scale modeling of the bone metabolic processes.

    Science.gov (United States)

    Dao, Tien Tuan

    2017-06-01

    Multi-scale modeling of the musculoskeletal system plays an essential role in the deep understanding of complex mechanisms underlying the biological phenomena and processes such as bone metabolic processes. Current multi-scale models suffer from the isolation of sub-models at each anatomical scale. The objective of this present work was to develop a new fully integrated computational workflow for simulating bone metabolic processes at multi-scale levels. Organ-level model employs multi-body dynamics to estimate body boundary and loading conditions from body kinematics. Tissue-level model uses finite element method to estimate the tissue deformation and mechanical loading under body loading conditions. Finally, cell-level model includes bone remodeling mechanism through an agent-based simulation under tissue loading. A case study on the bone remodeling process located on the human jaw was performed and presented. The developed multi-scale model of the human jaw was validated using the literature-based data at each anatomical level. Simulation outcomes fall within the literature-based ranges of values for estimated muscle force, tissue loading and cell dynamics during bone remodeling process. This study opens perspectives for accurately simulating bone metabolic processes using a fully integrated computational workflow leading to a better understanding of the musculoskeletal system function from multiple length scales as well as to provide new informative data for clinical decision support and industrial applications.

  9. Precise Quantitative Analysis of Probabilistic Business Process Model and Notation Workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2013-01-01

    We present a framework for modeling and analysis of real-world business workflows. We present a formalized core subset of the business process modeling and notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... the entire BPMN language, allow for more complex annotations and ultimately to automatically synthesize workflows by composing predefined subprocesses, in order to achieve a configuration that is optimal for parameters of interest....

  10. Populating a Library of Reusable H-Boms Assessment of a Feasible Image Based Modeling Workflow

    Science.gov (United States)

    Santagati, C.; Lo Turco, M.; D'Agostino, G.

    2017-08-01

    The paper shows the intermediate results of a research activity aimed at populating a library of reusable Historical Building Object Models (H-BOMs) by testing a full digital workflow that takes advantages from using Structure from Motion (SfM) models and is centered on the geometrical/stylistic/materic analysis of the architectural element (portal, window, altar). The aim is to find common (invariant) and uncommon (variant) features in terms of identification of architectural parts and their relationships, geometrical rules, dimensions and proportions, construction materials and measure units, in order to model archetypal shapes from which it is possible to derive all the style variations. At this regard, a set of 14th - 16th century gothic portals of the catalan-aragonese architecture in Etnean area of Eastern Sicily has been studied and used to assess the feasibility of the identified workflow. This approach tries to answer the increasingly demand for guidelines and standards in the field of Cultural Heritage Conservation to create and manage semantic-aware 3D models able to include all the information (both geometrical and alphanumerical ones) concerning historical buildings and able to be reused in several projects.

  11. Pathology economic model tool: a novel approach to workflow and budget cost analysis in an anatomic pathology laboratory.

    Science.gov (United States)

    Muirhead, David; Aoun, Patricia; Powell, Michael; Juncker, Flemming; Mollerup, Jens

    2010-08-01

    The need for higher efficiency, maximum quality, and faster turnaround time is a continuous focus for anatomic pathology laboratories and drives changes in work scheduling, instrumentation, and management control systems. To determine the costs of generating routine, special, and immunohistochemical microscopic slides in a large, academic anatomic pathology laboratory using a top-down approach. The Pathology Economic Model Tool was used to analyze workflow processes at The Nebraska Medical Center's anatomic pathology laboratory. Data from the analysis were used to generate complete cost estimates, which included not only materials, consumables, and instrumentation but also specific labor and overhead components for each of the laboratory's subareas. The cost data generated by the Pathology Economic Model Tool were compared with the cost estimates generated using relative value units. Despite the use of automated systems for different processes, the workflow in the laboratory was found to be relatively labor intensive. The effect of labor and overhead on per-slide costs was significantly underestimated by traditional relative-value unit calculations when compared with the Pathology Economic Model Tool. Specific workflow defects with significant contributions to the cost per slide were identified. The cost of providing routine, special, and immunohistochemical slides may be significantly underestimated by traditional methods that rely on relative value units. Furthermore, a comprehensive analysis may identify specific workflow processes requiring improvement.

  12. Integration of 3D photogrammetric outcrop models in the reservoir modelling workflow

    Science.gov (United States)

    Deschamps, Remy; Joseph, Philippe; Lerat, Olivier; Schmitz, Julien; Doligez, Brigitte; Jardin, Anne

    2014-05-01

    3D technologies are now widely used in geosciences to reconstruct outcrops in 3D. The technology used for the 3D reconstruction is usually based on Lidar, which provides very precise models. Such datasets offer the possibility to build well-constrained outcrop analogue models for reservoir study purposes. The photogrammetry is an alternate methodology which principles are based in determining the geometric properties of an object from photographic pictures taken from different angles. Outcrop data acquisition is easy, and this methodology allows constructing 3D outcrop models with many advantages such as: - light and fast acquisition, - moderate processing time (depending on the size of the area of interest), - integration of field data and 3D outcrops into the reservoir modelling tools. Whatever the method, the advantages of digital outcrop model are numerous as already highlighted by Hodgetts (2013), McCaffrey et al. (2005) and Pringle et al. (2006): collection of data from otherwise inaccessible areas, access to different angles of view, increase of the possible measurements, attributes analysis, fast rate of data collection, and of course training and communication. This paper proposes a workflow where 3D geocellular models are built by integrating all sources of information from outcrops (surface picking, sedimentological sections, structural and sedimentary dips…). The 3D geomodels that are reconstructed can be used at the reservoir scale, in order to compare the outcrop information with subsurface models: the detailed facies models of the outcrops are transferred into petrophysical and acoustic models, which are used to test different scenarios of seismic and fluid flow modelling. The detailed 3D models are also used to test new techniques of static reservoir modelling, based either on geostatistical approaches or on deterministic (process-based) simulation techniques. A modelling workflow has been designed to model reservoir geometries and properties from

  13. Integrated workflow for characterizing and modeling fracture network in unconventional reservoirs using microseismic data

    Science.gov (United States)

    Ayatollahy Tafti, Tayeb

    We develop a new method for integrating information and data from different sources. We also construct a comprehensive workflow for characterizing and modeling a fracture network in unconventional reservoirs, using microseismic data. The methodology is based on combination of several mathematical and artificial intelligent techniques, including geostatistics, fractal analysis, fuzzy logic, and neural networks. The study contributes to scholarly knowledge base on the characterization and modeling fractured reservoirs in several ways; including a versatile workflow with a novel objective functions. Some the characteristics of the methods are listed below: 1. The new method is an effective fracture characterization procedure estimates different fracture properties. Unlike the existing methods, the new approach is not dependent on the location of events. It is able to integrate all multi-scaled and diverse fracture information from different methodologies. 2. It offers an improved procedure to create compressional and shear velocity models as a preamble for delineating anomalies and map structures of interest and to correlate velocity anomalies with fracture swarms and other reservoir properties of interest. 3. It offers an effective way to obtain the fractal dimension of microseismic events and identify the pattern complexity, connectivity, and mechanism of the created fracture network. 4. It offers an innovative method for monitoring the fracture movement in different stages of stimulation that can be used to optimize the process. 5. Our newly developed MDFN approach allows to create a discrete fracture network model using only microseismic data with potential cost reduction. It also imposes fractal dimension as a constraint on other fracture modeling approaches, which increases the visual similarity between the modeled networks and the real network over the simulated volume.

  14. Designing Collaborative Developmental Standards by Refactoring of the Earth Science Models, Libraries, Workflows and Frameworks.

    Science.gov (United States)

    Mirvis, E.; Iredell, M.

    2015-12-01

    The operational (OPS) NOAA National Centers for Environmental Prediction (NCEP) suite, traditionally, consist of a large set of multi- scale HPC models, workflows, scripts, tools and utilities, which are very much depending on the variety of the additional components. Namely, this suite utilizes a unique collection of the in-house developed 20+ shared libraries (NCEPLIBS), certain versions of the 3-rd party libraries (like netcdf, HDF, ESMF, jasper, xml etc.), HPC workflow tool within dedicated (sometimes even vendors' customized) HPC system homogeneous environment. This domain and site specific, accompanied with NCEP's product- driven large scale real-time data operations complicates NCEP collaborative development tremendously by reducing chances to replicate this OPS environment anywhere else. The NOAA/NCEP's Environmental Modeling Center (EMC) missions to develop and improve numerical weather, climate, hydrological and ocean prediction through the partnership with the research community. Realizing said difficulties, lately, EMC has been taken an innovative approach to improve flexibility of the HPC environment by building the elements and a foundation for NCEP OPS functionally equivalent environment (FEE), which can be used to ease the external interface constructs as well. Aiming to reduce turnaround time of the community code enhancements via Research-to-Operations (R2O) cycle, EMC developed and deployed several project sub-set standards that already paved the road to NCEP OPS implementation standards. In this topic we will discuss the EMC FEE for O2R requirements and approaches in collaborative standardization, including NCEPLIBS FEE and models code version control paired with the models' derived customized HPC modules and FEE footprints. We will share NCEP/EMC experience and potential in the refactoring of EMC development processes, legacy codes and in securing model source code quality standards by using combination of the Eclipse IDE, integrated with the

  15. A priori modeling of chemical reactions on computational grid platforms: Workflows and data models

    International Nuclear Information System (INIS)

    Rampino, S.; Monari, A.; Rossi, E.; Evangelisti, S.; Laganà, A.

    2012-01-01

    Graphical abstract: The quantum framework of the Grid Empowered Molecular Simulator GEMS assembled on the European Grid allows the ab initio evaluation of the dynamics of small systems starting from the calculation of the electronic properties. Highlights: ► The grid based GEMS simulator accurately models small chemical systems. ► Q5Cost and D5Cost file formats provide interoperability in the workflow. ► Benchmark runs on H + H 2 highlight the Grid empowering. ► O + O 2 and N + N 2 calculated k (T)’s fall within the error bars of the experiment. - Abstract: The quantum framework of the Grid Empowered Molecular Simulator GEMS has been assembled on the segment of the European Grid devoted to the Computational Chemistry Virtual Organization. The related grid based workflow allows the ab initio evaluation of the dynamics of small systems starting from the calculation of the electronic properties. Interoperability between computational codes across the different stages of the workflow was made possible by the use of the common data formats Q5Cost and D5Cost. Illustrative benchmark runs have been performed on the prototype H + H 2 , N + N 2 and O + O 2 gas phase exchange reactions and thermal rate coefficients have been calculated for the last two. Results are discussed in terms of the modeling of the interaction and advantages of using the Grid is highlighted.

  16. Load Composition Model Workflow (BPA TIP-371 Deliverable 1A)

    Energy Technology Data Exchange (ETDEWEB)

    Chassin, David P.; Cezar, Gustavo V.; /SLAC

    2017-07-17

    This project is funded under Bonneville Power Administration (BPA) Strategic Partnership Project (SPP) 17-005 between BPA and SLAC National Accelerator Laboratory. The project in a BPA Technology Improvement Project (TIP) that builds on and validates the Composite Load Model developed by the Western Electric Coordinating Council's (WECC) Load Modeling Task Force (LMTF). The composite load model is used by the WECC Modeling and Validation Work Group to study the stability and security of the western electricity interconnection. The work includes development of load composition data sets, collection of load disturbance data, and model development and validation. This work supports reliable and economic operation of the power system. This report was produced for Deliverable 1A of the BPA TIP-371 Project entitled \\TIP 371: Advancing the Load Composition Model". The deliverable documents the proposed work ow for the Composite Load Model, which provides the basis for the instrumentation, data acquisition, analysis and data dissemination activities addressed by later phases of the project.

  17. Dynamic Chemical Model for $\\text {H} _2 $/$\\text {O} _2 $ Combustion Developed Through a Community Workflow

    KAUST Repository

    Oreluk, James

    2018-01-30

    Elementary-reaction models for $\\\\text{H}_2$/$\\\\text{O}_2$ combustion were evaluated and optimized through a collaborative workflow, establishing accuracy and characterizing uncertainties. Quantitative findings were the optimized model, the importance of $\\\\text{H}_2 + \\\\text{O}_2(1\\\\Delta) = \\\\text{H} + \\\\text{HO}_2$ in high-pressure flames, and the inconsistency of certain low-temperature shock-tube data. The workflow described here is proposed to be even more important because the approach and publicly available cyberinfrastructure allows future community development of evolving improvements. The workflow steps applied here were to develop an initial reaction set using Burke et al. [2012], Burke et al. [2013], Sellevag et al. [2009], and Konnov [2015]; test it for thermodynamic and kinetics consistency and plausibility against other sets in the literature; assign estimated uncertainties where not stated in the sources; select key data targets (

  18. Dynamic Chemical Model for $\\text {H} _2 $/$\\text {O} _2 $ Combustion Developed Through a Community Workflow

    KAUST Repository

    Oreluk, James; Needham, Craig D.; Baskaran, Sathya; Sarathy, Mani; Burke, Michael P.; West, Richard H.; Frenklach, Michael; Westmoreland, Phillip R.

    2018-01-01

    Elementary-reaction models for $\\text{H}_2$/$\\text{O}_2$ combustion were evaluated and optimized through a collaborative workflow, establishing accuracy and characterizing uncertainties. Quantitative findings were the optimized model, the importance of $\\text{H}_2 + \\text{O}_2(1\\Delta) = \\text{H} + \\text{HO}_2$ in high-pressure flames, and the inconsistency of certain low-temperature shock-tube data. The workflow described here is proposed to be even more important because the approach and publicly available cyberinfrastructure allows future community development of evolving improvements. The workflow steps applied here were to develop an initial reaction set using Burke et al. [2012], Burke et al. [2013], Sellevag et al. [2009], and Konnov [2015]; test it for thermodynamic and kinetics consistency and plausibility against other sets in the literature; assign estimated uncertainties where not stated in the sources; select key data targets (

  19. WebWorkFlow : An Object-Oriented Workflow Modeling Language for Web Applications

    NARCIS (Netherlands)

    Hemel, Z.; Verhaaf, R.; Visser, E.

    2008-01-01

    Preprint of paper published in: MODELS 2008 - International Conference on Model Driven Engineering Languages and Systems, Lecture Notes in Computer Science 5301, 2008; doi:10.1007/978-3-540-87875-9_8 Workflow languages are designed for the high-level description of processes and are typically not

  20. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  1. An organizational model to support the flexible workflow based on ontology

    International Nuclear Information System (INIS)

    Yuan Feng; Li Xudong; Zhu Guangying; Zhang Xiankun

    2012-01-01

    Based on ontology theory, the paper addresses an organizational model for flexible workflow. Firstly, the paper describes the conceptual model of the organizational model on ontology chart, which provides a consistent semantic framework of organization. Secondly, the paper gives the formalization of the model and describes the six key ontology elements of the mode in detail. Finally, the paper discusses deeply how the model supports the flexible workflow and indicates that the model has the advantages of cross-area, cross-organization and cross-domain, multi-process support and scalability. Especially, because the model is represented by ontology, the paper produces the conclusion that the model has covered the defect of unshared feature in traditional models, at the same time, it is more capable and flexible. (authors)

  2. Model-based Vestibular Afferent Stimulation: Modular Workflow for Analyzing Stimulation Scenarios in Patient Specific and Statistical Vestibular Anatomy

    Directory of Open Access Journals (Sweden)

    Michael Handler

    2017-12-01

    Full Text Available Our sense of balance and spatial orientation strongly depends on the correct functionality of our vestibular system. Vestibular dysfunction can lead to blurred vision and impaired balance and spatial orientation, causing a significant decrease in quality of life. Recent studies have shown that vestibular implants offer a possible treatment for patients with vestibular dysfunction. The close proximity of the vestibular nerve bundles, the facial nerve and the cochlear nerve poses a major challenge to targeted stimulation of the vestibular system. Modeling the electrical stimulation of the vestibular system allows for an efficient analysis of stimulation scenarios previous to time and cost intensive in vivo experiments. Current models are based on animal data or CAD models of human anatomy. In this work, a (semi-automatic modular workflow is presented for the stepwise transformation of segmented vestibular anatomy data of human vestibular specimens to an electrical model and subsequently analyzed. The steps of this workflow include (i the transformation of labeled datasets to a tetrahedra mesh, (ii nerve fiber anisotropy and fiber computation as a basis for neuron models, (iii inclusion of arbitrary electrode designs, (iv simulation of quasistationary potential distributions, and (v analysis of stimulus waveforms on the stimulation outcome. Results obtained by the workflow based on human datasets and the average shape of a statistical model revealed a high qualitative agreement and a quantitatively comparable range compared to data from literature, respectively. Based on our workflow, a detailed analysis of intra- and extra-labyrinthine electrode configurations with various stimulation waveforms and electrode designs can be performed on patient specific anatomy, making this framework a valuable tool for current optimization questions concerning vestibular implants in humans.

  3. Developing a Collection of Composable Data Translation Software Units to Improve Efficiency and Reproducibility in Ecohydrologic Modeling Workflows

    Science.gov (United States)

    Olschanowsky, C.; Flores, A. N.; FitzGerald, K.; Masarik, M. T.; Rudisill, W. J.; Aguayo, M.

    2017-12-01

    Dynamic models of the spatiotemporal evolution of water, energy, and nutrient cycling are important tools to assess impacts of climate and other environmental changes on ecohydrologic systems. These models require spatiotemporally varying environmental forcings like precipitation, temperature, humidity, windspeed, and solar radiation. These input data originate from a variety of sources, including global and regional weather and climate models, global and regional reanalysis products, and geostatistically interpolated surface observations. Data translation measures, often subsetting in space and/or time and transforming and converting variable units, represent a seemingly mundane, but critical step in the application workflows. Translation steps can introduce errors, misrepresentations of data, slow execution time, and interrupt data provenance. We leverage a workflow that subsets a large regional dataset derived from the Weather Research and Forecasting (WRF) model and prepares inputs to the Parflow integrated hydrologic model to demonstrate the impact translation tool software quality on scientific workflow results and performance. We propose that such workflows will benefit from a community approved collection of data transformation components. The components should be self-contained composable units of code. This design pattern enables automated parallelization and software verification, improving performance and reliability. Ensuring that individual translation components are self-contained and target minute tasks increases reliability. The small code size of each component enables effective unit and regression testing. The components can be automatically composed for efficient execution. An efficient data translation framework should be written to minimize data movement. Composing components within a single streaming process reduces data movement. Each component will typically have a low arithmetic intensity, meaning that it requires about the same number of

  4. Unified Geomorphological Analysis Workflows with Benthic Terrain Modeler

    Directory of Open Access Journals (Sweden)

    Shaun Walbridge

    2018-03-01

    Full Text Available High resolution remotely sensed bathymetric data is rapidly increasing in volume, but analyzing this data requires a mastery of a complex toolchain of disparate software, including computing derived measurements of the environment. Bathymetric gradients play a fundamental role in energy transport through the seascape. Benthic Terrain Modeler (BTM uses bathymetric data to enable simple characterization of benthic biotic communities and geologic types, and produces a collection of key geomorphological variables known to affect marine ecosystems and processes. BTM has received continual improvements since its 2008 release; here we describe the tools and morphometrics BTM can produce, the research context which this enables, and we conclude with an example application using data from a protected reef in St. Croix, US Virgin Islands.

  5. Modeling workflow to design machine translation applications for public health practice.

    Science.gov (United States)

    Turner, Anne M; Brownstein, Megumu K; Cole, Kate; Karasz, Hilary; Kirchhoff, Katrin

    2015-02-01

    Provide a detailed understanding of the information workflow processes related to translating health promotion materials for limited English proficiency individuals in order to inform the design of context-driven machine translation (MT) tools for public health (PH). We applied a cognitive work analysis framework to investigate the translation information workflow processes of two large health departments in Washington State. Researchers conducted interviews, performed a task analysis, and validated results with PH professionals to model translation workflow and identify functional requirements for a translation system for PH. The study resulted in a detailed description of work related to translation of PH materials, an information workflow diagram, and a description of attitudes towards MT technology. We identified a number of themes that hold design implications for incorporating MT in PH translation practice. A PH translation tool prototype was designed based on these findings. This study underscores the importance of understanding the work context and information workflow for which systems will be designed. Based on themes and translation information workflow processes, we identified key design guidelines for incorporating MT into PH translation work. Primary amongst these is that MT should be followed by human review for translations to be of high quality and for the technology to be adopted into practice. The time and costs of creating multilingual health promotion materials are barriers to translation. PH personnel were interested in MT's potential to improve access to low-cost translated PH materials, but expressed concerns about ensuring quality. We outline design considerations and a potential machine translation tool to best fit MT systems into PH practice. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Perti Net-Based Workflow Access Control Model%基于Perti网的工作流访问控制模型研究

    Institute of Scientific and Technical Information of China (English)

    陈卓; 骆婷; 石磊; 洪帆

    2004-01-01

    Access control is an important protection mechanism for information systems.This paper shows how to make access control in workflow system.We give a workflow access control model (WACM) based on several current access control models.The model supports roles assignment and dynamic authorization.The paper defines the workflow using Petri net.It firstly gives the definition and description of the workflow, and then analyzes the architecture of the workflow access control model (WACM).Finally, an example of an e-commerce workflow access control model is discussed in detail.

  7. GO2OGS 1.0: a versatile workflow to integrate complex geological information with fault data into numerical simulation models

    Science.gov (United States)

    Fischer, T.; Naumov, D.; Sattler, S.; Kolditz, O.; Walther, M.

    2015-11-01

    We offer a versatile workflow to convert geological models built with the ParadigmTM GOCAD© (Geological Object Computer Aided Design) software into the open-source VTU (Visualization Toolkit unstructured grid) format for usage in numerical simulation models. Tackling relevant scientific questions or engineering tasks often involves multidisciplinary approaches. Conversion workflows are needed as a way of communication between the diverse tools of the various disciplines. Our approach offers an open-source, platform-independent, robust, and comprehensible method that is potentially useful for a multitude of environmental studies. With two application examples in the Thuringian Syncline, we show how a heterogeneous geological GOCAD model including multiple layers and faults can be used for numerical groundwater flow modeling, in our case employing the OpenGeoSys open-source numerical toolbox for groundwater flow simulations. The presented workflow offers the chance to incorporate increasingly detailed data, utilizing the growing availability of computational power to simulate numerical models.

  8. Design Principles as a Guide for Constraint Based and Dynamic Modeling: Towards an Integrative Workflow.

    Science.gov (United States)

    Sehr, Christiana; Kremling, Andreas; Marin-Sanguino, Alberto

    2015-10-16

    During the last 10 years, systems biology has matured from a fuzzy concept combining omics, mathematical modeling and computers into a scientific field on its own right. In spite of its incredible potential, the multilevel complexity of its objects of study makes it very difficult to establish a reliable connection between data and models. The great number of degrees of freedom often results in situations, where many different models can explain/fit all available datasets. This has resulted in a shift of paradigm from the initially dominant, maybe naive, idea of inferring the system out of a number of datasets to the application of different techniques that reduce the degrees of freedom before any data set is analyzed. There is a wide variety of techniques available, each of them can contribute a piece of the puzzle and include different kinds of experimental information. But the challenge that remains is their meaningful integration. Here we show some theoretical results that enable some of the main modeling approaches to be applied sequentially in a complementary manner, and how this workflow can benefit from evolutionary reasoning to keep the complexity of the problem in check. As a proof of concept, we show how the synergies between these modeling techniques can provide insight into some well studied problems: Ammonia assimilation in bacteria and an unbranched linear pathway with end-product inhibition.

  9. Design Principles as a Guide for Constraint Based and Dynamic Modeling: Towards an Integrative Workflow

    Directory of Open Access Journals (Sweden)

    Christiana Sehr

    2015-10-01

    Full Text Available During the last 10 years, systems biology has matured from a fuzzy concept combining omics, mathematical modeling and computers into a scientific field on its own right. In spite of its incredible potential, the multilevel complexity of its objects of study makes it very difficult to establish a reliable connection between data and models. The great number of degrees of freedom often results in situations, where many different models can explain/fit all available datasets. This has resulted in a shift of paradigm from the initially dominant, maybe naive, idea of inferring the system out of a number of datasets to the application of different techniques that reduce the degrees of freedom before any data set is analyzed. There is a wide variety of techniques available, each of them can contribute a piece of the puzzle and include different kinds of experimental information. But the challenge that remains is their meaningful integration. Here we show some theoretical results that enable some of the main modeling approaches to be applied sequentially in a complementary manner, and how this workflow can benefit from evolutionary reasoning to keep the complexity of the problem in check. As a proof of concept, we show how the synergies between these modeling techniques can provide insight into some well studied problems: Ammonia assimilation in bacteria and an unbranched linear pathway with end-product inhibition.

  10. Workflow Generation from the Two-Hemisphere Model

    Directory of Open Access Journals (Sweden)

    Gusarovs Konstantīns

    2017-12-01

    Full Text Available Model-Driven Software Development (MDSD is a trend in Software Development that focuses on code generation from various kinds of models. To perform such a task, it is necessary to develop an algorithm that performs source model transformation into the target model, which ideally is an actual software code written in some kind of a programming language. However, at present a lot of methods focus on Unified Modelling Language (UML diagram generation. The present paper describes a result of authors’ research on Two-Hemisphere Model (2HM processing for easier code generation.

  11. On Lifecycle Constraints of Artifact-Centric Workflows

    Science.gov (United States)

    Kucukoguz, Esra; Su, Jianwen

    Data plays a fundamental role in modeling and management of business processes and workflows. Among the recent "data-aware" workflow models, artifact-centric models are particularly interesting. (Business) artifacts are the key data entities that are used in workflows and can reflect both the business logic and the execution states of a running workflow. The notion of artifacts succinctly captures the fluidity aspect of data during workflow executions. However, much of the technical dimension concerning artifacts in workflows is not well understood. In this paper, we study a key concept of an artifact "lifecycle". In particular, we allow declarative specifications/constraints of artifact lifecycle in the spirit of DecSerFlow, and formulate the notion of lifecycle as the set of all possible paths an artifact can navigate through. We investigate two technical problems: (Compliance) does a given workflow (schema) contain only lifecycle allowed by a constraint? And (automated construction) from a given lifecycle specification (constraint), is it possible to construct a "compliant" workflow? The study is based on a new formal variant of artifact-centric workflow model called "ArtiNets" and two classes of lifecycle constraints named "regular" and "counting" constraints. We present a range of technical results concerning compliance and automated construction, including: (1) compliance is decidable when workflow is atomic or constraints are regular, (2) for each constraint, we can always construct a workflow that satisfies the constraint, and (3) sufficient conditions where atomic workflows can be constructed.

  12. Development of a Sampling-Based Global Sensitivity Analysis Workflow for Multiscale Computational Cancer Models

    Science.gov (United States)

    Wang, Zhihui; Deisboeck, Thomas S.; Cristini, Vittorio

    2014-01-01

    There are two challenges that researchers face when performing global sensitivity analysis (GSA) on multiscale in silico cancer models. The first is increased computational intensity, since a multiscale cancer model generally takes longer to run than does a scale-specific model. The second problem is the lack of a best GSA method that fits all types of models, which implies that multiple methods and their sequence need to be taken into account. In this article, we therefore propose a sampling-based GSA workflow consisting of three phases – pre-analysis, analysis, and post-analysis – by integrating Monte Carlo and resampling methods with the repeated use of analysis of variance (ANOVA); we then exemplify this workflow using a two-dimensional multiscale lung cancer model. By accounting for all parameter rankings produced by multiple GSA methods, a summarized ranking is created at the end of the workflow based on the weighted mean of the rankings for each input parameter. For the cancer model investigated here, this analysis reveals that ERK, a downstream molecule of the EGFR signaling pathway, has the most important impact on regulating both the tumor volume and expansion rate in the algorithm used. PMID:25257020

  13. Enabling Structured Exploration of Workflow Performance Variability in Extreme-Scale Environments

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin; Stephan, Eric G.; Raju, Bibi; Altintas, Ilkay; Elsethagen, Todd O.; Krishnamoorthy, Sriram

    2015-11-15

    Workflows are taking an Workflows are taking an increasingly important role in orchestrating complex scientific processes in extreme scale and highly heterogeneous environments. However, to date we cannot reliably predict, understand, and optimize workflow performance. Sources of performance variability and in particular the interdependencies of workflow design, execution environment and system architecture are not well understood. While there is a rich portfolio of tools for performance analysis, modeling and prediction for single applications in homogenous computing environments, these are not applicable to workflows, due to the number and heterogeneity of the involved workflow and system components and their strong interdependencies. In this paper, we investigate workflow performance goals and identify factors that could have a relevant impact. Based on our analysis, we propose a new workflow performance provenance ontology, the Open Provenance Model-based WorkFlow Performance Provenance, or OPM-WFPP, that will enable the empirical study of workflow performance characteristics and variability including complex source attribution.

  14. A Workflow to Model Microbial Loadings in Watersheds

    Science.gov (United States)

    Many watershed models simulate overland and instream microbial fate and transport, but few actually provide loading rates on land surfaces and point sources to the water body network. This paper describes the underlying general equations for microbial loading rates associated wit...

  15. A Workflow to Model Microbial Loadings in Watersheds (proceedings)

    Science.gov (United States)

    Many watershed models simulate overland and instream microbial fate and transport, but few actually provide loading rates on land surfaces and point sources to the water body network. This paper describes the underlying general equations for microbial loading rates associated wit...

  16. Efficient Workflows for Curation of Heterogeneous Data Supporting Modeling of U-Nb Alloy Aging

    International Nuclear Information System (INIS)

    Ward, Logan Timothy; Hackenberg, Robert Errol

    2016-01-01

    These are slides from a presentation summarizing a graduate research associate's summer project. The following topics are covered in these slides: data challenges in materials, aging in U-Nb Alloys, Building an Aging Model, Different Phase Trans. in U-Nb, the Challenge, Storing Materials Data, Example Data Source, Organizing Data: What is a Schema?, What does a 'XML Schema' look like?, Our Data Schema: Nice and Simple, Storing Data: Materials Data Curation System (MDCS), Problem with MDCS: Slow Data Entry, Getting Literature into MDCS, Staging Data in Excel Document, Final Result: MDCS Records, Analyzing Image Data, Process for Making TTT Diagram, Bottleneck Number 1: Image Analysis, Fitting a TTP Boundary, Fitting a TTP Curve: Comparable Results, How Does it Compare to Our Data?, Image Analysis Workflow, Curating Hardness Records, Hardness Data: Two Key Decisions, Before Peak Age? - Automation, Interactive Viz, Which Transformation?, Microstructure-Informed Model, Tracking the Entire Process, General Problem with Property Models, Pinyon: Toolkit for Managing Model Creation, Tracking Individual Decisions, Jupyter: Docs and Code in One File, Hardness Analysis Workflow, Workflow for Aging Models, and conclusions.

  17. Efficient Workflows for Curation of Heterogeneous Data Supporting Modeling of U-Nb Alloy Aging

    Energy Technology Data Exchange (ETDEWEB)

    Ward, Logan Timothy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hackenberg, Robert Errol [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-31

    These are slides from a presentation summarizing a graduate research associate's summer project. The following topics are covered in these slides: data challenges in materials, aging in U-Nb Alloys, Building an Aging Model, Different Phase Trans. in U-Nb, the Challenge, Storing Materials Data, Example Data Source, Organizing Data: What is a Schema?, What does a "XML Schema" look like?, Our Data Schema: Nice and Simple, Storing Data: Materials Data Curation System (MDCS), Problem with MDCS: Slow Data Entry, Getting Literature into MDCS, Staging Data in Excel Document, Final Result: MDCS Records, Analyzing Image Data, Process for Making TTT Diagram, Bottleneck Number 1: Image Analysis, Fitting a TTP Boundary, Fitting a TTP Curve: Comparable Results, How Does it Compare to Our Data?, Image Analysis Workflow, Curating Hardness Records, Hardness Data: Two Key Decisions, Before Peak Age? - Automation, Interactive Viz, Which Transformation?, Microstructure-Informed Model, Tracking the Entire Process, General Problem with Property Models, Pinyon: Toolkit for Managing Model Creation, Tracking Individual Decisions, Jupyter: Docs and Code in One File, Hardness Analysis Workflow, Workflow for Aging Models, and conclusions.

  18. High‐resolution trench photomosaics from image‐based modeling: Workflow and error analysis

    Science.gov (United States)

    Reitman, Nadine G.; Bennett, Scott E. K.; Gold, Ryan D.; Briggs, Richard; Duross, Christopher

    2015-01-01

    Photomosaics are commonly used to construct maps of paleoseismic trench exposures, but the conventional process of manually using image‐editing software is time consuming and produces undesirable artifacts and distortions. Herein, we document and evaluate the application of image‐based modeling (IBM) for creating photomosaics and 3D models of paleoseismic trench exposures, illustrated with a case‐study trench across the Wasatch fault in Alpine, Utah. Our results include a structure‐from‐motion workflow for the semiautomated creation of seamless, high‐resolution photomosaics designed for rapid implementation in a field setting. Compared with conventional manual methods, the IBM photomosaic method provides a more accurate, continuous, and detailed record of paleoseismic trench exposures in approximately half the processing time and 15%–20% of the user input time. Our error analysis quantifies the effect of the number and spatial distribution of control points on model accuracy. For this case study, an ∼87  m2 exposure of a benched trench photographed at viewing distances of 1.5–7 m yields a model with <2  cm root mean square error (rmse) with as few as six control points. Rmse decreases as more control points are implemented, but the gains in accuracy are minimal beyond 12 control points. Spreading control points throughout the target area helps to minimize error. We propose that 3D digital models and corresponding photomosaics should be standard practice in paleoseismic exposure archiving. The error analysis serves as a guide for future investigations that seek balance between speed and accuracy during photomosaic and 3D model construction.

  19. DOMstudio: an integrated workflow for Digital Outcrop Model reconstruction and interpretation

    Science.gov (United States)

    Bistacchi, Andrea

    2015-04-01

    Different Remote Sensing technologies, including photogrammetry and LIDAR, allow collecting 3D dataset that can be used to create 3D digital representations of outcrop surfaces, called Digital Outcrop Models (DOM), or sometimes Virtual Outcrop Models (VOM). Irrespective of the Remote Sensing technique used, DOMs can be represented either by photorealistic point clouds (PC-DOM) or textured surfaces (TS-DOM). The first are datasets composed of millions of points with XYZ coordinates and RGB colour, whilst the latter are triangulated surfaces onto which images of the outcrop have been mapped or "textured" (applying a tech-nology originally developed for movies and videogames). Here we present a workflow that allows exploiting in an integrated and efficient, yet flexible way, both kinds of dataset: PC-DOMs and TS-DOMs. The workflow is composed of three main steps: (1) data collection and processing, (2) interpretation, and (3) modelling. Data collection can be performed with photogrammetry, LIDAR, or other techniques. The quality of photogrammetric datasets obtained with Structure From Motion (SFM) techniques has shown a tremendous improvement over the past few years, and this is becoming the more effective way to collect DOM datasets. The main advantages of photogrammetry over LIDAR are represented by the very simple and lightweight field equipment (a digital camera), and by the arbitrary spatial resolution, that can be increased simply getting closer to the out-crop or by using a different lens. It must be noted that concerns about the precision of close-range photogrammetric surveys, that were justified in the past, are no more a problem if modern software and acquisition schemas are applied. In any case, LIDAR is a well-tested technology and it is still very common. Irrespective of the data collection technology, the output will be a photorealistic point cloud and a collection of oriented photos, plus additional imagery in special projects (e.g. infrared images

  20. Transformation from bpmn to wf model of document management system workflows

    OpenAIRE

    Kisly, Miroslav

    2008-01-01

    Darbe yra nagrinėjami du paplitę darbo sekų modeliai – BPMN ir Windows Workflow Foundation (WF), bei tiriama, kaip BPMN darbo sekas transformuoti į WF vykdymo modelį. BPMN ir WF yra iš esmės skirtingos kalbos ir pagrindinė transformavimo problema yra susijusi su nestruktūrizuotų ciklų eliminavimu, t.y. jų konvertavimu į proceso atžvilgiu ekvivalenčias struktūrizuotas konstrukcijas. Darbe pasirinktas žinomas algoritmas, skirtas panašioms ciklinėms konstrukcijoms transformuoti naudojant tęstinu...

  1. Improved workflow modelling using role activity diagram-based modelling with application to a radiology service case study.

    Science.gov (United States)

    Shukla, Nagesh; Keast, John E; Ceglarek, Darek

    2014-10-01

    The modelling of complex workflows is an important problem-solving technique within healthcare settings. However, currently most of the workflow models use a simplified flow chart of patient flow obtained using on-site observations, group-based debates and brainstorming sessions, together with historic patient data. This paper presents a systematic and semi-automatic methodology for knowledge acquisition with detailed process representation using sequential interviews of people in the key roles involved in the service delivery process. The proposed methodology allows the modelling of roles, interactions, actions, and decisions involved in the service delivery process. This approach is based on protocol generation and analysis techniques such as: (i) initial protocol generation based on qualitative interviews of radiology staff, (ii) extraction of key features of the service delivery process, (iii) discovering the relationships among the key features extracted, and, (iv) a graphical representation of the final structured model of the service delivery process. The methodology is demonstrated through a case study of a magnetic resonance (MR) scanning service-delivery process in the radiology department of a large hospital. A set of guidelines is also presented in this paper to visually analyze the resulting process model for identifying process vulnerabilities. A comparative analysis of different workflow models is also conducted. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. Integrating prediction, provenance, and optimization into high energy workflows

    Energy Technology Data Exchange (ETDEWEB)

    Schram, M.; Bansal, V.; Friese, R. D.; Tallent, N. R.; Yin, J.; Barker, K. J.; Stephan, E.; Halappanavar, M.; Kerbyson, D. J.

    2017-10-01

    We propose a novel approach for efficient execution of workflows on distributed resources. The key components of this framework include: performance modeling to quantitatively predict workflow component behavior; optimization-based scheduling such as choosing an optimal subset of resources to meet demand and assignment of tasks to resources; distributed I/O optimizations such as prefetching; and provenance methods for collecting performance data. In preliminary results, these techniques improve throughput on a small Belle II workflow by 20%.

  3. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-II analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This presentation will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  4. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  5. Separating Business Logic from Medical Knowledge in Digital Clinical Workflows Using Business Process Model and Notation and Arden Syntax.

    Science.gov (United States)

    de Bruin, Jeroen S; Adlassnig, Klaus-Peter; Leitich, Harald; Rappelsberger, Andrea

    2018-01-01

    Evidence-based clinical guidelines have a major positive effect on the physician's decision-making process. Computer-executable clinical guidelines allow for automated guideline marshalling during a clinical diagnostic process, thus improving the decision-making process. Implementation of a digital clinical guideline for the prevention of mother-to-child transmission of hepatitis B as a computerized workflow, thereby separating business logic from medical knowledge and decision-making. We used the Business Process Model and Notation language system Activiti for business logic and workflow modeling. Medical decision-making was performed by an Arden-Syntax-based medical rule engine, which is part of the ARDENSUITE software. We succeeded in creating an electronic clinical workflow for the prevention of mother-to-child transmission of hepatitis B, where institution-specific medical decision-making processes could be adapted without modifying the workflow business logic. Separation of business logic and medical decision-making results in more easily reusable electronic clinical workflows.

  6. Integrated workflows for spiking neuronal network simulations

    Directory of Open Access Journals (Sweden)

    Ján eAntolík

    2013-12-01

    Full Text Available The increasing availability of computational resources is enabling more detailed, realistic modelling in computational neuroscience, resulting in a shift towards more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeller's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modellers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity.To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualisation into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organised configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualisation stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modelling studies by relieving the user from manual handling of the flow of metadata between the individual

  7. A data model for analyzing user collaborations in workflow-driven e-Science

    NARCIS (Netherlands)

    Altintas, I.; Anand, M.K.; Vuong, T.N.; Bowers, S.; Ludäscher, B.; Sloot, P.M.A.

    2011-01-01

    Scientific discoveries are often the result of methodical execution of many interrelated scientific workflows, where workflows and datasets published by one set of users can be used by other users to perform subsequent analyses, leading to implicit or explicit collaboration. In this paper, we

  8. Modeling Boston: A workflow for the efficient generation and maintenance of urban building energy models from existing geospatial datasets

    International Nuclear Information System (INIS)

    Cerezo Davila, Carlos; Reinhart, Christoph F.; Bemis, Jamie L.

    2016-01-01

    City governments and energy utilities are increasingly focusing on the development of energy efficiency strategies for buildings as a key component in emission reduction plans and energy supply strategies. To support these diverse needs, a new generation of Urban Building Energy Models (UBEM) is currently being developed and validated to estimate citywide hourly energy demands at the building level. However, in order for cities to rely on UBEMs, effective model generation and maintenance workflows are needed based on existing urban data structures. Within this context, the authors collaborated with the Boston Redevelopment Authority to develop a citywide UBEM based on official GIS datasets and a custom building archetype library. Energy models for 83,541 buildings were generated and assigned one of 52 use/age archetypes, within the CAD modelling environment Rhinoceros3D. The buildings were then simulated using the US DOE EnergyPlus simulation program, and results for buildings of the same archetype were crosschecked against data from the US national energy consumption surveys. A district-level intervention combining photovoltaics with demand side management is presented to demonstrate the ability of UBEM to provide actionable information. Lack of widely available archetype templates and metered energy data, were identified as key barriers within existing workflows that may impede cities from effectively applying UBEM to guide energy policy. - Highlights: • Data requirements for Urban Building Energy Models are reviewed. • A workflow for UBEM generation from available GIS datasets is developed. • A citywide demand simulation model for Boston is generated and tested. • Limitations for UBEM in current urban data systems are identified and discussed. • Model application for energy management policy is shown in an urban PV scenario.

  9. Querying Workflow Logs

    Directory of Open Access Journals (Sweden)

    Yan Tang

    2018-01-01

    Full Text Available A business process or workflow is an assembly of tasks that accomplishes a business goal. Business process management is the study of the design, configuration/implementation, enactment and monitoring, analysis, and re-design of workflows. The traditional methodology for the re-design and improvement of workflows relies on the well-known sequence of extract, transform, and load (ETL, data/process warehousing, and online analytical processing (OLAP tools. In this paper, we study the ad hoc queryiny of process enactments for (data-centric business processes, bypassing the traditional methodology for more flexibility in querying. We develop an algebraic query language based on “incident patterns” with four operators inspired from Business Process Model and Notation (BPMN representation, allowing the user to formulate ad hoc queries directly over workflow logs. A formal semantics of this query language, a preliminary query evaluation algorithm, and a group of elementary properties of the operators are provided.

  10. MOS modeling hierarchy including radiation effects

    International Nuclear Information System (INIS)

    Alexander, D.R.; Turfler, R.M.

    1975-01-01

    A hierarchy of modeling procedures has been developed for MOS transistors, circuit blocks, and integrated circuits which include the effects of total dose radiation and photocurrent response. The models were developed for use with the SCEPTRE circuit analysis program, but the techniques are suitable for other modern computer aided analysis programs. The modeling hierarchy permits the designer or analyst to select the level of modeling complexity consistent with circuit size, parametric information, and accuracy requirements. Improvements have been made in the implementation of important second order effects in the transistor MOS model, in the definition of MOS building block models, and in the development of composite terminal models for MOS integrated circuits

  11. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    Science.gov (United States)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  12. Scientific workflow and support for high resolution global climate modeling at the Oak Ridge Leadership Computing Facility

    Science.gov (United States)

    Anantharaj, V.; Mayer, B.; Wang, F.; Hack, J.; McKenna, D.; Hartman-Baker, R.

    2012-04-01

    The Oak Ridge Leadership Computing Facility (OLCF) facilitates the execution of computational experiments that require tens of millions of CPU hours (typically using thousands of processors simultaneously) while generating hundreds of terabytes of data. A set of ultra high resolution climate experiments in progress, using the Community Earth System Model (CESM), will produce over 35,000 files, ranging in sizes from 21 MB to 110 GB each. The execution of the experiments will require nearly 70 Million CPU hours on the Jaguar and Titan supercomputers at OLCF. The total volume of the output from these climate modeling experiments will be in excess of 300 TB. This model output must then be archived, analyzed, distributed to the project partners in a timely manner, and also made available more broadly. Meeting this challenge would require efficient movement of the data, staging the simulation output to a large and fast file system that provides high volume access to other computational systems used to analyze the data and synthesize results. This file system also needs to be accessible via high speed networks to an archival system that can provide long term reliable storage. Ideally this archival system is itself directly available to other systems that can be used to host services making the data and analysis available to the participants in the distributed research project and to the broader climate community. The various resources available at the OLCF now support this workflow. The available systems include the new Jaguar Cray XK6 2.63 petaflops (estimated) supercomputer, the 10 PB Spider center-wide parallel file system, the Lens/EVEREST analysis and visualization system, the HPSS archival storage system, the Earth System Grid (ESG), and the ORNL Climate Data Server (CDS). The ESG features federated services, search & discovery, extensive data handling capabilities, deep storage access, and Live Access Server (LAS) integration. The scientific workflow enabled on

  13. CMS distributed computing workflow experience

    Science.gov (United States)

    Adelman-McCarthy, Jennifer; Gutsche, Oliver; Haas, Jeffrey D.; Prosper, Harrison B.; Dutta, Valentina; Gomez-Ceballos, Guillelmo; Hahn, Kristian; Klute, Markus; Mohapatra, Ajit; Spinoso, Vincenzo; Kcira, Dorian; Caudron, Julien; Liao, Junhui; Pin, Arnaud; Schul, Nicolas; De Lentdecker, Gilles; McCartin, Joseph; Vanelderen, Lukas; Janssen, Xavier; Tsyganov, Andrey; Barge, Derek; Lahiff, Andrew

    2011-12-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simulation of proton-proton collisions for the CMS experiment is primarily carried out at the second tier of the CMS computing infrastructure. Half of the Tier-2 sites of CMS are reserved for central Monte Carlo (MC) production while the other half is available for user analysis. This paper summarizes the large throughput of the MC production operation during the data taking period of 2010 and discusses the latencies and efficiencies of the various types of MC production workflows. We present the operational procedures to optimize the usage of available resources and we the operational model of CMS for including opportunistic resources, such as the larger Tier-3 sites, into the central production operation.

  14. CMS distributed computing workflow experience

    International Nuclear Information System (INIS)

    Adelman-McCarthy, Jennifer; Gutsche, Oliver; Haas, Jeffrey D; Prosper, Harrison B; Dutta, Valentina; Gomez-Ceballos, Guillelmo; Hahn, Kristian; Klute, Markus; Mohapatra, Ajit; Spinoso, Vincenzo; Kcira, Dorian; Caudron, Julien; Liao Junhui; Pin, Arnaud; Schul, Nicolas; Lentdecker, Gilles De; McCartin, Joseph; Vanelderen, Lukas; Janssen, Xavier; Tsyganov, Andrey

    2011-01-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simulation of proton-proton collisions for the CMS experiment is primarily carried out at the second tier of the CMS computing infrastructure. Half of the Tier-2 sites of CMS are reserved for central Monte Carlo (MC) production while the other half is available for user analysis. This paper summarizes the large throughput of the MC production operation during the data taking period of 2010 and discusses the latencies and efficiencies of the various types of MC production workflows. We present the operational procedures to optimize the usage of available resources and we the operational model of CMS for including opportunistic resources, such as the larger Tier-3 sites, into the central production operation.

  15. 3d-modelling workflows for trans-nationally shared geological models - first approaches from the project GeoMol

    Science.gov (United States)

    Rupf, Isabel

    2013-04-01

    To meet the EU's ambitious targets for carbon emission reduction, renewable energy production has to be strongly upgraded and made more efficient for grid energy storage. Alpine Foreland Basins feature a unique geological inventory which can contribute substantially to tackle these challenges. They offer a geothermal potential and storage capacity for compressed air, as well as space for underground storage of CO2. Exploiting these natural subsurface resources will strongly compete with existing oil and gas claims and groundwater issues. The project GeoMol will provide consistent 3-dimensional subsurface information about the Alpine Foreland Basins based on a holistic and transnational approach. Core of the project GeoMol is a geological framework model for the entire Northern Molasse Basin, complemented by five detailed models in pilot areas, also in the Po Basin, which are dedicated to specific questions of subsurface use. The models will consist of up to 13 litho-stratigraphic horizons ranging from the Cenozoic basin fill down to Mesozoic and late Paleozoic sedimentary rocks and the crystalline basement. More than 5000 wells and 28 000 km seismic lines serve as input data sets for the geological subsurface model. The data have multiple sources and various acquisition dates, and their interpretations have gone through several paradigm changes. Therefore, it is necessary to standardize the data with regards to technical parameters and content prior to further analysis (cf. Capar et al. 2013, EGU2013-5349). Each partner will build its own geological subsurface model with different software solutions for seismic interpretation and 3d-modelling. Therefore, 3d-modelling follows different software- and partner-specific workflows. One of the main challenges of the project is to ensure a seamlessly fitting framework model. It is necessary to define several milestones for cross border checks during the whole modelling process. Hence, the main input data set of the

  16. Constructing reservoir-scale 3D geomechanical FE-models. A refined workflow for model generation and calculation

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, K.; Henk, A. [Technische Univ. Darmstadt (Germany). Inst. fuer Angewandte Geowissenschaften

    2013-08-01

    The tectonic stress field strongly affects the optimal exploitation of conventional and unconventional hydrocarbon reservoirs. Amongst others, wellbore stability, orientation of hydraulically induced fractures and - particularly in fractured reservoirs - permeability anisotropies depend on the magnitudes and orientations of the recent stresses. Geomechanical reservoir models can provide unique insights into the tectonic stress field revealing the local perturbations resulting from faults and lithological changes. In order to provide robust predictions, such numerical models are based on the finite element (FE) method and account for the complexities of real reservoirs with respect to subsurface geometry, inhomogeneous material distribution and nonlinear rock mechanical behavior. We present a refined workflow for geomechanical reservoir modeling which allows for an easier set-up of the model geometry, high resolution submodels and faster calculation times due to element savings in the load frame. Transferring the reservoir geometry from the geological subsurface model, e.g., a Petrel {sup registered} project, to the FE model represents a special challenge as the faults are discontinuities in the numerical model and no direct interface exists between the two software packages used. Point clouds displaying faults and lithostratigraphic horizons can be used for geometry transfer but this labor-intensive approach is not feasible for complex field-scale models with numerous faults. Instead, so-called Coon's patches based on horizon lines, i.e. the intersection lines between horizons and faults, are well suited to re-generate the various surfaces in the FE software while maintaining their topology. High-resolution submodels of individual fault blocks can be incorporated into the field-scale model. This allows to consider both a locally refined mechanical stratigraphy and the impact of the large-scale fault pattern. A pressure load on top of the model represents the

  17. An integrated workflow for stress and flow modelling using outcrop-derived discrete fracture networks

    DEFF Research Database (Denmark)

    Bisdom, Kevin; Nick, Hamid; Bertotti, Giovanni

    2017-01-01

    stresssensitive fracture permeability and matrix flow to determine the full permeability tensor. The applicability of this workflow is illustrated using an outcropping carbonate pavement in the Potiguar basin in Brazil, from which 1082 fractures are digitised. The permeability tensor for a range of matrix...

  18. Model for safety reports including descriptive examples

    International Nuclear Information System (INIS)

    1995-12-01

    Several safety reports will be produced in the process of planning and constructing the system for disposal of high-level radioactive waste in Sweden. The present report gives a model, with detailed examples, of how these reports should be organized and what steps they should include. In the near future safety reports will deal with the encapsulation plant and the repository. Later reports will treat operation of the handling systems and the repository

  19. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2018-01-01

    From 2025 onwards, the ATLAS collaboration at the Large Hadron Collider (LHC) at CERN will experience a massive increase in data quantity as well as complexity. Including mitigating factors, the prevalent computing power by that time will only fulfil one tenth of the requirement. This contribution will focus on Cloud computing as an approach to help overcome this challenge by providing flexible hardware that can be configured to the specific needs of a workflow. Experience with Cloud computing exists, but there is a large uncertainty if and to which degree it can be able to reduce the burden by 2025. In order to understand and quantify the benefits of Cloud computing, the "Workflow and Infrastructure Model" was created. It estimates the viability of Cloud computing by combining different inputs from the workflow side with infrastructure specifications. The model delivers metrics that enable the comparison of different Cloud configurations as well as different Cloud offerings with each other. A wide range of r...

  20. SEEPAGE MODEL FOR PA INCLUDING DRIFT COLLAPSE

    International Nuclear Information System (INIS)

    C. Tsang

    2004-01-01

    The purpose of this report is to document the predictions and analyses performed using the seepage model for performance assessment (SMPA) for both the Topopah Spring middle nonlithophysal (Tptpmn) and lower lithophysal (Tptpll) lithostratigraphic units at Yucca Mountain, Nevada. Look-up tables of seepage flow rates into a drift (and their uncertainty) are generated by performing numerical simulations with the seepage model for many combinations of the three most important seepage-relevant parameters: the fracture permeability, the capillary-strength parameter 1/a, and the percolation flux. The percolation flux values chosen take into account flow focusing effects, which are evaluated based on a flow-focusing model. Moreover, multiple realizations of the underlying stochastic permeability field are conducted. Selected sensitivity studies are performed, including the effects of an alternative drift geometry representing a partially collapsed drift from an independent drift-degradation analysis (BSC 2004 [DIRS 166107]). The intended purpose of the seepage model is to provide results of drift-scale seepage rates under a series of parameters and scenarios in support of the Total System Performance Assessment for License Application (TSPA-LA). The SMPA is intended for the evaluation of drift-scale seepage rates under the full range of parameter values for three parameters found to be key (fracture permeability, the van Genuchten 1/a parameter, and percolation flux) and drift degradation shape scenarios in support of the TSPA-LA during the period of compliance for postclosure performance [Technical Work Plan for: Performance Assessment Unsaturated Zone (BSC 2002 [DIRS 160819], Section I-4-2-1)]. The flow-focusing model in the Topopah Spring welded (TSw) unit is intended to provide an estimate of flow focusing factors (FFFs) that (1) bridge the gap between the mountain-scale and drift-scale models, and (2) account for variability in local percolation flux due to

  1. A Photogrammetric Workflow for the Creation of a Forest Canopy Height Model from Small Unmanned Aerial System Imagery

    Directory of Open Access Journals (Sweden)

    Philippe Lejeune

    2013-11-01

    Full Text Available The recent development of operational small unmanned aerial systems (UASs opens the door for their extensive use in forest mapping, as both the spatial and temporal resolution of UAS imagery better suit local-scale investigation than traditional remote sensing tools. This article focuses on the use of combined photogrammetry and “Structure from Motion” approaches in order to model the forest canopy surface from low-altitude aerial images. An original workflow, using the open source and free photogrammetric toolbox, MICMAC (acronym for Multi Image Matches for Auto Correlation Methods, was set up to create a digital canopy surface model of deciduous stands. In combination with a co-registered light detection and ranging (LiDAR digital terrain model, the elevation of vegetation was determined, and the resulting hybrid photo/LiDAR canopy height model was compared to data from a LiDAR canopy height model and from forest inventory data. Linear regressions predicting dominant height and individual height from plot metrics and crown metrics showed that the photogrammetric canopy height model was of good quality for deciduous stands. Although photogrammetric reconstruction significantly smooths the canopy surface, the use of this workflow has the potential to take full advantage of the flexible revisit period of drones in order to refresh the LiDAR canopy height model and to collect dense multitemporal canopy height series.

  2. Grand unified models including extra Z bosons

    International Nuclear Information System (INIS)

    Li Tiezhong

    1989-01-01

    The grand unified theories (GUT) of the simple Lie groups including extra Z bosons are discussed. Under authors's hypothesis there are only SU 5+m SO 6+4n and E 6 groups. The general discussion of SU 5+m is given, then the SU 6 and SU 7 are considered. In SU 6 the 15+6 * +6 * fermion representations are used, which are not same as others in fermion content, Yukawa coupling and broken scales. A conception of clans of particles, which are not families, is suggested. These clans consist of extra Z bosons and the corresponding fermions of the scale. The all of fermions in the clans are down quarks except for the standard model which consists of Z bosons and 15 fermions, therefore, the spectrum of the hadrons which are composed of these down quarks are different from hadrons at present

  3. From DNS to RANS: A Multi-model workflow to understand the Influence of Hurricanes on Generating Turbidity Currents in the Gulf of Mexico

    Science.gov (United States)

    Syvitski, J. P.; Arango, H.; Harris, C. K.; Meiburg, E. H.; Jenkins, C. J.; Auad, G.; Hutton, E.; Kniskern, T. A.; Radhakrishnan, S.

    2016-12-01

    A loosely coupled numerical workflow is developed to address land-sea pathways for sediment routing from terrestrial and coastal sources, across the continental shelf and ultimately down the continental slope canyon system of the northern Gulf of Mexico (GOM). Model simulations represent a range of environmental conditions that might lead to the generation of turbidity-currents. The workflow comprises: 1) A simulator for the water and sediment discharged from rivers into the GOM with WMBsedv2 with calibration using USGS and USACE gauged river data; 2) Domain grids and bathymetry (ETOPO2) for the ocean models and realistic seabed sediment texture grids (dbSEABED) for the sediment transport models; 3) A spectral wave action simulator (10 km resolution) (WaveWatch III) driven by GFDL - GFS winds; 4) A simulator for ocean dynamics (ROMS) forced with ECMWF ERA winds; 5) A simulator for seafloor resuspension and transport (CSTMS); 6) Simulators (HurriSlip) of seafloor failure and flow ignition locations for boundary input to a turbidity current model; and 7) A RANS turbidity current model (TURBINS) to route sediment flows down GOM canyons, providing estimates of bottom shear stresses. TURBINS was developed first as a DNS model and then converted to an LES model wherein a dynamic turbulence closure scheme was employed. Like most DNS to LES model comparisons (these being done by the UCSB team), turbulence scaling allowed for higher Re applications but were found still not capable of simulating field scale (GOM continental canyons) environments. The LES model was next converted to a non-hydrostatic RANS model capable of field scale applications but only with a daisy-chain approach to multiple model runs along the simulated canyon floor. These model adaptations allowed the workflow to be tested for the year 1-Oct-2007 to 30-Sep-2008 that included two domain Hurricanes (Ike and Gustav). The RANS-TURBINS employed further boundary simplifications on both sediment erosion and

  4. Observing health professionals' workflow patterns for diabetes care - First steps towards an ontology for EHR services.

    Science.gov (United States)

    Schweitzer, M; Lasierra, N; Hoerbst, A

    2015-01-01

    Increasing the flexibility from a user-perspective and enabling a workflow based interaction, facilitates an easy user-friendly utilization of EHRs for healthcare professionals' daily work. To offer such versatile EHR-functionality, our approach is based on the execution of clinical workflows by means of a composition of semantic web-services. The backbone of such architecture is an ontology which enables to represent clinical workflows and facilitates the selection of suitable services. In this paper we present the methods and results after running observations of diabetes routine consultations which were conducted in order to identify those workflows and the relation among the included tasks. Mentioned workflows were first modeled by BPMN and then generalized. As a following step in our study, interviews will be conducted with clinical personnel to validate modeled workflows.

  5. An Integrated Biochemistry Laboratory, Including Molecular Modeling

    Science.gov (United States)

    Hall, Adele J. Wolfson Mona L.; Branham, Thomas R.

    1996-11-01

    ) experience with methods of protein purification; (iii) incorporation of appropriate controls into experiments; (iv) use of basic statistics in data analysis; (v) writing papers and grant proposals in accepted scientific style; (vi) peer review; (vii) oral presentation of results and proposals; and (viii) introduction to molecular modeling. Figure 1 illustrates the modular nature of the lab curriculum. Elements from each of the exercises can be separated and treated as stand-alone exercises, or combined into short or long projects. We have been able to offer the opportunity to use sophisticated molecular modeling in the final module through funding from an NSF-ILI grant. However, many of the benefits of the research proposal can be achieved with other computer programs, or even by literature survey alone. Figure 1.Design of project-based biochemistry laboratory. Modules (projects, or portions of projects) are indicated as boxes. Each of these can be treated independently, or used as part of a larger project. Solid lines indicate some suggested paths from one module to the next. The skills and knowledge required for protein purification and design are developed in three units: (i) an introduction to critical assays needed to monitor degree of purification, including an evaluation of assay parameters; (ii) partial purification by ion-exchange techniques; and (iii) preparation of a grant proposal on protein design by mutagenesis. Brief descriptions of each of these units follow, with experimental details of each project at the end of this paper. Assays for Lysozyme Activity and Protein Concentration (4 weeks) The assays mastered during the first unit are a necessary tool for determining the purity of the enzyme during the second unit on purification by ion exchange. These assays allow an introduction to the concept of specific activity (units of enzyme activity per milligram of total protein) as a measure of purity. In this first sequence, students learn a turbidimetric assay

  6. Towards an Intelligent Workflow Designer based on the Reuse of Workflow Patterns

    NARCIS (Netherlands)

    Iochpe, Cirano; Chiao, Carolina; Hess, Guillermo; Nascimento, Gleison; Thom, Lucinéia; Reichert, Manfred

    2007-01-01

    In order to perform process-aware information systems we need sophisticated methods and concepts for designing and modeling processes. Recently, research on workflow patterns has emerged in order to increase the reuse of recurring workflow structures. However, current workflow modeling tools do not

  7. A standard-enabled workflow for synthetic biology

    KAUST Repository

    Myers, Chris J.

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications.

  8. Seepage Model for PA Including Drift Collapse

    International Nuclear Information System (INIS)

    Li, G.; Tsang, C.

    2000-01-01

    The purpose of this Analysis/Model Report (AMR) is to document the predictions and analysis performed using the Seepage Model for Performance Assessment (PA) and the Disturbed Drift Seepage Submodel for both the Topopah Spring middle nonlithophysal and lower lithophysal lithostratigraphic units at Yucca Mountain. These results will be used by PA to develop the probability distribution of water seepage into waste-emplacement drifts at Yucca Mountain, Nevada, as part of the evaluation of the long term performance of the potential repository. This AMR is in accordance with the ''Technical Work Plan for Unsaturated Zone (UZ) Flow and Transport Process Model Report'' (CRWMS M andO 2000 [153447]). This purpose is accomplished by performing numerical simulations with stochastic representations of hydrological properties, using the Seepage Model for PA, and evaluating the effects of an alternative drift geometry representing a partially collapsed drift using the Disturbed Drift Seepage Submodel. Seepage of water into waste-emplacement drifts is considered one of the principal factors having the greatest impact of long-term safety of the repository system (CRWMS M andO 2000 [153225], Table 4-1). This AMR supports the analysis and simulation that are used by PA to develop the probability distribution of water seepage into drift, and is therefore a model of primary (Level 1) importance (AP-3.15Q, ''Managing Technical Product Inputs''). The intended purpose of the Seepage Model for PA is to support: (1) PA; (2) Abstraction of Drift-Scale Seepage; and (3) Unsaturated Zone (UZ) Flow and Transport Process Model Report (PMR). Seepage into drifts is evaluated by applying numerical models with stochastic representations of hydrological properties and performing flow simulations with multiple realizations of the permeability field around the drift. The Seepage Model for PA uses the distribution of permeabilities derived from air injection testing in niches and in the cross drift to

  9. Seepage Model for PA Including Dift Collapse

    Energy Technology Data Exchange (ETDEWEB)

    G. Li; C. Tsang

    2000-12-20

    The purpose of this Analysis/Model Report (AMR) is to document the predictions and analysis performed using the Seepage Model for Performance Assessment (PA) and the Disturbed Drift Seepage Submodel for both the Topopah Spring middle nonlithophysal and lower lithophysal lithostratigraphic units at Yucca Mountain. These results will be used by PA to develop the probability distribution of water seepage into waste-emplacement drifts at Yucca Mountain, Nevada, as part of the evaluation of the long term performance of the potential repository. This AMR is in accordance with the ''Technical Work Plan for Unsaturated Zone (UZ) Flow and Transport Process Model Report'' (CRWMS M&O 2000 [153447]). This purpose is accomplished by performing numerical simulations with stochastic representations of hydrological properties, using the Seepage Model for PA, and evaluating the effects of an alternative drift geometry representing a partially collapsed drift using the Disturbed Drift Seepage Submodel. Seepage of water into waste-emplacement drifts is considered one of the principal factors having the greatest impact of long-term safety of the repository system (CRWMS M&O 2000 [153225], Table 4-1). This AMR supports the analysis and simulation that are used by PA to develop the probability distribution of water seepage into drift, and is therefore a model of primary (Level 1) importance (AP-3.15Q, ''Managing Technical Product Inputs''). The intended purpose of the Seepage Model for PA is to support: (1) PA; (2) Abstraction of Drift-Scale Seepage; and (3) Unsaturated Zone (UZ) Flow and Transport Process Model Report (PMR). Seepage into drifts is evaluated by applying numerical models with stochastic representations of hydrological properties and performing flow simulations with multiple realizations of the permeability field around the drift. The Seepage Model for PA uses the distribution of permeabilities derived from air injection testing in

  10. Enhanced battery model including temperature effects

    NARCIS (Netherlands)

    Rosca, B.; Wilkins, S.

    2013-01-01

    Within electric and hybrid vehicles, batteries are used to provide/buffer the energy required for driving. However, battery performance varies throughout the temperature range specific to automotive applications, and as such, models that describe this behaviour are required. This paper presents a

  11. Image-Rich Radiology Reports: A Value-Based Model to Improve Clinical Workflow.

    Science.gov (United States)

    Patel, Bhavik N; Lopez, Jose M; Jiang, Brian G; Roth, Christopher J; Nelson, Rendon C

    2017-01-01

    To determine the value of image-rich radiology reports (IRRR) by evaluating the interest and preferences of referring physicians, potential impact on clinical workflow, and the willingness of radiologists to create them. Referring physicians and radiologists were interviewed in this prospective, HIPAA-compliant study. Subject willingness to participate in the study was determined by an e-mail. A single investigator conducted all interviews using a standard questionnaire. All subjects reviewed a video mockup demonstration of IRRR and three methods for viewing embedded images, as follows: (1) clickable hyperlinks to access a scrollable stack of images, (2) scrollable and enlargeable small-image thumbnails, and (3) scrollable but not enlargeable medium-sized images. Questionnaire responses, free comments, and general impressions were captured and analyzed. Seventy-two physicians (36 clinicians, 36 radiologists) were interviewed. Thirty-one clinicians (86%) expressed interest in using IRRR. Seventy-seven percent of subjects believed IRRR would improve communication. Ten clinicians (28%) preferred method 1, 18 (50%) preferred method 2, and 8 (22%) preferred method 3 for embedding images. Thirty clinicians (83%) stated that IRRR would improve efficiency. Twenty-two radiologists (61%) preferred selecting a tool button with a mouse and right-clicking images to embed them, 13 (36%) preferred pressing a function key, and 11 (31%) preferred dictating series and image numbers. The average time radiologists were willing to expend for embedding images was 66.7 seconds. Referring physicians and radiologist both believe IRRR would add value by improving communication with the potential to improve the workflow efficiency of referring physicians. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  12. Workflow User Interfaces Patterns

    Directory of Open Access Journals (Sweden)

    Jean Vanderdonckt

    2012-03-01

    Full Text Available Este trabajo presenta una colección de patrones de diseño de interfaces de usuario para sistemas de información para el flujo de trabajo; la colección incluye cuarenta y tres patrones clasificados en siete categorías identificados a partir de la lógica del ciclo de vida de la tarea sobre la base de la oferta y la asignación de tareas a los responsables de realizarlas (i. e. recursos humanos durante el flujo de trabajo. Cada patrón de la interfaz de usuario de flujo de trabajo (WUIP, por sus siglas en inglés se caracteriza por las propiedades expresadas en el lenguaje PLML para expresar patrones y complementado por otros atributos y modelos que se adjuntan a dicho modelo: la interfaz de usuario abstracta y el modelo de tareas correspondiente. Estos modelos se especifican en un lenguaje de descripción de interfaces de usuario. Todos los WUIPs se almacenan en una biblioteca y se pueden recuperar a través de un editor de flujo de trabajo que vincula a cada patrón de asignación de trabajo a su WUIP correspondiente.A collection of user interface design patterns for workflow information systems is presented that contains forty three resource patterns classified in seven categories. These categories and their corresponding patterns have been logically identified from the task life cycle based on offering and allocation operations. Each Workflow User Interface Pattern (WUIP is characterized by properties expressed in the PLML markup language for expressing patterns and augmented by additional attributes and models attached to the pattern: the abstract user interface and the corresponding task model. These models are specified in a User Interface Description Language. All WUIPs are stored in a library and can be retrieved within a workflow editor that links each workflow pattern to its corresponding WUIP, thus giving rise to a user interface for each workflow pattern.

  13. BEAM: A computational workflow system for managing and modeling material characterization data in HPC environments

    Energy Technology Data Exchange (ETDEWEB)

    Lingerfelt, Eric J [ORNL; Endeve, Eirik [ORNL; Ovchinnikov, Oleg S [ORNL; Borreguero Calvo, Jose M [ORNL; Park, Byung H [ORNL; Archibald, Richard K [ORNL; Symons, Christopher T [ORNL; Kalinin, Sergei V [ORNL; Messer, Bronson [ORNL; Shankar, Mallikarjun [ORNL; Jesse, Stephen [ORNL

    2016-01-01

    Improvements in scientific instrumentation allow imaging at mesoscopic to atomic length scales, many spectroscopic modes, and now with the rise of multimodal acquisition systems and the associated processing capability the era of multidimensional, informationally dense data sets has arrived. Technical issues in these combinatorial scientific fields are exacerbated by computational challenges best summarized as a necessity for drastic improvement in the capability to transfer, store, and analyze large volumes of data. The Bellerophon Environment for Analysis of Materials (BEAM) platform provides material scientists the capability to directly leverage the integrated computational and analytical power of High Performance Computing (HPC) to perform scalable data analysis and simulation via an intuitive, cross-platform client user interface. This framework delivers authenticated, push-button execution of complex user workflows that deploy data analysis algorithms and computational simulations utilizing the converged compute-and-data infrastructure at Oak Ridge National Laboratory s (ORNL) Compute and Data Environment for Science (CADES) and HPC environments like Titan at the Oak Ridge Leadership Computing Facility (OLCF). In this work we address the underlying HPC needs for characterization in the material science community, elaborate how BEAM s design and infrastructure tackle those needs, and present a small sub-set of user cases where scientists utilized BEAM across a broad range of analytical techniques and analysis modes.

  14. DEWEY: the DICOM-enabled workflow engine system.

    Science.gov (United States)

    Erickson, Bradley J; Langer, Steve G; Blezek, Daniel J; Ryan, William J; French, Todd L

    2014-06-01

    Workflow is a widely used term to describe the sequence of steps to accomplish a task. The use of workflow technology in medicine and medical imaging in particular is limited. In this article, we describe the application of a workflow engine to improve workflow in a radiology department. We implemented a DICOM-enabled workflow engine system in our department. We designed it in a way to allow for scalability, reliability, and flexibility. We implemented several workflows, including one that replaced an existing manual workflow and measured the number of examinations prepared in time without and with the workflow system. The system significantly increased the number of examinations prepared in time for clinical review compared to human effort. It also met the design goals defined at its outset. Workflow engines appear to have value as ways to efficiently assure that complex workflows are completed in a timely fashion.

  15. Pro WF Windows Workflow in NET 40

    CERN Document Server

    Bukovics, Bruce

    2010-01-01

    Windows Workflow Foundation (WF) is a revolutionary part of the .NET 4 Framework that allows you to orchestrate human and system interactions as a series of workflows that can be easily mapped, analyzed, adjusted, and implemented. As business problems become more complex, the need for workflow-based solutions has never been more evident. WF provides a simple and consistent way to model and implement complex problems. As a developer, you focus on developing the business logic for individual workflow tasks. The runtime handles the execution of those tasks after they have been composed into a wor

  16. Integration of services into workflow applications

    CERN Document Server

    Czarnul, Pawel

    2015-01-01

    Describing state-of-the-art solutions in distributed system architectures, Integration of Services into Workflow Applications presents a concise approach to the integration of loosely coupled services into workflow applications. It discusses key challenges related to the integration of distributed systems and proposes solutions, both in terms of theoretical aspects such as models and workflow scheduling algorithms, and technical solutions such as software tools and APIs.The book provides an in-depth look at workflow scheduling and proposes a way to integrate several different types of services

  17. Multidetector-row CT: economics and workflow

    International Nuclear Information System (INIS)

    Pottala, K.M.; Kalra, M.K.; Saini, S.; Ouellette, K.; Sahani, D.; Thrall, J.H.

    2005-01-01

    With rapid evolution of multidetector-row CT (MDCT) technology and applications, several factors such ad technology upgrade and turf battles for sharing cost and profitability affect MDCT workflow and economics. MDCT workflow optimization can enhance productivity and reduce unit costs as well as increase profitability, in spite of decrease in reimbursement rates. Strategies for workflow management include standardization, automation, and constant assessment of various steps involved in MDCT operations. In this review article, we describe issues related to MDCT economics and workflow. (orig.)

  18. a Geometric Processing Workflow for Transforming Reality-Based 3d Models in Volumetric Meshes Suitable for Fea

    Science.gov (United States)

    Gonizzi Barsanti, S.; Guidi, G.

    2017-02-01

    Conservation of Cultural Heritage is a key issue and structural changes and damages can influence the mechanical behaviour of artefacts and buildings. The use of Finite Elements Methods (FEM) for mechanical analysis is largely used in modelling stress behaviour. The typical workflow involves the use of CAD 3D models made by Non-Uniform Rational B-splines (NURBS) surfaces, representing the ideal shape of the object to be simulated. Nowadays, 3D documentation of CH has been widely developed through reality-based approaches, but the models are not suitable for a direct use in FEA: the mesh has in fact to be converted to volumetric, and the density has to be reduced since the computational complexity of a FEA grows exponentially with the number of nodes. The focus of this paper is to present a new method aiming at generate the most accurate 3D representation of a real artefact from highly accurate 3D digital models derived from reality-based techniques, maintaining the accuracy of the high-resolution polygonal models in the solid ones. The approach proposed is based on a wise use of retopology procedures and a transformation of this model to a mathematical one made by NURBS surfaces suitable for being processed by volumetric meshers typically embedded in standard FEM packages. The strong simplification with little loss of consistency possible with the retopology step is used for maintaining as much coherence as possible between the original acquired mesh and the simplified model, creating in the meantime a topology that is more favourable for the automatic NURBS conversion.

  19. GPGPU accelerated Krylov methods for compact modeling of on-chip passive integrated structures within the Chameleon-RF workflow

    Directory of Open Access Journals (Sweden)

    Sebastian Gim

    2012-11-01

    Full Text Available Continued device scaling into the nanometer region and high frequencies of operation well into the multi-GHz region has given rise to new effects that previously had negligible impact but now present greater challenges and unprecedented complexity to designing successful mixed-signal silicon. The Chameleon-RF project was conceived to address these challenges. Creative use of domain decomposition, multi grid techniques or reduced order modeling techniques (ROM can be selectively applied at all levels of the process to efficiently prune down degrees of freedom (DoFs. However, the simulation of complex systems within a reasonable amount of time remains a computational challenge. This paper presents work done in the incorporation of GPGPU technology to accelerate Krylov based algorithms used for compact modeling of on-chip passive integrated structures within the workflow of the Chameleon-RF project. Based upon insight gained from work done above, a novel GPGPU accelerated algorithm was developed for the Krylov ROM (kROM methods and is described here for the benefit of the wider community.

  20. Olkiluoto surface hydrological modelling: Update 2012 including salt transport modelling

    International Nuclear Information System (INIS)

    Karvonen, T.

    2013-11-01

    Posiva Oy is responsible for implementing a final disposal program for spent nuclear fuel of its owners Teollisuuden Voima Oyj and Fortum Power and Heat Oy. The spent nuclear fuel is planned to be disposed at a depth of about 400-450 meters in the crystalline bedrock at the Olkiluoto site. Leakages located at or close to spent fuel repository may give rise to the upconing of deep highly saline groundwater and this is a concern with regard to the performance of the tunnel backfill material after the closure of the tunnels. Therefore a salt transport sub-model was added to the Olkiluoto surface hydrological model (SHYD). The other improvements include update of the particle tracking algorithm and possibility to estimate the influence of open drillholes in a case where overpressure in inflatable packers decreases causing a hydraulic short-circuit between hydrogeological zones HZ19 and HZ20 along the drillhole. Four new hydrogeological zones HZ056, HZ146, BFZ100 and HZ039 were added to the model. In addition, zones HZ20A and HZ20B intersect with each other in the new structure model, which influences salinity upconing caused by leakages in shafts. The aim of the modelling of long-term influence of ONKALO, shafts and repository tunnels provide computational results that can be used to suggest limits for allowed leakages. The model input data included all the existing leakages into ONKALO (35-38 l/min) and shafts in the present day conditions. The influence of shafts was computed using eight different values for total shaft leakage: 5, 11, 20, 30, 40, 50, 60 and 70 l/min. The selection of the leakage criteria for shafts was influenced by the fact that upconing of saline water increases TDS-values close to the repository areas although HZ20B does not intersect any deposition tunnels. The total limit for all leakages was suggested to be 120 l/min. The limit for HZ20 zones was proposed to be 40 l/min: about 5 l/min the present day leakages to access tunnel, 25 l/min from

  1. Olkiluoto surface hydrological modelling: Update 2012 including salt transport modelling

    Energy Technology Data Exchange (ETDEWEB)

    Karvonen, T. [WaterHope, Helsinki (Finland)

    2013-11-15

    Posiva Oy is responsible for implementing a final disposal program for spent nuclear fuel of its owners Teollisuuden Voima Oyj and Fortum Power and Heat Oy. The spent nuclear fuel is planned to be disposed at a depth of about 400-450 meters in the crystalline bedrock at the Olkiluoto site. Leakages located at or close to spent fuel repository may give rise to the upconing of deep highly saline groundwater and this is a concern with regard to the performance of the tunnel backfill material after the closure of the tunnels. Therefore a salt transport sub-model was added to the Olkiluoto surface hydrological model (SHYD). The other improvements include update of the particle tracking algorithm and possibility to estimate the influence of open drillholes in a case where overpressure in inflatable packers decreases causing a hydraulic short-circuit between hydrogeological zones HZ19 and HZ20 along the drillhole. Four new hydrogeological zones HZ056, HZ146, BFZ100 and HZ039 were added to the model. In addition, zones HZ20A and HZ20B intersect with each other in the new structure model, which influences salinity upconing caused by leakages in shafts. The aim of the modelling of long-term influence of ONKALO, shafts and repository tunnels provide computational results that can be used to suggest limits for allowed leakages. The model input data included all the existing leakages into ONKALO (35-38 l/min) and shafts in the present day conditions. The influence of shafts was computed using eight different values for total shaft leakage: 5, 11, 20, 30, 40, 50, 60 and 70 l/min. The selection of the leakage criteria for shafts was influenced by the fact that upconing of saline water increases TDS-values close to the repository areas although HZ20B does not intersect any deposition tunnels. The total limit for all leakages was suggested to be 120 l/min. The limit for HZ20 zones was proposed to be 40 l/min: about 5 l/min the present day leakages to access tunnel, 25 l/min from

  2. Decaf: Decoupled Dataflows for In Situ High-Performance Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Dreher, M.; Peterka, T.

    2017-07-31

    Decaf is a dataflow system for the parallel communication of coupled tasks in an HPC workflow. The dataflow can perform arbitrary data transformations ranging from simply forwarding data to complex data redistribution. Decaf does this by allowing the user to allocate resources and execute custom code in the dataflow. All communication through the dataflow is efficient parallel message passing over MPI. The runtime for calling tasks is entirely message-driven; Decaf executes a task when all messages for the task have been received. Such a messagedriven runtime allows cyclic task dependencies in the workflow graph, for example, to enact computational steering based on the result of downstream tasks. Decaf includes a simple Python API for describing the workflow graph. This allows Decaf to stand alone as a complete workflow system, but Decaf can also be used as the dataflow layer by one or more other workflow systems to form a heterogeneous task-based computing environment. In one experiment, we couple a molecular dynamics code with a visualization tool using the FlowVR and Damaris workflow systems and Decaf for the dataflow. In another experiment, we test the coupling of a cosmology code with Voronoi tessellation and density estimation codes using MPI for the simulation, the DIY programming model for the two analysis codes, and Decaf for the dataflow. Such workflows consisting of heterogeneous software infrastructures exist because components are developed separately with different programming models and runtimes, and this is the first time that such heterogeneous coupling of diverse components was demonstrated in situ on HPC systems.

  3. Concurrency & Asynchrony in Declarative Workflows

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2015-01-01

    of concurrency in DCR Graphs admits asynchronous execution of declarative workflows both conceptually and by reporting on a prototype implementation of a distributed declarative workflow engine. Both the theoretical development and the implementation is supported by an extended example; moreover, the theoretical....... In this paper, we pro- pose a notion of concurrency for declarative process models, formulated in the context of Dynamic Condition Response (DCR) graphs, and exploiting the so-called “true concurrency” semantics of Labelled Asynchronous Transition Systems. We demonstrate how this semantic underpinning...

  4. Workflow in Almaraz NPP

    International Nuclear Information System (INIS)

    Gonzalez Crego, E.; Martin Lopez-Suevos, C.

    2000-01-01

    Almaraz NPP decided to incorporate Workflow into its information system in response to the need to provide exhaustive follow-up and monitoring of each phase of the different procedures it manages. Oracle's Workflow was chosen for this purpose and it was integrated with previously developed applications. The objectives to be met in the incorporation of Workflow were as follows: Strict monitoring of procedures and processes. Detection of bottlenecks in the flow of information. Notification of those affected by pending tasks. Flexible allocation of tasks to user groups. Improved monitoring of management procedures. Improved communication. Similarly, special care was taken to: Integrate workflow processes with existing control panels. Synchronize workflow with installation procedures. Ensure that the system reflects use of paper forms. At present the Corrective Maintenance Request module is being operated using Workflow and the Work Orders and Notice of Order modules are about to follow suit. (Author)

  5. Understanding latent structures of clinical information logistics: A bottom-up approach for model building and validating the workflow composite score.

    Science.gov (United States)

    Esdar, Moritz; Hübner, Ursula; Liebe, Jan-David; Hüsers, Jens; Thye, Johannes

    2017-01-01

    Clinical information logistics is a construct that aims to describe and explain various phenomena of information provision to drive clinical processes. It can be measured by the workflow composite score, an aggregated indicator of the degree of IT support in clinical processes. This study primarily aimed to investigate the yet unknown empirical patterns constituting this construct. The second goal was to derive a data-driven weighting scheme for the constituents of the workflow composite score and to contrast this scheme with a literature based, top-down procedure. This approach should finally test the validity and robustness of the workflow composite score. Based on secondary data from 183 German hospitals, a tiered factor analytic approach (confirmatory and subsequent exploratory factor analysis) was pursued. A weighting scheme, which was based on factor loadings obtained in the analyses, was put into practice. We were able to identify five statistically significant factors of clinical information logistics that accounted for 63% of the overall variance. These factors were "flow of data and information", "mobility", "clinical decision support and patient safety", "electronic patient record" and "integration and distribution". The system of weights derived from the factor loadings resulted in values for the workflow composite score that differed only slightly from the score values that had been previously published based on a top-down approach. Our findings give insight into the internal composition of clinical information logistics both in terms of factors and weights. They also allowed us to propose a coherent model of clinical information logistics from a technical perspective that joins empirical findings with theoretical knowledge. Despite the new scheme of weights applied to the calculation of the workflow composite score, the score behaved robustly, which is yet another hint of its validity and therefore its usefulness. Copyright © 2016 Elsevier Ireland

  6. Agreement Workflow Tool (AWT)

    Data.gov (United States)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  7. Toward Design, Modelling and Analysis of Dynamic Workflow Reconfigurations - A Process Algebra Perspective

    DEFF Research Database (Denmark)

    Mazzara, M.; Abouzaid, F.; Dragoni, Nicola

    2011-01-01

    This paper describes a case study involving the dynamic re- conguration of an oce work ow. We state the requirements on a sys- tem implementing the work ow and its reconguration, and describe the system's design in BPMN. We then use an asynchronous -calculus and Web1 to model the design and to ve......This paper describes a case study involving the dynamic re- conguration of an oce work ow. We state the requirements on a sys- tem implementing the work ow and its reconguration, and describe the system's design in BPMN. We then use an asynchronous -calculus and Web1 to model the design...

  8. The informational system model of Ukrainian national transport workflow improvement based on electronic signature introduction management

    Directory of Open Access Journals (Sweden)

    Grigoriy NECHAEY

    2007-01-01

    Full Text Available Proposed model of informational system supposes improvement of newconceptual method on the work with e-signature in transport nformational systems. Problems and aims that may be solved with the help of this system and the most important economical and technical advantages of the proposed system in comparison with traditional methods of e-signing use are marked out.

  9. Capturing microbial sources distributed in a mixed-use watershed within an integrated environmental modeling workflow

    Science.gov (United States)

    Many watershed models simulate overland and instream microbial fate and transport, but few provide loading rates on land surfaces and point sources to the waterbody network. This paper describes the underlying equations for microbial loading rates associated with 1) land-applied ...

  10. Business process management demystified : a tutorial on models, systems and standards for workflow management

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Desel, J.; Reisig, W.; Rozenberg, G.

    2004-01-01

    Over the last decade there has been a shift from data-aware information systems to process-aware information systems. To support business processes an enterprise information system needs to be aware of these processes and their organizational context. Business Process Management (BPM) includes

  11. A workflow for mathematical modeling of subcellular metabolic pathways in leaf metabolism of Arabidopsis thaliana

    Directory of Open Access Journals (Sweden)

    Thomas eNägele

    2013-12-01

    Full Text Available During the last decade genome sequencing has experienced a rapid technological development resulting in numerous sequencing projects and applications in life science. In plant molecular biology, the availability of sequence data on whole genomes has enabled the reconstruction of metabolic networks. Enzymatic reactions are predicted by the sequence information. Pathways arise due to the participation of chemical compounds as substrates and products in these reactions. Although several of these comprehensive networks have been reconstructed for the genetic model plant Arabidopsis thaliana, the integration of experimental data is still challenging. Particularly the analysis of subcellular organization of plant cells limits the understanding of regulatory instances in these metabolic networks in vivo. In this study, we develop an approach for the functional integration of experimental high-throughput data into such large-scale networks. We present a subcellular metabolic network model comprising 524 metabolic intermediates and 548 metabolic interactions derived from a total of 2769 reactions. We demonstrate how to link the metabolite covariance matrix of different Arabidopsis thaliana accessions with the subcellular metabolic network model for the inverse calculation of the biochemical Jacobian, finally resulting in the calculation of a matrix which satisfies a Lyaponov equation involving a covariance matrix. In this way, differential strategies of metabolite compartmentation and involved reactions were identified in the accessions when exposed to low temperature.

  12. The standard-based open workflow system in GeoBrain (Invited)

    Science.gov (United States)

    Di, L.; Yu, G.; Zhao, P.; Deng, M.

    2013-12-01

    GeoBrain is an Earth science Web-service system developed and operated by the Center for Spatial Information Science and Systems, George Mason University. In GeoBrain, a standard-based open workflow system has been implemented to accommodate the automated processing of geospatial data through a set of complex geo-processing functions for advanced production generation. The GeoBrain models the complex geoprocessing at two levels, the conceptual and concrete. At the conceptual level, the workflows exist in the form of data and service types defined by ontologies. The workflows at conceptual level are called geo-processing models and cataloged in GeoBrain as virtual product types. A conceptual workflow is instantiated into a concrete, executable workflow when a user requests a product that matches a virtual product type. Both conceptual and concrete workflows are encoded in Business Process Execution Language (BPEL). A BPEL workflow engine, called BPELPower, has been implemented to execute the workflow for the product generation. A provenance capturing service has been implemented to generate the ISO 19115-compliant complete product provenance metadata before and after the workflow execution. The generation of provenance metadata before the workflow execution allows users to examine the usability of the final product before the lengthy and expensive execution takes place. The three modes of workflow executions defined in the ISO 19119, transparent, translucent, and opaque, are available in GeoBrain. A geoprocessing modeling portal has been developed to allow domain experts to develop geoprocessing models at the type level with the support of both data and service/processing ontologies. The geoprocessing models capture the knowledge of the domain experts and are become the operational offering of the products after a proper peer review of models is conducted. An automated workflow composition has been experimented successfully based on ontologies and artificial

  13. Parametric Room Acoustic Workflows

    DEFF Research Database (Denmark)

    Parigi, Dario; Svidt, Kjeld; Molin, Erik

    2017-01-01

    The paper investigates and assesses different room acoustics software and the opportunities they offer to engage in parametric acoustics workflow and to influence architectural designs. The first step consists in the testing and benchmarking of different tools on the basis of accuracy, speed...... and interoperability with Grasshopper 3d. The focus will be placed to the benchmarking of three different acoustic analysis tools based on raytracing. To compare the accuracy and speed of the acoustic evaluation across different tools, a homogeneous set of acoustic parameters is chosen. The room acoustics parameters...... included in the set are reverberation time (EDT, RT30), clarity (C50), loudness (G), and definition (D50). Scenarios are discussed for determining at different design stages the most suitable acoustic tool. Those scenarios are characterized, by the use of less accurate but fast evaluation tools to be used...

  14. Planning bioinformatics workflows using an expert system

    Science.gov (United States)

    Chen, Xiaoling; Chang, Jeffrey T.

    2017-01-01

    Abstract Motivation: Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. Results: To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. Availability and Implementation: https://github.com/jefftc/changlab Contact: jeffrey.t.chang@uth.tmc.edu PMID:28052928

  15. Dynamic Reusable Workflows for Ocean Science

    Directory of Open Access Journals (Sweden)

    Richard P. Signell

    2016-10-01

    Full Text Available Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog searches and data access now make it possible to create catalog-driven workflows that automate—end-to-end—data search, analysis, and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused, and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS which automates the skill assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC Catalog Service for the Web (CSW, then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enter the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased

  16. Dynamic reusable workflows for ocean science

    Science.gov (United States)

    Signell, Richard; Fernandez, Filipe; Wilcox, Kyle

    2016-01-01

    Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog search and data access make it now possible to create catalog-driven workflows that automate — end-to-end — data search, analysis and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS) which automates the skill-assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC) Catalog Service for the Web (CSW), then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS) for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enters the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased use of dynamic

  17. A Formal Framework for Workflow Analysis

    Science.gov (United States)

    Cravo, Glória

    2010-09-01

    In this paper we provide a new formal framework to model and analyse workflows. A workflow is the formal definition of a business process that consists in the execution of tasks in order to achieve a certain objective. In our work we describe a workflow as a graph whose vertices represent tasks and the arcs are associated to workflow transitions. Each task has associated an input/output logic operator. This logic operator can be the logical AND (•), the OR (⊗), or the XOR -exclusive-or—(⊕). Moreover, we introduce algebraic concepts in order to completely describe completely the structure of workflows. We also introduce the concept of logical termination. Finally, we provide a necessary and sufficient condition for this property to hold.

  18. Using Mobile Agents to Implement Workflow System

    Institute of Scientific and Technical Information of China (English)

    LI Jie; LIU Xian-xing; GUO Zheng-wei

    2004-01-01

    Current workflow management systems usually adopt the existing technologies such as TCP/IP-based Web technologies and CORBA as well to fulfill the bottom communications.Very often it has been considered only from a theoretical point of view, mainly for the lack of concrete possibilities to execute with elasticity.MAT (Mobile Agent Technology) represents a very attractive approach to the distributed control of computer networks and a valid alternative to the implementation of strategies for workflow system.This paper mainly focuses on improving the performance of workflow system by using MAT.Firstly, the performances of workflow systems based on both CORBA and mobile agent are summarized and analyzed; Secondly, the performance contrast is presented by introducing the mathematic model of each kind of data interaction process respectively.Last, a mobile agent-based workflow system named MAWMS is presented and described in detail.

  19. Analysis and classification of oncology activities on the way to workflow based single source documentation in clinical information systems.

    Science.gov (United States)

    Wagner, Stefan; Beckmann, Matthias W; Wullich, Bernd; Seggewies, Christof; Ries, Markus; Bürkle, Thomas; Prokosch, Hans-Ulrich

    2015-12-22

    Today, cancer documentation is still a tedious task involving many different information systems even within a single institution and it is rarely supported by appropriate documentation workflows. In a comprehensive 14 step analysis we compiled diagnostic and therapeutic pathways for 13 cancer entities using a mixed approach of document analysis, workflow analysis, expert interviews, workflow modelling and feedback loops. These pathways were stepwise classified and categorized to create a final set of grouped pathways and workflows including electronic documentation forms. A total of 73 workflows for the 13 entities based on 82 paper documentation forms additionally to computer based documentation systems were compiled in a 724 page document comprising 130 figures, 94 tables and 23 tumour classifications as well as 12 follow-up tables. Stepwise classification made it possible to derive grouped diagnostic and therapeutic pathways for the three major classes - solid entities with surgical therapy - solid entities with surgical and additional therapeutic activities and - non-solid entities. For these classes it was possible to deduct common documentation workflows to support workflow-guided single-source documentation. Clinical documentation activities within a Comprehensive Cancer Center can likely be realized in a set of three documentation workflows with conditional branching in a modern workflow supporting clinical information system.

  20. The equivalency between logic Petri workflow nets and workflow nets.

    Science.gov (United States)

    Wang, Jing; Yu, ShuXia; Du, YuYue

    2015-01-01

    Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented.

  1. The Equivalency between Logic Petri Workflow Nets and Workflow Nets

    Science.gov (United States)

    Wang, Jing; Yu, ShuXia; Du, YuYue

    2015-01-01

    Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented. PMID:25821845

  2. Health information exchange technology on the front lines of healthcare: workflow factors and patterns of use

    Science.gov (United States)

    Johnson, Kevin B; Lorenzi, Nancy M

    2011-01-01

    Objective The goal of this study was to develop an in-depth understanding of how a health information exchange (HIE) fits into clinical workflow at multiple clinical sites. Materials and Methods The ethnographic qualitative study was conducted over a 9-month period in six emergency departments (ED) and eight ambulatory clinics in Memphis, Tennessee, USA. Data were collected using direct observation, informal interviews during observation, and formal semi-structured interviews. The authors observed for over 180 h, during which providers used the exchange 130 times. Results HIE-related workflow was modeled for each ED site and ambulatory clinic group and substantial site-to-site workflow differences were identified. Common patterns in HIE-related workflow were also identified across all sites, leading to the development of two role-based workflow models: nurse based and physician based. The workflow elements framework was applied to the two role-based patterns. An in-depth description was developed of how providers integrated HIE into existing clinical workflow, including prompts for HIE use. Discussion Workflow differed substantially among sites, but two general role-based HIE usage models were identified. Although providers used HIE to improve continuity of patient care, patient–provider trust played a significant role. Types of information retrieved related to roles, with nurses seeking to retrieve recent hospitalization data and more open-ended usage by nurse practitioners and physicians. User and role-specific customization to accommodate differences in workflow and information needs may increase the adoption and use of HIE. Conclusion Understanding end users' perspectives towards HIE technology is crucial to the long-term success of HIE. By applying qualitative methods, an in-depth understanding of HIE usage was developed. PMID:22003156

  3. Stochastic modelling of two-phase flows including phase change

    International Nuclear Information System (INIS)

    Hurisse, O.; Minier, J.P.

    2011-01-01

    Stochastic modelling has already been developed and applied for single-phase flows and incompressible two-phase flows. In this article, we propose an extension of this modelling approach to two-phase flows including phase change (e.g. for steam-water flows). Two aspects are emphasised: a stochastic model accounting for phase transition and a modelling constraint which arises from volume conservation. To illustrate the whole approach, some remarks are eventually proposed for two-fluid models. (authors)

  4. Unsteady panel method for complex configurations including wake modeling

    CSIR Research Space (South Africa)

    Van Zyl, Lourens H

    2008-01-01

    Full Text Available implementations of the DLM are however not very versatile in terms of geometries that can be modeled. The ZONA6 code offers a versatile surface panel body model including a separated wake model, but uses a pressure panel method for lifting surfaces. This paper...

  5. Deriving DICOM surgical extensions from surgical workflows

    Science.gov (United States)

    Burgert, O.; Neumuth, T.; Gessat, M.; Jacobs, S.; Lemke, H. U.

    2007-03-01

    The generation, storage, transfer, and representation of image data in radiology are standardized by DICOM. To cover the needs of image guided surgery or computer assisted surgery in general one needs to handle patient information besides image data. A large number of objects must be defined in DICOM to address the needs of surgery. We propose an analysis process based on Surgical Workflows that helps to identify these objects together with use cases and requirements motivating for their specification. As the first result we confirmed the need for the specification of representation and transfer of geometric models. The analysis of Surgical Workflows has shown that geometric models are widely used to represent planned procedure steps, surgical tools, anatomical structures, or prosthesis in the context of surgical planning, image guided surgery, augmented reality, and simulation. By now, the models are stored and transferred in several file formats bare of contextual information. The standardization of data types including contextual information and specifications for handling of geometric models allows a broader usage of such models. This paper explains the specification process leading to Geometry Mesh Service Object Pair classes. This process can be a template for the definition of further DICOM classes.

  6. Responsive web design workflow

    OpenAIRE

    LAAK, TIMO

    2013-01-01

    Responsive Web Design Workflow is a literature review about Responsive Web Design, a web standards based modern web design paradigm. The goals of this research were to define what responsive web design is, determine its importance in building modern websites and describe a workflow for responsive web design projects. Responsive web design is a paradigm to create adaptive websites, which respond to the properties of the media that is used to render them. The three key elements of responsi...

  7. It's All About the Data: Workflow Systems and Weather

    Science.gov (United States)

    Plale, B.

    2009-05-01

    Digital data is fueling new advances in the computational sciences, particularly geospatial research as environmental sensing grows more practical through reduced technology costs, broader network coverage, and better instruments. e-Science research (i.e., cyberinfrastructure research) has responded to data intensive computing with tools, systems, and frameworks that support computationally oriented activities such as modeling, analysis, and data mining. Workflow systems support execution of sequences of tasks on behalf of a scientist. These systems, such as Taverna, Apache ODE, and Kepler, when built as part of a larger cyberinfrastructure framework, give the scientist tools to construct task graphs of execution sequences, often through a visual interface for connecting task boxes together with arcs representing control flow or data flow. Unlike business processing workflows, scientific workflows expose a high degree of detail and control during configuration and execution. Data-driven science imposes unique needs on workflow frameworks. Our research is focused on two issues. The first is the support for workflow-driven analysis over all kinds of data sets, including real time streaming data and locally owned and hosted data. The second is the essential role metadata/provenance collection plays in data driven science, for discovery, determining quality, for science reproducibility, and for long-term preservation. The research has been conducted over the last 6 years in the context of cyberinfrastructure for mesoscale weather research carried out as part of the Linked Environments for Atmospheric Discovery (LEAD) project. LEAD has pioneered new approaches for integrating complex weather data, assimilation, modeling, mining, and cyberinfrastructure systems. Workflow systems have the potential to generate huge volumes of data. Without some form of automated metadata capture, either metadata description becomes largely a manual task that is difficult if not impossible

  8. Climate Data Analytics Workflow Management

    Science.gov (United States)

    Zhang, J.; Lee, S.; Pan, L.; Mattmann, C. A.; Lee, T. J.

    2016-12-01

    In this project we aim to pave a novel path to create a sustainable building block toward Earth science big data analytics and knowledge sharing. Closely studying how Earth scientists conduct data analytics research in their daily work, we have developed a provenance model to record their activities, and to develop a technology to automatically generate workflows for scientists from the provenance. On top of it, we have built the prototype of a data-centric provenance repository, and establish a PDSW (People, Data, Service, Workflow) knowledge network to support workflow recommendation. To ensure the scalability and performance of the expected recommendation system, we have leveraged the Apache OODT system technology. The community-approved, metrics-based performance evaluation web-service will allow a user to select a metric from the list of several community-approved metrics and to evaluate model performance using the metric as well as the reference dataset. This service will facilitate the use of reference datasets that are generated in support of the model-data intercomparison projects such as Obs4MIPs and Ana4MIPs. The data-centric repository infrastructure will allow us to catch richer provenance to further facilitate knowledge sharing and scientific collaboration in the Earth science community. This project is part of Apache incubator CMDA project.

  9. Including investment risk in large-scale power market models

    DEFF Research Database (Denmark)

    Lemming, Jørgen Kjærgaard; Meibom, P.

    2003-01-01

    Long-term energy market models can be used to examine investments in production technologies, however, with market liberalisation it is crucial that such models include investment risks and investor behaviour. This paper analyses how the effect of investment risk on production technology selection...... can be included in large-scale partial equilibrium models of the power market. The analyses are divided into a part about risk measures appropriate for power market investors and a more technical part about the combination of a risk-adjustment model and a partial-equilibrium model. To illustrate...... the analyses quantitatively, a framework based on an iterative interaction between the equilibrium model and a separate risk-adjustment module was constructed. To illustrate the features of the proposed modelling approach we examined how uncertainty in demand and variable costs affects the optimal choice...

  10. MODEL OF THE TOKAMAK EDGE DENSITY PEDESTAL INCLUDING DIFFUSIVE NEUTRALS

    International Nuclear Information System (INIS)

    BURRELL, K.H.

    2003-01-01

    OAK-B135 Several previous analytic models of the tokamak edge density pedestal have been based on diffusive transport of plasma plus free-streaming of neutrals. This latter neutral model includes only the effect of ionization and neglects charge exchange. The present work models the edge density pedestal using diffusive transport for both the plasma and the neutrals. In contrast to the free-streaming model, a diffusion model for the neutrals includes the effect of both charge exchange and ionization and is valid when charge exchange is the dominant interaction. Surprisingly, the functional forms for the electron and neutral density profiles from the present calculation are identical to the results of the previous analytic models. There are some differences in the detailed definition of various parameters in the solution. For experimentally relevant cases where ionization and charge exchange rate are comparable, both models predict approximately the same width for the edge density pedestal

  11. A standard-enabled workflow for synthetic biology.

    Science.gov (United States)

    Myers, Chris J; Beal, Jacob; Gorochowski, Thomas E; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Göksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications. © 2017 The Author(s); published by Portland Press Limited on behalf of the Biochemical Society.

  12. Progress in digital color workflow understanding in the International Color Consortium (ICC) Workflow WG

    Science.gov (United States)

    McCarthy, Ann

    2006-01-01

    The ICC Workflow WG serves as the bridge between ICC color management technologies and use of those technologies in real world color production applications. ICC color management is applicable to and is used in a wide range of color systems, from highly specialized digital cinema color special effects to high volume publications printing to home photography. The ICC Workflow WG works to align ICC technologies so that the color management needs of these diverse use case systems are addressed in an open, platform independent manner. This report provides a high level summary of the ICC Workflow WG objectives and work to date, focusing on the ways in which workflow can impact image quality and color systems performance. The 'ICC Workflow Primitives' and 'ICC Workflow Patterns and Dimensions' workflow models are covered in some detail. Consider the questions, "How much of dissatisfaction with color management today is the result of 'the wrong color transformation at the wrong time' and 'I can't get to the right conversion at the right point in my work process'?" Put another way, consider how image quality through a workflow can be negatively affected when the coordination and control level of the color management system is not sufficient.

  13. Progressive IRP Models for Power Resources Including EPP

    Directory of Open Access Journals (Sweden)

    Yiping Zhu

    2017-01-01

    Full Text Available In the view of optimizing regional power supply and demand, the paper makes effective planning scheduling of supply and demand side resources including energy efficiency power plant (EPP, to achieve the target of benefit, cost, and environmental constraints. In order to highlight the characteristics of different supply and demand resources in economic, environmental, and carbon constraints, three planning models with progressive constraints are constructed. Results of three models by the same example show that the best solutions to different models are different. The planning model including EPP has obvious advantages considering pollutant and carbon emission constraints, which confirms the advantages of low cost and emissions of EPP. The construction of progressive IRP models for power resources considering EPP has a certain reference value for guiding the planning and layout of EPP within other power resources and achieving cost and environmental objectives.

  14. COSMOS: Python library for massively parallel workflows.

    Science.gov (United States)

    Gafni, Erik; Luquette, Lovelace J; Lancaster, Alex K; Hawkins, Jared B; Jung, Jae-Yoon; Souilmi, Yassine; Wall, Dennis P; Tonellato, Peter J

    2014-10-15

    Efficient workflows to shepherd clinically generated genomic data through the multiple stages of a next-generation sequencing pipeline are of critical importance in translational biomedical science. Here we present COSMOS, a Python library for workflow management that allows formal description of pipelines and partitioning of jobs. In addition, it includes a user interface for tracking the progress of jobs, abstraction of the queuing system and fine-grained control over the workflow. Workflows can be created on traditional computing clusters as well as cloud-based services. Source code is available for academic non-commercial research purposes. Links to code and documentation are provided at http://lpm.hms.harvard.edu and http://wall-lab.stanford.edu. dpwall@stanford.edu or peter_tonellato@hms.harvard.edu. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  15. A workflow for handling heterogeneous 3D models with the TOUGH2 family of codes: Applications to numerical modeling of CO 2 geological storage

    Science.gov (United States)

    Audigane, Pascal; Chiaberge, Christophe; Mathurin, Frédéric; Lions, Julie; Picot-Colbeaux, Géraldine

    2011-04-01

    This paper is addressed to the TOUGH2 user community. It presents a new tool for handling simulations run with the TOUGH2 code with specific application to CO 2 geological storage. This tool is composed of separate FORTRAN subroutines (or modules) that can be run independently, using input and output files in ASCII format for TOUGH2. These modules have been developed specifically for modeling of carbon dioxide geological storage and their use with TOUGH2 and the Equation of State module ECO2N, dedicated to CO 2-water-salt mixture systems, with TOUGHREACT, which is an adaptation of TOUGH2 with ECO2N and geochemical fluid-rock interactions, and with TOUGH2 and the EOS7C module dedicated to CO 2-CH 4 gas mixture is described. The objective is to save time for the pre-processing, execution and visualization of complex geometry for geological system representation. The workflow is rapid and user-friendly and future implementation to other TOUGH2 EOS modules for other contexts (e.g. nuclear waste disposal, geothermal production) is straightforward. Three examples are shown for validation: (i) leakage of CO 2 up through an abandoned well; (ii) 3D reactive transport modeling of CO 2 in a sandy aquifer formation in the Sleipner gas Field, (North Sea, Norway); and (iii) an estimation of enhanced gas recovery technology using CO 2 as the injected and stored gas to produce methane in the K12B Gas Field (North Sea, Denmark).

  16. A hydrodynamic model for granular material flows including segregation effects

    Science.gov (United States)

    Gilberg, Dominik; Klar, Axel; Steiner, Konrad

    2017-06-01

    The simulation of granular flows including segregation effects in large industrial processes using particle methods is accurate, but very time-consuming. To overcome the long computation times a macroscopic model is a natural choice. Therefore, we couple a mixture theory based segregation model to a hydrodynamic model of Navier-Stokes-type, describing the flow behavior of the granular material. The granular flow model is a hybrid model derived from kinetic theory and a soil mechanical approach to cover the regime of fast dilute flow, as well as slow dense flow, where the density of the granular material is close to the maximum packing density. Originally, the segregation model has been formulated by Thornton and Gray for idealized avalanches. It is modified and adapted to be in the preferred form for the coupling. In the final coupled model the segregation process depends on the local state of the granular system. On the other hand, the granular system changes as differently mixed regions of the granular material differ i.e. in the packing density. For the modeling process the focus lies on dry granular material flows of two particle types differing only in size but can be easily extended to arbitrary granular mixtures of different particle size and density. To solve the coupled system a finite volume approach is used. To test the model the rotational mixing of small and large particles in a tumbler is simulated.

  17. Modelling a linear PM motor including magnetic saturation

    NARCIS (Netherlands)

    Polinder, H.; Slootweg, J.G.; Compter, J.C.; Hoeijmakers, M.J.

    2002-01-01

    The use of linear permanent-magnet (PM) actuators increases in a wide variety of applications because of the high force density, robustness and accuracy. The paper describes the modelling of a linear PM motor applied in, for example, wafer steppers, including magnetic saturation. This is important

  18. Simple suggestions for including vertical physics in oil spill models

    International Nuclear Information System (INIS)

    D'Asaro, Eric; University of Washington, Seatle, WA

    2001-01-01

    Current models of oil spills include no vertical physics. They neglect the effect of vertical water motions on the transport and concentration of floating oil. Some simple ways to introduce vertical physics are suggested here. The major suggestion is to routinely measure the density stratification of the upper ocean during oil spills in order to develop a database on the effect of stratification. (Author)

  19. Ab initio chemical safety assessment: A workflow based on exposure considerations and non-animal methods.

    Science.gov (United States)

    Berggren, Elisabet; White, Andrew; Ouedraogo, Gladys; Paini, Alicia; Richarz, Andrea-Nicole; Bois, Frederic Y; Exner, Thomas; Leite, Sofia; Grunsven, Leo A van; Worth, Andrew; Mahony, Catherine

    2017-11-01

    We describe and illustrate a workflow for chemical safety assessment that completely avoids animal testing. The workflow, which was developed within the SEURAT-1 initiative, is designed to be applicable to cosmetic ingredients as well as to other types of chemicals, e.g. active ingredients in plant protection products, biocides or pharmaceuticals. The aim of this work was to develop a workflow to assess chemical safety without relying on any animal testing, but instead constructing a hypothesis based on existing data, in silico modelling, biokinetic considerations and then by targeted non-animal testing. For illustrative purposes, we consider a hypothetical new ingredient x as a new component in a body lotion formulation. The workflow is divided into tiers in which points of departure are established through in vitro testing and in silico prediction, as the basis for estimating a safe external dose in a repeated use scenario. The workflow includes a series of possible exit (decision) points, with increasing levels of confidence, based on the sequential application of the Threshold of Toxicological (TTC) approach, read-across, followed by an "ab initio" assessment, in which chemical safety is determined entirely by new in vitro testing and in vitro to in vivo extrapolation by means of mathematical modelling. We believe that this workflow could be applied as a tool to inform targeted and toxicologically relevant in vitro testing, where necessary, and to gain confidence in safety decision making without the need for animal testing.

  20. Workflow management: an overview

    NARCIS (Netherlands)

    Ouyang, C.; Adams, M.; Wynn, M.T.; Hofstede, ter A.H.M.; Brocke, vom J.; Rosemann, M.

    2010-01-01

    Workflow management has its origin in the office automation systems of the seventies, but it is not until fairly recently that conceptual and technological breakthroughs have led to its widespread adoption. In fact, nowadays, processawareness has become an accepted and integral part of various types

  1. Ferret Workflow Anomaly Detection System

    National Research Council Canada - National Science Library

    Smith, Timothy J; Bryant, Stephany

    2005-01-01

    The Ferret workflow anomaly detection system project 2003-2004 has provided validation and anomaly detection in accredited workflows in secure knowledge management systems through the use of continuous, automated audits...

  2. Aggregated Demand Modelling Including Distributed Generation, Storage and Demand Response

    OpenAIRE

    Marzooghi, Hesamoddin; Hill, David J.; Verbic, Gregor

    2014-01-01

    It is anticipated that penetration of renewable energy sources (RESs) in power systems will increase further in the next decades mainly due to environmental issues. In the long term of several decades, which we refer to in terms of the future grid (FG), balancing between supply and demand will become dependent on demand actions including demand response (DR) and energy storage. So far, FG feasibility studies have not considered these new demand-side developments for modelling future demand. I...

  3. Modeling Electric Double-Layers Including Chemical Reaction Effects

    DEFF Research Database (Denmark)

    Paz-Garcia, Juan Manuel; Johannesson, Björn; Ottosen, Lisbeth M.

    2014-01-01

    A physicochemical and numerical model for the transient formation of an electric double-layer between an electrolyte and a chemically-active flat surface is presented, based on a finite elements integration of the nonlinear Nernst-Planck-Poisson model including chemical reactions. The model works...... for symmetric and asymmetric multi-species electrolytes and is not limited to a range of surface potentials. Numerical simulations are presented, for the case of a CaCO3 electrolyte solution in contact with a surface with rate-controlled protonation/deprotonation reactions. The surface charge and potential...... are determined by the surface reactions, and therefore they depends on the bulk solution composition and concentration...

  4. Comparison of Resource Platform Selection Approaches for Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Ramakrishnan, Lavanya

    2010-03-05

    Cloud computing is increasingly considered as an additional computational resource platform for scientific workflows. The cloud offers opportunity to scale-out applications from desktops and local cluster resources. At the same time, it can eliminate the challenges of restricted software environments and queue delays in shared high performance computing environments. Choosing from these diverse resource platforms for a workflow execution poses a challenge for many scientists. Scientists are often faced with deciding resource platform selection trade-offs with limited information on the actual workflows. While many workflow planning methods have explored task scheduling onto different resources, these methods often require fine-scale characterization of the workflow that is onerous for a scientist. In this position paper, we describe our early exploratory work into using blackbox characteristics to do a cost-benefit analysis across of using cloud platforms. We use only very limited high-level information on the workflow length, width, and data sizes. The length and width are indicative of the workflow duration and parallelism. The data size characterizes the IO requirements. We compare the effectiveness of this approach to other resource selection models using two exemplar scientific workflows scheduled on desktops, local clusters, HPC centers, and clouds. Early results suggest that the blackbox model often makes the same resource selections as a more fine-grained whitebox model. We believe the simplicity of the blackbox model can help inform a scientist on the applicability of cloud computing resources even before porting an existing workflow.

  5. Tavaxy: integrating Taverna and Galaxy workflows with cloud computing support.

    Science.gov (United States)

    Abouelhoda, Mohamed; Issa, Shadi Alaa; Ghanem, Moustafa

    2012-05-04

    Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis.The system can be accessed either through a

  6. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    Directory of Open Access Journals (Sweden)

    Abouelhoda Mohamed

    2012-05-01

    Full Text Available Abstract Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub- workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and

  7. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    Science.gov (United States)

    2012-01-01

    Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis. The system

  8. Pegasus Workflow Management System: Helping Applications From Earth and Space

    Science.gov (United States)

    Mehta, G.; Deelman, E.; Vahi, K.; Silva, F.

    2010-12-01

    Pegasus WMS is a Workflow Management System that can manage large-scale scientific workflows across Grid, local and Cloud resources simultaneously. Pegasus WMS provides a means for representing the workflow of an application in an abstract XML form, agnostic of the resources available to run it and the location of data and executables. It then compiles these workflows into concrete plans by querying catalogs and farming computations across local and distributed computing resources, as well as emerging commercial and community cloud environments in an easy and reliable manner. Pegasus WMS optimizes the execution as well as data movement by leveraging existing Grid and cloud technologies via a flexible pluggable interface and provides advanced features like reusing existing data, automatic cleanup of generated data, and recursive workflows with deferred planning. It also captures all the provenance of the workflow from the planning stage to the execution of the generated data, helping scientists to accurately measure performance metrics of their workflow as well as data reproducibility issues. Pegasus WMS was initially developed as part of the GriPhyN project to support large-scale high-energy physics and astrophysics experiments. Direct funding from the NSF enabled support for a wide variety of applications from diverse domains including earthquake simulation, bacterial RNA studies, helioseismology and ocean modeling. Earthquake Simulation: Pegasus WMS was recently used in a large scale production run in 2009 by the Southern California Earthquake Centre to run 192 million loosely coupled tasks and about 2000 tightly coupled MPI style tasks on National Cyber infrastructure for generating a probabilistic seismic hazard map of the Southern California region. SCEC ran 223 workflows over a period of eight weeks, using on average 4,420 cores, with a peak of 14,540 cores. A total of 192 million files were produced totaling about 165TB out of which 11TB of data was saved

  9. Exclusive queueing model including the choice of service windows

    Science.gov (United States)

    Tanaka, Masahiro; Yanagisawa, Daichi; Nishinari, Katsuhiro

    2018-01-01

    In a queueing system involving multiple service windows, choice behavior is a significant concern. This paper incorporates the choice of service windows into a queueing model with a floor represented by discrete cells. We contrived a logit-based choice algorithm for agents considering the numbers of agents and the distances to all service windows. Simulations were conducted with various parameters of agent choice preference for these two elements and for different floor configurations, including the floor length and the number of service windows. We investigated the model from the viewpoint of transit times and entrance block rates. The influences of the parameters on these factors were surveyed in detail and we determined that there are optimum floor lengths that minimize the transit times. In addition, we observed that the transit times were determined almost entirely by the entrance block rates. The results of the presented model are relevant to understanding queueing systems including the choice of service windows and can be employed to optimize facility design and floor management.

  10. A Re-Engineered Software Interface and Workflow for the Open-Source SimVascular Cardiovascular Modeling Package.

    Science.gov (United States)

    Lan, Hongzhi; Updegrove, Adam; Wilson, Nathan M; Maher, Gabriel D; Shadden, Shawn C; Marsden, Alison L

    2018-02-01

    Patient-specific simulation plays an important role in cardiovascular disease research, diagnosis, surgical planning and medical device design, as well as education in cardiovascular biomechanics. simvascular is an open-source software package encompassing an entire cardiovascular modeling and simulation pipeline from image segmentation, three-dimensional (3D) solid modeling, and mesh generation, to patient-specific simulation and analysis. SimVascular is widely used for cardiovascular basic science and clinical research as well as education, following increased adoption by users and development of a GATEWAY web portal to facilitate educational access. Initial efforts of the project focused on replacing commercial packages with open-source alternatives and adding increased functionality for multiscale modeling, fluid-structure interaction (FSI), and solid modeling operations. In this paper, we introduce a major SimVascular (SV) release that includes a new graphical user interface (GUI) designed to improve user experience. Additional improvements include enhanced data/project management, interactive tools to facilitate user interaction, new boundary condition (BC) functionality, plug-in mechanism to increase modularity, a new 3D segmentation tool, and new computer-aided design (CAD)-based solid modeling capabilities. Here, we focus on major changes to the software platform and outline features added in this new release. We also briefly describe our recent experiences using SimVascular in the classroom for bioengineering education.

  11. Workflows for Full Waveform Inversions

    Science.gov (United States)

    Boehm, Christian; Krischer, Lion; Afanasiev, Michael; van Driel, Martin; May, Dave A.; Rietmann, Max; Fichtner, Andreas

    2017-04-01

    Despite many theoretical advances and the increasing availability of high-performance computing clusters, full seismic waveform inversions still face considerable challenges regarding data and workflow management. While the community has access to solvers which can harness modern heterogeneous computing architectures, the computational bottleneck has fallen to these often manpower-bounded issues that need to be overcome to facilitate further progress. Modern inversions involve huge amounts of data and require a tight integration between numerical PDE solvers, data acquisition and processing systems, nonlinear optimization libraries, and job orchestration frameworks. To this end we created a set of libraries and applications revolving around Salvus (http://salvus.io), a novel software package designed to solve large-scale full waveform inverse problems. This presentation focuses on solving passive source seismic full waveform inversions from local to global scales with Salvus. We discuss (i) design choices for the aforementioned components required for full waveform modeling and inversion, (ii) their implementation in the Salvus framework, and (iii) how it is all tied together by a usable workflow system. We combine state-of-the-art algorithms ranging from high-order finite-element solutions of the wave equation to quasi-Newton optimization algorithms using trust-region methods that can handle inexact derivatives. All is steered by an automated interactive graph-based workflow framework capable of orchestrating all necessary pieces. This naturally facilitates the creation of new Earth models and hopefully sparks new scientific insights. Additionally, and even more importantly, it enhances reproducibility and reliability of the final results.

  12. Applying a Global Sensitivity Analysis Workflow to Improve the Computational Efficiencies in Physiologically-Based Pharmacokinetic Modeling

    Directory of Open Access Journals (Sweden)

    Nan-Hung Hsieh

    2018-06-01

    Full Text Available Traditionally, the solution to reduce parameter dimensionality in a physiologically-based pharmacokinetic (PBPK model is through expert judgment. However, this approach may lead to bias in parameter estimates and model predictions if important parameters are fixed at uncertain or inappropriate values. The purpose of this study was to explore the application of global sensitivity analysis (GSA to ascertain which parameters in the PBPK model are non-influential, and therefore can be assigned fixed values in Bayesian parameter estimation with minimal bias. We compared the elementary effect-based Morris method and three variance-based Sobol indices in their ability to distinguish “influential” parameters to be estimated and “non-influential” parameters to be fixed. We illustrated this approach using a published human PBPK model for acetaminophen (APAP and its two primary metabolites APAP-glucuronide and APAP-sulfate. We first applied GSA to the original published model, comparing Bayesian model calibration results using all the 21 originally calibrated model parameters (OMP, determined by “expert judgment”-based approach vs. the subset of original influential parameters (OIP, determined by GSA from the OMP. We then applied GSA to all the PBPK parameters, including those fixed in the published model, comparing the model calibration results using this full set of 58 model parameters (FMP vs. the full set influential parameters (FIP, determined by GSA from FMP. We also examined the impact of different cut-off points to distinguish the influential and non-influential parameters. We found that Sobol indices calculated by eFAST provided the best combination of reliability (consistency with other variance-based methods and efficiency (lowest computational cost to achieve convergence in identifying influential parameters. We identified several originally calibrated parameters that were not influential, and could be fixed to improve computational

  13. Insightful Workflow For Grid Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Charles Earl

    2008-10-09

    We developed a workflow adaptation and scheduling system for Grid workflow. The system currently interfaces with and uses the Karajan workflow system. We developed machine learning agents that provide the planner/scheduler with information needed to make decisions about when and how to replan. The Kubrick restructures workflow at runtime, making it unique among workflow scheduling systems. The existing Kubrick system provides a platform on which to integrate additional quality of service constraints and in which to explore the use of an ensemble of scheduling and planning algorithms. This will be the principle thrust of our Phase II work.

  14. Complexity Metrics for Workflow Nets

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M.P.

    2009-01-01

    analysts have difficulties grasping the dynamics implied by a process model. Recent empirical studies show that people make numerous errors when modeling complex business processes, e.g., about 20 percent of the EPCs in the SAP reference model have design flaws resulting in potential deadlocks, livelocks......, etc. It seems obvious that the complexity of the model contributes to design errors and a lack of understanding. It is not easy to measure complexity, however. This paper presents three complexity metrics that have been implemented in the process analysis tool ProM. The metrics are defined...... for a subclass of Petri nets named Workflow nets, but the results can easily be applied to other languages. To demonstrate the applicability of these metrics, we have applied our approach and tool to 262 relatively complex Protos models made in the context of various student projects. This allows us to validate...

  15. Two-Layer Transaction Management for Workflow Management Applications

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Vonk, J.; Boertjes, E.M.; Apers, Peter M.G.

    Workflow management applications require advanced transaction management that is not offered by traditional database systems. For this reason, a number of extended transaction models has been proposed in the past. None of these models seems completely adequate, though, because workflow management

  16. Kinetic models of gene expression including non-coding RNAs

    Energy Technology Data Exchange (ETDEWEB)

    Zhdanov, Vladimir P., E-mail: zhdanov@catalysis.r

    2011-03-15

    In cells, genes are transcribed into mRNAs, and the latter are translated into proteins. Due to the feedbacks between these processes, the kinetics of gene expression may be complex even in the simplest genetic networks. The corresponding models have already been reviewed in the literature. A new avenue in this field is related to the recognition that the conventional scenario of gene expression is fully applicable only to prokaryotes whose genomes consist of tightly packed protein-coding sequences. In eukaryotic cells, in contrast, such sequences are relatively rare, and the rest of the genome includes numerous transcript units representing non-coding RNAs (ncRNAs). During the past decade, it has become clear that such RNAs play a crucial role in gene expression and accordingly influence a multitude of cellular processes both in the normal state and during diseases. The numerous biological functions of ncRNAs are based primarily on their abilities to silence genes via pairing with a target mRNA and subsequently preventing its translation or facilitating degradation of the mRNA-ncRNA complex. Many other abilities of ncRNAs have been discovered as well. Our review is focused on the available kinetic models describing the mRNA, ncRNA and protein interplay. In particular, we systematically present the simplest models without kinetic feedbacks, models containing feedbacks and predicting bistability and oscillations in simple genetic networks, and models describing the effect of ncRNAs on complex genetic networks. Mathematically, the presentation is based primarily on temporal mean-field kinetic equations. The stochastic and spatio-temporal effects are also briefly discussed.

  17. The MPO system for automatic workflow documentation

    Energy Technology Data Exchange (ETDEWEB)

    Abla, G.; Coviello, E.N.; Flanagan, S.M. [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Greenwald, M. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Lee, X. [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Romosan, A. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Schissel, D.P., E-mail: schissel@fusion.gat.com [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Shoshani, A. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Stillerman, J.; Wright, J. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Wu, K.J. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

    2016-11-15

    Highlights: • Data model, infrastructure, and tools for data tracking, cataloging, and integration. • Automatically document workflow and data provenance in the widest sense. • Fusion Science as test bed but the system’s framework and data model is quite general. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. This paper presents the Metadata, Provenance, and Ontology (MPO) System, the software that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.

  18. The MPO system for automatic workflow documentation

    International Nuclear Information System (INIS)

    Abla, G.; Coviello, E.N.; Flanagan, S.M.; Greenwald, M.; Lee, X.; Romosan, A.; Schissel, D.P.; Shoshani, A.; Stillerman, J.; Wright, J.; Wu, K.J.

    2016-01-01

    Highlights: • Data model, infrastructure, and tools for data tracking, cataloging, and integration. • Automatically document workflow and data provenance in the widest sense. • Fusion Science as test bed but the system’s framework and data model is quite general. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. This paper presents the Metadata, Provenance, and Ontology (MPO) System, the software that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.

  19. Wildfire: distributed, Grid-enabled workflow construction and execution

    Directory of Open Access Journals (Sweden)

    Issac Praveen

    2005-03-01

    Full Text Available Abstract Background We observe two trends in bioinformatics: (i analyses are increasing in complexity, often requiring several applications to be run as a workflow; and (ii multiple CPU clusters and Grids are available to more scientists. The traditional solution to the problem of running workflows across multiple CPUs required programming, often in a scripting language such as perl. Programming places such solutions beyond the reach of many bioinformatics consumers. Results We present Wildfire, a graphical user interface for constructing and running workflows. Wildfire borrows user interface features from Jemboss and adds a drag-and-drop interface allowing the user to compose EMBOSS (and other programs into workflows. For execution, Wildfire uses GEL, the underlying workflow execution engine, which can exploit available parallelism on multiple CPU machines including Beowulf-class clusters and Grids. Conclusion Wildfire simplifies the tasks of constructing and executing bioinformatics workflows.

  20. Progress Towards an LES Wall Model Including Unresolved Roughness

    Science.gov (United States)

    Craft, Kyle; Redman, Andrew; Aikens, Kurt

    2015-11-01

    Wall models used in large eddy simulations (LES) are often based on theories for hydraulically smooth walls. While this is reasonable for many applications, there are also many where the impact of surface roughness is important. A previously developed wall model has been used primarily for jet engine aeroacoustics. However, jet simulations have not accurately captured thick initial shear layers found in some experimental data. This may partly be due to nozzle wall roughness used in the experiments to promote turbulent boundary layers. As a result, the wall model is extended to include the effects of unresolved wall roughness through appropriate alterations to the log-law. The methodology is tested for incompressible flat plate boundary layers with different surface roughness. Correct trends are noted for the impact of surface roughness on the velocity profile. However, velocity deficit profiles and the Reynolds stresses do not collapse as well as expected. Possible reasons for the discrepancies as well as future work will be presented. This work used the Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by National Science Foundation grant number ACI-1053575. Computational resources on TACC Stampede were provided under XSEDE allocation ENG150001.

  1. Extending Primitive Spatial Data Models to Include Semantics

    Science.gov (United States)

    Reitsma, F.; Batcheller, J.

    2009-04-01

    Our traditional geospatial data model involves associating some measurable quality, such as temperature, or observable feature, such as a tree, with a point or region in space and time. When capturing data we implicitly subscribe to some kind of conceptualisation. If we can make this explicit in an ontology and associate it with the captured data, we can leverage formal semantics to reason with the concepts represented in our spatial data sets. To do so, we extend our fundamental representation of geospatial data in a data model by including a URI in our basic data model that links it to our ontology defining our conceptualisation, We thus extend Goodchild et al's geo-atom [1] with the addition of a URI: (x, Z, z(x), URI) . This provides us with pixel or feature level knowledge and the ability to create layers of data from a set of pixels or features that might be drawn from a database based on their semantics. Using open source tools, we present a prototype that involves simple reasoning as a proof of concept. References [1] M.F. Goodchild, M. Yuan, and T.J. Cova. Towards a general theory of geographic representation in gis. International Journal of Geographical Information Science, 21(3):239-260, 2007.

  2. Accurate SHAPE-directed RNA secondary structure modeling, including pseudoknots.

    Science.gov (United States)

    Hajdin, Christine E; Bellaousov, Stanislav; Huggins, Wayne; Leonard, Christopher W; Mathews, David H; Weeks, Kevin M

    2013-04-02

    A pseudoknot forms in an RNA when nucleotides in a loop pair with a region outside the helices that close the loop. Pseudoknots occur relatively rarely in RNA but are highly overrepresented in functionally critical motifs in large catalytic RNAs, in riboswitches, and in regulatory elements of viruses. Pseudoknots are usually excluded from RNA structure prediction algorithms. When included, these pairings are difficult to model accurately, especially in large RNAs, because allowing this structure dramatically increases the number of possible incorrect folds and because it is difficult to search the fold space for an optimal structure. We have developed a concise secondary structure modeling approach that combines SHAPE (selective 2'-hydroxyl acylation analyzed by primer extension) experimental chemical probing information and a simple, but robust, energy model for the entropic cost of single pseudoknot formation. Structures are predicted with iterative refinement, using a dynamic programming algorithm. This melded experimental and thermodynamic energy function predicted the secondary structures and the pseudoknots for a set of 21 challenging RNAs of known structure ranging in size from 34 to 530 nt. On average, 93% of known base pairs were predicted, and all pseudoknots in well-folded RNAs were identified.

  3. The PBase Scientific Workflow Provenance Repository

    Directory of Open Access Journals (Sweden)

    Víctor Cuevas-Vicenttín

    2014-10-01

    Full Text Available Scientific workflows and their supporting systems are becoming increasingly popular for compute-intensive and data-intensive scientific experiments. The advantages scientific workflows offer include rapid and easy workflow design, software and data reuse, scalable execution, sharing and collaboration, and other advantages that altogether facilitate “reproducible science”. In this context, provenance – information about the origin, context, derivation, ownership, or history of some artifact – plays a key role, since scientists are interested in examining and auditing the results of scientific experiments. However, in order to perform such analyses on scientific results as part of extended research collaborations, an adequate environment and tools are required. Concretely, the need arises for a repository that will facilitate the sharing of scientific workflows and their associated execution traces in an interoperable manner, also enabling querying and visualization. Furthermore, such functionality should be supported while taking performance and scalability into account. With this purpose in mind, we introduce PBase: a scientific workflow provenance repository implementing the ProvONE proposed standard, which extends the emerging W3C PROV standard for provenance data with workflow specific concepts. PBase is built on the Neo4j graph database, thus offering capabilities such as declarative and efficient querying. Our experiences demonstrate the power gained by supporting various types of queries for provenance data. In addition, PBase is equipped with a user friendly interface tailored for the visualization of scientific workflow provenance data, making the specification of queries and the interpretation of their results easier and more effective.

  4. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...

  5. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...... for more complex annotations and ultimately to automatically synthesise workflows by composing predefined sub-processes, in order to achieve a configuration that is optimal for parameters of interest....

  6. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronnie S:; van der Aalst, Wil M.P.; Bakker, Piet J.M.

    2007-01-01

    care process of the Academic Medical Center (AMC) hospital is used as reference process. The process consists of hundreds of activities. These have been modeled and analyzed using an EUC and a CWN. Moreover, based on the CWN, the process has been implemented using four different workflow systems......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where...... an Executable Use Case (EUC) and Colored Workflow Net (CWN) are used to close the gap between the given requirements specification and the realization of these requirements with the help of a workflow system. This paper describes a large case study where the diagnostic tra jectory of the gynaecological oncology...

  7. Workflow patterns the definitive guide

    CERN Document Server

    Russell, Nick; ter Hofstede, Arthur H M

    2016-01-01

    The study of business processes has emerged as a highly effective approach to coordinating an organization's complex service- and knowledge-based activities. The growing field of business process management (BPM) focuses on methods and tools for designing, enacting, and analyzing business processes. This volume offers a definitive guide to the use of patterns, which synthesize the wide range of approaches to modeling business processes. It provides a unique and comprehensive introduction to the well-known workflow patterns collection -- recurrent, generic constructs describing common business process modeling and execution scenarios, presented in the form of problem-solution dialectics. The underlying principles of the patterns approach ensure that they are independent of any specific enabling technology, representational formalism, or modeling approach, and thus broadly applicable across the business process modeling and business process technology domains. The authors, drawing on extensive research done by...

  8. A speech production model including the nasal Cavity

    DEFF Research Database (Denmark)

    Olesen, Morten

    In order to obtain articulatory analysis of speech production the model is improved. the standard model, as used in LPC analysis, to a large extent only models the acoustic properties of speech signal as opposed to articulatory modelling of the speech production. In spite of this the LPC model...... is by far the most widely used model in speech technology....

  9. Automation of Flexible Migration Workflows

    Directory of Open Access Journals (Sweden)

    Dirk von Suchodoletz

    2011-03-01

    Full Text Available Many digital preservation scenarios are based on the migration strategy, which itself is heavily tool-dependent. For popular, well-defined and often open file formats – e.g., digital images, such as PNG, GIF, JPEG – a wide range of tools exist. Migration workflows become more difficult with proprietary formats, as used by the several text processing applications becoming available in the last two decades. If a certain file format can not be rendered with actual software, emulation of the original environment remains a valid option. For instance, with the original Lotus AmiPro or Word Perfect, it is not a problem to save an object of this type in ASCII text or Rich Text Format. In specific environments, it is even possible to send the file to a virtual printer, thereby producing a PDF as a migration output. Such manual migration tasks typically involve human interaction, which may be feasible for a small number of objects, but not for larger batches of files.We propose a novel approach using a software-operated VNC abstraction layer in order to replace humans with machine interaction. Emulators or virtualization tools equipped with a VNC interface are very well suited for this approach. But screen, keyboard and mouse interaction is just part of the setup. Furthermore, digital objects need to be transferred into the original environment in order to be extracted after processing. Nevertheless, the complexity of the new generation of migration services is quickly rising; a preservation workflow is now comprised not only of the migration tool itself, but of a complete software and virtual hardware stack with recorded workflows linked to every supported migration scenario. Thus the requirements of OAIS management must include proper software archiving, emulator selection, system image and recording handling. The concept of view-paths could help either to automatically determine the proper pre-configured virtual environment or to set up system

  10. Hydromechanical modeling of clay rock including fracture damage

    Science.gov (United States)

    Asahina, D.; Houseworth, J. E.; Birkholzer, J. T.

    2012-12-01

    Argillaceous rock typically acts as a flow barrier, but under certain conditions significant and potentially conductive fractures may be present. Fracture formation is well-known to occur in the vicinity of underground excavations in a region known as the excavation disturbed zone. Such problems are of particular importance for low-permeability, mechanically weak rock such as clays and shales because fractures can be relatively transient as a result of fracture self-sealing processes. Perhaps not as well appreciated is the fact that natural fractures can form in argillaceous rock as a result of hydraulic overpressure caused by phenomena such as disequlibrium compaction, changes in tectonic stress, and mineral dehydration. Overpressure conditions can cause hydraulic fracturing if the fluid pressure leads to tensile effective stresses that exceed the tensile strength of the material. Quantitative modeling of this type of process requires coupling between hydrogeologic processes and geomechanical processes including fracture initiation and propagation. Here we present a computational method for three-dimensional, hydromechanical coupled processes including fracture damage. Fractures are represented as discrete features in a fracture network that interact with a porous rock matrix. Fracture configurations are mapped onto an unstructured, three-dimensonal, Voronoi grid, which is based on a random set of spatial points. Discrete fracture networks (DFN) are represented by the connections of the edges of a Voronoi cells. This methodology has the advantage that fractures can be more easily introduced in response to coupled hydro-mechanical processes and generally eliminates several potential issues associated with the geometry of DFN and numerical gridding. A geomechanical and fracture-damage model is developed here using the Rigid-Body-Spring-Network (RBSN) numerical method. The hydrogelogic and geomechanical models share the same geometrical information from a 3D Voronoi

  11. Integrating configuration workflows with project management system

    International Nuclear Information System (INIS)

    Nilsen, Dimitri; Weber, Pavel

    2014-01-01

    The complexity of the heterogeneous computing resources, services and recurring infrastructure changes at the GridKa WLCG Tier-1 computing center require a structured approach to configuration management and optimization of interplay between functional components of the whole system. A set of tools deployed at GridKa, including Puppet, Redmine, Foreman, SVN and Icinga, provides the administrative environment giving the possibility to define and develop configuration workflows, reduce the administrative effort and improve sustainable operation of the whole computing center. In this presentation we discuss the developed configuration scenarios implemented at GridKa, which we use for host installation, service deployment, change management procedures, service retirement etc. The integration of Puppet with a project management tool like Redmine provides us with the opportunity to track problem issues, organize tasks and automate these workflows. The interaction between Puppet and Redmine results in automatic updates of the issues related to the executed workflow performed by different system components. The extensive configuration workflows require collaboration and interaction between different departments like network, security, production etc. at GridKa. Redmine plugins developed at GridKa and integrated in its administrative environment provide an effective way of collaboration within the GridKa team. We present the structural overview of the software components, their connections, communication protocols and show a few working examples of the workflows and their automation.

  12. Automated data reduction workflows for astronomy. The ESO Reflex environment

    Science.gov (United States)

    Freudling, W.; Romaniello, M.; Bramich, D. M.; Ballester, P.; Forchi, V.; García-Dabló, C. E.; Moehler, S.; Neeser, M. J.

    2013-11-01

    Context. Data from complex modern astronomical instruments often consist of a large number of different science and calibration files, and their reduction requires a variety of software tools. The execution chain of the tools represents a complex workflow that needs to be tuned and supervised, often by individual researchers that are not necessarily experts for any specific instrument. Aims: The efficiency of data reduction can be improved by using automatic workflows to organise data and execute a sequence of data reduction steps. To realize such efficiency gains, we designed a system that allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection and interaction with the data. Methods: The European Southern Observatory (ESO) has developed Reflex, an environment to automate data reduction workflows. Reflex is implemented as a package of customized components for the Kepler workflow engine. Kepler provides the graphical user interface to create an executable flowchart-like representation of the data reduction process. Key features of Reflex are a rule-based data organiser, infrastructure to re-use results, thorough book-keeping, data progeny tracking, interactive user interfaces, and a novel concept to exploit information created during data organisation for the workflow execution. Results: Automated workflows can greatly increase the efficiency of astronomical data reduction. In Reflex, workflows can be run non-interactively as a first step. Subsequent optimization can then be carried out while transparently re-using all unchanged intermediate products. We found that such workflows enable the reduction of complex data by non-expert users and minimizes mistakes due to book-keeping errors. Conclusions: Reflex includes novel concepts to increase the efficiency of astronomical data processing. While Reflex is a specific implementation of astronomical scientific workflows within the Kepler workflow

  13. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2016-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value

  14. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00396985; The ATLAS collaboration; Keeble, Oliver; Quadt, Arnulf; Kawamura, Gen

    2017-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the Cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value.

  15. The Symbiotic Relationship between Scientific Workflow and Provenance (Invited)

    Science.gov (United States)

    Stephan, E.

    2010-12-01

    The purpose of this presentation is to describe the symbiotic nature of scientific workflows and provenance. We will also discuss the current trends and real world challenges facing these two distinct research areas. Although motivated differently, the needs of the international science communities are the glue that binds this relationship together. Understanding and articulating the science drivers to these communities is paramount as these technologies evolve and mature. Originally conceived for managing business processes, workflows are now becoming invaluable assets in both computational and experimental sciences. These reconfigurable, automated systems provide essential technology to perform complex analyses by coupling together geographically distributed disparate data sources and applications. As a result, workflows are capable of higher throughput in a shorter amount of time than performing the steps manually. Today many different workflow products exist; these could include Kepler and Taverna or similar products like MeDICI, developed at PNNL, that are standardized on the Business Process Execution Language (BPEL). Provenance, originating from the French term Provenir “to come from”, is used to describe the curation process of artwork as art is passed from owner to owner. The concept of provenance was adopted by digital libraries as a means to track the lineage of documents while standards such as the DublinCore began to emerge. In recent years the systems science community has increasingly expressed the need to expand the concept of provenance to formally articulate the history of scientific data. Communities such as the International Provenance and Annotation Workshop (IPAW) have formalized a provenance data model. The Open Provenance Model, and the W3C is hosting a provenance incubator group featuring the Proof Markup Language. Although both workflows and provenance have risen from different communities and operate independently, their mutual

  16. Impact of CGNS on CFD Workflow

    Science.gov (United States)

    Poinot, M.; Rumsey, C. L.; Mani, M.

    2004-01-01

    CFD tools are an integral part of industrial and research processes, for which the amount of data is increasing at a high rate. These data are used in a multi-disciplinary fluid dynamics environment, including structural, thermal, chemical or even electrical topics. We show that the data specification is an important challenge that must be tackled to achieve an efficient workflow for use in this environment. We compare the process with other software techniques, such as network or database type, where past experiences showed how difficult it was to bridge the gap between completely general specifications and dedicated specific applications. We show two aspects of the use of CFD General Notation System (CGNS) that impact CFD workflow: as a data specification framework and as a data storage means. Then, we give examples of projects involving CFD workflows where the use of the CGNS standard leads to a useful method either for data specification, exchange, or storage.

  17. Design decisions in workflow management and quality of work.

    NARCIS (Netherlands)

    Waal, B.M.E. de; Batenburg, R.

    2009-01-01

    In this paper, the design and implementation of a workflow management (WFM) system in a large Dutch social insurance organisation is described. The effect of workflow design decisions on the quality of work is explored theoretically and empirically, using the model of Zur Mühlen as a frame of

  18. Distributed Global Transaction Support for Workflow Management Applications

    NARCIS (Netherlands)

    Vonk, J.; Grefen, P.W.P.J.; Boertjes, E.M.; Apers, Peter M.G.

    Workflow management systems require advanced transaction support to cope with their inherently long-running processes. The recent trend to distribute workflow executions requires an even more advanced transaction support system that is able to handle distribution. This paper presents a model as well

  19. "Intelligent" tools for workflow process redesign : a research agenda

    NARCIS (Netherlands)

    Netjes, M.; Vanderfeesten, I.T.P.; Reijers, H.A.; Bussler, C.; Haller, A.

    2006-01-01

    Although much attention is being paid to business processes during the past decades, the design of business processes and particularly workflow processes is still more art than science. In this workshop paper, we present our view on modeling methods for workflow processes and introduce our research

  20. From Paper Based Clinical Practice Guidelines to Declarative Workflow Management

    DEFF Research Database (Denmark)

    Lyng, Karen Marie; Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2009-01-01

    a sub workflow can be described in a declarative workflow management system: the Resultmaker Online Consultant (ROC). The example demonstrates that declarative primitives allow to naturally extend the paper based flowchart to an executable model without introducing a complex cyclic control flow graph....

  1. Thoughtflow: Standards and Tools for Provenance Capture and Workflow Definition to Support Model-Informed Drug Discovery and Development.

    Science.gov (United States)

    Wilkins, J J; Chan, Pls; Chard, J; Smith, G; Smith, M K; Beer, M; Dunn, A; Flandorfer, C; Franklin, C; Gomeni, R; Harnisch, L; Kaye, R; Moodie, S; Sardu, M L; Wang, E; Watson, E; Wolstencroft, K; Cheung, Sya

    2017-05-01

    Pharmacometric analyses are complex and multifactorial. It is essential to check, track, and document the vast amounts of data and metadata that are generated during these analyses (and the relationships between them) in order to comply with regulations, support quality control, auditing, and reporting. It is, however, challenging, tedious, error-prone, and time-consuming, and diverts pharmacometricians from the more useful business of doing science. Automating this process would save time, reduce transcriptional errors, support the retention and transfer of knowledge, encourage good practice, and help ensure that pharmacometric analyses appropriately impact decisions. The ability to document, communicate, and reconstruct a complete pharmacometric analysis using an open standard would have considerable benefits. In this article, the Innovative Medicines Initiative (IMI) Drug Disease Model Resources (DDMoRe) consortium proposes a set of standards to facilitate the capture, storage, and reporting of knowledge (including assumptions and decisions) in the context of model-informed drug discovery and development (MID3), as well as to support reproducibility: "Thoughtflow." A prototype software implementation is provided. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  2. Including spatial data in nutrient balance modelling on dairy farms

    Science.gov (United States)

    van Leeuwen, Maricke; van Middelaar, Corina; Stoof, Cathelijne; Oenema, Jouke; Stoorvogel, Jetse; de Boer, Imke

    2017-04-01

    The Annual Nutrient Cycle Assessment (ANCA) calculates the nitrogen (N) and phosphorus (P) balance at a dairy farm, while taking into account the subsequent nutrient cycles of the herd, manure, soil and crop components. Since January 2016, Dutch dairy farmers are required to use ANCA in order to increase understanding of nutrient flows and to minimize nutrient losses to the environment. A nutrient balance calculates the difference between nutrient inputs and outputs. Nutrients enter the farm via purchased feed, fertilizers, deposition and fixation by legumes (nitrogen), and leave the farm via milk, livestock, manure, and roughages. A positive balance indicates to which extent N and/or P are lost to the environment via gaseous emissions (N), leaching, run-off and accumulation in soil. A negative balance indicates that N and/or P are depleted from soil. ANCA was designed to calculate average nutrient flows on farm level (for the herd, manure, soil and crop components). ANCA was not designed to perform calculations of nutrient flows at the field level, as it uses averaged nutrient inputs and outputs across all fields, and it does not include field specific soil characteristics. Land management decisions, however, such as the level of N and P application, are typically taken at the field level given the specific crop and soil characteristics. Therefore the information that ANCA provides is likely not sufficient to support farmers' decisions on land management to minimize nutrient losses to the environment. This is particularly a problem when land management and soils vary between fields. For an accurate estimate of nutrient flows in a given farming system that can be used to optimize land management, the spatial scale of nutrient inputs and outputs (and thus the effect of land management and soil variation) could be essential. Our aim was to determine the effect of the spatial scale of nutrient inputs and outputs on modelled nutrient flows and nutrient use efficiencies

  3. Soundness of Timed-Arc Workflow Nets

    DEFF Research Database (Denmark)

    Mateo, Jose Antonio; Srba, Jiri; Sørensen, Mathias Grund

    2014-01-01

    , we demonstrate the usability of our theory on the case studies of a Brake System Control Unit used in aircraft certification, the MPEG2 encoding algorithm, and a blood transfusion workflow. The implementation of the algorithms is freely available as a part of the model checker TAPAAL....

  4. Workflow standardization of a novel team care model to improve chronic care: a quasi-experimental study.

    Science.gov (United States)

    Panattoni, Laura; Hurlimann, Lily; Wilson, Caroline; Durbin, Meg; Tai-Seale, Ming

    2017-04-19

    Team-based chronic care models have not been widely adopted in community settings, partly due to their varying effectiveness in randomized control trials, implementation challenges, and concerns about physician acceptance. The Palo Alto Medical Foundation designed and implemented "Champion," a novel team-based model that includes new standard work (e.g. proactive patient outreach, pre-visit schedule grooming, depression screening, care planning, health coaching) to support patients' self-management of hypertension and diabetes. We investigated whether Champion improved clinical outcomes. We conducted a quasi-experimental study comparing the Champion clinic-level intervention (n = 38 physicians) with a usual care clinic (n = 37 physicians) in Northern California. The primary outcomes, blood pressure and glycohemoglobin (A1c), were analyzed using a piecewise linear growth curve model for patients exposed to a Champion physician visit (n = 3156) or usual care visit (n = 8034) in the two years prior and one year post implementation. Secondary outcomes were provider experience, compared at baseline and 12 months in both the intervention and usual care clinics using multi-level ordered logistic modeling, and electronic health record based fidelity measures. Compared to usual care, in the first 6 months after a Champion physician visit, diabetes patients aged 18-75 experienced an additional -1.13 mm Hg (95% CI: -2.23 to -0.04) decline in diastolic blood pressure and -0.47 (95% CI: -0.61 to -0.33) decline in A1c. There were no additional improvements in blood pressure or A1c 6 to 12 months post physician visit. At 12 months, Champion physicians reported improved experience with managing chronic care patients in 6 of 7 survey items (p work was uneven; depression screening was the most commonly documented element (85% of patients), while care plans were the least (30.8% of patients). Champion standard work improved glycemic control over the first 6

  5. Verification of Timed Healthcare Workflows Using Component Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Bertolini, Cristiano; Liu, Zhiming; Srba, Jiri

    2013-01-01

    Workflows in modern healthcare systems are becoming increasingly complex and their execution involves concurrency and sharing of resources. The definition, analysis and management of collaborative healthcare workflows requires abstract model notations with a precisely defined semantics and a supp......Workflows in modern healthcare systems are becoming increasingly complex and their execution involves concurrency and sharing of resources. The definition, analysis and management of collaborative healthcare workflows requires abstract model notations with a precisely defined semantics...

  6. Single-Phase Bundle Flows Including Macroscopic Turbulence Model

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Jun; Yoon, Han Young [KAERI, Daejeon (Korea, Republic of); Yoon, Seok Jong; Cho, Hyoung Kyu [Seoul National University, Seoul (Korea, Republic of)

    2016-05-15

    To deal with various thermal hydraulic phenomena due to rapid change of fluid properties when an accident happens, securing mechanistic approaches as much as possible may reduce the uncertainty arising from improper applications of the experimental models. In this study, the turbulence mixing model, which is well defined in the subchannel analysis code such as VIPRE, COBRA, and MATRA by experiments, is replaced by a macroscopic k-e turbulence model, which represents the aspect of mathematical derivation. The performance of CUPID with macroscopic turbulence model is validated against several bundle experiments: CNEN 4x4 and PNL 7x7 rod bundle tests. In this study, the macroscopic k-e model has been validated for the application to subchannel analysis. It has been implemented in the CUPID code and validated against CNEN 4x4 and PNL 7x7 rod bundle tests. The results showed that the macroscopic k-e turbulence model can estimate the experiments properly.

  7. Understanding the dispensary workflow at the Birmingham Free Clinic: a proposed framework for an informatics intervention.

    Science.gov (United States)

    Fisher, Arielle M; Herbert, Mary I; Douglas, Gerald P

    2016-02-19

    The Birmingham Free Clinic (BFC) in Pittsburgh, Pennsylvania, USA is a free, walk-in clinic that serves medically uninsured populations through the use of volunteer health care providers and an on-site medication dispensary. The introduction of an electronic medical record (EMR) has improved several aspects of clinic workflow. However, pharmacists' tasks involving medication management and dispensing have become more challenging since EMR implementation due to its inability to support workflows between the medical and pharmaceutical services. To inform the design of a systematic intervention, we conducted a needs assessment study to identify workflow challenges and process inefficiencies in the dispensary. We used contextual inquiry to document the dispensary workflow and facilitate identification of critical aspects of intervention design specific to the user. Pharmacists were observed according to contextual inquiry guidelines. Graphical models were produced to aid data and process visualization. We created a list of themes describing workflow challenges and asked the pharmacists to rank them in order of significance to narrow the scope of intervention design. Three pharmacists were observed at the BFC. Observer notes were documented and analyzed to produce 13 themes outlining the primary challenges pharmacists encounter during dispensation at the BFC. The dispensary workflow is labor intensive, redundant, and inefficient when integrated with the clinical service. Observations identified inefficiencies that may benefit from the introduction of informatics interventions including: medication labeling, insufficient process notification, triple documentation, and inventory control. We propose a system for Prescription Management and General Inventory Control (RxMAGIC). RxMAGIC is a framework designed to mitigate workflow challenges and improve the processes of medication management and inventory control. While RxMAGIC is described in the context of the BFC

  8. The impact of electronic medical record systems on outpatient workflows: a longitudinal evaluation of its workflow effects.

    Science.gov (United States)

    Vishwanath, Arun; Singh, Sandeep Rajan; Winkelstein, Peter

    2010-11-01

    empirically from the conceptual model. They included 85 items that measured physician perceptions of the EMR's workflow effect on the following eight issues: (1) administration, (2) efficiency in patient processing, (3) basic clinical processes, (4) documentation of patient encounter, (5) economic challenges and reimbursement, (6) technical issues, (7) patient safety and care, and (8) communication and confidentiality. The items were used to track expectations prior to implementation and they served as retrospective measures of satisfaction with the EMR in post-implementation Time 1 and Time 2. The findings suggest that physicians conceptualize EMRs as an incremental extension of older computerized provider order entries (CPOEs) rather than as a new innovation. The EMRs major functional advantages are seen to be very similar to, if not the same as, those of CPOEs. Technology learning curves play a statistically significant though minor role in shaping physician perceptions. The physicians' expectations from the EMR are based on their prior beliefs rather than on a rational evaluation of the EMR's fit, functionality, or performance. Their decision regarding the usefulness of the EMR is made very early, within the first few months of use of the EMR. These early perceptions then remain stable and become the lens through which subsequent experience with the EMR is interpreted. The findings suggest a need for communication based interventions aimed at explaining the value, fit, and usefulness of EMRs to physicians early in the pre- and immediate post-EMR implementation stages. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  9. Accelerating the scientific exploration process with scientific workflows

    International Nuclear Information System (INIS)

    Altintas, Ilkay; Barney, Oscar; Cheng, Zhengang; Critchlow, Terence; Ludaescher, Bertram; Parker, Steve; Shoshani, Arie; Vouk, Mladen

    2006-01-01

    Although an increasing amount of middleware has emerged in the last few years to achieve remote data access, distributed job execution, and data management, orchestrating these technologies with minimal overhead still remains a difficult task for scientists. Scientific workflow systems improve this situation by creating interfaces to a variety of technologies and automating the execution and monitoring of the workflows. Workflow systems provide domain-independent customizable interfaces and tools that combine different tools and technologies along with efficient methods for using them. As simulations and experiments move into the petascale regime, the orchestration of long running data and compute intensive tasks is becoming a major requirement for the successful steering and completion of scientific investigations. A scientific workflow is the process of combining data and processes into a configurable, structured set of steps that implement semi-automated computational solutions of a scientific problem. Kepler is a cross-project collaboration, co-founded by the SciDAC Scientific Data Management (SDM) Center, whose purpose is to develop a domain-independent scientific workflow system. It provides a workflow environment in which scientists design and execute scientific workflows by specifying the desired sequence of computational actions and the appropriate data flow, including required data transformations, between these steps. Currently deployed workflows range from local analytical pipelines to distributed, high-performance and high-throughput applications, which can be both data- and compute-intensive. The scientific workflow approach offers a number of advantages over traditional scripting-based approaches, including ease of configuration, improved reusability and maintenance of workflows and components (called actors), automated provenance management, 'smart' re-running of different versions of workflow instances, on-the-fly updateable parameters, monitoring

  10. A unitarized meson model including color Coulomb interaction

    International Nuclear Information System (INIS)

    Metzger, Kees.

    1990-01-01

    Ch. 1 gives a general introduction into the problem field of the thesis. It discusses in how far the internal structure of mesons is understood theoretically and which models exist. It discusses from a phenomenological point of view the problem of confinement indicates how quark models of mesons may provide insight in this phenomenon. In ch. 2 the formal theory of scattering in a system with confinement is given. It is shown how a coupled channel (CC) description and the work of other authors fit into this general framework. Explicit examples and arguments are given to support the CC treatment of such a system. In ch. 3 the full coupled-channel model as is employed in this thesis is presented. On the basis of arguments from the former chapters and the observed regularities in the experimental data, the choices underlying the model are supported. In this model confinement is described with a mass-dependent harmonic-oscillator potential and the presence of open (meson-meson) channels plays an essential role. In ch. 4 the unitarized model is applied to light scalar meson resonances. In this regime the contribution of the open channels is considerable. It is demonstrated that the model parameters as used for the description of the pseudo-scalar and vector mesons, unchanged can be used for the description of these mesons. Ch. 5 treats the color-Coulomb interaction. There the effect of the Coulomb interaction is studied in simple models without decay. The results of incorporating the color-Coulomb interaction into the full CC model are given in ch.6. Ch. 7 discusses the results of the previous chapters and the present status of the model. (author). 182 refs.; 16 figs.; 33 tabs

  11. Global atmospheric model for mercury including oxidation by bromine atoms

    Directory of Open Access Journals (Sweden)

    C. D. Holmes

    2010-12-01

    Full Text Available Global models of atmospheric mercury generally assume that gas-phase OH and ozone are the main oxidants converting Hg0 to HgII and thus driving mercury deposition to ecosystems. However, thermodynamic considerations argue against the importance of these reactions. We demonstrate here the viability of atomic bromine (Br as an alternative Hg0 oxidant. We conduct a global 3-D simulation with the GEOS-Chem model assuming gas-phase Br to be the sole Hg0 oxidant (Hg + Br model and compare to the previous version of the model with OH and ozone as the sole oxidants (Hg + OH/O3 model. We specify global 3-D Br concentration fields based on our best understanding of tropospheric and stratospheric Br chemistry. In both the Hg + Br and Hg + OH/O3 models, we add an aqueous photochemical reduction of HgII in cloud to impose a tropospheric lifetime for mercury of 6.5 months against deposition, as needed to reconcile observed total gaseous mercury (TGM concentrations with current estimates of anthropogenic emissions. This added reduction would not be necessary in the Hg + Br model if we adjusted the Br oxidation kinetics downward within their range of uncertainty. We find that the Hg + Br and Hg + OH/O3 models are equally capable of reproducing the spatial distribution of TGM and its seasonal cycle at northern mid-latitudes. The Hg + Br model shows a steeper decline of TGM concentrations from the tropics to southern mid-latitudes. Only the Hg + Br model can reproduce the springtime depletion and summer rebound of TGM observed at polar sites; the snowpack component of GEOS-Chem suggests that 40% of HgII deposited to snow in the Arctic is transferred to the ocean and land reservoirs, amounting to a net deposition flux to the Arctic of 60 Mg a−1. Summertime events of depleted Hg0 at Antarctic sites due to subsidence are much better simulated by

  12. The prediction of the cavitation phenomena including population balance modeling

    Science.gov (United States)

    Bannari, Rachid; Hliwa, Ghizlane Zineb; Bannari, Abdelfettah; Belghiti, Mly Taib

    2017-07-01

    Cavitation is the principal reason behind the behavior's modification of the hydraulic turbines. However, the experimental observations can not be appropriate to all cases due to the limitations in the measurement techniques. The mathematical models which have been implemented, use the mixture multiphase frame. As well as, most of the published work is limited by considering a constant bubble size distribution. However, this assumption is not realist. The aim of this article is the implementation and the use of a non-homogeneous multiphase model which solve two phases transport equation. The evolution of bubble size is considered by the population balance equation. This study is based on the eulerian-eulerian model, associated to the cavitation model. All the inter-phase forces such as drag, lift and virtual mass are used.

  13. Including model uncertainty in risk-informed decision making

    International Nuclear Information System (INIS)

    Reinert, Joshua M.; Apostolakis, George E.

    2006-01-01

    Model uncertainties can have a significant impact on decisions regarding licensing basis changes. We present a methodology to identify basic events in the risk assessment that have the potential to change the decision and are known to have significant model uncertainties. Because we work with basic event probabilities, this methodology is not appropriate for analyzing uncertainties that cause a structural change to the model, such as success criteria. We use the risk achievement worth (RAW) importance measure with respect to both the core damage frequency (CDF) and the change in core damage frequency (ΔCDF) to identify potentially important basic events. We cross-check these with generically important model uncertainties. Then, sensitivity analysis is performed on the basic event probabilities, which are used as a proxy for the model parameters, to determine how much error in these probabilities would need to be present in order to impact the decision. A previously submitted licensing basis change is used as a case study. Analysis using the SAPHIRE program identifies 20 basic events as important, four of which have model uncertainties that have been identified in the literature as generally important. The decision is fairly insensitive to uncertainties in these basic events. In three of these cases, one would need to show that model uncertainties would lead to basic event probabilities that would be between two and four orders of magnitude larger than modeled in the risk assessment before they would become important to the decision. More detailed analysis would be required to determine whether these higher probabilities are reasonable. Methods to perform this analysis from the literature are reviewed and an example is demonstrated using the case study

  14. A virtual radiation therapy workflow training simulation

    International Nuclear Information System (INIS)

    Bridge, P.; Crowe, S.B.; Gibson, G.; Ellemor, N.J.; Hargrave, C.; Carmichael, M.

    2016-01-01

    Aim: Simulation forms an increasingly vital component of clinical skills development in a wide range of professional disciplines. Simulation of clinical techniques and equipment is designed to better prepare students for placement by providing an opportunity to learn technical skills in a “safe” academic environment. In radiotherapy training over the last decade or so this has predominantly comprised treatment planning software and small ancillary equipment such as mould room apparatus. Recent virtual reality developments have dramatically changed this approach. Innovative new simulation applications and file processing and interrogation software have helped to fill in the gaps to provide a streamlined virtual workflow solution. This paper outlines the innovations that have enabled this, along with an evaluation of the impact on students and educators. Method: Virtual reality software and workflow applications have been developed to enable the following steps of radiation therapy to be simulated in an academic environment: CT scanning using a 3D virtual CT scanner simulation; batch CT duplication; treatment planning; 3D plan evaluation using a virtual linear accelerator; quantitative plan assessment, patient setup with lasers; and image guided radiotherapy software. Results: Evaluation of the impact of the virtual reality workflow system highlighted substantial time saving for academic staff as well as positive feedback from students relating to preparation for clinical placements. Students valued practice in the “safe” environment and the opportunity to understand the clinical workflow ahead of clinical department experience. Conclusion: Simulation of most of the radiation therapy workflow and tasks is feasible using a raft of virtual reality simulation applications and supporting software. Benefits of this approach include time-saving, embedding of a case-study based approach, increased student confidence, and optimal use of the clinical environment

  15. The complete digital workflow in fixed prosthodontics: a systematic review.

    Science.gov (United States)

    Joda, Tim; Zarone, Fernando; Ferrari, Marco

    2017-09-19

    The continuous development in dental processing ensures new opportunities in the field of fixed prosthodontics in a complete virtual environment without any physical model situations. The aim was to compare fully digitalized workflows to conventional and/or mixed analog-digital workflows for the treatment with tooth-borne or implant-supported fixed reconstructions. A PICO strategy was executed using an electronic (MEDLINE, EMBASE, Google Scholar) plus manual search up to 2016-09-16 focusing on RCTs investigating complete digital workflows in fixed prosthodontics with regard to economics or esthetics or patient-centered outcomes with or without follow-up or survival/success rate analysis as well as complication assessment of at least 1 year under function. The search strategy was assembled from MeSH-Terms and unspecific free-text words: {(("Dental Prosthesis" [MeSH]) OR ("Crowns" [MeSH]) OR ("Dental Prosthesis, Implant-Supported" [MeSH])) OR ((crown) OR (fixed dental prosthesis) OR (fixed reconstruction) OR (dental bridge) OR (implant crown) OR (implant prosthesis) OR (implant restoration) OR (implant reconstruction))} AND {("Computer-Aided Design" [MeSH]) OR ((digital workflow) OR (digital technology) OR (computerized dentistry) OR (intraoral scan) OR (digital impression) OR (scanbody) OR (virtual design) OR (digital design) OR (cad/cam) OR (rapid prototyping) OR (monolithic) OR (full-contour))} AND {("Dental Technology" [MeSH) OR ((conventional workflow) OR (lost-wax-technique) OR (porcelain-fused-to-metal) OR (PFM) OR (implant impression) OR (hand-layering) OR (veneering) OR (framework))} AND {(("Study, Feasibility" [MeSH]) OR ("Survival" [MeSH]) OR ("Success" [MeSH]) OR ("Economics" [MeSH]) OR ("Costs, Cost Analysis" [MeSH]) OR ("Esthetics, Dental" [MeSH]) OR ("Patient Satisfaction" [MeSH])) OR ((feasibility) OR (efficiency) OR (patient-centered outcome))}. Assessment of risk of bias in selected studies was done at a 'trial level' including random sequence

  16. Biowep: a workflow enactment portal for bioinformatics applications.

    Science.gov (United States)

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-03-08

    The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of

  17. Biowep: a workflow enactment portal for bioinformatics applications

    Directory of Open Access Journals (Sweden)

    Romano Paolo

    2007-03-01

    Full Text Available Abstract Background The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS, can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. Results We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. Conclusion We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical

  18. Implementing Oracle Workflow

    CERN Document Server

    Mathieson, D W

    1999-01-01

    CERN (see [CERN]) is the world's largest physics research centre. Currently there are around 5,000 people working at the CERN site located on the border of France and Switzerland near Geneva along with another 4,000 working remotely at institutes situated all around the globe. CERN is currently working on the construction of our newest scientific instrument called the Large Hadron Collider (LHC); the construction alone of this 27-kilometre particle accelerator will not complete until 2005. Like many businesses in the current economic climate CERN is expected to continue growing, yet staff numbers are planned to fall in the coming years. In essence, do more with less. In an environment such as this, it is critical that the administration is as efficient as possible. One of the ways that administrative procedures are streamlined is by the use of an organisation-wide workflow system.

  19. Conventions and workflows for using Situs

    International Nuclear Information System (INIS)

    Wriggers, Willy

    2012-01-01

    Recent developments of the Situs software suite for multi-scale modeling are reviewed. Typical workflows and conventions encountered during processing of biophysical data from electron microscopy, tomography or small-angle X-ray scattering are described. Situs is a modular program package for the multi-scale modeling of atomic resolution structures and low-resolution biophysical data from electron microscopy, tomography or small-angle X-ray scattering. This article provides an overview of recent developments in the Situs package, with an emphasis on workflows and conventions that are important for practical applications. The modular design of the programs facilitates scripting in the bash shell that allows specific programs to be combined in creative ways that go beyond the original intent of the developers. Several scripting-enabled functionalities, such as flexible transformations of data type, the use of symmetry constraints or the creation of two-dimensional projection images, are described. The processing of low-resolution biophysical maps in such workflows follows not only first principles but often relies on implicit conventions. Situs conventions related to map formats, resolution, correlation functions and feature detection are reviewed and summarized. The compatibility of the Situs workflow with CCP4 conventions and programs is discussed

  20. Digital workflows in contemporary orthodontics

    Directory of Open Access Journals (Sweden)

    Lars R Christensen

    2017-01-01

    Full Text Available Digital workflows are now increasingly possible in orthodontic practice. Workflows designed to improve the customization of orthodontic appliances are now available through laboratories and orthodontic manufacturing facilities in many parts of the world. These now have the potential to improve certain aspects of patient care.

  1. Transport modelling including radial electric field and plasma rotation

    International Nuclear Information System (INIS)

    Fukuyama, A.; Fuji, Y.; Itoh, S.-I.

    1994-01-01

    Using a simple turbulent transport model with a constant diffusion coefficient and a fixed temperature profile, the density profile in a steady state and the transient behaviour during the co and counter neutral beam injection are studied. More consistent analysis has been initiated with a turbulent transport model based on the current diffusive high-n ballooning mode. The enhancement of the radial electric field due to ion orbit losses and the reduction of the transport due to the poloidal rotation shear are demonstrated. The preliminary calculation indicates a sensitive temperature dependence of the density profile. (author)

  2. Identifying Clusters with Mixture Models that Include Radial Velocity Observations

    Science.gov (United States)

    Czarnatowicz, Alexis; Ybarra, Jason E.

    2018-01-01

    The study of stellar clusters plays an integral role in the study of star formation. We present a cluster mixture model that considers radial velocity data in addition to spatial data. Maximum likelihood estimation through the Expectation-Maximization (EM) algorithm is used for parameter estimation. Our mixture model analysis can be used to distinguish adjacent or overlapping clusters, and estimate properties for each cluster.Work supported by awards from the Virginia Foundation for Independent Colleges (VFIC) Undergraduate Science Research Fellowship and The Research Experience @Bridgewater (TREB).

  3. An extended Intelligent Water Drops algorithm for workflow scheduling in cloud computing environment

    Directory of Open Access Journals (Sweden)

    Shaymaa Elsherbiny

    2018-03-01

    Full Text Available Cloud computing is emerging as a high performance computing environment with a large scale, heterogeneous collection of autonomous systems and flexible computational architecture. Many resource management methods may enhance the efficiency of the whole cloud computing system. The key part of cloud computing resource management is resource scheduling. Optimized scheduling of tasks on the cloud virtual machines is an NP-hard problem and many algorithms have been presented to solve it. The variations among these schedulers are due to the fact that the scheduling strategies of the schedulers are adapted to the changing environment and the types of tasks. The focus of this paper is on workflows scheduling in cloud computing, which is gaining a lot of attention recently because workflows have emerged as a paradigm to represent complex computing problems. We proposed a novel algorithm extending the natural-based Intelligent Water Drops (IWD algorithm that optimizes the scheduling of workflows on the cloud. The proposed algorithm is implemented and embedded within the workflows simulation toolkit and tested in different simulated cloud environments with different cost models. Our algorithm showed noticeable enhancements over the classical workflow scheduling algorithms. We made a comparison between the proposed IWD-based algorithm with other well-known scheduling algorithms, including MIN-MIN, MAX-MIN, Round Robin, FCFS, and MCT, PSO and C-PSO, where the proposed algorithm presented noticeable enhancements in the performance and cost in most situations.

  4. Text mining for the biocuration workflow.

    Science.gov (United States)

    Hirschman, Lynette; Burns, Gully A P C; Krallinger, Martin; Arighi, Cecilia; Cohen, K Bretonnel; Valencia, Alfonso; Wu, Cathy H; Chatr-Aryamontri, Andrew; Dowell, Karen G; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G

    2012-01-01

    Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on 'Text Mining for the BioCuration Workflow' at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community.

  5. Constitutive modeling of multiphase materials including phase transformations

    NARCIS (Netherlands)

    Perdahcioglu, Emin Semih; Geijselaers, Hubertus J.M.; Khan, A.S.; Meredith, C; Farrokh, B

    2011-01-01

    A constitutive model is developed for materials involving two or more different phases in their microstructure such as DP (Dual Phase) or TRIP (TRansformation Induced Plasticity) steels. Homogenization of the response of the phases is achieved by the Mean-Field method. One of the phases in TRIP

  6. Development of realistic concrete models including scaling effects

    International Nuclear Information System (INIS)

    Carpinteri, A.

    1989-09-01

    Progressive cracking in structural elements of concrete is considered. Two simple models are applied, which, even though different, lead to similar predictions for the fracture behaviour. Both Virtual Crack Propagation Model and Cohesive Limit Analysis (Section 2), show a trend towards brittle behaviour and catastrophical events for large structural sizes. A numerical Cohesive Crack Model is proposed (Section 3) to describe strain softening and strain localization in concrete. Such a model is able to predict the size effects of fracture mechanics accurately. Whereas for Mode I, only untieing of the finite element nodes is applied to simulate crack growth, for Mixed Mode a topological variation is required at each step (Section 4). In the case of the four point shear specimen, the load vs. deflection diagrams reveal snap-back instability for large sizes. By increasing the specimen sizes, such instability tends to reproduce the classical LEFM instability. Remarkable size effects are theoretically predicted and experimentally confirmed also for reinforced concrete (Section 5). The brittleness of the flexural members increases by increasing size and/or decreasing steel content. On the basis of these results, the empirical code rules regarding the minimum amount of reinforcement could be considerably revised

  7. Dynamic model including piping acoustics of a centrifugal compression system

    NARCIS (Netherlands)

    Helvoirt, van J.; Jager, de A.G.

    2007-01-01

    This paper deals with low frequency pulsation phenomena in full-scale centrifugal compression systems associated with compressor surge. The Greitzer lumped parameter model is applied to describe the dynamic behavior of an industrial compressor test rig and experimental evidence is provided for the

  8. Constructing Workflows from Script Applications

    Directory of Open Access Journals (Sweden)

    Mikołaj Baranowski

    2012-01-01

    Full Text Available For programming and executing complex applications on grid infrastructures, scientific workflows have been proposed as convenient high-level alternative to solutions based on general-purpose programming languages, APIs and scripts. GridSpace is a collaborative programming and execution environment, which is based on a scripting approach and it extends Ruby language with a high-level API for invoking operations on remote resources. In this paper we describe a tool which enables to convert the GridSpace application source code into a workflow representation which, in turn, may be used for scheduling, provenance, or visualization. We describe how we addressed the issues of analyzing Ruby source code, resolving variable and method dependencies, as well as building workflow representation. The solutions to these problems have been developed and they were evaluated by testing them on complex grid application workflows such as CyberShake, Epigenomics and Montage. Evaluation is enriched by representing typical workflow control flow patterns.

  9. Including lateral interactions into microkinetic models of catalytic reactions

    DEFF Research Database (Denmark)

    Hellman, Anders; Honkala, Johanna Karoliina

    2007-01-01

    In many catalytic reactions lateral interactions between adsorbates are believed to have a strong influence on the reaction rates. We apply a microkinetic model to explore the effect of lateral interactions and how to efficiently take them into account in a simple catalytic reaction. Three differ...... different approximations are investigated: site, mean-field, and quasichemical approximations. The obtained results are compared to accurate Monte Carlo numbers. In the end, we apply the approximations to a real catalytic reaction, namely, ammonia synthesis....

  10. Radiology information system: a workflow-based approach

    International Nuclear Information System (INIS)

    Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; Aalst, W.M.P. van der

    2009-01-01

    Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare. (orig.)

  11. Parton recombination model including resonance production. RL-78-040

    International Nuclear Information System (INIS)

    Roberts, R.G.; Hwa, R.C.; Matsuda, S.

    1978-05-01

    Possible effects of resonance production on the meson inclusive distribution in the fragmentation region are investigated in the framework of the parton recombination model. From a detailed study of the data on vector-meson production, a reliable ratio of the vector-to-pseudoscalar rates is determined. Then the influence of the decay of the vector mesons on the pseudoscalar spectrum is examined, and the effect found to be no more than 25% for x > 0.5. The normalization of the non-strange antiquark distributions are still higher than those in a quiescent proton. The agreement between the calculated results and data remain very good. 36 references

  12. Parton recombination model including resonance production. RL-78-040

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, R. G.; Hwa, R. C.; Matsuda, S.

    1978-05-01

    Possible effects of resonance production on the meson inclusive distribution in the fragmentation region are investigated in the framework of the parton recombination model. From a detailed study of the data on vector-meson production, a reliable ratio of the vector-to-pseudoscalar rates is determined. Then the influence of the decay of the vector mesons on the pseudoscalar spectrum is examined, and the effect found to be no more than 25% for x > 0.5. The normalization of the non-strange antiquark distributions are still higher than those in a quiescent proton. The agreement between the calculated results and data remain very good. 36 references.

  13. Extending PSA models including ageing and asset management - 15291

    International Nuclear Information System (INIS)

    Martorell, S.; Marton, I.; Carlos, S.; Sanchez, A.I.

    2015-01-01

    This paper proposes a new approach to Ageing Probabilistic Safety Assessment (APSA) modelling, which is intended to be used to support risk-informed decisions on the effectiveness of maintenance management programs and technical specification requirements of critical equipment of Nuclear Power Plants (NPP) within the framework of the Risk Informed Decision Making according to R.G. 1.174 principles. This approach focuses on the incorporation of not only equipment ageing but also effectiveness of maintenance and efficiency of surveillance testing explicitly into APSA models and data. This methodology is applied to a motor-operated valve of the auxiliary feed water system (AFWS) of a PWR. This simple example of application focuses on a critical safety-related equipment of a NPP in order to evaluate the risk impact of considering different approaches to APSA and the combined effect of equipment ageing and maintenance and testing alternatives along NPP design life. The risk impact of several alternatives in maintenance strategy is discussed

  14. Toward a Visualization-Supported Workflow for Cyber Alert Management using Threat Models and Human-Centered Design

    Energy Technology Data Exchange (ETDEWEB)

    Franklin, Lyndsey; Pirrung, Megan A.; Blaha, Leslie M.; Dowling, Michelle V.; Feng, Mi

    2017-10-09

    Cyber network analysts follow complex processes in their investigations of potential threats to their network. Much research is dedicated to providing automated tool support in the effort to make their tasks more efficient, accurate, and timely. This tool support comes in a variety of implementations from machine learning algorithms that monitor streams of data to visual analytic environments for exploring rich and noisy data sets. Cyber analysts, however, often speak of a need for tools which help them merge the data they already have and help them establish appropriate baselines against which to compare potential anomalies. Furthermore, existing threat models that cyber analysts regularly use to structure their investigation are not often leveraged in support tools. We report on our work with cyber analysts to understand they analytic process and how one such model, the MITRE ATT&CK Matrix [32], is used to structure their analytic thinking. We present our efforts to map specific data needed by analysts into the threat model to inform our eventual visualization designs. We examine data mapping for gaps where the threat model is under-supported by either data or tools. We discuss these gaps as potential design spaces for future research efforts. We also discuss the design of a prototype tool that combines machine-learning and visualization components to support cyber analysts working with this threat model.

  15. A data analysis workflow to enhance clay and organic carbon models using proximal Vis-NIR data

    DEFF Research Database (Denmark)

    Tabatabai, Salman; Knadel, Maria; Greve, Mogens Humlekrog

    Modelling proximal sensors data is becoming a norm in soil characterization and mapping. In many cases, these models still have low predictive capabilities and lack robustness due to the large amount of noise from several environmental factors. In this study we proposed a combination of extensive...... data preprocessing (preprocessing survey) and two variable selection methods to significantly increase visible near-infrared spectroscopy (Vis-NIRS) model performance and stability. Spectra of eight agricultural fields were measured in the range of 350-2200 nm using a mobile sensor platform (Veris...... Technologies, USA) towed by a tractor. A fuzzy c-means clustering was performed based on the first 3 principal components to select 15 representative sampling locations in each field. Clay and organic carbon (OC) were determined for all calibration samples using pipette and ignition methods, respectively...

  16. Distributed Workflow Service Composition Based on CTR Technology

    Science.gov (United States)

    Feng, Zhilin; Ye, Yanming

    Recently, WS-BPEL has gradually become the basis of a standard for web service description and composition. However, WS-BPEL cannot efficiently describe distributed workflow services for lacking of special expressive power and formal semantics. This paper presents a novel method for modeling distributed workflow service composition with Concurrent TRansaction logic (CTR). The syntactic structure of WS-BPEL and CTR are analyzed, and new rules of mapping WS-BPEL into CTR are given. A case study is put forward to show that the proposed method is appropriate for modeling workflow business services under distributed environments.

  17. Developing a workflow to identify inconsistencies in volunteered geographic information: a phenological case study

    Science.gov (United States)

    Mehdipoor, Hamed; Zurita-Milla, Raul; Rosemartin, Alyssa; Gerst, Katharine L.; Weltzin, Jake F.

    2015-01-01

    Recent improvements in online information communication and mobile location-aware technologies have led to the production of large volumes of volunteered geographic information. Widespread, large-scale efforts by volunteers to collect data can inform and drive scientific advances in diverse fields, including ecology and climatology. Traditional workflows to check the quality of such volunteered information can be costly and time consuming as they heavily rely on human interventions. However, identifying factors that can influence data quality, such as inconsistency, is crucial when these data are used in modeling and decision-making frameworks. Recently developed workflows use simple statistical approaches that assume that the majority of the information is consistent. However, this assumption is not generalizable, and ignores underlying geographic and environmental contextual variability that may explain apparent inconsistencies. Here we describe an automated workflow to check inconsistency based on the availability of contextual environmental information for sampling locations. The workflow consists of three steps: (1) dimensionality reduction to facilitate further analysis and interpretation of results, (2) model-based clustering to group observations according to their contextual conditions, and (3) identification of inconsistent observations within each cluster. The workflow was applied to volunteered observations of flowering in common and cloned lilac plants (Syringa vulgaris and Syringa x chinensis) in the United States for the period 1980 to 2013. About 97% of the observations for both common and cloned lilacs were flagged as consistent, indicating that volunteers provided reliable information for this case study. Relative to the original dataset, the exclusion of inconsistent observations changed the apparent rate of change in lilac bloom dates by two days per decade, indicating the importance of inconsistency checking as a key step in data quality

  18. Developing a Workflow to Identify Inconsistencies in Volunteered Geographic Information: A Phenological Case Study.

    Science.gov (United States)

    Mehdipoor, Hamed; Zurita-Milla, Raul; Rosemartin, Alyssa; Gerst, Katharine L; Weltzin, Jake F

    2015-01-01

    Recent improvements in online information communication and mobile location-aware technologies have led to the production of large volumes of volunteered geographic information. Widespread, large-scale efforts by volunteers to collect data can inform and drive scientific advances in diverse fields, including ecology and climatology. Traditional workflows to check the quality of such volunteered information can be costly and time consuming as they heavily rely on human interventions. However, identifying factors that can influence data quality, such as inconsistency, is crucial when these data are used in modeling and decision-making frameworks. Recently developed workflows use simple statistical approaches that assume that the majority of the information is consistent. However, this assumption is not generalizable, and ignores underlying geographic and environmental contextual variability that may explain apparent inconsistencies. Here we describe an automated workflow to check inconsistency based on the availability of contextual environmental information for sampling locations. The workflow consists of three steps: (1) dimensionality reduction to facilitate further analysis and interpretation of results, (2) model-based clustering to group observations according to their contextual conditions, and (3) identification of inconsistent observations within each cluster. The workflow was applied to volunteered observations of flowering in common and cloned lilac plants (Syringa vulgaris and Syringa x chinensis) in the United States for the period 1980 to 2013. About 97% of the observations for both common and cloned lilacs were flagged as consistent, indicating that volunteers provided reliable information for this case study. Relative to the original dataset, the exclusion of inconsistent observations changed the apparent rate of change in lilac bloom dates by two days per decade, indicating the importance of inconsistency checking as a key step in data quality

  19. FOSS geospatial libraries in scientific workflow environments: experiences and directions

    CSIR Research Space (South Africa)

    McFerren, G

    2011-07-01

    Full Text Available of experiments. In context of three sets of research (wildfire research, flood modelling and the linking of disease outbreaks to multi-scale environmental conditions), we describe our efforts to provide geospatial capability for scientific workflow software...

  20. Using Workflow Modeling to Identify Areas to Improve Genetic Test Processes in the University of Maryland Translational Pharmacogenomics Project.

    Science.gov (United States)

    Cutting, Elizabeth M; Overby, Casey L; Banchero, Meghan; Pollin, Toni; Kelemen, Mark; Shuldiner, Alan R; Beitelshees, Amber L

    Delivering genetic test results to clinicians is a complex process. It involves many actors and multiple steps, requiring all of these to work together in order to create an optimal course of treatment for the patient. We used information gained from focus groups in order to illustrate the current process of delivering genetic test results to clinicians. We propose a business process model and notation (BPMN) representation of this process for a Translational Pharmacogenomics Project being implemented at the University of Maryland Medical Center, so that personalized medicine program implementers can identify areas to improve genetic testing processes. We found that the current process could be improved to reduce input errors, better inform and notify clinicians about the implications of certain genetic tests, and make results more easily understood. We demonstrate our use of BPMN to improve this important clinical process for CYP2C19 genetic testing in patients undergoing invasive treatment of coronary heart disease.

  1. ATLAS Grid Workflow Performance Optimization

    CERN Document Server

    Elmsheuser, Johannes; The ATLAS collaboration

    2018-01-01

    The CERN ATLAS experiment grid workflow system manages routinely 250 to 500 thousand concurrently running production and analysis jobs to process simulation and detector data. In total more than 300 PB of data is distributed over more than 150 sites in the WLCG. At this scale small improvements in the software and computing performance and workflows can lead to significant resource usage gains. ATLAS is reviewing together with CERN IT experts several typical simulation and data processing workloads for potential performance improvements in terms of memory and CPU usage, disk and network I/O. All ATLAS production and analysis grid jobs are instrumented to collect many performance metrics for detailed statistical studies using modern data analytics tools like ElasticSearch and Kibana. This presentation will review and explain the performance gains of several ATLAS simulation and data processing workflows and present analytics studies of the ATLAS grid workflows.

  2. Privacy-aware workflow management

    NARCIS (Netherlands)

    Alhaqbani, B.; Adams, M.; Fidge, C.J.; Hofstede, ter A.H.M.; Glykas, M.

    2013-01-01

    Information security policies play an important role in achieving information security. Confidentiality, Integrity, and Availability are classic information security goals attained by enforcing appropriate security policies. Workflow Management Systems (WfMSs) also benefit from inclusion of these

  3. Summer Student Report - AV Workflow

    CERN Document Server

    Abramson, Jessie

    2014-01-01

    The AV Workflow is web application which allows cern users to publish, update and delete videos from cds. During my summer internship I implemented the backend of the new version of the AV Worklow in python using the django framework.

  4. ERROR HANDLING IN INTEGRATION WORKFLOWS

    Directory of Open Access Journals (Sweden)

    Alexey M. Nazarenko

    2017-01-01

    Full Text Available Simulation experiments performed while solving multidisciplinary engineering and scientific problems require joint usage of multiple software tools. Further, when following a preset plan of experiment or searching for optimum solu- tions, the same sequence of calculations is run multiple times with various simulation parameters, input data, or conditions while overall workflow does not change. Automation of simulations like these requires implementing of a workflow where tool execution and data exchange is usually controlled by a special type of software, an integration environment or plat- form. The result is an integration workflow (a platform-dependent implementation of some computing workflow which, in the context of automation, is a composition of weakly coupled (in terms of communication intensity typical subtasks. These compositions can then be decomposed back into a few workflow patterns (types of subtasks interaction. The pat- terns, in their turn, can be interpreted as higher level subtasks.This paper considers execution control and data exchange rules that should be imposed by the integration envi- ronment in the case of an error encountered by some integrated software tool. An error is defined as any abnormal behavior of a tool that invalidates its result data thus disrupting the data flow within the integration workflow. The main requirementto the error handling mechanism implemented by the integration environment is to prevent abnormal termination of theentire workflow in case of missing intermediate results data. Error handling rules are formulated on the basic pattern level and on the level of a composite task that can combine several basic patterns as next level subtasks. The cases where workflow behavior may be different, depending on user's purposes, when an error takes place, and possible error handling op- tions that can be specified by the user are also noted in the work.

  5. Designing Flexible E-Business Workflow Systems

    OpenAIRE

    Cătălin Silvestru; Codrin Nisioiu; Marinela Mircea; Bogdan Ghilic-Micu; Marian Stoica

    2010-01-01

    In today’s business environment organizations must cope with complex interactions between actors adapt fast to frequent market changes and be innovative. In this context, integrating knowledge with processes and Business Intelligenceis a major step towards improving organization agility. Therefore, traditional environments for workflow design have been adapted to answer the new business models and current requirements in the field of collaborative processes. This paper approaches the design o...

  6. Workflow Lexicons in Healthcare: Validation of the SWIM Lexicon.

    Science.gov (United States)

    Meenan, Chris; Erickson, Bradley; Knight, Nancy; Fossett, Jewel; Olsen, Elizabeth; Mohod, Prerna; Chen, Joseph; Langer, Steve G

    2017-06-01

    For clinical departments seeking to successfully navigate the challenges of modern health reform, obtaining access to operational and clinical data to establish and sustain goals for improving quality is essential. More broadly, health delivery organizations are also seeking to understand performance across multiple facilities and often across multiple electronic medical record (EMR) systems. Interpreting operational data across multiple vendor systems can be challenging, as various manufacturers may describe different departmental workflow steps in different ways and sometimes even within a single vendor's installed customer base. In 2012, The Society for Imaging Informatics in Medicine (SIIM) recognized the need for better quality and performance data standards and formed SIIM's Workflow Initiative for Medicine (SWIM), an initiative designed to consistently describe workflow steps in radiology departments as well as defining operational quality metrics. The SWIM lexicon was published as a working model to describe operational workflow steps and quality measures. We measured the prevalence of the SWIM lexicon workflow steps in both academic and community radiology environments using real-world patient observations and correlated that information with automatically captured workflow steps from our clinical information systems. Our goal was to measure frequency of occurrence of workflow steps identified by the SWIM lexicon in a real-world clinical setting, as well as to correlate how accurately departmental information systems captured patient flow through our health facility.

  7. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    Science.gov (United States)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  8. Examining daily activity routines of older adults using workflow.

    Science.gov (United States)

    Chung, Jane; Ozkaynak, Mustafa; Demiris, George

    2017-07-01

    We evaluated the value of workflow analysis supported by a novel visualization technique to better understand the daily routines of older adults and highlight their patterns of daily activities and normal variability in physical functions. We used a self-reported activity diary to obtain data from six community-dwelling older adults for 14 consecutive days. Workflow for daily routine was analyzed using the EventFlow tool, which aggregates workflow information to highlight patterns and variabilities. A total of 1453 events were included in the data analysis. To demonstrate the patterns and variability of each individual's daily activities, participant activity workflows were visualized and compared. The workflow analysis revealed great variability in activity types, regularity, frequency, duration, and timing of performing certain activities across individuals. Also, when workflow approach was applied to spatial information of activities, the analysis revealed the ability to provide meaningful data on individuals' mobility in different levels of life spaces from home to community. Results suggest that using workflows to characterize the daily activities of older adults will be helpful for clinicians and researchers in understanding their daily routines and preparing education and prevention strategies tailored to each individual's activity level. This tool also has the potential to be integrated into consumer informatics technologies, such as patient portals or personal health records, so that consumers may be encouraged to become actively involved in monitoring and managing their health. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Successful Completion of FY18/Q1 ASC L2 Milestone 6355: Electrical Analysis Calibration Workflow Capability Demonstration.

    Energy Technology Data Exchange (ETDEWEB)

    Copps, Kevin D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-12-01

    The Sandia Analysis Workbench (SAW) project has developed and deployed a production capability for SIERRA computational mechanics analysis workflows. However, the electrical analysis workflow capability requirements have only been demonstrated in early prototype states, with no real capability deployed for analysts’ use. This milestone aims to improve the electrical analysis workflow capability (via SAW and related tools) and deploy it for ongoing use. We propose to focus on a QASPR electrical analysis calibration workflow use case. We will include a number of new capabilities (versus today’s SAW), such as: 1) support for the XYCE code workflow component, 2) data management coupled to electrical workflow, 3) human-in-theloop workflow capability, and 4) electrical analysis workflow capability deployed on the restricted (and possibly classified) network at Sandia. While far from the complete set of capabilities required for electrical analysis workflow over the long term, this is a substantial first step toward full production support for the electrical analysts.

  10. Development of a High-Throughput Ion-Exchange Resin Characterization Workflow.

    Science.gov (United States)

    Liu, Chun; Dermody, Daniel; Harris, Keith; Boomgaard, Thomas; Sweeney, Jeff; Gisch, Daryl; Goltz, Bob

    2017-06-12

    A novel high-throughout (HTR) ion-exchange (IEX) resin workflow has been developed for characterizing ion exchange equilibrium of commercial and experimental IEX resins against a range of different applications where water environment differs from site to site. Because of its much higher throughput, design of experiment (DOE) methodology can be easily applied for studying the effects of multiple factors on resin performance. Two case studies will be presented to illustrate the efficacy of the combined HTR workflow and DOE method. In case study one, a series of anion exchange resins have been screened for selective removal of NO 3 - and NO 2 - in water environments consisting of multiple other anions, varied pH, and ionic strength. The response surface model (RSM) is developed to statistically correlate the resin performance with the water composition and predict the best resin candidate. In case study two, the same HTR workflow and DOE method have been applied for screening different cation exchange resins in terms of the selective removal of Mg 2+ , Ca 2+ , and Ba 2+ from high total dissolved salt (TDS) water. A master DOE model including all of the cation exchange resins is created to predict divalent cation removal by different IEX resins under specific conditions, from which the best resin candidates can be identified. The successful adoption of HTR workflow and DOE method for studying the ion exchange of IEX resins can significantly reduce the resources and time to address industry and application needs.

  11. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach.

    Science.gov (United States)

    Haston, Elspeth; Cubey, Robert; Pullan, Martin; Atkins, Hannah; Harris, David J

    2012-01-01

    Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow.The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR) software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow.

  12. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach

    Directory of Open Access Journals (Sweden)

    Elspeth Haston

    2012-07-01

    Full Text Available Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow.The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow.

  13. Quantitative workflow based on NN for weighting criteria in landfill suitability mapping

    Science.gov (United States)

    Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Alkhasawneh, Mutasem Sh.; Aziz, Hamidi Abdul

    2017-10-01

    Our study aims to introduce a new quantitative workflow that integrates neural networks (NNs) and multi criteria decision analysis (MCDA). Existing MCDA workflows reveal a number of drawbacks, because of the reliance on human knowledge in the weighting stage. Thus, new workflow presented to form suitability maps at the regional scale for solid waste planning based on NNs. A feed-forward neural network employed in the workflow. A total of 34 criteria were pre-processed to establish the input dataset for NN modelling. The final learned network used to acquire the weights of the criteria. Accuracies of 95.2% and 93.2% achieved for the training dataset and testing dataset, respectively. The workflow was found to be capable of reducing human interference to generate highly reliable maps. The proposed workflow reveals the applicability of NN in generating landfill suitability maps and the feasibility of integrating them with existing MCDA workflows.

  14. Worklist handling in workflow-enabled radiological application systems

    Science.gov (United States)

    Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim; von Berg, Jens

    2000-05-01

    For the next generation integrated information systems for health care applications, more emphasis has to be put on systems which, by design, support the reduction of cost, the increase inefficiency and the improvement of the quality of services. A substantial contribution to this will be the modeling. optimization, automation and enactment of processes in health care institutions. One of the perceived key success factors for the system integration of processes will be the application of workflow management, with workflow management systems as key technology components. In this paper we address workflow management in radiology. We focus on an important aspect of workflow management, the generation and handling of worklists, which provide workflow participants automatically with work items that reflect tasks to be performed. The display of worklists and the functions associated with work items are the visible part for the end-users of an information system using a workflow management approach. Appropriate worklist design and implementation will influence user friendliness of a system and will largely influence work efficiency. Technically, in current imaging department information system environments (modality-PACS-RIS installations), a data-driven approach has been taken: Worklist -- if present at all -- are generated from filtered views on application data bases. In a future workflow-based approach, worklists will be generated by autonomous workflow services based on explicit process models and organizational models. This process-oriented approach will provide us with an integral view of entire health care processes or sub- processes. The paper describes the basic mechanisms of this approach and summarizes its benefits.

  15. Schedule-Aware Workflow Management Systems

    Science.gov (United States)

    Mans, Ronny S.; Russell, Nick C.; van der Aalst, Wil M. P.; Moleman, Arnold J.; Bakker, Piet J. M.

    Contemporary workflow management systems offer work-items to users through specific work-lists. Users select the work-items they will perform without having a specific schedule in mind. However, in many environments work needs to be scheduled and performed at particular times. For example, in hospitals many work-items are linked to appointments, e.g., a doctor cannot perform surgery without reserving an operating theater and making sure that the patient is present. One of the problems when applying workflow technology in such domains is the lack of calendar-based scheduling support. In this paper, we present an approach that supports the seamless integration of unscheduled (flow) and scheduled (schedule) tasks. Using CPN Tools we have developed a specification and simulation model for schedule-aware workflow management systems. Based on this a system has been realized that uses YAWL, Microsoft Exchange Server 2007, Outlook, and a dedicated scheduling service. The approach is illustrated using a real-life case study at the AMC hospital in the Netherlands. In addition, we elaborate on the experiences obtained when developing and implementing a system of this scale using formal techniques.

  16. O potencial das redes de Petri em modelagem e análise de processos de negócio The potencial of Petri nets in modeling and analysis of workflow

    Directory of Open Access Journals (Sweden)

    Sílvia Inês Dallavalle de Pádua

    2004-04-01

    Full Text Available A tecnologia de gerenciamento workflow procura oferecer uma solução flexível em apoio aos processos de negócios, por meio da facilitação de modificações e da criação de novos processos. Entretanto, a falta de definição bem formalizada no que se refere à sintaxe e à semântica de tais técnicas dificulta análises mais complexas dos modelos. Nesse caso, redes de Petri atuam com excelente potencial, uma vez que possuem representação gráfica, são de fácil aprendizado, funcionam como linguagem de comunicação entre especialistas de diversas áreas, possibilitam descrever aspectos estáticos e dinâmicos do sistema a ser representado e ainda possuem o formalismo matemático necessário para métodos de análise já consagrados. O presente trabalho tem por objetivo, primeiramente, oferecer uma visão atualizada do estado da arte na área de modelagem de workflow baseada em redes de Petri e expor um exemplo de modelo de workflow com parâmetros de tempo e custo associados às transições da rede. O objetivo secundário é apresentar conceitos importantes de redes de Petri, workflow e rotas de processos.Unfortunately these systems and techniques do not have well-defined syntax and semantic, which makes harder the complex analysis of models. In this case, Petri nets have excellent potential to solve the problem, once they present graphic representation and easy understanding. They are used as a communication language among expert people in different areas, and may allow the description of static and dynamic aspects of the systems that have to be represented. Also, they have mathematic formalism, which make possibly the use of the analysis methods. The objective of this work is to present the potential of Petri nets in studies of Enterprise Modelling and Workflow, showing an example of application, with time and cost parameters.

  17. Optimizing CyberShake Seismic Hazard Workflows for Large HPC Resources

    Science.gov (United States)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2014-12-01

    The CyberShake computational platform is a well-integrated collection of scientific software and middleware that calculates 3D simulation-based probabilistic seismic hazard curves and hazard maps for the Los Angeles region. Currently each CyberShake model comprises about 235 million synthetic seismograms from about 415,000 rupture variations computed at 286 sites. CyberShake integrates large-scale parallel and high-throughput serial seismological research codes into a processing framework in which early stages produce files used as inputs by later stages. Scientific workflow tools are used to manage the jobs, data, and metadata. The Southern California Earthquake Center (SCEC) developed the CyberShake platform using USC High Performance Computing and Communications systems and open-science NSF resources.CyberShake calculations were migrated to the NSF Track 1 system NCSA Blue Waters when it became operational in 2013, via an interdisciplinary team approach including domain scientists, computer scientists, and middleware developers. Due to the excellent performance of Blue Waters and CyberShake software optimizations, we reduced the makespan (a measure of wallclock time-to-solution) of a CyberShake study from 1467 to 342 hours. We will describe the technical enhancements behind this improvement, including judicious introduction of new GPU software, improved scientific software components, increased workflow-based automation, and Blue Waters-specific workflow optimizations.Our CyberShake performance improvements highlight the benefits of scientific workflow tools. The CyberShake workflow software stack includes the Pegasus Workflow Management System (Pegasus-WMS, which includes Condor DAGMan), HTCondor, and Globus GRAM, with Pegasus-mpi-cluster managing the high-throughput tasks on the HPC resources. The workflow tools handle data management, automatically transferring about 13 TB back to SCEC storage.We will present performance metrics from the most recent Cyber

  18. Semi-Automatic Science Workflow Synthesis for High-End Computing on the NASA Earth Exchange

    Data.gov (United States)

    National Aeronautics and Space Administration — Enhance capabilities for collaborative data analysis and modeling in Earth sciences. Develop components for automatic workflow capture, archiving and management....

  19. Similarity measures for scientific workflows

    OpenAIRE

    Starlinger, Johannes

    2016-01-01

    In Laufe der letzten zehn Jahre haben Scientific Workflows als Werkzeug zur Erstellung von reproduzierbaren, datenverarbeitenden in-silico Experimenten an Aufmerksamkeit gewonnen, in die sowohl lokale Skripte und Anwendungen, als auch Web-Services eingebunden werden können. Über spezialisierte Online-Bibliotheken, sogenannte Repositories, können solche Workflows veröffentlicht und wiederverwendet werden. Mit zunehmender Größe dieser Repositories werden Ähnlichkeitsmaße für Scientific Workfl...

  20. Scientific Workflow Management in Proteomics

    Science.gov (United States)

    de Bruin, Jeroen S.; Deelder, André M.; Palmblad, Magnus

    2012-01-01

    Data processing in proteomics can be a challenging endeavor, requiring extensive knowledge of many different software packages, all with different algorithms, data format requirements, and user interfaces. In this article we describe the integration of a number of existing programs and tools in Taverna Workbench, a scientific workflow manager currently being developed in the bioinformatics community. We demonstrate how a workflow manager provides a single, visually clear and intuitive interface to complex data analysis tasks in proteomics, from raw mass spectrometry data to protein identifications and beyond. PMID:22411703

  1. Analysing scientific workflows: Why workflows not only connect web services

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; Wolstencroft, K.; Neerincx, P.B.T.; Roos, M.; Rauwerda, H.; Breit, T.M.; Zhang, L.J.

    2009-01-01

    Life science workflow systems are developed to help life scientists to conveniently connect various programs and web services. In practice however, much time is spent on data conversion, because web services provided by different organisations use different data formats. We have analysed all the

  2. Analysing scientific workflows: why workflows not only connect web services

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; Wolstencroft, K.; Neerincx, P.B.T.; Roos, M.; Rauwerda, H.; Breit, T.M.; Zhang, LJ.

    2009-01-01

    Life science workflow systems are developed to help life scientists to conveniently connect various programs and web services. In practice however, much time is spent on data conversion, because web services provided by different organisations use different data formats. We have analysed all the

  3. BALANCED SCORECARDS EVALUATION MODEL THAT INCLUDES ELEMENTS OF ENVIRONMENTAL MANAGEMENT SYSTEM USING AHP MODEL

    Directory of Open Access Journals (Sweden)

    Jelena Jovanović

    2010-03-01

    Full Text Available The research is oriented on improvement of environmental management system (EMS using BSC (Balanced Scorecard model that presents strategic model of measurem ents and improvement of organisational performance. The research will present approach of objectives and environmental management me trics involvement (proposed by literature review in conventional BSC in "Ad Barska plovi dba" organisation. Further we will test creation of ECO-BSC model based on business activities of non-profit organisations in order to improve envir onmental management system in parallel with other systems of management. Using this approach we may obtain 4 models of BSC that includ es elements of environmen tal management system for AD "Barska plovidba". Taking into acc ount that implementation and evaluation need long period of time in AD "Barska plovidba", the final choice will be based on 14598 (Information technology - Software product evaluation and ISO 9126 (Software engineering - Product quality using AHP method. Those standards are usually used for evaluation of quality software product and computer programs that serve in organisation as support and factors for development. So, AHP model will be bas ed on evolution criteria based on suggestion of ISO 9126 standards and types of evaluation from two evaluation teams. Members of team & will be experts in BSC and environmental management system that are not em ployed in AD "Barska Plovidba" organisation. The members of team 2 will be managers of AD "Barska Plovidba" organisation (including manage rs from environmental department. Merging results based on previously cr eated two AHP models, one can obtain the most appropriate BSC that includes elements of environmental management system. The chosen model will present at the same time suggestion for approach choice including ecological metrics in conventional BSC model for firm that has at least one ECO strategic orientation.

  4. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology.

    Science.gov (United States)

    Cock, Peter J A; Grüning, Björn A; Paszkiewicz, Konrad; Pritchard, Leighton

    2013-01-01

    The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of "effector" proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen's predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu).

  5. NeuroManager: A workflow analysis based simulation management engine for computational neuroscience

    Directory of Open Access Journals (Sweden)

    David Bruce Stockton

    2015-10-01

    Full Text Available We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach 1 provides flexibility to adapt to a variety of neuroscience simulators, 2 simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and 3 improves tracking of simulator/simulation evolution. We implemented NeuroManager in Matlab, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in twenty-two stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to Matlab's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  6. Office 2010 Workflow Developing Collaborative Solutions

    CERN Document Server

    Mann, David; Enterprises, Creative

    2010-01-01

    Workflow is the glue that binds information worker processes, users, and artifacts. Without workflow, information workers are just islands of data and potential. Office 2010 Workflow details how to implement workflow in SharePoint 2010 and the client Microsoft Office 2010 suite to help information workers share data, enforce processes and business rules, and work more efficiently together or solo. This book covers everything you need to know-from what workflow is all about to creating new activities; from the SharePoint Designer to Visual Studio 2010; from out-of-the-box workflows to state mac

  7. Reenginering of the i4 workflow engine

    OpenAIRE

    Likar, Tilen

    2013-01-01

    I4 is an enterprise resource planning system which allows you to manage business processes. Due to increasing demands for managing complex processes and adjusting those processes to global standards, a renewal of a part of the system was required. In this thesis we faced the reengineering of the workflow engine, and corresponding data model. We designed a business process diagram in Bizagi Porcess Modeler. The import to i4 and the export from i4 was developed on XPDL file exported from the mo...

  8. CO2 Storage Feasibility: A Workflow for Site Characterisation

    Directory of Open Access Journals (Sweden)

    Nepveu Manuel

    2015-04-01

    Full Text Available In this paper, we present an overview of the SiteChar workflow model for site characterisation and assessment for CO2 storage. Site characterisation and assessment is required when permits are requested from the legal authorities in the process of starting a CO2 storage process at a given site. The goal is to assess whether a proposed CO2 storage site can indeed be used for permanent storage while meeting the safety requirements demanded by the European Commission (EC Storage Directive (9, Storage Directive 2009/31/EC. Many issues have to be scrutinised, and the workflow presented here is put forward to help efficiently organise this complex task. Three issues are highlighted: communication within the working team and with the authorities; interdependencies in the workflow and feedback loops; and the risk-based character of the workflow. A general overview (helicopter view of the workflow is given; the issues involved in communication and the risk assessment process are described in more detail. The workflow as described has been tested within the SiteChar project on five potential storage sites throughout Europe. This resulted in a list of key aspects of site characterisation which can help prepare and focus new site characterisation studies.

  9. Kronos: a workflow assembler for genome analytics and informatics

    Science.gov (United States)

    Taghiyar, M. Jafar; Rosner, Jamie; Grewal, Diljot; Grande, Bruno M.; Aniba, Radhouane; Grewal, Jasleen; Boutros, Paul C.; Morin, Ryan D.

    2017-01-01

    Abstract Background: The field of next-generation sequencing informatics has matured to a point where algorithmic advances in sequence alignment and individual feature detection methods have stabilized. Practical and robust implementation of complex analytical workflows (where such tools are structured into “best practices” for automated analysis of next-generation sequencing datasets) still requires significant programming investment and expertise. Results: We present Kronos, a software platform for facilitating the development and execution of modular, auditable, and distributable bioinformatics workflows. Kronos obviates the need for explicit coding of workflows by compiling a text configuration file into executable Python applications. Making analysis modules would still require programming. The framework of each workflow includes a run manager to execute the encoded workflows locally (or on a cluster or cloud), parallelize tasks, and log all runtime events. The resulting workflows are highly modular and configurable by construction, facilitating flexible and extensible meta-applications that can be modified easily through configuration file editing. The workflows are fully encoded for ease of distribution and can be instantiated on external systems, a step toward reproducible research and comparative analyses. We introduce a framework for building Kronos components that function as shareable, modular nodes in Kronos workflows. Conclusions: The Kronos platform provides a standard framework for developers to implement custom tools, reuse existing tools, and contribute to the community at large. Kronos is shipped with both Docker and Amazon Web Services Machine Images. It is free, open source, and available through the Python Package Index and at https://github.com/jtaghiyar/kronos. PMID:28655203

  10. Workflows for microarray data processing in the Kepler environment

    Directory of Open Access Journals (Sweden)

    Stropp Thomas

    2012-05-01

    Full Text Available Abstract Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data and therefore are close to

  11. Workflows for microarray data processing in the Kepler environment.

    Science.gov (United States)

    Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark

    2012-05-17

    Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R

  12. Workflows for microarray data processing in the Kepler environment

    Science.gov (United States)

    2012-01-01

    Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or

  13. On the Support of Scientific Workflows over Pub/Sub Brokers

    Directory of Open Access Journals (Sweden)

    Edwin Cedeño

    2013-08-01

    Full Text Available The execution of scientific workflows is gaining importance as more computing resources are available in the form of grid environments. The Publish/Subscribe paradigm offers well-proven solutions for sustaining distributed scenarios while maintaining the high level of task decoupling required by scientific workflows. In this paper, we propose a new model for supporting scientific workflows that improves the dissemination of control events. The proposed solution is based on the mapping of workflow tasks to the underlying Pub/Sub event layer, and the definition of interfaces and procedures for execution on brokers. In this paper we also analyze the strengths and weaknesses of current solutions that are based on existing message exchange models for scientific workflows. Finally, we explain how our model improves the information dissemination, event filtering, task decoupling and the monitoring of scientific workflows.

  14. On the support of scientific workflows over Pub/Sub brokers.

    Science.gov (United States)

    Morales, Augusto; Robles, Tomas; Alcarria, Ramon; Cedeño, Edwin

    2013-08-20

    The execution of scientific workflows is gaining importance as more computing resources are available in the form of grid environments. The Publish/Subscribe paradigm offers well-proven solutions for sustaining distributed scenarios while maintaining the high level of task decoupling required by scientific workflows. In this paper, we propose a new model for supporting scientific workflows that improves the dissemination of control events. The proposed solution is based on the mapping of workflow tasks to the underlying Pub/Sub event layer, and the definition of interfaces and procedures for execution on brokers. In this paper we also analyze the strengths and weaknesses of current solutions that are based on existing message exchange models for scientific workflows. Finally, we explain how our model improves the information dissemination, event filtering, task decoupling and the monitoring of scientific workflows.

  15. A Kepler Workflow Tool for Reproducible AMBER GPU Molecular Dynamics.

    Science.gov (United States)

    Purawat, Shweta; Ieong, Pek U; Malmstrom, Robert D; Chan, Garrett J; Yeung, Alan K; Walker, Ross C; Altintas, Ilkay; Amaro, Rommie E

    2017-06-20

    With the drive toward high throughput molecular dynamics (MD) simulations involving ever-greater numbers of simulation replicates run for longer, biologically relevant timescales (microseconds), the need for improved computational methods that facilitate fully automated MD workflows gains more importance. Here we report the development of an automated workflow tool to perform AMBER GPU MD simulations. Our workflow tool capitalizes on the capabilities of the Kepler platform to deliver a flexible, intuitive, and user-friendly environment and the AMBER GPU code for a robust and high-performance simulation engine. Additionally, the workflow tool reduces user input time by automating repetitive processes and facilitates access to GPU clusters, whose high-performance processing power makes simulations of large numerical scale possible. The presented workflow tool facilitates the management and deployment of large sets of MD simulations on heterogeneous computing resources. The workflow tool also performs systematic analysis on the simulation outputs and enhances simulation reproducibility, execution scalability, and MD method development including benchmarking and validation. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  16. CMS Distributed Computing Workflow Experience

    CERN Document Server

    Haas, Jeffrey David

    2010-01-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simul...

  17. New Interactions with Workflow Systems

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; van der Veer, Gerrit C.; Roos, M.; van Dijk, Elisabeth M.A.G.; Norros, L.; Koskinen, H.; Salo, L.; Savioja, P.

    2009-01-01

    This paper describes the evaluation of our early design ideas of an ad-hoc of workflow system. Using the teach-back technique, we have performed a hermeneutic analysis of the mockup implementation named NIWS to get corrective and creative feedback at the functional, dialogue and representation level

  18. SMITH: a LIMS for handling next-generation sequencing workflows.

    Science.gov (United States)

    Venco, Francesco; Vaskin, Yuriy; Ceol, Arnaud; Muller, Heiko

    2014-01-01

    Life-science laboratories make increasing use of Next Generation Sequencing (NGS) for studying bio-macromolecules and their interactions. Array-based methods for measuring gene expression or protein-DNA interactions are being replaced by RNA-Seq and ChIP-Seq. Sequencing is generally performed by specialized facilities that have to keep track of sequencing requests, trace samples, ensure quality and make data available according to predefined privileges. An integrated tool helps to troubleshoot problems, to maintain a high quality standard, to reduce time and costs. Commercial and non-commercial tools called LIMS (Laboratory Information Management Systems) are available for this purpose. However, they often come at prohibitive cost and/or lack the flexibility and scalability needed to adjust seamlessly to the frequently changing protocols employed. In order to manage the flow of sequencing data produced at the Genomic Unit of the Italian Institute of Technology (IIT), we developed SMITH (Sequencing Machine Information Tracking and Handling). SMITH is a web application with a MySQL server at the backend. Wet-lab scientists of the Centre for Genomic Science and database experts from the Politecnico of Milan in the context of a Genomic Data Model Project developed SMITH. The data base schema stores all the information of an NGS experiment, including the descriptions of all protocols and algorithms used in the process. Notably, an attribute-value table allows associating an unconstrained textual description to each sample and all the data produced afterwards. This method permits the creation of metadata that can be used to search the database for specific files as well as for statistical analyses. SMITH runs automatically and limits direct human interaction mainly to administrative tasks. SMITH data-delivery procedures were standardized making it easier for biologists and analysts to navigate the data. Automation also helps saving time. The workflows are available

  19. SMITH: a LIMS for handling next-generation sequencing workflows

    Science.gov (United States)

    2014-01-01

    Background Life-science laboratories make increasing use of Next Generation Sequencing (NGS) for studying bio-macromolecules and their interactions. Array-based methods for measuring gene expression or protein-DNA interactions are being replaced by RNA-Seq and ChIP-Seq. Sequencing is generally performed by specialized facilities that have to keep track of sequencing requests, trace samples, ensure quality and make data available according to predefined privileges. An integrated tool helps to troubleshoot problems, to maintain a high quality standard, to reduce time and costs. Commercial and non-commercial tools called LIMS (Laboratory Information Management Systems) are available for this purpose. However, they often come at prohibitive cost and/or lack the flexibility and scalability needed to adjust seamlessly to the frequently changing protocols employed. In order to manage the flow of sequencing data produced at the Genomic Unit of the Italian Institute of Technology (IIT), we developed SMITH (Sequencing Machine Information Tracking and Handling). Methods SMITH is a web application with a MySQL server at the backend. Wet-lab scientists of the Centre for Genomic Science and database experts from the Politecnico of Milan in the context of a Genomic Data Model Project developed SMITH. The data base schema stores all the information of an NGS experiment, including the descriptions of all protocols and algorithms used in the process. Notably, an attribute-value table allows associating an unconstrained textual description to each sample and all the data produced afterwards. This method permits the creation of metadata that can be used to search the database for specific files as well as for statistical analyses. Results SMITH runs automatically and limits direct human interaction mainly to administrative tasks. SMITH data-delivery procedures were standardized making it easier for biologists and analysts to navigate the data. Automation also helps saving time. The

  20. Snakemake-a scalable bioinformatics workflow engine

    NARCIS (Netherlands)

    J. Köster (Johannes); S. Rahmann (Sven)

    2012-01-01

    textabstractSnakemake is a workflow engine that provides a readable Python-based workflow definition language and a powerful execution environment that scales from single-core workstations to compute clusters without modifying the workflow. It is the first system to support the use of automatically

  1. Behavioral technique for workflow abstraction and matching

    NARCIS (Netherlands)

    Klai, K.; Ould Ahmed M'bareck, N.; Tata, S.; Dustdar, S.; Fiadeiro, J.L.; Sheth, A.

    2006-01-01

    This work is in line with the CoopFlow approach dedicated for workflow advertisement, interconnection, and cooperation in virtual organizations. In order to advertise workflows into a registry, we present in this paper a novel method to abstract behaviors of workflows into symbolic observation

  2. A performance study of grid workflow engines

    NARCIS (Netherlands)

    Stratan, C.; Iosup, A.; Epema, D.H.J.

    2008-01-01

    To benefit from grids, scientists require grid workflow engines that automatically manage the execution of inter-related jobs on the grid infrastructure. So far, the workflows community has focused on scheduling algorithms and on interface tools. Thus, while several grid workflow engines have been

  3. Design and implementation of a secure workflow system based on PKI/PMI

    Science.gov (United States)

    Yan, Kai; Jiang, Chao-hui

    2013-03-01

    As the traditional workflow system in privilege management has the following weaknesses: low privilege management efficiency, overburdened for administrator, lack of trust authority etc. A secure workflow model based on PKI/PMI is proposed after studying security requirements of the workflow systems in-depth. This model can achieve static and dynamic authorization after verifying user's ID through PKC and validating user's privilege information by using AC in workflow system. Practice shows that this system can meet the security requirements of WfMS. Moreover, it can not only improve system security, but also ensures integrity, confidentiality, availability and non-repudiation of the data in the system.

  4. Evolutionary optimization of production materials workflow processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    We present an evolutionary optimisation technique for stochastic production processes, which is able to find improved production materials workflow processes with respect to arbitrary combinations of numerical quantities associated with the production process. Working from a core fragment...... of the BPMN language, we employ an evolutionary algorithm where stochastic model checking is used as a fitness function to determine the degree of improvement of candidate processes derived from the original process through mutation and cross-over operations. We illustrate this technique using a case study...

  5. CMS data and workflow management system

    CERN Document Server

    Fanfani, A; Bacchi, W; Codispoti, G; De Filippis, N; Pompili, A; My, S; Abbrescia, M; Maggi, G; Donvito, G; Silvestris, L; Calzolari, F; Sarkar, S; Spiga, D; Cinquili, M; Lacaprara, S; Biasotto, M; Farina, F; Merlo, M; Belforte, S; Kavka, C; Sala, L; Harvey, J; Hufnagel, D; Fanzago, F; Corvo, M; Magini, N; Rehn, J; Toteva, Z; Feichtinger, D; Tuura, L; Eulisse, G; Bockelman, B; Lundstedt, C; Egeland, R; Evans, D; Mason, D; Gutsche, O; Sexton-Kennedy, L; Dagenhart, D W; Afaq, A; Guo, Y; Kosyakov, S; Lueking, L; Sekhri, V; Fisk, I; McBride, P; Bauerdick, L; Bakken, J; Rossman, P; Wicklund, E; Wu, Y; Jones, C; Kuznetsov, V; Riley, D; Dolgert, A; van Lingen, F; Narsky, I; Paus, C; Klute, M; Gomez-Ceballos, G; Piedra-Gomez, J; Miller, M; Mohapatra, A; Lazaridis, C; Bradley, D; Elmer, P; Wildish, T; Wuerthwein, F; Letts, J; Bourilkov, D; Kim, B; Smith, P; Hernandez, J M; Caballero, J; Delgado, A; Flix, J; Cabrillo-Bartolome, I; Kasemann, M; Flossdorf, A; Stadie, H; Kreuzer, P; Khomitch, A; Hof, C; Zeidler, C; Kalini, S; Trunov, A; Saout, C; Felzmann, U; Metson, S; Newbold, D; Geddes, N; Brew, C; Jackson, J; Wakefield, S; De Weirdt, S; Adler, V; Maes, J; Van Mulders, P; Villella, I; Hammad, G; Pukhaeva, N; Kurca, T; Semneniouk, I; Guan, W; Lajas, J A; Teodoro, D; Gregores, E; Baquero, M; Shehzad, A; Kadastik, M; Kodolova, O; Chao, Y; Ming Kuo, C; Filippidis, C; Walzel, G; Han, D; Kalinowski, A; Giro de Almeida, N M; Panyam, N

    2008-01-01

    CMS expects to manage many tens of peta bytes of data to be distributed over several computing centers around the world. The CMS distributed computing and analysis model is designed to serve, process and archive the large number of events that will be generated when the CMS detector starts taking data. The underlying concepts and the overall architecture of the CMS data and workflow management system will be presented. In addition the experience in using the system for MC production, initial detector commissioning activities and data analysis will be summarized.

  6. A reliable computational workflow for the selection of optimal screening libraries.

    Science.gov (United States)

    Gilad, Yocheved; Nadassy, Katalin; Senderowitz, Hanoch

    2015-01-01

    The experimental screening of compound collections is a common starting point in many drug discovery projects. Successes of such screening campaigns critically depend on the quality of the screened library. Many libraries are currently available from different vendors yet the selection of the optimal screening library for a specific project is challenging. We have devised a novel workflow for the rational selection of project-specific screening libraries. The workflow accepts as input a set of virtual candidate libraries and applies the following steps to each library: (1) data curation; (2) assessment of ADME/T profile; (3) assessment of the number of promiscuous binders/frequent HTS hitters; (4) assessment of internal diversity; (5) assessment of similarity to known active compound(s) (optional); (6) assessment of similarity to in-house or otherwise accessible compound collections (optional). For ADME/T profiling, Lipinski's and Veber's rule-based filters were implemented and a new blood brain barrier permeation model was developed and validated (85 and 74 % success rate for training set and test set, respectively). Diversity and similarity descriptors which demonstrated best performances in terms of their ability to select either diverse or focused sets of compounds from three databases (Drug Bank, CMC and CHEMBL) were identified and used for diversity and similarity assessments. The workflow was used to analyze nine common screening libraries available from six vendors. The results of this analysis are reported for each library providing an assessment of its quality. Furthermore, a consensus approach was developed to combine the results of these analyses into a single score for selecting the optimal library under different scenarios. We have devised and tested a new workflow for the rational selection of screening libraries under different scenarios. The current workflow was implemented using the Pipeline Pilot software yet due to the usage of generic

  7. Incorporating Workflow Interference in Facility Layout Design: The Quartic Assignment Problem

    OpenAIRE

    Wen-Chyuan Chiang; Panagiotis Kouvelis; Timothy L. Urban

    2002-01-01

    Although many authors have noted the importance of minimizing workflow interference in facility layout design, traditional layout research tends to focus on minimizing the distance-based transportation cost. This paper formalizes the concept of workflow interference from a facility layout perspective. A model, formulated as a quartic assignment problem, is developed that explicitly considers the interference of workflow. Optimal and heuristic solution methodologies are developed and evaluated.

  8. Text mining for the biocuration workflow

    Science.gov (United States)

    Hirschman, Lynette; Burns, Gully A. P. C; Krallinger, Martin; Arighi, Cecilia; Cohen, K. Bretonnel; Valencia, Alfonso; Wu, Cathy H.; Chatr-Aryamontri, Andrew; Dowell, Karen G.; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G.

    2012-01-01

    Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on ‘Text Mining for the BioCuration Workflow’ at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community. PMID:22513129

  9. Detecting dissonance in clinical and research workflow for translational psychiatric registries.

    Science.gov (United States)

    Cofiel, Luciana; Bassi, Débora U; Ray, Ryan Kumar; Pietrobon, Ricardo; Brentani, Helena

    2013-01-01

    The interplay between the workflow for clinical tasks and research data collection is often overlooked, ultimately making it ineffective. To the best of our knowledge, no previous studies have developed standards that allow for the comparison of workflow models derived from clinical and research tasks toward the improvement of data collection processes. In this study we used the term dissonance for the occurrences where there was a discord between clinical and research workflows. We developed workflow models for a translational research study in psychiatry and the clinic where its data collection was carried out. After identifying points of dissonance between clinical and research models we derived a corresponding classification system that ultimately enabled us to re-engineer the data collection workflow. We considered (1) the number of patients approached for enrollment and (2) the number of patients enrolled in the study as indicators of efficiency in research workflow. We also recorded the number of dissonances before and after the workflow modification. We identified 22 episodes of dissonance across 6 dissonance categories: actor, communication, information, artifact, time, and space. We were able to eliminate 18 episodes of dissonance and increase the number of patients approached and enrolled in research study trough workflow modification. The classification developed in this study is useful for guiding the identification of dissonances and reveal modifications required to align the workflow of data collection and the clinical setting. The methodology described in this study can be used by researchers to standardize data collection process.

  10. Application of Workflow Technology for Big Data Analysis Service

    Directory of Open Access Journals (Sweden)

    Bin Zhang

    2018-04-01

    Full Text Available This study presents a lightweight representational state transfer-based cloud workflow system to construct a big data intelligent software-as-a-service (SaaS platform. The system supports the dynamic construction and operation of an intelligent data analysis application, and realizes rapid development and flexible deployment of the business analysis process that can improve the interaction and response time of the process. The proposed system integrates offline-batch and online-streaming analysis models that allow users to conduct batch and streaming computing simultaneously. Users can rend cloud capabilities and customize a set of big data analysis applications in the form of workflow processes. This study elucidates the architecture and application modeling, customization, dynamic construction, and scheduling of a cloud workflow system. A chain workflow foundation mechanism is proposed to combine several analysis components into a chain component that can promote efficiency. Four practical application cases are provided to verify the analysis capability of the system. Experimental results show that the proposed system can support multiple users in accessing the system concurrently and effectively uses data analysis algorithms. The proposed SaaS workflow system has been used in network operators and has achieved good results.

  11. Development of the workflow kine systems for support on KAIZEN.

    Science.gov (United States)

    Mizuno, Yuki; Ito, Toshihiko; Yoshikawa, Toru; Yomogida, Satoshi; Morio, Koji; Sakai, Kazuhiro

    2012-01-01

    In this paper, we introduce the new workflow line system consisted of the location and image recording, which led to the acquisition of workflow information and the analysis display. From the results of workflow line investigation, we considered the anticipated effects and the problems on KAIZEN. Workflow line information included the location information and action contents information. These technologies suggest the viewpoints to help improvement, for example, exclusion of useless movement, the redesign of layout and the review of work procedure. Manufacturing factory, it was clear that there was much movement from the standard operation place and accumulation residence time. The following was shown as a result of this investigation, to be concrete, the efficient layout was suggested by this system. In the case of the hospital, similarly, it is pointed out that the workflow has the problem of layout and setup operations based on the effective movement pattern of the experts. This system could adapt to routine work, including as well as non-routine work. By the development of this system which can fit and adapt to industrial diversification, more effective "visual management" (visualization of work) is expected in the future.

  12. Study of a diffusion flamelet model, with preferential diffusion effects included

    NARCIS (Netherlands)

    Delhaye, S.; Somers, L.M.T.; Bongers, H.; Oijen, van J.A.; Goey, de L.P.H.; Dias, V.

    2005-01-01

    The non-premixed flamelet model of Peters [1] (model1), which does not include preferential diffusion effects is investigated. Two similar models are presented, but without the assumption of unity Lewis numbers. One of these models was derived by Peters & Pitsch [2] (model2), while the other one was

  13. How to plan workflow changes: a practical quality improvement tool used in an outpatient hospital pharmacy.

    Science.gov (United States)

    Aguilar, Christine; Chau, Connie; Giridharan, Neha; Huh, Youchin; Cooley, Janet; Warholak, Terri L

    2013-06-01

    A quality improvement tool is provided to improve pharmacy workflow with the goal of minimizing errors caused by workflow issues. This study involved workflow evaluation and reorganization, and staff opinions of these proposed changes. The study pharmacy was an outpatient pharmacy in the Tucson area. However, the quality improvement tool may be applied in all pharmacy settings, including but not limited to community, hospital, and independent pharmacies. This tool can help the user to identify potential workflow problem spots, such as high-traffic areas through the creation of current and proposed workflow diagrams. Creating a visual representation can help the user to identify problem spots and to propose changes to optimize workflow. It may also be helpful to assess employees' opinions of these changes. The workflow improvement tool can be used to assess where improvements are needed in a pharmacy's floor plan and workflow. Suggestions for improvements in the study pharmacy included increasing the number of verification points and decreasing high traffic areas in the workflow. The employees of the study pharmacy felt that the proposed changes displayed greater continuity, sufficiency, accessibility, and space within the pharmacy.

  14. Scheduling Multilevel Deadline-Constrained Scientific Workflows on Clouds Based on Cost Optimization

    Directory of Open Access Journals (Sweden)

    Maciej Malawski

    2015-01-01

    Full Text Available This paper presents a cost optimization model for scheduling scientific workflows on IaaS clouds such as Amazon EC2 or RackSpace. We assume multiple IaaS clouds with heterogeneous virtual machine instances, with limited number of instances per cloud and hourly billing. Input and output data are stored on a cloud object store such as Amazon S3. Applications are scientific workflows modeled as DAGs as in the Pegasus Workflow Management System. We assume that tasks in the workflows are grouped into levels of identical tasks. Our model is specified using mathematical programming languages (AMPL and CMPL and allows us to minimize the cost of workflow execution under deadline constraints. We present results obtained using our model and the benchmark workflows representing real scientific applications in a variety of domains. The data used for evaluation come from the synthetic workflows and from general purpose cloud benchmarks, as well as from the data measured in our own experiments with Montage, an astronomical application, executed on Amazon EC2 cloud. We indicate how this model can be used for scenarios that require resource planning for scientific workflows and their ensembles.

  15. Scientific Workflows and the Sensor Web for Virtual Environmental Observatories

    Science.gov (United States)

    Simonis, I.; Vahed, A.

    2008-12-01

    interfaces. All data sets and sensor communication follow well-defined abstract models and corresponding encodings, mostly developed by the OGC Sensor Web Enablement initiative. Scientific progress is currently accelerated by an emerging new concept called scientific workflows, which organize and manage complex distributed computations. A scientific workflow represents and records the highly complex processes that a domain scientist typically would follow in exploration, discovery and ultimately, transformation of raw data to publishable results. The challenge is now to integrate the benefits of scientific workflows with those provided by the Sensor Web in order to leverage all resources for scientific exploration, problem solving, and knowledge generation. Scientific workflows for the Sensor Web represent the next evolutionary step towards efficient, powerful, and flexible earth observation frameworks and platforms. Those platforms support the entire process from capturing data, sharing and integrating, to requesting additional observations. Multiple sites and organizations will participate on single platforms and scientists from different countries and organizations interact and contribute to large-scale research projects. Simultaneously, the data- and information overload becomes manageable, as multiple layers of abstraction will free scientists to deal with underlying data-, processing or storage peculiarities. The vision are automated investigation and discovery mechanisms that allow scientists to pose queries to the system, which in turn would identify potentially related resources, schedules processing tasks and assembles all parts in workflows that may satisfy the query.

  16. Analog to digital workflow improvement: a quantitative study.

    Science.gov (United States)

    Wideman, Catherine; Gallet, Jacqueline

    2006-01-01

    This study tracked a radiology department's conversion from utilization of a Kodak Amber analog system to a Kodak DirectView DR 5100 digital system. Through the use of ProModel Optimization Suite, a workflow simulation software package, significant quantitative information was derived from workflow process data measured before and after the change to a digital system. Once the digital room was fully operational and the radiology staff comfortable with the new system, average patient examination time was reduced from 9.24 to 5.28 min, indicating that a higher patient throughput could be achieved. Compared to the analog system, chest examination time for modality specific activities was reduced by 43%. The percentage of repeat examinations experienced with the digital system also decreased to 8% vs. the level of 9.5% experienced with the analog system. The study indicated that it is possible to quantitatively study clinical workflow and productivity by using commercially available software.

  17. Assessment of the Nurse Medication Administration Workflow Process

    Directory of Open Access Journals (Sweden)

    Nathan Huynh

    2016-01-01

    Full Text Available This paper presents findings of an observational study of the Registered Nurse (RN Medication Administration Process (MAP conducted on two comparable medical units in a large urban tertiary care medical center in Columbia, South Carolina. A total of 305 individual MAP observations were recorded over a 6-week period with an average of 5 MAP observations per RN participant for both clinical units. A key MAP variation was identified in terms of unbundled versus bundled MAP performance. In the unbundled workflow, an RN engages in the MAP by performing only MAP tasks during a care episode. In the bundled workflow, an RN completes medication administration along with other patient care responsibilities during the care episode. Using a discrete-event simulation model, this paper addresses the difference between unbundled and bundled workflow and their effects on simulated redesign interventions.

  18. Nexus: A modular workflow management system for quantum simulation codes

    Science.gov (United States)

    Krogel, Jaron T.

    2016-01-01

    The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.

  19. Soundness of Timed-Arc Workflow Nets in Discrete and Continuous-Time Semantics

    DEFF Research Database (Denmark)

    Mateo, Jose Antonio; Srba, Jiri; Sørensen, Mathias Grund

    2015-01-01

    Analysis of workflow processes with quantitative aspectslike timing is of interest in numerous time-critical applications. We suggest a workflow model based on timed-arc Petri nets and studythe foundational problems of soundness and strong (time-bounded) soundness.We first consider the discrete-t...

  20. Deadline-constrained workflow scheduling algorithms for Infrastructure as a Service Clouds

    NARCIS (Netherlands)

    Abrishami, S.; Naghibzadeh, M.; Epema, D.H.J.

    2013-01-01

    The advent of Cloud computing as a new model of service provisioning in distributed systems encourages researchers to investigate its benefits and drawbacks on executing scientific applications such as workflows. One of the most challenging problems in Clouds is workflow scheduling, i.e., the

  1. Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.

    Science.gov (United States)

    Stockton, David B; Santamaria, Fidel

    2017-10-01

    We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.

  2. Leveraging workflow control patterns in the domain of clinical practice guidelines.

    Science.gov (United States)

    Kaiser, Katharina; Marcos, Mar

    2016-02-10

    Clinical practice guidelines (CPGs) include recommendations describing appropriate care for the management of patients with a specific clinical condition. A number of representation languages have been developed to support executable CPGs, with associated authoring/editing tools. Even with tool assistance, authoring of CPG models is a labor-intensive task. We aim at facilitating the early stages of CPG modeling task. In this context, we propose to support the authoring of CPG models based on a set of suitable procedural patterns described in an implementation-independent notation that can be then semi-automatically transformed into one of the alternative executable CPG languages. We have started with the workflow control patterns which have been identified in the fields of workflow systems and business process management. We have analyzed the suitability of these patterns by means of a qualitative analysis of CPG texts. Following our analysis we have implemented a selection of workflow patterns in the Asbru and PROforma CPG languages. As implementation-independent notation for the description of patterns we have chosen BPMN 2.0. Finally, we have developed XSLT transformations to convert the BPMN 2.0 version of the patterns into the Asbru and PROforma languages. We showed that although a significant number of workflow control patterns are suitable to describe CPG procedural knowledge, not all of them are applicable in the context of CPGs due to their focus on single-patient care. Moreover, CPGs may require additional patterns not included in the set of workflow control patterns. We also showed that nearly all the CPG-suitable patterns can be conveniently implemented in the Asbru and PROforma languages. Finally, we demonstrated that individual patterns can be semi-automatically transformed from a process specification in BPMN 2.0 to executable implementations in these languages. We propose a pattern and transformation-based approach for the development of CPG models

  3. BioModels: expanding horizons to include more modelling approaches and formats.

    Science.gov (United States)

    Glont, Mihai; Nguyen, Tung V N; Graesslin, Martin; Hälke, Robert; Ali, Raza; Schramm, Jochen; Wimalaratne, Sarala M; Kothamachu, Varun B; Rodriguez, Nicolas; Swat, Maciej J; Eils, Jurgen; Eils, Roland; Laibe, Camille; Malik-Sheriff, Rahuman S; Chelliah, Vijayalakshmi; Le Novère, Nicolas; Hermjakob, Henning

    2018-01-04

    BioModels serves as a central repository of mathematical models representing biological processes. It offers a platform to make mathematical models easily shareable across the systems modelling community, thereby supporting model reuse. To facilitate hosting a broader range of model formats derived from diverse modelling approaches and tools, a new infrastructure for BioModels has been developed that is available at http://www.ebi.ac.uk/biomodels. This new system allows submitting and sharing of a wide range of models with improved support for formats other than SBML. It also offers a version-control backed environment in which authors and curators can work collaboratively to curate models. This article summarises the features available in the current system and discusses the potential benefit they offer to the users over the previous system. In summary, the new portal broadens the scope of models accepted in BioModels and supports collaborative model curation which is crucial for model reproducibility and sharing. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  4. Toward Exascale Seismic Imaging: Taming Workflow and I/O Issues

    Science.gov (United States)

    Lefebvre, M. P.; Bozdag, E.; Lei, W.; Rusmanugroho, H.; Smith, J. A.; Tromp, J.; Yuan, Y.

    2013-12-01

    Providing a better understanding of the physics and chemistry of Earth's interior through numerical simulations has always required tremendous computational resources. Post-petascale supercomputers are now available to solve complex scientific problems that were thought unreachable a few decades ago. They also bring a cohort of concerns on how to obtain optimum performance. Several issues are currently being investigated by the HPC community. To name a few, we can list energy consumption, fault resilience, scalability of the current parallel paradigms, large workflow management, I/O performance and feature extraction with large datasets. For this presentation, we focus on the last three issues. In the context of seismic imaging, in particular for simulations based on adjoint methods, workflows are well defined. They consist of a few collective steps (e.g., mesh generation or model updates) and of a large number of independent steps (e.g., forward and adjoint simulations of each seismic event, pre- and postprocessing of seismic traces). The greater goal is to reduce the time to solution, that is, obtaining a more precise representation of the subsurface as fast as possible. This brings us to consider both the workflow in its entirety and the parts composing it. The usual approach is to speedup the purely computational parts by code tuning in order to reach higher FLOPS and better memory usage. This still remains an important concern, but larger scale experiments show that the imaging workflow suffers from a severe I/O bottleneck. This limitation occurs both for purely computational data and seismic time series. The latter are dealt with by the introduction of a new Adaptable Seismic Data Format (ASDF). In both cases, a parallel I/O library, ORNL's ADIOS, is used to drastically lessen the weight of disk access. Moreover, parallel visualization tools, such as VisIt, are able to take advantage of the metadata included in our ADIOS outputs to extract features and

  5. ATEFlap aerodynamic model, a dynamic stall model including the effects of trailing edge flap deflection

    Energy Technology Data Exchange (ETDEWEB)

    Bergami, L.; Gaunaa, M.

    2012-02-15

    The report presents the ATEFlap aerodynamic model, which computes the unsteady lift, drag and moment on a 2D airfoil section equipped with Adaptive Trailing Edge Flap. The model captures the unsteady response related to the effects of the vorticity shed into the wake, and the dynamics of flow separation a thin-airfoil potential flow model is merged with a dynamic stall model of the Beddoes-Leishmann type. The inputs required by the model are steady data for lift, drag, and moment coefficients as function of angle of attack and flap deflection. Further steady data used by the Beddoes- Leishmann dynamic stall model are computed in an external preprocessor application, which gives the user the possibility to verify, and eventually correct, the steady data passed to the aerodynamic model. The ATEFlap aerodynamic model is integrated in the aeroelastic simulation tool HAWC2, thus al- lowing to simulate the response of a wind turbine with trailing edge flaps on the rotor. The algorithms used by the preprocessor, and by aerodynamic model are presented, and modifications to previous implementations of the aerodynamic model are briefly discussed. The performance and the validity of the model are verified by comparing the dynamic response computed by the ATEFlap with solutions from CFD simulations. (Author)

  6. You’ve Got Email: a Workflow Management Extraction System

    NARCIS (Netherlands)

    P. Chaipornkaew (Piyanuch); T. Prexawanprasut (Takorn); M.J. McAleer (Michael)

    2017-01-01

    textabstractEmail is one of the most powerful tools for communication. Many businesses use email as the main channel for communication, so it is possible that substantial data are included in email content. In order to help businesses grow faster, a workflow management system may be required. The

  7. Optimising metadata workflows in a distributed information environment

    OpenAIRE

    Robertson, R. John; Barton, Jane

    2005-01-01

    The different purposes present within a distributed information environment create the potential for repositories to enhance their metadata by capitalising on the diversity of metadata available for any given object. This paper presents three conceptual reference models required to achieve this optimisation of metadata workflow: the ecology of repositories, the object lifecycle model, and the metadata lifecycle model. It suggests a methodology for developing the metadata lifecycle model, and ...

  8. Integrated model of port oil piping transportation system safety including operating environment threats

    OpenAIRE

    Kołowrocki, Krzysztof; Kuligowska, Ewa; Soszyńska-Budny, Joanna

    2017-01-01

    The paper presents an integrated general model of complex technical system, linking its multistate safety model and the model of its operation process including operating environment threats and considering variable at different operation states its safety structures and its components safety parameters. Under the assumption that the system has exponential safety function, the safety characteristics of the port oil piping transportation system are determined.

  9. From shared data to sharing workflow: Merging PACS and teleradiology

    International Nuclear Information System (INIS)

    Benjamin, Menashe; Aradi, Yinon; Shreiber, Reuven

    2010-01-01

    Due to a host of technological, interface, operational and workflow limitations, teleradiology and PACS/RIS were historically developed as separate systems serving different purposes. PACS/RIS handled local radiology storage and workflow management while teleradiology addressed remote access to images. Today advanced PACS/RIS support complete site radiology workflow for attending physicians, whether on-site or remote. In parallel, teleradiology has emerged into a service of providing remote, off-hours, coverage for emergency radiology and to a lesser extent subspecialty reading to subscribing sites and radiology groups. When attending radiologists use teleradiology for remote access to a site, they may share all relevant patient data and participate in the site's workflow like their on-site peers. The operation gets cumbersome and time consuming when these radiologists serve multi-sites, each requiring a different remote access, or when the sites do not employ the same PACS/RIS/Reporting Systems and do not share the same ownership. The least efficient operation is of teleradiology companies engaged in reading for multiple facilities. As these services typically employ non-local radiologists, they are allowed to share some of the available patient data necessary to provide an emergency report but, by enlarge, they do not share the workflow of the sites they serve. Radiology stakeholders usually prefer to have their own radiologists perform all radiology tasks including interpretation of off-hour examinations. It is possible with current technology to create a system that combines the benefits of local radiology services to multiple sites with the advantages offered by adding subspecialty and off-hours emergency services through teleradiology. Such a system increases efficiency for the radiology groups by enabling all users, regardless of location, to work 'local' and fully participate in the workflow of every site. We refer to such a system as SuperPACS.

  10. Elastic Scheduling of Scientific Workflows under Deadline Constraints in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Nazia Anwar

    2018-01-01

    Full Text Available Scientific workflow applications are collections of several structured activities and fine-grained computational tasks. Scientific workflow scheduling in cloud computing is a challenging research topic due to its distinctive features. In cloud environments, it has become critical to perform efficient task scheduling resulting in reduced scheduling overhead, minimized cost and maximized resource utilization while still meeting the user-specified overall deadline. This paper proposes a strategy, Dynamic Scheduling of Bag of Tasks based workflows (DSB, for scheduling scientific workflows with the aim to minimize financial cost of leasing Virtual Machines (VMs under a user-defined deadline constraint. The proposed model groups the workflow into Bag of Tasks (BoTs based on data dependency and priority constraints and thereafter optimizes the allocation and scheduling of BoTs on elastic, heterogeneous and dynamically provisioned cloud resources called VMs in order to attain the proposed method’s objectives. The proposed approach considers pay-as-you-go Infrastructure as a Service (IaaS clouds having inherent features such as elasticity, abundance, heterogeneity and VM provisioning delays. A trace-based simulation using benchmark scientific workflows representing real world applications, demonstrates a significant reduction in workflow computation cost while the workflow deadline is met. The results validate that the proposed model produces better success rates to meet deadlines and cost efficiencies in comparison to adapted state-of-the-art algorithms for similar problems.

  11. Financial and workflow analysis of radiology reporting processes in the planning phase of implementation of a speech recognition system

    Science.gov (United States)

    Whang, Tom; Ratib, Osman M.; Umamoto, Kathleen; Grant, Edward G.; McCoy, Michael J.

    2002-05-01

    The goal of this study is to determine the financial value and workflow improvements achievable by replacing traditional transcription services with a speech recognition system in a large, university hospital setting. Workflow metrics were measured at two hospitals, one of which exclusively uses a transcription service (UCLA Medical Center), and the other which exclusively uses speech recognition (West Los Angeles VA Hospital). Workflow metrics include time spent per report (the sum of time spent interpreting, dictating, reviewing, and editing), transcription turnaround, and total report turnaround. Compared to traditional transcription, speech recognition resulted in radiologists spending 13-32% more time per report, but it also resulted in reduction of report turnaround time by 22-62% and reduction of marginal cost per report by 94%. The model developed here helps justify the introduction of a speech recognition system by showing that the benefits of reduced operating costs and decreased turnaround time outweigh the cost of increased time spent per report. Whether the ultimate goal is to achieve a financial objective or to improve operational efficiency, it is important to conduct a thorough analysis of workflow before implementation.

  12. Automatic Image Processing Workflow for the Keck/NIRC2 Vortex Coronagraph

    Science.gov (United States)

    Xuan, Wenhao; Cook, Therese; Ngo, Henry; Zawol, Zoe; Ruane, Garreth; Mawet, Dimitri

    2018-01-01

    The Keck/NIRC2 camera, equipped with the vortex coronagraph, is an instrument targeted at the high contrast imaging of extrasolar planets. To uncover a faint planet signal from the overwhelming starlight, we utilize the Vortex Image Processing (VIP) library, which carries out principal component analysis to model and remove the stellar point spread function. To bridge the gap between data acquisition and data reduction, we implement a workflow that 1) downloads, sorts, and processes data with VIP, 2) stores the analysis products into a database, and 3) displays the reduced images, contrast curves, and auxiliary information on a web interface. Both angular differential imaging and reference star differential imaging are implemented in the analysis module. A real-time version of the workflow runs during observations, allowing observers to make educated decisions about time distribution on different targets, hence optimizing science yield. The post-night version performs a standardized reduction after the observation, building up a valuable database that not only helps uncover new discoveries, but also enables a statistical study of the instrument itself. We present the workflow, and an examination of the contrast performance of the NIRC2 vortex with respect to factors including target star properties and observing conditions.

  13. Workflow-Oriented Cyberinfrastructure for Sensor Data Analytics

    Science.gov (United States)

    Orcutt, J. A.; Rajasekar, A.; Moore, R. W.; Vernon, F.

    2015-12-01

    Sensor streams comprise an increasingly large part of Earth Science data. Analytics based on sensor data require an easy way to perform operations such as acquisition, conversion to physical units, metadata linking, sensor fusion, analysis and visualization on distributed sensor streams. Furthermore, embedding real-time sensor data into scientific workflows is of growing interest. We have implemented a scalable networked architecture that can be used to dynamically access packets of data in a stream from multiple sensors, and perform synthesis and analysis across a distributed network. Our system is based on the integrated Rule Oriented Data System (irods.org), which accesses sensor data from the Antelope Real Time Data System (brtt.com), and provides virtualized access to collections of data streams. We integrate real-time data streaming from different sources, collected for different purposes, on different time and spatial scales, and sensed by different methods. iRODS, noted for its policy-oriented data management, brings to sensor processing features and facilities such as single sign-on, third party access control lists ( ACLs), location transparency, logical resource naming, and server-side modeling capabilities while reducing the burden on sensor network operators. Rich integrated metadata support also makes it straightforward to discover data streams of interest and maintain data provenance. The workflow support in iRODS readily integrates sensor processing into any analytical pipeline. The system is developed as part of the NSF-funded Datanet Federation Consortium (datafed.org). APIs for selecting, opening, reaping and closing sensor streams are provided, along with other helper functions to associate metadata and convert sensor packets into NetCDF and JSON formats. Near real-time sensor data including seismic sensors, environmental sensors, LIDAR and video streams are available through this interface. A system for archiving sensor data and metadata in Net

  14. A Workflow to Improve the Alignment of Prostate Imaging with Whole-mount Histopathology.

    Science.gov (United States)

    Yamamoto, Hidekazu; Nir, Dror; Vyas, Lona; Chang, Richard T; Popert, Rick; Cahill, Declan; Challacombe, Ben; Dasgupta, Prokar; Chandra, Ashish

    2014-08-01

    Evaluation of prostate imaging tests against whole-mount histology specimens requires accurate alignment between radiologic and histologic data sets. Misalignment results in false-positive and -negative zones as assessed by imaging. We describe a workflow for three-dimensional alignment of prostate imaging data against whole-mount prostatectomy reference specimens and assess its performance against a standard workflow. Ethical approval was granted. Patients underwent motorized transrectal ultrasound (Prostate Histoscanning) to generate a three-dimensional image of the prostate before radical prostatectomy. The test workflow incorporated steps for axial alignment between imaging and histology, size adjustments following formalin fixation, and use of custom-made parallel cutters and digital caliper instruments. The control workflow comprised freehand cutting and assumed homogeneous block thicknesses at the same relative angles between pathology and imaging sections. Thirty radical prostatectomy specimens were histologically and radiologically processed, either by an alignment-optimized workflow (n = 20) or a control workflow (n = 10). The optimized workflow generated tissue blocks of heterogeneous thicknesses but with no significant drifting in the cutting plane. The control workflow resulted in significantly nonparallel blocks, accurately matching only one out of four histology blocks to their respective imaging data. The image-to-histology alignment accuracy was 20% greater in the optimized workflow (P alignment was observed in the optimized workflow. Evaluation of prostate imaging biomarkers using whole-mount histology references should include a test-to-reference spatial alignment workflow. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  15. Managing Library IT Workflow with Bugzilla

    Directory of Open Access Journals (Sweden)

    Nina McHale

    2010-09-01

    Full Text Available Prior to September 2008, all technology issues at the University of Colorado Denver's Auraria Library were reported to a dedicated departmental phone line. A variety of staff changes necessitated a more formal means of tracking, delegating, and resolving reported issues, and the department turned to Bugzilla, an open source bug tracking application designed by Mozilla.org developers. While designed with software development bug tracking in mind, Bugzilla can be easily customized and modified to serve as an IT ticketing system. Twenty-three months and over 2300 trouble tickets later, Auraria's IT department workflow is much smoother and more efficient. This article includes two Perl Template Toolkit code samples for customized Bugzilla screens for its use in a library environment; readers will be able to easily replicate the project in their own environments.

  16. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing.

    Science.gov (United States)

    Cotes-Ruiz, Iván Tomás; Prado, Rocío P; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás

    2017-01-01

    Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique.

  17. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing.

    Directory of Open Access Journals (Sweden)

    Iván Tomás Cotes-Ruiz

    Full Text Available Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS. The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique.

  18. Contracts for Cross-Organizational Workflow Management

    NARCIS (Netherlands)

    Koetsier, M.J.; Grefen, P.W.P.J.; Vonk, J.

    1999-01-01

    Nowadays, many organizations form dynamic partnerships to deal effectively with market requirements. As companies use automated workflow systems to control their processes, a way of linking workflow processes in different organizations is useful in turning the co-operating companies into a seamless

  19. Verifying generalized soundness for workflow nets

    NARCIS (Netherlands)

    Hee, van K.M.; Oanea, O.I.; Sidorova, N.; Voorhoeve, M.; Virbitskaite, I.; Voronkov, A.

    2007-01-01

    We improve the decision procedure from [10] for the problem of generalized soundness of workflow nets. A workflow net is generalized sound iff every marking reachable from an initial marking with k tokens on the initial place terminates properly, i.e. it can reach a marking with k tokens on the

  20. Implementing bioinformatic workflows within the bioextract server

    Science.gov (United States)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  1. Mathematical model of thyristor inverter including a series-parallel resonant circuit

    OpenAIRE

    Luft, M.; Szychta, E.

    2008-01-01

    The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with the aid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  2. Mathematical Model of Thyristor Inverter Including a Series-parallel Resonant Circuit

    OpenAIRE

    Miroslaw Luft; Elzbieta Szychta

    2008-01-01

    The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with theaid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  3. Mathematical Model of Thyristor Inverter Including a Series-parallel Resonant Circuit

    Directory of Open Access Journals (Sweden)

    Miroslaw Luft

    2008-01-01

    Full Text Available The article presents a mathematical model of thyristor inverter including a series-parallel resonant circuit with theaid of state variable method. Maple procedures are used to compute current and voltage waveforms in the inverter.

  4. Modeling of Pem Fuel Cell Systems Including Controls and Reforming Effects for Hybrid Automotive Applications

    National Research Council Canada - National Science Library

    Boettner, Daisie

    2001-01-01

    .... This study develops models for a stand-alone Proton Exchange Membrane (PEM) fuel cell stack, a direct-hydrogen fuel cell system including auxiliaries, and a methanol reforming fuel cell system for integration into a vehicle performance simulator...

  5. Atmosphere-soil-vegetation model including CO2 exchange processes: SOLVEG2

    International Nuclear Information System (INIS)

    Nagai, Haruyasu

    2004-11-01

    A new atmosphere-soil-vegetation model named SOLVEG2 (SOLVEG version 2) was developed to study the heat, water, and CO 2 exchanges between the atmosphere and land-surface. The model consists of one-dimensional multilayer sub-models for the atmosphere, soil, and vegetation. It also includes sophisticated processes for solar and long-wave radiation transmission in vegetation canopy and CO 2 exchanges among the atmosphere, soil, and vegetation. Although the model usually simulates only vertical variation of variables in the surface-layer atmosphere, soil, and vegetation canopy by using meteorological data as top boundary conditions, it can be used by coupling with a three-dimensional atmosphere model. In this paper, details of SOLVEG2, which includes the function of coupling with atmosphere model MM5, are described. (author)

  6. Modeling of the Direct Current Generator Including the Magnetic Saturation and Temperature Effects

    Directory of Open Access Journals (Sweden)

    Alfonso J. Mercado-Samur

    2013-11-01

    Full Text Available In this paper the inclusion of temperature effect on the field resistance on the direct current generator model DC1A, which is valid to stability studies is proposed. First, the linear generator model is presented, after the effect of magnetic saturation and the change in the resistance value due to temperature produced by the field current are included. The comparison of experimental results and model simulations to validate the model is used. A direct current generator model which is a better representation of the generator is obtained. Visual comparison between simulations and experimental results shows the success of the proposed model, because it presents the lowest error of the compared models. The accuracy of the proposed model is observed via Modified Normalized Sum of Squared Errors index equal to 3.8979%.

  7. Enhanced UWB Radio Channel Model for Short-Range Communication Scenarios Including User Dynamics

    DEFF Research Database (Denmark)

    Kovacs, Istvan Zsolt; Nguyen, Tuan Hung; Eggers, Patrick Claus F.

    2005-01-01

    channel model represents an enhancement of the existing IEEE 802.15.3a/4a PAN channel model, where antenna and user-proximity effects are not included. Our investigations showed that significant variations of the received wideband power and time-delay signal clustering are possible due the human body...

  8. Influence of structural parameter included in nonlocal rock mass model on stress concentration around circular tunnel

    Science.gov (United States)

    Lavrikov, SV; Mikenina, OA; Revuzhenko, AF

    2018-03-01

    A model of elastic body, including local curvature of elementary volume, is matched with a nonlocal model with a linear structural parameter in the differential approximation. The problem on deformation of rock mass around a circular cross section tunnel is solved numerically. The contours of the calculated stresses are plotted. It is shown that inclusion of local bends in the model results in expansion of influence zone of the tunnel and reduces stress concentration factor at the tunnel boundary.

  9. Integrated model of port oil piping transportation system safety including operating environment threats

    Directory of Open Access Journals (Sweden)

    Kołowrocki Krzysztof

    2017-06-01

    Full Text Available The paper presents an integrated general model of complex technical system, linking its multistate safety model and the model of its operation process including operating environment threats and considering variable at different operation states its safety structures and its components safety parameters. Under the assumption that the system has exponential safety function, the safety characteristics of the port oil piping transportation system are determined.

  10. Including model uncertainty in the model predictive control with output feedback

    Directory of Open Access Journals (Sweden)

    Rodrigues M.A.

    2002-01-01

    Full Text Available This paper addresses the development of an efficient numerical output feedback robust model predictive controller for open-loop stable systems. Stability of the closed loop is guaranteed by using an infinite horizon predictive controller and a stable state observer. The performance and the computational burden of this approach are compared to a robust predictive controller from the literature. The case used for this study is based on an industrial gasoline debutanizer column.

  11. A constitutive model for the forces of a magnetic bearing including eddy currents

    Science.gov (United States)

    Taylor, D. L.; Hebbale, K. V.

    1993-01-01

    A multiple magnet bearing can be developed from N individual electromagnets. The constitutive relationships for a single magnet in such a bearing is presented. Analytical expressions are developed for a magnet with poles arranged circumferencially. Maxwell's field equations are used so the model easily includes the effects of induced eddy currents due to the rotation of the journal. Eddy currents must be included in any dynamic model because they are the only speed dependent parameter and may lead to a critical speed for the bearing. The model is applicable to bearings using attraction or repulsion.

  12. ADVANCED APPROACH TO PRODUCTION WORKFLOW COMPOSITION ON ENGINEERING KNOWLEDGE PORTALS

    OpenAIRE

    Novogrudska, Rina; Kot, Tatyana; Globa, Larisa; Schill, Alexander

    2016-01-01

    Background. In the environment of engineering knowledge portals great amount of partial workflows is concentrated. Such workflows are composed into general workflow aiming to perform real complex production task. Characteristics of partial workflows and general workflow structure are not studied enough, that affects the impossibility of general production workflowdynamic composition.Objective. Creating an approach to the general production workflow dynamic composition based on the partial wor...

  13. Images crossing borders: image and workflow sharing on multiple levels.

    Science.gov (United States)

    Ross, Peeter; Pohjonen, Hanna

    2011-04-01

    Digitalisation of medical data makes it possible to share images and workflows between related parties. In addition to linear data flow where healthcare professionals or patients are the information carriers, a new type of matrix of many-to-many connections is emerging. Implementation of shared workflow brings challenges of interoperability and legal clarity. Sharing images or workflows can be implemented on different levels with different challenges: inside the organisation, between organisations, across country borders, or between healthcare institutions and citizens. Interoperability issues vary according to the level of sharing and are either technical or semantic, including language. Legal uncertainty increases when crossing national borders. Teleradiology is regulated by multiple European Union (EU) directives and legal documents, which makes interpretation of the legal system complex. To achieve wider use of eHealth and teleradiology several strategic documents were published recently by the EU. Despite EU activities, responsibility for organising, providing and funding healthcare systems remains with the Member States. Therefore, the implementation of new solutions requires strong co-operation between radiologists, societies of radiology, healthcare administrators, politicians and relevant EU authorities. The aim of this article is to describe different dimensions of image and workflow sharing and to analyse legal acts concerning teleradiology in the EU.

  14. Electronic Health Record-Driven Workflow for Diagnostic Radiologists.

    Science.gov (United States)

    Geeslin, Matthew G; Gaskin, Cree M

    2016-01-01

    In most settings, radiologists maintain a high-throughput practice in which efficiency is crucial. The conversion from film-based to digital study interpretation and data storage launched the era of PACS-driven workflow, leading to significant gains in speed. The advent of electronic health records improved radiologists' access to patient data; however, many still find this aspect of workflow to be relatively cumbersome. Nevertheless, the ability to guide a diagnostic interpretation with clinical information, beyond that provided in the examination indication, can add significantly to the specificity of a radiologist's interpretation. Responsibilities of the radiologist include, but are not limited to, protocoling examinations, interpreting studies, chart review, peer review, writing notes, placing orders, and communicating with referring providers. Most of the aforementioned activities are not PACS-centric and require a login to one or more additional applications. Consolidation of these tasks for completion through a single interface can simplify workflow, save time, and potentially reduce the incidence of errors. Here, the authors describe diagnostic radiology workflow that leverages the electronic health record to significantly add to a radiologist's ability to be part of the health care team, provide relevant interpretations, and improve efficiency and quality. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  15. A thermal conductivity model for nanofluids including effect of the temperature-dependent interfacial layer

    International Nuclear Information System (INIS)

    Sitprasert, Chatcharin; Dechaumphai, Pramote; Juntasaro, Varangrat

    2009-01-01

    The interfacial layer of nanoparticles has been recently shown to have an effect on the thermal conductivity of nanofluids. There is, however, still no thermal conductivity model that includes the effects of temperature and nanoparticle size variations on the thickness and consequently on the thermal conductivity of the interfacial layer. In the present work, the stationary model developed by Leong et al. (J Nanopart Res 8:245-254, 2006) is initially modified to include the thermal dispersion effect due to the Brownian motion of nanoparticles. This model is called the 'Leong et al.'s dynamic model'. However, the Leong et al.'s dynamic model over-predicts the thermal conductivity of nanofluids in the case of the flowing fluid. This suggests that the enhancement in the thermal conductivity of the flowing nanofluids due to the increase in temperature does not come from the thermal dispersion effect. It is more likely that the enhancement in heat transfer of the flowing nanofluids comes from the temperature-dependent interfacial layer effect. Therefore, the Leong et al.'s stationary model is again modified to include the effect of temperature variation on the thermal conductivity of the interfacial layer for different sizes of nanoparticles. This present model is then evaluated and compared with the other thermal conductivity models for the turbulent convective heat transfer in nanofluids along a uniformly heated tube. The results show that the present model is more general than the other models in the sense that it can predict both the temperature and the volume fraction dependence of the thermal conductivity of nanofluids for both non-flowing and flowing fluids. Also, it is found to be more accurate than the other models due to the inclusion of the effect of the temperature-dependent interfacial layer. In conclusion, the present model can accurately predict the changes in thermal conductivity of nanofluids due to the changes in volume fraction and temperature for

  16. Declarative Event-Based Workflow as Distributed Dynamic Condition Response Graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2010-01-01

    We present Dynamic Condition Response Graphs (DCR Graphs) as a declarative, event-based process model inspired by the workflow language employed by our industrial partner and conservatively generalizing prime event structures. A dynamic condition response graph is a directed graph with nodes repr...... exemplify the use of distributed DCR Graphs on a simple workflow taken from a field study at a Danish hospital, pointing out their flexibility compared to imperative workflow models. Finally we provide a mapping from DCR Graphs to Buchi-automata....

  17. Implementation of Cyberinfrastructure and Data Management Workflow for a Large-Scale Sensor Network

    Science.gov (United States)

    Jones, A. S.; Horsburgh, J. S.

    2014-12-01

    Monitoring with in situ environmental sensors and other forms of field-based observation presents many challenges for data management, particularly for large-scale networks consisting of multiple sites, sensors, and personnel. The availability and utility of these data in addressing scientific questions relies on effective cyberinfrastructure that facilitates transformation of raw sensor data into functional data products. It also depends on the ability of researchers to share and access the data in useable formats. In addition to addressing the challenges presented by the quantity of data, monitoring networks need practices to ensure high data quality, including procedures and tools for post processing. Data quality is further enhanced if practitioners are able to track equipment, deployments, calibrations, and other events related to site maintenance and associate these details with observational data. In this presentation we will describe the overall workflow that we have developed for research groups and sites conducting long term monitoring using in situ sensors. Features of the workflow include: software tools to automate the transfer of data from field sites to databases, a Python-based program for data quality control post-processing, a web-based application for online discovery and visualization of data, and a data model and web interface for managing physical infrastructure. By automating the data management workflow, the time from collection to analysis is reduced and sharing and publication is facilitated. The incorporation of metadata standards and descriptions and the use of open-source tools enhances the sustainability and reusability of the data. We will describe the workflow and tools that we have developed in the context of the iUTAH (innovative Urban Transitions and Aridregion Hydrosustainability) monitoring network. The iUTAH network consists of aquatic and climate sensors deployed in three watersheds to monitor Gradients Along Mountain to Urban

  18. Direct-phase-variable model of a synchronous reluctance motor including all slot and winding harmonics

    International Nuclear Information System (INIS)

    Obe, Emeka S.; Binder, A.

    2011-01-01

    A detailed model in direct-phase variables of a synchronous reluctance motor operating at mains voltage and frequency is presented. The model includes the stator and rotor slot openings, the actual winding layout and the reluctance rotor geometry. Hence, all mmf and permeance harmonics are taken into account. It is seen that non-negligible harmonics introduced by slots are present in the inductances computed by the winding function procedure. These harmonics are usually ignored in d-q models. The machine performance is simulated in the stator reference frame to depict the difference between this new direct-phase model including all harmonics and the conventional rotor reference frame d-q model. Saturation is included by using a polynomial fitting the variation of d-axis inductance with stator current obtained by finite-element software FEMAG DC (registered) . The detailed phase-variable model can yield torque pulsations comparable to those obtained from finite elements while the d-q model cannot.

  19. End-to-end workflow for finite element analysis of tumor treating fields in glioblastomas

    Science.gov (United States)

    Timmons, Joshua J.; Lok, Edwin; San, Pyay; Bui, Kevin; Wong, Eric T.

    2017-11-01

    Tumor Treating Fields (TTFields) therapy is an approved modality of treatment for glioblastoma. Patient anatomy-based finite element analysis (FEA) has the potential to reveal not only how these fields affect tumor control but also how to improve efficacy. While the automated tools for segmentation speed up the generation of FEA models, multi-step manual corrections are required, including removal of disconnected voxels, incorporation of unsegmented structures and the addition of 36 electrodes plus gel layers matching the TTFields transducers. Existing approaches are also not scalable for the high throughput analysis of large patient volumes. A semi-automated workflow was developed to prepare FEA models for TTFields mapping in the human brain. Magnetic resonance imaging (MRI) pre-processing, segmentation, electrode and gel placement, and post-processing were all automated. The material properties of each tissue were applied to their corresponding mask in silico using COMSOL Multiphysics (COMSOL, Burlington, MA, USA). The fidelity of the segmentations with and without post-processing was compared against the full semi-automated segmentation workflow approach using Dice coefficient analysis. The average relative differences for the electric fields generated by COMSOL were calculated in addition to observed differences in electric field-volume histograms. Furthermore, the mesh file formats in MPHTXT and NASTRAN were also compared using the differences in the electric field-volume histogram. The Dice coefficient was less for auto-segmentation without versus auto-segmentation with post-processing, indicating convergence on a manually corrected model. An existent but marginal relative difference of electric field maps from models with manual correction versus those without was identified, and a clear advantage of using the NASTRAN mesh file format was found. The software and workflow outlined in this article may be used to accelerate the investigation of TTFields in

  20. End-to-end workflow for finite element analysis of tumor treating fields in glioblastomas.

    Science.gov (United States)

    Timmons, Joshua J; Lok, Edwin; San, Pyay; Bui, Kevin; Wong, Eric T

    2017-10-12

    Tumor Treating Fields (TTFields) therapy is an approved modality of treatment for glioblastoma. Patient anatomy-based finite element analysis (FEA) has the potential to reveal not only how these fields affect tumor control but also how to improve efficacy. While the automated tools for segmentation speed up the generation of FEA models, multi-step manual corrections are required, including removal of disconnected voxels, incorporation of unsegmented structures and the addition of 36 electrodes plus gel layers matching the TTFields transducers. Existing approaches are also not scalable for the high throughput analysis of large patient volumes. A semi-automated workflow was developed to prepare FEA models for TTFields mapping in the human brain. Magnetic resonance imaging (MRI) pre-processing, segmentation, electrode and gel placement, and post-processing were all automated. The material properties of each tissue were applied to their corresponding mask in silico using COMSOL Multiphysics (COMSOL, Burlington, MA, USA). The fidelity of the segmentations with and without post-processing was compared against the full semi-automated segmentation workflow approach using Dice coefficient analysis. The average relative differences for the electric fields generated by COMSOL were calculated in addition to observed differences in electric field-volume histograms. Furthermore, the mesh file formats in MPHTXT and NASTRAN were also compared using the differences in the electric field-volume histogram. The Dice coefficient was less for auto-segmentation without versus auto-segmentation with post-processing, indicating convergence on a manually corrected model. An existent but marginal relative difference of electric field maps from models with manual correction versus those without was identified, and a clear advantage of using the NASTRAN mesh file format was found. The software and workflow outlined in this article may be used to accelerate the investigation of TTFields in

  1. Watchdog - a workflow management system for the distributed analysis of large-scale experimental data.

    Science.gov (United States)

    Kluge, Michael; Friedel, Caroline C

    2018-03-13

    The development of high-throughput experimental technologies, such as next-generation sequencing, have led to new challenges for handling, analyzing and integrating the resulting large and diverse datasets. Bioinformatical analysis of these data commonly requires a number of mutually dependent steps applied to numerous samples for multiple conditions and replicates. To support these analyses, a number of workflow management systems (WMSs) have been developed to allow automated execution of corresponding analysis workflows. Major advantages of WMSs are the easy reproducibility of results as well as the reusability of workflows or their components. In this article, we present Watchdog, a WMS for the automated analysis of large-scale experimental data. Main features include straightforward processing of replicate data, support for distributed computer systems, customizable error detection and manual intervention into workflow execution. Watchdog is implemented in Java and thus platform-independent and allows easy sharing of workflows and corresponding program modules. It provides a graphical user interface (GUI) for workflow construction using pre-defined modules as well as a helper script for creating new module definitions. Execution of workflows is possible using either the GUI or a command-line interface and a web-interface is provided for monitoring the execution status and intervening in case of errors. To illustrate its potentials on a real-life example, a comprehensive workflow and modules for the analysis of RNA-seq experiments were implemented and are provided with the software in addition to simple test examples. Watchdog is a powerful and flexible WMS for the analysis of large-scale high-throughput experiments. We believe it will greatly benefit both users with and without programming skills who want to develop and apply bioinformatical workflows with reasonable overhead. The software, example workflows and a comprehensive documentation are freely

  2. Workflow Management in CLARIN-DK

    DEFF Research Database (Denmark)

    Jongejan, Bart

    2013-01-01

    The CLARIN-DK infrastructure is not only a repository of resources, but also a place where users can analyse, annotate, reformat and potentially even translate resources, using tools that are integrated in the infrastructure as web services. In many cases a single tool does not produce the desired...... with the features that describe her goal, because the workflow manager not only executes chains of tools in a workflow, but also takes care of autonomously devising workflows that serve the user’s intention, given the tools that currently are integrated in the infrastructure as web services. To do this...

  3. The Diabetic Retinopathy Screening Workflow

    Science.gov (United States)

    Bolster, Nigel M.; Giardini, Mario E.; Bastawrous, Andrew

    2015-01-01

    Complications of diabetes mellitus, namely diabetic retinopathy and diabetic maculopathy, are the leading cause of blindness in working aged people. Sufferers can avoid blindness if identified early via retinal imaging. Systematic screening of the diabetic population has been shown to greatly reduce the prevalence and incidence of blindness within the population. Many national screening programs have digital fundus photography as their basis. In the past 5 years several techniques and adapters have been developed that allow digital fundus photography to be performed using smartphones. We review recent progress in smartphone-based fundus imaging and discuss its potential for integration into national systematic diabetic retinopathy screening programs. Some systems have produced promising initial results with respect to their agreement with reference standards. However further multisite trialling of such systems’ use within implementable screening workflows is required if an evidence base strong enough to affect policy change is to be established. If this were to occur national diabetic retinopathy screening would, for the first time, become possible in low- and middle-income settings where cost and availability of trained eye care personnel are currently key barriers to implementation. As diabetes prevalence and incidence is increasing sharply in these settings, the impact on global blindness could be profound. PMID:26596630

  4. Workflow Optimization in Vertebrobasilar Occlusion

    International Nuclear Information System (INIS)

    Kamper, Lars; Meyn, Hannes; Rybacki, Konrad; Nordmeyer, Simone; Kempkes, Udo; Piroth, Werner; Isenmann, Stefan; Haage, Patrick

    2012-01-01

    Objective: In vertebrobasilar occlusion, rapid recanalization is the only substantial means to improve the prognosis. We introduced a standard operating procedure (SOP) for interventional therapy to analyze the effects on interdisciplinary time management. Methods: Intrahospital time periods between hospital admission and neuroradiological intervention were retrospectively analyzed, together with the patients’ outcome, before (n = 18) and after (n = 20) implementation of the SOP. Results: After implementation of the SOP, we observed statistically significant improvement of postinterventional patient neurological status (p = 0.017). In addition, we found a decrease of 5:33 h for the mean time period from hospital admission until neuroradiological intervention. The recanalization rate increased from 72.2% to 80% after implementation of the SOP. Conclusion: Our results underscore the relevance of SOP implementation and analysis of time management for clinical workflow optimization. Both may trigger awareness for the need of efficient interdisciplinary time management. This could be an explanation for the decreased time periods and improved postinterventional patient status after SOP implementation.

  5. Security aspects in teleradiology workflow

    Science.gov (United States)

    Soegner, Peter I.; Helweg, Gernot; Holzer, Heimo; zur Nedden, Dieter

    2000-05-01

    The medicolegal necessity of privacy, security and confidentiality was the aim of the attempt to develop a secure teleradiology workflow between the telepartners -- radiologist and the referring physician. To avoid the lack of dataprotection and datasecurity we introduced biometric fingerprint scanners in combination with smart cards to identify the teleradiology partners and communicated over an encrypted TCP/IP satellite link between Innsbruck and Reutte. We used an asymmetric kryptography method to guarantee authentification, integrity of the data-packages and confidentiality of the medical data. It was necessary to use a biometric feature to avoid a case of mistaken identity of persons, who wanted access to the system. Only an invariable electronical identification allowed a legal liability to the final report and only a secure dataconnection allowed the exchange of sensible medical data between different partners of Health Care Networks. In our study we selected the user friendly combination of a smart card and a biometric fingerprint technique, called SkymedTM Double Guard Secure Keyboard (Agfa-Gevaert) to confirm identities and log into the imaging workstations and the electronic patient record. We examined the interoperability of the used software with the existing platforms. Only the WIN-XX operating systems could be protected at the time of our study.

  6. Integrative Workflows for Metagenomic Analysis

    Directory of Open Access Journals (Sweden)

    Efthymios eLadoukakis

    2014-11-01

    Full Text Available The rapid evolution of all sequencing technologies, described by the term Next Generation Sequencing (NGS, have revolutionized metagenomic analysis. They constitute a combination of high-throughput analytical protocols, coupled to delicate measuring techniques, in order to potentially discover, properly assemble and map allelic sequences to the correct genomes, achieving particularly high yields for only a fraction of the cost of traditional processes (i.e. Sanger. From a bioinformatic perspective, this boils down to many gigabytes of data being generated from each single sequencing experiment, rendering the management or even the storage, critical bottlenecks with respect to the overall analytical endeavor. The enormous complexity is even more aggravated by the versatility of the processing steps available, represented by the numerous bioinformatic tools that are essential, for each analytical task, in order to fully unveil the genetic content of a metagenomic dataset. These disparate tasks range from simple, nonetheless non-trivial, quality control of raw data to exceptionally complex protein annotation procedures, requesting a high level of expertise for their proper application or the neat implementation of the whole workflow. Furthermore, a bioinformatic analysis of such scale, requires grand computational resources, imposing as the sole realistic solution, the utilization of cloud computing infrastructures. In this review article we discuss different, integrative, bioinformatic solutions available, which address the aforementioned issues, by performing a critical assessment of the available automated pipelines for data management, quality control and annotation of metagenomic data, embracing various, major sequencing technologies and applications.

  7. Dipole model analysis of highest precision HERA data, including very low Q"2's

    International Nuclear Information System (INIS)

    Luszczak, A.; Kowalski, H.

    2016-12-01

    We analyse, within a dipole model, the final, inclusive HERA DIS cross section data in the low χ region, using fully correlated errors. We show, that these highest precision data are very well described within the dipole model framework starting from Q"2 values of 3.5 GeV"2 to the highest values of Q"2=250 GeV"2. To analyze the saturation effects we evaluated the data including also the very low 0.35< Q"2 GeV"2 region. The fits including this region show a preference of the saturation ansatz.

  8. Key Characteristics of Combined Accident including TLOFW accident for PSA Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Bo Gyung; Kang, Hyun Gook [KAIST, Daejeon (Korea, Republic of); Yoon, Ho Joon [Khalifa University of Science, Technology and Research, Abu Dhabi (United Arab Emirates)

    2015-05-15

    The conventional PSA techniques cannot adequately evaluate all events. The conventional PSA models usually focus on single internal events such as DBAs, the external hazards such as fire, seismic. However, the Fukushima accident of Japan in 2011 reveals that very rare event is necessary to be considered in the PSA model to prevent the radioactive release to environment caused by poor treatment based on lack of the information, and to improve the emergency operation procedure. Especially, the results from PSA can be used to decision making for regulators. Moreover, designers can consider the weakness of plant safety based on the quantified results and understand accident sequence based on human actions and system availability. This study is for PSA modeling of combined accidents including total loss of feedwater (TLOFW) accident. The TLOFW accident is a representative accident involving the failure of cooling through secondary side. If the amount of heat transfer is not enough due to the failure of secondary side, the heat will be accumulated to the primary side by continuous core decay heat. Transients with loss of feedwater include total loss of feedwater accident, loss of condenser vacuum accident, and closure of all MSIVs. When residual heat removal by the secondary side is terminated, the safety injection into the RCS with direct primary depressurization would provide alternative heat removal. This operation is called feed and bleed (F and B) operation. Combined accidents including TLOFW accident are very rare event and partially considered in conventional PSA model. Since the necessity of F and B operation is related to plant conditions, the PSA modeling for combined accidents including TLOFW accident is necessary to identify the design and operational vulnerabilities.The PSA is significant to assess the risk of NPPs, and to identify the design and operational vulnerabilities. Even though the combined accident is very rare event, the consequence of combined

  9. Ab initio chemical safety assessment: A workflow based on exposure considerations and non-animal methods

    OpenAIRE

    Berggren, Elisabet; White, Andrew; Ouedraogo, Gladys; Paini, Alicia; Richarz, Andrea-Nicole; Bois, Frederic Y.; Exner, Thomas; Leite, Sofia; Grunsven, Leo A. van; Worth, Andrew; Mahony, Catherine

    2017-01-01

    Highlights • A workflow for an exposure driven chemical safety assessment to avoid animal testing. • Hypothesis based on existing data, in silico modelling and biokinetic considerations. • A tool to inform targeted and toxicologically relevant in vitro testing.

  10. A Distributed Collaborative Workflow Based Approach to Data Collection and Analysis

    National Research Council Canada - National Science Library

    Gerecke, William; Enas, Douglas; Gottschlich, Susan

    2004-01-01

    ...) and Modeling and Simulation (M&S) systems and architectures. In our work we have found that in order to be maximally effective, these capabilities must be designed with the military user workflow process in mind...

  11. Facilitating Stewardship of scientific data through standards based workflows

    Science.gov (United States)

    Bastrakova, I.; Kemp, C.; Potter, A. K.

    2013-12-01

    There are main suites of standards that can be used to define the fundamental scientific methodology of data, methods and results. These are firstly Metadata standards to enable discovery of the data (ISO 19115), secondly the Sensor Web Enablement (SWE) suite of standards that include the O&M and SensorML standards and thirdly Ontology that provide vocabularies to define the scientific concepts and relationships between these concepts. All three types of standards have to be utilised by the practicing scientist to ensure that those who ultimately have to steward the data stewards to ensure that the data can be preserved curated and reused and repurposed. Additional benefits of this approach include transparency of scientific processes from the data acquisition to creation of scientific concepts and models, and provision of context to inform data use. Collecting and recording metadata is the first step in scientific data flow. The primary role of metadata is to provide details of geographic extent, availability and high-level description of data suitable for its initial discovery through common search engines. The SWE suite provides standardised patterns to describe observations and measurements taken for these data, capture detailed information about observation or analytical methods, used instruments and define quality determinations. This information standardises browsing capability over discrete data types. The standardised patterns of the SWE standards simplify aggregation of observation and measurement data enabling scientists to transfer disintegrated data to scientific concepts. The first two steps provide a necessary basis for the reasoning about concepts of ';pure' science, building relationship between concepts of different domains (linked-data), and identifying domain classification and vocabularies. Geoscience Australia is re-examining its marine data flows, including metadata requirements and business processes, to achieve a clearer link between

  12. Akuna: An Open Source User Environment for Managing Subsurface Simulation Workflows

    Science.gov (United States)

    Freedman, V. L.; Agarwal, D.; Bensema, K.; Finsterle, S.; Gable, C. W.; Keating, E. H.; Krishnan, H.; Lansing, C.; Moeglein, W.; Pau, G. S. H.; Porter, E.; Scheibe, T. D.

    2014-12-01

    The U.S. Department of Energy (DOE) is investing in development of a numerical modeling toolset called ASCEM (Advanced Simulation Capability for Environmental Management) to support modeling analyses at legacy waste sites. ASCEM is an open source and modular computing framework that incorporates new advances and tools for predicting contaminant fate and transport in natural and engineered systems. The ASCEM toolset includes both a Platform with Integrated Toolsets (called Akuna) and a High-Performance Computing multi-process simulator (called Amanzi). The focus of this presentation is on Akuna, an open-source user environment that manages subsurface simulation workflows and associated data and metadata. In this presentation, key elements of Akuna are demonstrated, which includes toolsets for model setup, database management, sensitivity analysis, parameter estimation, uncertainty quantification, and visualization of both model setup and simulation results. A key component of the workflow is in the automated job launching and monitoring capabilities, which allow a user to submit and monitor simulation runs on high-performance, parallel computers. Visualization of large outputs can also be performed without moving data back to local resources. These capabilities make high-performance computing accessible to the users who might not be familiar with batch queue systems and usage protocols on different supercomputers and clusters.

  13. Conceptualizing a Dynamic Fall Risk Model Including Intrinsic Risks and Exposures.

    Science.gov (United States)

    Klenk, Jochen; Becker, Clemens; Palumbo, Pierpaolo; Schwickert, Lars; Rapp, Kilan; Helbostad, Jorunn L; Todd, Chris; Lord, Stephen R; Kerse, Ngaire

    2017-11-01

    Falls are a major cause of injury and disability in older people, leading to serious health and social consequences including fractures, poor quality of life, loss of independence, and institutionalization. To design and provide adequate prevention measures, accurate understanding and identification of person's individual fall risk is important. However, to date, the performance of fall risk models is weak compared with models estimating, for example, cardiovascular risk. This deficiency may result from 2 factors. First, current models consider risk factors to be stable for each person and not change over time, an assumption that does not reflect real-life experience. Second, current models do not consider the interplay of individual exposure including type of activity (eg, walking, undertaking transfers) and environmental risks (eg, lighting, floor conditions) in which activity is performed. Therefore, we posit a dynamic fall risk model consisting of intrinsic risk factors that vary over time and exposure (activity in context). eHealth sensor technology (eg, smartphones) begins to enable the continuous measurement of both the above factors. We illustrate our model with examples of real-world falls from the FARSEEING database. This dynamic framework for fall risk adds important aspects that may improve understanding of fall mechanisms, fall risk models, and the development of fall prevention interventions. Copyright © 2017 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  14. Improved compliance by BPM-driven workflow automation.

    Science.gov (United States)

    Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin

    2014-12-01

    Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.

  15. The Watts-Strogatz network model developed by including degree distribution: theory and computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y W [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China); Zhang, L F [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China); Huang, J P [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China)

    2007-07-20

    By using theoretical analysis and computer simulations, we develop the Watts-Strogatz network model by including degree distribution, in an attempt to improve the comparison between characteristic path lengths and clustering coefficients predicted by the original Watts-Strogatz network model and those of the real networks with the small-world property. Good agreement between the predictions of the theoretical analysis and those of the computer simulations has been shown. It is found that the developed Watts-Strogatz network model can fit the real small-world networks more satisfactorily. Some other interesting results are also reported by adjusting the parameters in a model degree-distribution function. The developed Watts-Strogatz network model is expected to help in the future analysis of various social problems as well as financial markets with the small-world property.

  16. The Watts-Strogatz network model developed by including degree distribution: theory and computer simulation

    International Nuclear Information System (INIS)

    Chen, Y W; Zhang, L F; Huang, J P

    2007-01-01

    By using theoretical analysis and computer simulations, we develop the Watts-Strogatz network model by including degree distribution, in an attempt to improve the comparison between characteristic path lengths and clustering coefficients predicted by the original Watts-Strogatz network model and those of the real networks with the small-world property. Good agreement between the predictions of the theoretical analysis and those of the computer simulations has been shown. It is found that the developed Watts-Strogatz network model can fit the real small-world networks more satisfactorily. Some other interesting results are also reported by adjusting the parameters in a model degree-distribution function. The developed Watts-Strogatz network model is expected to help in the future analysis of various social problems as well as financial markets with the small-world property

  17. Modeling of cylindrical surrounding gate MOSFETs including the fringing field effects

    International Nuclear Information System (INIS)

    Gupta, Santosh K.; Baishya, Srimanta

    2013-01-01

    A physically based analytical model for surface potential and threshold voltage including the fringing gate capacitances in cylindrical surround gate (CSG) MOSFETs has been developed. Based on this a subthreshold drain current model has also been derived. This model first computes the charge induced in the drain/source region due to the fringing capacitances and considers an effective charge distribution in the cylindrically extended source/drain region for the development of a simple and compact model. The fringing gate capacitances taken into account are outer fringe capacitance, inner fringe capacitance, overlap capacitance, and sidewall capacitance. The model has been verified with the data extracted from 3D TCAD simulations of CSG MOSFETs and was found to be working satisfactorily. (semiconductor devices)

  18. Modeling of the dynamics of wind to power conversion including high wind speed behavior

    DEFF Research Database (Denmark)

    Litong-Palima, Marisciel; Bjerge, Martin Huus; Cutululis, Nicolaos Antonio

    2016-01-01

    This paper proposes and validates an efficient, generic and computationally simple dynamic model for the conversion of the wind speed at hub height into the electrical power by a wind turbine. This proposed wind turbine model was developed as a first step to simulate wind power time series...... for power system studies. This paper focuses on describing and validating the single wind turbine model, and is therefore neither describing wind speed modeling nor aggregation of contributions from a whole wind farm or a power system area. The state-of-the-art is to use static power curves for the purpose...... of power system studies, but the idea of the proposed wind turbine model is to include the main dynamic effects in order to have a better representation of the fluctuations in the output power and of the fast power ramping especially because of high wind speed shutdowns of the wind turbine. The high wind...

  19. Including Effects of Water Stress on Dead Organic Matter Decay to a Forest Carbon Model

    Science.gov (United States)

    Kim, H.; Lee, J.; Han, S. H.; Kim, S.; Son, Y.

    2017-12-01

    Decay of dead organic matter is a key process of carbon (C) cycling in forest ecosystems. The change in decay rate depends on temperature sensitivity and moisture conditions. The Forest Biomass and Dead organic matter Carbon (FBDC) model includes a decay sub-model considering temperature sensitivity, yet does not consider moisture conditions as drivers of the decay rate change. This study aimed to improve the FBDC model by including a water stress function to the decay sub-model. Also, soil C sequestration under climate change with the FBDC model including the water stress function was simulated. The water stress functions were determined with data from decomposition study on Quercus variabilis forests and Pinus densiflora forests of Korea, and adjustment parameters of the functions were determined for both species. The water stress functions were based on the ratio of precipitation to potential evapotranspiration. Including the water stress function increased the explained variances of the decay rate by 19% for the Q. variabilis forests and 7% for the P. densiflora forests, respectively. The increase of the explained variances resulted from large difference in temperature range and precipitation range across the decomposition study plots. During the period of experiment, the mean annual temperature range was less than 3°C, while the annual precipitation ranged from 720mm to 1466mm. Application of the water stress functions to the FBDC model constrained increasing trend of temperature sensitivity under climate change, and thus increased the model-estimated soil C sequestration (Mg C ha-1) by 6.6 for the Q. variabilis forests and by 3.1 for the P. densiflora forests, respectively. The addition of water stress functions increased reliability of the decay rate estimation and could contribute to reducing the bias in estimating soil C sequestration under varying moisture condition. Acknowledgement: This study was supported by Korea Forest Service (2017044B10-1719-BB01)

  20. RABIX: AN OPEN-SOURCE WORKFLOW EXECUTOR SUPPORTING RECOMPUTABILITY AND INTEROPERABILITY OF WORKFLOW DESCRIPTIONS.

    Science.gov (United States)

    Kaushik, Gaurav; Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz

    2017-01-01

    As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optim1izations to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor, an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions.

  1. Including an ocean carbon cycle model into iLOVECLIM (v1.0)

    NARCIS (Netherlands)

    Bouttes, N.; Roche, D.M.V.A.P.; Mariotti, V.; Bopp, L.

    2015-01-01

    The atmospheric carbon dioxide concentration plays a crucial role in the radiative balance and as such has a strong influence on the evolution of climate. Because of the numerous interactions between climate and the carbon cycle, it is necessary to include a model of the carbon cycle within a

  2. The Model of the Software Running on a Computer Equipment Hardware Included in the Grid network

    Directory of Open Access Journals (Sweden)

    T. A. Mityushkina

    2012-12-01

    Full Text Available A new approach to building a cloud computing environment using Grid networks is proposed in this paper. The authors describe the functional capabilities, algorithm, model of software running on a computer equipment hardware included in the Grid network, that will allow to implement cloud computing environment using Grid technologies.

  3. An open source approach to enable the reproducibility of scientific workflows in the ocean sciences

    Science.gov (United States)

    Di Stefano, M.; Fox, P. A.; West, P.; Hare, J. A.; Maffei, A. R.

    2013-12-01

    Every scientist should be able to rerun data analyses conducted by his or her team and regenerate the figures in a paper. However, all too often the correct version of a script goes missing, or the original raw data is filtered by hand and the filtering process is undocumented, or there is lack of collaboration and communication among scientists working in a team. Here we present 3 different use cases in ocean sciences in which end-to-end workflows are tracked. The main tool that is deployed to address these use cases is based on a web application (IPython Notebook) that provides the ability to work on very diverse and heterogeneous data and information sources, providing an effective way to share the and track changes to source code used to generate data products and associated metadata, as well as to track the overall workflow provenance to allow versioned reproducibility of a data product. Use cases selected for this work are: 1) A partial reproduction of the Ecosystem Status Report (ESR) for the Northeast U.S. Continental Shelf Large Marine Ecosystem. Our goal with this use case is to enable not just the traceability but also the reproducibility of this biannual report, keeping track of all the processes behind the generation and validation of time-series and spatial data and information products. An end-to-end workflow with code versioning is developed so that indicators in the report may be traced back to the source datasets. 2) Realtime generation of web pages to be able to visualize one of the environmental indicators from the Ecosystem Advisory for the Northeast Shelf Large Marine Ecosystem web site. 3) Data and visualization integration for ocean climate forecasting. In this use case, we focus on a workflow to describe how to provide access to online data sources in the NetCDF format and other model data, and make use of multicore processing to generate video animation from time series of gridded data. For each use case we show how complete workflows

  4. Workflow Based Software Development Environment, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed research is to investigate and develop a workflow based tool, the Software Developers Assistant, to facilitate the collaboration between...

  5. Implementing Workflow Reconfiguration in WS-BPEL

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Dragoni, Nicola; Zhou, Mu

    2012-01-01

    This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would not natu......This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would...... not naturally support dynamic change, is used as a target for implementation. The WS-BPEL recovery framework is here exploited to implement the reconfiguration using principles derived from previous research in process algebra and two mappings from BPMN to WS-BPEL are presented, one automatic and only mostly...

  6. Logical provenance in data-oriented workflows?

    KAUST Repository

    Ikeda, R.; Das Sarma, Akash; Widom, J.

    2013-01-01

    for general transformations, introducing the notions of correctness, precision, and minimality. We then determine when properties such as correctness and minimality carry over from the individual transformations' provenance to the workflow provenance. We

  7. A Multilevel Secure Workflow Management System

    National Research Council Canada - National Science Library

    Kang, Myong H; Froscher, Judith N; Sheth, Amit P; Kochut, Krys J; Miller, John A

    1999-01-01

    The Department of Defense (DoD) needs multilevel secure (MLS) workflow management systems to enable globally distributed users and applications to cooperate across classification levels to achieve mission critical goals...

  8. Workflow Based Software Development Environment, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed research is to investigate and develop a workflow based tool, the Software Developers Assistant, to facilitate the collaboration between...

  9. Children and adolescents' internal models of food-sharing behavior include complex evaluations of contextual factors.

    Science.gov (United States)

    Markovits, Henry; Benenson, Joyce F; Kramer, Donald L

    2003-01-01

    This study examined internal representations of food sharing in 589 children and adolescents (8-19 years of age). Questionnaires, depicting a variety of contexts in which one person was asked to share a resource with another, were used to examine participants' expectations of food-sharing behavior. Factors that were varied included the value of the resource, the relation between the two depicted actors, the quality of this relation, and gender. Results indicate that internal models of food-sharing behavior showed systematic patterns of variation, demonstrating that individuals have complex contextually based internal models at all ages, including the youngest. Examination of developmental changes in use of individual patterns is consistent with the idea that internal models reflect age-specific patterns of interactions while undergoing a process of progressive consolidation.

  10. Workflow Dynamics and the Imaging Value Chain: Quantifying the Effect of Designating a Nonimage-Interpretive Task Workflow.

    Science.gov (United States)

    Lee, Matthew H; Schemmel, Andrew J; Pooler, B Dustin; Hanley, Taylor; Kennedy, Tabassum A; Field, Aaron S; Wiegmann, Douglas; Yu, John-Paul J

    To assess the impact of separate non-image interpretive task and image-interpretive task workflows in an academic neuroradiology practice. A prospective, randomized, observational investigation of a centralized academic neuroradiology reading room was performed. The primary reading room fellow was observed over a one-month period using a time-and-motion methodology, recording frequency and duration of tasks performed. Tasks were categorized into separate image interpretive and non-image interpretive workflows. Post-intervention observation of the primary fellow was repeated following the implementation of a consult assistant responsible for non-image interpretive tasks. Pre- and post-intervention data were compared. Following separation of image-interpretive and non-image interpretive workflows, time spent on image-interpretive tasks by the primary fellow increased from 53.8% to 73.2% while non-image interpretive tasks decreased from 20.4% to 4.4%. Mean time duration of image interpretation nearly doubled, from 05:44 to 11:01 (p = 0.002). Decreases in specific non-image interpretive tasks, including phone calls/paging (2.86/hr versus 0.80/hr), in-room consultations (1.36/hr versus 0.80/hr), and protocoling (0.99/hr versus 0.10/hr), were observed. The consult assistant experienced 29.4 task switching events per hour. Rates of specific non-image interpretive tasks for the CA were 6.41/hr for phone calls/paging, 3.60/hr for in-room consultations, and 3.83/hr for protocoling. Separating responsibilities into NIT and IIT workflows substantially increased image interpretation time and decreased TSEs for the primary fellow. Consolidation of NITs into a separate workflow may allow for more efficient task completion. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Observational constraint on the interacting dark energy models including the Sandage-Loeb test

    Science.gov (United States)

    Zhang, Ming-Jian; Liu, Wen-Biao

    2014-05-01

    Two types of interacting dark energy models are investigated using the type Ia supernova (SNIa), observational data (OHD), cosmic microwave background shift parameter, and the secular Sandage-Loeb (SL) test. In the investigation, we have used two sets of parameter priors including WMAP-9 and Planck 2013. They have shown some interesting differences. We find that the inclusion of SL test can obviously provide a more stringent constraint on the parameters in both models. For the constant coupling model, the interaction term has been improved to be only a half of the original scale on corresponding errors. Comparing with only SNIa and OHD, we find that the inclusion of the SL test almost reduces the best-fit interaction to zero, which indicates that the higher-redshift observation including the SL test is necessary to track the evolution of the interaction. For the varying coupling model, data with the inclusion of the SL test show that the parameter at C.L. in Planck priors is , where the constant is characteristic for the severity of the coincidence problem. This indicates that the coincidence problem will be less severe. We then reconstruct the interaction , and we find that the best-fit interaction is also negative, similar to the constant coupling model. However, for a high redshift, the interaction generally vanishes at infinity. We also find that the phantom-like dark energy with is favored over the CDM model.

  12. MEMLS3&a: Microwave Emission Model of Layered Snowpacks adapted to include backscattering

    Directory of Open Access Journals (Sweden)

    M. Proksch

    2015-08-01

    Full Text Available The Microwave Emission Model of Layered Snowpacks (MEMLS was originally developed for microwave emissions of snowpacks in the frequency range 5–100 GHz. It is based on six-flux theory to describe radiative transfer in snow including absorption, multiple volume scattering, radiation trapping due to internal reflection and a combination of coherent and incoherent superposition of reflections between horizontal layer interfaces. Here we introduce MEMLS3&a, an extension of MEMLS, which includes a backscatter model for active microwave remote sensing of snow. The reflectivity is decomposed into diffuse and specular components. Slight undulations of the snow surface are taken into account. The treatment of like- and cross-polarization is accomplished by an empirical splitting parameter q. MEMLS3&a (as well as MEMLS is set up in a way that snow input parameters can be derived by objective measurement methods which avoid fitting procedures of the scattering efficiency of snow, required by several other models. For the validation of the model we have used a combination of active and passive measurements from the NoSREx (Nordic Snow Radar Experiment campaign in Sodankylä, Finland. We find a reasonable agreement between the measurements and simulations, subject to uncertainties in hitherto unmeasured input parameters of the backscatter model. The model is written in Matlab and the code is publicly available for download through the following website: http://www.iapmw.unibe.ch/research/projects/snowtools/memls.html.

  13. Safe distance car-following model including backward-looking and its stability analysis

    Science.gov (United States)

    Yang, Da; Jin, Peter Jing; Pu, Yun; Ran, Bin

    2013-03-01

    The focus of this paper is the car-following behavior including backward-looking, simply called the bi-directional looking car-following behavior. This study is motivated by the potential changes of the physical properties of traffic flow caused by the fast developing intelligent transportation system (ITS), especially the new connected vehicle technology. Existing studies on this topic focused on general motors (GM) models and optimal velocity (OV) models. The safe distance car-following model, Gipps' model, which is more widely used in practice have not drawn too much attention in the bi-directional looking context. This paper explores the property of the bi-directional looking extension of Gipps' safe distance model. The stability condition of the proposed model is derived using the linear stability theory and is verified using numerical simulations. The impacts of the driver and vehicle characteristics appeared in the proposed model on the traffic flow stability are also investigated. It is found that taking into account the backward-looking effect in car-following has three types of effect on traffic flow: stabilizing, destabilizing and producing non-physical phenomenon. This conclusion is more sophisticated than the study results based on the OV bi-directional looking car-following models. Moreover, the drivers who have the smaller reaction time or the larger additional delay and think the other vehicles have larger maximum decelerations can stabilize traffic flow.

  14. An imprecise Dirichlet model for Bayesian analysis of failure data including right-censored observations

    International Nuclear Information System (INIS)

    Coolen, F.P.A.

    1997-01-01

    This paper is intended to make researchers in reliability theory aware of a recently introduced Bayesian model with imprecise prior distributions for statistical inference on failure data, that can also be considered as a robust Bayesian model. The model consists of a multinomial distribution with Dirichlet priors, making the approach basically nonparametric. New results for the model are presented, related to right-censored observations, where estimation based on this model is closely related to the product-limit estimator, which is an important statistical method to deal with reliability or survival data including right-censored observations. As for the product-limit estimator, the model considered in this paper aims at not using any information other than that provided by observed data, but our model fits into the robust Bayesian context which has the advantage that all inferences can be based on probabilities or expectations, or bounds for probabilities or expectations. The model uses a finite partition of the time-axis, and as such it is also related to life-tables

  15. Workflow interruptions, cognitive failure and near-accidents in health care.

    Science.gov (United States)

    Elfering, Achim; Grebner, Simone; Ebener, Corinne

    2015-01-01

    Errors are frequent in health care. A specific model was tested that affirms failure in cognitive action regulation to mediate the influence of nurses' workflow interruptions and safety conscientiousness on near-accidents in health care. One hundred and sixty-five nurses from seven Swiss hospitals participated in a questionnaire survey. Structural equation modelling confirmed the hypothesised mediation model. Cognitive failure in action regulation significantly mediated the influence of workflow interruptions on near-accidents (p accidents via cognitive failure in action regulation was also significant (p accidents; moreover, cognitive failure mediated the association between compliance and near-accidents (p < .05). Contrary to expectations, compliance with safety regulations was not related to workflow interruptions. Workflow interruptions caused by colleagues, patients and organisational constraints are likely to trigger errors in nursing. Work redesign is recommended to reduce cognitive failure and improve safety of nurses and patients.

  16. Multilevel Workflow System in the ATLAS Experiment

    CERN Document Server

    Borodin, M; The ATLAS collaboration; Golubkov, D; Klimentov, A; Maeno, T; Vaniachine, A

    2015-01-01

    The ATLAS experiment is scaling up Big Data processing for the next LHC run using a multilevel workflow system comprised of many layers. In Big Data processing ATLAS deals with datasets, not individual files. Similarly a task (comprised of many jobs) has become a unit of the ATLAS workflow in distributed computing, with about 0.8M tasks processed per year. In order to manage the diversity of LHC physics (exceeding 35K physics samples per year), the individual data processing tasks are organized into workflows. For example, the Monte Carlo workflow is composed of many steps: generate or configure hard-processes, hadronize signal and minimum-bias (pileup) events, simulate energy deposition in the ATLAS detector, digitize electronics response, simulate triggers, reconstruct data, convert the reconstructed data into ROOT ntuples for physics analysis, etc. Outputs are merged and/or filtered as necessary to optimize the chain. The bi-level workflow manager - ProdSys2 - generates actual workflow tasks and their jobs...

  17. Improving weather predictability by including land-surface model parameter uncertainty

    Science.gov (United States)

    Orth, Rene; Dutra, Emanuel; Pappenberger, Florian

    2016-04-01

    The land surface forms an important component of Earth system models and interacts nonlinearly with other parts such as ocean and atmosphere. To capture the complex and heterogenous hydrology of the land surface, land surface models include a large number of parameters impacting the coupling to other components of the Earth system model. Focusing on ECMWF's land-surface model HTESSEL we present in this study a comprehensive parameter sensitivity evaluation using multiple observational datasets in Europe. We select 6 poorly constrained effective parameters (surface runoff effective depth, skin conductivity, minimum stomatal resistance, maximum interception, soil moisture stress function shape, total soil depth) and explore their sensitivity to model outputs such as soil moisture, evapotranspiration and runoff using uncoupled simulations and coupled seasonal forecasts. Additionally we investigate the possibility to construct ensembles from the multiple land surface parameters. In the uncoupled runs we find that minimum stomatal resistance and total soil depth have the most influence on model performance. Forecast skill scores are moreover sensitive to the same parameters as HTESSEL performance in the uncoupled analysis. We demonstrate the robustness of our findings by comparing multiple best performing parameter sets and multiple randomly chosen parameter sets. We find better temperature and precipitation forecast skill with the best-performing parameter perturbations demonstrating representativeness of model performance across uncoupled (and hence less computationally demanding) and coupled settings. Finally, we construct ensemble forecasts from ensemble members derived with different best-performing parameterizations of HTESSEL. This incorporation of parameter uncertainty in the ensemble generation yields an increase in forecast skill, even beyond the skill of the default system. Orth, R., E. Dutra, and F. Pappenberger, 2016: Improving weather predictability by

  18. Finite element modeling of contaminant transport in soils including the effect of chemical reactions.

    Science.gov (United States)

    Javadi, A A; Al-Najjar, M M

    2007-05-17

    The movement of chemicals through soils to the groundwater is a major cause of degradation of water resources. In many cases, serious human and stock health implications are associated with this form of pollution. Recent studies have shown that the current models and methods are not able to adequately describe the leaching of nutrients through soils, often underestimating the risk of groundwater contamination by surface-applied chemicals, and overestimating the concentration of resident solutes. Furthermore, the effect of chemical reactions on the fate and transport of contaminants is not included in many of the existing numerical models for contaminant transport. In this paper a numerical model is presented for simulation of the flow of water and air and contaminant transport through unsaturated soils with the main focus being on the effects of chemical reactions. The governing equations of miscible contaminant transport including advection, dispersion-diffusion and adsorption effects together with the effect of chemical reactions are presented. The mathematical framework and the numerical implementation of the model are described in detail. The model is validated by application to a number of test cases from the literature and is then applied to the simulation of a physical model test involving transport of contaminants in a block of soil with particular reference to the effects of chemical reactions. Comparison of the results of the numerical model with the experimental results shows that the model is capable of predicting the effects of chemical reactions with very high accuracy. The importance of consideration of the effects of chemical reactions is highlighted.

  19. Developing 3D Imaging Programmes-Workflow and Quality Control

    OpenAIRE

    Hess, M.; Robson, S.; Serpico, M.; Amati, G.; Pridden, I.; Nelson, T.

    2016-01-01

    This article reports on a successful project for 3D imaging research, digital applications, and use of new technologies in the museum. The article will focus on the development and implementation of a viable workflow for the production of high-quality 3D models of museum objects, based on the 3D laser scanning and photogrammetry of selected ancient Egyptian artefacts. The development of a robust protocol for the complete process chain for imaging cultural heritage artefacts, from the acquisit...

  20. Flexible End2End Workflow Automation of Hit-Discovery Research.

    Science.gov (United States)

    Holzmüller-Laue, Silke; Göde, Bernd; Thurow, Kerstin

    2014-08-01

    The article considers a new approach of more complex laboratory automation at the workflow layer. The authors purpose the automation of end2end workflows. The combination of all relevant subprocesses-whether automated or manually performed, independently, and in which organizational unit-results in end2end processes that include all result dependencies. The end2end approach focuses on not only the classical experiments in synthesis or screening, but also on auxiliary processes such as the production and storage of chemicals, cell culturing, and maintenance as well as preparatory activities and analyses of experiments. Furthermore, the connection of control flow and data flow in the same process model leads to reducing of effort of the data transfer between the involved systems, including the necessary data transformations. This end2end laboratory automation can be realized effectively with the modern methods of business process management (BPM). This approach is based on a new standardization of the process-modeling notation Business Process Model and Notation 2.0. In drug discovery, several scientific disciplines act together with manifold modern methods, technologies, and a wide range of automated instruments for the discovery and design of target-based drugs. The article discusses the novel BPM-based automation concept with an implemented example of a high-throughput screening of previously synthesized compound libraries. © 2014 Society for Laboratory Automation and Screening.

  1. A numerical model including PID control of a multizone crystal growth furnace

    Science.gov (United States)

    Panzarella, Charles H.; Kassemi, Mohammad

    1992-01-01

    This paper presents a 2D axisymmetric combined conduction and radiation model of a multizone crystal growth furnace. The model is based on a programmable multizone furnace (PMZF) designed and built at NASA Lewis Research Center for growing high quality semiconductor crystals. A novel feature of this model is a control algorithm which automatically adjusts the power in any number of independently controlled heaters to establish the desired crystal temperatures in the furnace model. The control algorithm eliminates the need for numerous trial and error runs previously required to obtain the same results. The finite element code, FIDAP, used to develop the furnace model, was modified to directly incorporate the control algorithm. This algorithm, which presently uses PID control, and the associated heat transfer model are briefly discussed. Together, they have been used to predict the heater power distributions for a variety of furnace configurations and desired temperature profiles. Examples are included to demonstrate the effectiveness of the PID controlled model in establishing isothermal, Bridgman, and other complicated temperature profies in the sample. Finally, an example is given to show how the algorithm can be used to change the desired profile with time according to a prescribed temperature-time evolution.

  2. A New Circuit Model for Spin-Torque Oscillator Including Perpendicular Torque of Magnetic Tunnel Junction

    Directory of Open Access Journals (Sweden)

    Hyein Lim

    2013-01-01

    Full Text Available Spin-torque oscillator (STO is a promising new technology for the future RF oscillators, which is based on the spin-transfer torque (STT effect in magnetic multilayered nanostructure. It is expected to provide a larger tunability, smaller size, lower power consumption, and higher level of integration than the semiconductor-based oscillators. In our previous work, a circuit-level model of the giant magnetoresistance (GMR STO was proposed. In this paper, we present a physics-based circuit-level model of the magnetic tunnel junction (MTJ-based STO. MTJ-STO model includes the effect of perpendicular torque that has been ignored in the GMR-STO model. The variations of three major characteristics, generation frequency, mean oscillation power, and generation linewidth of an MTJ-STO with respect to the amount of perpendicular torque, are investigated, and the results are applied to our model. The operation of the model was verified by HSPICE simulation, and the results show an excellent agreement with the experimental data. The results also prove that a full circuit-level simulation with MJT-STO devices can be made with our proposed model.

  3. Resilient workflows for computational mechanics platforms

    International Nuclear Information System (INIS)

    Nguyen, Toan; Trifan, Laurentiu; Desideri, Jean-Antoine

    2010-01-01

    Workflow management systems have recently been the focus of much interest and many research and deployment for scientific applications worldwide. Their ability to abstract the applications by wrapping application codes have also stressed the usefulness of such systems for multidiscipline applications. When complex applications need to provide seamless interfaces hiding the technicalities of the computing infrastructures, their high-level modeling, monitoring and execution functionalities help giving production teams seamless and effective facilities. Software integration infrastructures based on programming paradigms such as Python, Mathlab and Scilab have also provided evidence of the usefulness of such approaches for the tight coupling of multidisciplne application codes. Also high-performance computing based on multi-core multi-cluster infrastructures open new opportunities for more accurate, more extensive and effective robust multi-discipline simulations for the decades to come. This supports the goal of full flight dynamics simulation for 3D aircraft models within the next decade, opening the way to virtual flight-tests and certification of aircraft in the future.

  4. Resilient workflows for computational mechanics platforms

    Science.gov (United States)

    Nguyên, Toàn; Trifan, Laurentiu; Désidéri, Jean-Antoine

    2010-06-01

    Workflow management systems have recently been the focus of much interest and many research and deployment for scientific applications worldwide [26, 27]. Their ability to abstract the applications by wrapping application codes have also stressed the usefulness of such systems for multidiscipline applications [23, 24]. When complex applications need to provide seamless interfaces hiding the technicalities of the computing infrastructures, their high-level modeling, monitoring and execution functionalities help giving production teams seamless and effective facilities [25, 31, 33]. Software integration infrastructures based on programming paradigms such as Python, Mathlab and Scilab have also provided evidence of the usefulness of such approaches for the tight coupling of multidisciplne application codes [22, 24]. Also high-performance computing based on multi-core multi-cluster infrastructures open new opportunities for more accurate, more extensive and effective robust multi-discipline simulations for the decades to come [28]. This supports the goal of full flight dynamics simulation for 3D aircraft models within the next decade, opening the way to virtual flight-tests and certification of aircraft in the future [23, 24, 29].

  5. Measuring Semantic and Structural Information for Data Oriented Workflow Retrieval with Cost Constraints

    Directory of Open Access Journals (Sweden)

    Yinglong Ma

    2014-01-01

    Full Text Available The reuse of data oriented workflows (DOWs can reduce the cost of workflow system development and control the risk of project failure and therefore is crucial for accelerating the automation of business processes. Reusing workflows can be achieved by measuring the similarity among candidate workflows and selecting the workflow satisfying requirements of users from them. However, due to DOWs being often developed based on an open, distributed, and heterogeneous environment, different users often can impose diverse cost constraints on data oriented workflows. This makes the reuse of DOWs challenging. There is no clear solution for retrieving DOWs with cost constraints. In this paper, we present a novel graph based model of DOWs with cost constraints, called constrained data oriented workflow (CDW, which can express cost constraints that users are often concerned about. An approach is proposed for retrieving CDWs, which seamlessly combines semantic and structural information of CDWs. A distance measure based on matrix theory is adopted to seamlessly combine semantic and structural similarities of CDWs for selecting and reusing them. Finally, the related experiments are made to show the effectiveness and efficiency of our approach.

  6. Fuzzy Control of Yaw and Roll Angles of a Simulated Helicopter Model Includes Articulated Manipulators

    Directory of Open Access Journals (Sweden)

    Hossein Sadegh Lafmejani

    2015-09-01

    Full Text Available Fuzzy logic controller (FLC is a heuristic method by If-Then Rules which resembles human intelligence and it is a good method for designing Non-linear control systems. In this paper, an arbitrary helicopter model includes articulated manipulators has been simulated with Matlab SimMechanics toolbox. Due to the difficulties of modeling this complex system, a fuzzy controller with simple fuzzy rules has been designed for its yaw and roll angles in order to stabilize the helicopter while it is in the presence of disturbances or its manipulators are moving for a task. Results reveal that a simple FLC can appropriately control this system.

  7. A roller chain drive model including contact with guide-bars

    DEFF Research Database (Denmark)

    Pedersen, Sine Leergaard; Hansen, John Michael; Ambrósio, J. A. C.

    2004-01-01

    A model of a roller chain drive is developed and applied to the simulation and analysis of roller chain drives of large marine diesel engines. The model includes the impact with guide-bars that are the motion delimiter components on the chain strands between the sprockets. The main components...... and the sprocket centre, i.e. a constraint is added when such distance is less than the pitch radius. The unilateral kinematic constraint is removed when its associated constraint reaction force, applied on the roller, is in the direction of the root of the sprocket teeth. In order to improve the numerical...

  8. TS Fuzzy Model-Based Controller Design for a Class of Nonlinear Systems Including Nonsmooth Functions

    DEFF Research Database (Denmark)

    Vafamand, Navid; Asemani, Mohammad Hassan; Khayatiyan, Alireza

    2018-01-01

    This paper proposes a novel robust controller design for a class of nonlinear systems including hard nonlinearity functions. The proposed approach is based on Takagi-Sugeno (TS) fuzzy modeling, nonquadratic Lyapunov function, and nonparallel distributed compensation scheme. In this paper, a novel...... criterion, new robust controller design conditions in terms of linear matrix inequalities are derived. Three practical case studies, electric power steering system, a helicopter model and servo-mechanical system, are presented to demonstrate the importance of such class of nonlinear systems comprising...

  9. Experiences and lessons learned from creating a generalized workflow for data publication of field campaign datasets

    Science.gov (United States)

    Santhana Vannan, S. K.; Ramachandran, R.; Deb, D.; Beaty, T.; Wright, D.

    2017-12-01

    This paper summarizes the workflow challenges of curating and publishing data produced from disparate data sources and provides a generalized workflow solution to efficiently archive data generated by researchers. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) for biogeochemical dynamics and the Global Hydrology Resource Center (GHRC) DAAC have been collaborating on the development of a generalized workflow solution to efficiently manage the data publication process. The generalized workflow presented here are built on lessons learned from implementations of the workflow system. Data publication consists of the following steps: Accepting the data package from the data providers, ensuring the full integrity of the data files. Identifying and addressing data quality issues Assembling standardized, detailed metadata and documentation, including file level details, processing methodology, and characteristics of data files Setting up data access mechanisms Setup of the data in data tools and services for improved data dissemination and user experience Registering the dataset in online search and discovery catalogues Preserving the data location through Digital Object Identifiers (DOI) We will describe the steps taken to automate, and realize efficiencies to the above process. The goals of the workflow system are to reduce the time taken to publish a dataset, to increase the quality of documentation and metadata, and to track individual datasets through the data curation process. Utilities developed to achieve these goal will be described. We will also share metrics driven value of the workflow system and discuss the future steps towards creation of a common software framework.

  10. a Workflow for UAV's Integration Into a Geodesign Platform

    Science.gov (United States)

    Anca, P.; Calugaru, A.; Alixandroae, I.; Nazarie, R.

    2016-06-01

    This paper presents a workflow for the development of various Geodesign scenarios. The subject is important in the context of identifying patterns and designing solutions for a Smart City with optimized public transportation, efficient buildings, efficient utilities, recreational facilities a.s.o.. The workflow describes the procedures starting with acquiring data in the field, data processing, orthophoto generation, DTM generation, integration into a GIS platform and analyzing for a better support for Geodesign. Esri's City Engine is used mostly for 3D modeling capabilities that enable the user to obtain 3D realistic models. The workflow uses as inputs information extracted from images acquired using UAVs technologies, namely eBee, existing 2D GIS geodatabases, and a set of CGA rules. The method that we used further, is called procedural modeling, and uses rules in order to extrude buildings, the street network, parcel zoning and side details, based on the initial attributes from the geodatabase. The resulted products are various scenarios for redesigning, for analyzing new exploitation sites. Finally, these scenarios can be published as interactive web scenes for internal, groups or pubic consultation. In this way, problems like the impact of new constructions being build, re-arranging green spaces or changing routes for public transportation, etc. are revealed through impact and visibility analysis or shadowing analysis and are brought to the citizen's attention. This leads to better decisions.

  11. Validation of lumbar spine loading from a musculoskeletal model including the lower limbs and lumbar spine.

    Science.gov (United States)

    Actis, Jason A; Honegger, Jasmin D; Gates, Deanna H; Petrella, Anthony J; Nolasco, Luis A; Silverman, Anne K

    2018-02-08

    Low back mechanics are important to quantify to study injury, pain and disability. As in vivo forces are difficult to measure directly, modeling approaches are commonly used to estimate these forces. Validation of model estimates is critical to gain confidence in modeling results across populations of interest, such as people with lower-limb amputation. Motion capture, ground reaction force and electromyographic data were collected from ten participants without an amputation (five male/five female) and five participants with a unilateral transtibial amputation (four male/one female) during trunk-pelvis range of motion trials in flexion/extension, lateral bending and axial rotation. A musculoskeletal model with a detailed lumbar spine and the legs including 294 muscles was used to predict L4-L5 loading and muscle activations using static optimization. Model estimates of L4-L5 intervertebral joint loading were compared to measured intradiscal pressures from the literature and muscle activations were compared to electromyographic signals. Model loading estimates were only significantly different from experimental measurements during trunk extension for males without an amputation and for people with an amputation, which may suggest a greater portion of L4-L5 axial load transfer through the facet joints, as facet loads are not captured by intradiscal pressure transducers. Pressure estimates between the model and previous work were not significantly different for flexion, lateral bending or axial rotation. Timing of model-estimated muscle activations compared well with electromyographic activity of the lumbar paraspinals and upper erector spinae. Validated estimates of low back loading can increase the applicability of musculoskeletal models to clinical diagnosis and treatment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Modeling of Temperature-Dependent Noise in Silicon Nanowire FETs including Self-Heating Effects

    Directory of Open Access Journals (Sweden)

    P. Anandan

    2014-01-01

    Full Text Available Silicon nanowires are leading the CMOS era towards the downsizing limit and its nature will be effectively suppress the short channel effects. Accurate modeling of thermal noise in nanowires is crucial for RF applications of nano-CMOS emerging technologies. In this work, a perfect temperature-dependent model for silicon nanowires including the self-heating effects has been derived and its effects on device parameters have been observed. The power spectral density as a function of thermal resistance shows significant improvement as the channel length decreases. The effects of thermal noise including self-heating of the device are explored. Moreover, significant reduction in noise with respect to channel thermal resistance, gate length, and biasing is analyzed.

  13. Model for safety reports including descriptive examples; Mall foer saekerhetsrapporter med beskrivande exempel

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-01

    Several safety reports will be produced in the process of planning and constructing the system for disposal of high-level radioactive waste in Sweden. The present report gives a model, with detailed examples, of how these reports should be organized and what steps they should include. In the near future safety reports will deal with the encapsulation plant and the repository. Later reports will treat operation of the handling systems and the repository.

  14. A High-Rate, Single-Crystal Model including Phase Transformations, Plastic Slip, and Twinning

    Energy Technology Data Exchange (ETDEWEB)

    Addessio, Francis L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Theoretical Division; Bronkhorst, Curt Allan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Theoretical Division; Bolme, Cynthia Anne [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Explosive Science and Shock Physics Division; Brown, Donald William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Materials Science and Technology Division; Cerreta, Ellen Kathleen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Materials Science and Technology Division; Lebensohn, Ricardo A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Materials Science and Technology Division; Lookman, Turab [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Theoretical Division; Luscher, Darby Jon [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Theoretical Division; Mayeur, Jason Rhea [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Theoretical Division; Morrow, Benjamin M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Materials Science and Technology Division; Rigg, Paulo A. [Washington State Univ., Pullman, WA (United States). Dept. of Physics. Inst. for Shock Physics

    2016-08-09

    An anisotropic, rate-­dependent, single-­crystal approach for modeling materials under the conditions of high strain rates and pressures is provided. The model includes the effects of large deformations, nonlinear elasticity, phase transformations, and plastic slip and twinning. It is envisioned that the model may be used to examine these coupled effects on the local deformation of materials that are subjected to ballistic impact or explosive loading. The model is formulated using a multiplicative decomposition of the deformation gradient. A plate impact experiment on a multi-­crystal sample of titanium was conducted. The particle velocities at the back surface of three crystal orientations relative to the direction of impact were measured. Molecular dynamics simulations were conducted to investigate the details of the high-­rate deformation and pursue issues related to the phase transformation for titanium. Simulations using the single crystal model were conducted and compared to the high-­rate experimental data for the impact loaded single crystals. The model was found to capture the features of the experiments.

  15. Collisional-radiative model including recombination processes for W27+ ion★

    Science.gov (United States)

    Murakami, Izumi; Sasaki, Akira; Kato, Daiji; Koike, Fumihiro

    2017-10-01

    We have constructed a collisional-radiative (CR) model for W27+ ions including 226 configurations with n ≤ 9 and ł ≤ 5 for spectroscopic diagnostics. We newly include recombination processes in the model and this is the first result of extreme ultraviolet spectrum calculated for recombining plasma component. Calculated spectra in 40-70 Å range in ionizing and recombining plasma components show similar 3 strong lines and 1 line weak in recombining plasma component at 45-50 Å and many weak lines at 50-65 Å for both components. Recombination processes do not contribute much to the spectrum at around 60 Å for W27+ ion. Dielectronic satellite lines are also minor contribution to the spectrum of recombining plasma component. Dielectronic recombination (DR) rate coefficient from W28+ to W27+ ions is also calculated with the same atomic data in the CR model. We found that larger set of energy levels including many autoionizing states gave larger DR rate coefficients but our rate agree within factor 6 with other works at electron temperature around 1 keV in which W27+ and W28+ ions are usually observed in plasmas. Contribution to the Topical Issue "Atomic and Molecular Data and their Applications", edited by Gordon W.F. Drake, Jung-Sik Yoon, Daiji Kato, and Grzegorz Karwasz.

  16. How to include frequency dependent complex permeability Into SPICE models to improve EMI filters design?

    Science.gov (United States)

    Sixdenier, Fabien; Yade, Ousseynou; Martin, Christian; Bréard, Arnaud; Vollaire, Christian

    2018-05-01

    Electromagnetic interference (EMI) filters design is a rather difficult task where engineers have to choose adequate magnetic materials, design the magnetic circuit and choose the size and number of turns. The final design must achieve the attenuation requirements (constraints) and has to be as compact as possible (goal). Alternating current (AC) analysis is a powerful tool to predict global impedance or attenuation of any filter. However, AC analysis are generally performed without taking into account the frequency-dependent complex permeability behaviour of soft magnetic materials. That's why, we developed two frequency-dependent complex permeability models able to be included into SPICE models. After an identification process, the performances of each model are compared to measurements made on a realistic EMI filter prototype in common mode (CM) and differential mode (DM) to see the benefit of the approach. Simulation results are in good agreement with the measured ones especially in the middle frequency range.

  17. WorkflowNet2BPEL4WS: A Tool for Translating Unstructured Workflow Processes to Readable BPEL

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M. P.

    2007-01-01

    code and not easy to use by end-users. Therefore, we provide a mapping from WF-nets to BPEL. This mapping builds on the rich theory of Petri nets and can also be used to map other languages (e.g., UML, EPC, BPMN, etc.) onto BPEL. To evaluate WorkflowNet2BPEL4WS we used more than 100 processes modeled...

  18. A framework for service enterprise workflow simulation with multi-agents cooperation

    Science.gov (United States)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  19. RELAP5-3D Code Includes ATHENA Features and Models

    International Nuclear Information System (INIS)

    Riemke, Richard A.; Davis, Cliff B.; Schultz, Richard R.

    2006-01-01

    Version 2.3 of the RELAP5-3D computer program includes all features and models previously available only in the ATHENA version of the code. These include the addition of new working fluids (i.e., ammonia, blood, carbon dioxide, glycerol, helium, hydrogen, lead-bismuth, lithium, lithium-lead, nitrogen, potassium, sodium, and sodium-potassium) and a magnetohydrodynamic model that expands the capability of the code to model many more thermal-hydraulic systems. In addition to the new working fluids along with the standard working fluid water, one or more noncondensable gases (e.g., air, argon, carbon dioxide, carbon monoxide, helium, hydrogen, krypton, nitrogen, oxygen, SF 6 , xenon) can be specified as part of the vapor/gas phase of the working fluid. These noncondensable gases were in previous versions of RELAP5-3D. Recently four molten salts have been added as working fluids to RELAP5-3D Version 2.4, which has had limited release. These molten salts will be in RELAP5-3D Version 2.5, which will have a general release like RELAP5-3D Version 2.3. Applications that use these new features and models are discussed in this paper. (authors)

  20. Double-gate junctionless transistor model including short-channel effects

    International Nuclear Information System (INIS)

    Paz, B C; Pavanello, M A; Ávila-Herrera, F; Cerdeira, A

    2015-01-01

    This work presents a physically based model for double-gate junctionless transistors (JLTs), continuous in all operation regimes. To describe short-channel transistors, short-channel effects (SCEs), such as increase of the channel potential due to drain bias, carrier velocity saturation and mobility degradation due to vertical and longitudinal electric fields, are included in a previous model developed for long-channel double-gate JLTs. To validate the model, an analysis is made by using three-dimensional numerical simulations performed in a Sentaurus Device Simulator from Synopsys. Different doping concentrations, channel widths and channel lengths are considered in this work. Besides that, the series resistance influence is numerically included and validated for a wide range of source and drain extensions. In order to check if the SCEs are appropriately described, besides drain current, transconductance and output conductance characteristics, the following parameters are analyzed to demonstrate the good agreement between model and simulation and the SCEs occurrence in this technology: threshold voltage (V TH ), subthreshold slope (S) and drain induced barrier lowering. (paper)

  1. Refitting density dependent relativistic model parameters including Center-of-Mass corrections

    International Nuclear Information System (INIS)

    Avancini, Sidney S.; Marinelli, Jose R.; Carlson, Brett Vern

    2011-01-01

    Full text: Relativistic mean field models have become a standard approach for precise nuclear structure calculations. After the seminal work of Serot and Walecka, which introduced a model Lagrangian density where the nucleons interact through the exchange of scalar and vector mesons, several models were obtained through its generalization, including other meson degrees of freedom, non-linear meson interactions, meson-meson interactions, etc. More recently density dependent coupling constants were incorporated into the Walecka-like models, which are then extensively used. In particular, for these models a connection with the density functional theory can be established. Due to the inherent difficulties presented by field theoretical models, only the mean field approximation is used for the solution of these models. In order to calculate finite nuclei properties in the mean field approximation, a reference set has to be fixed and therefore the translational symmetry is violated. It is well known that in such case spurious effects due to the center-of-mass (COM) motion are present, which are more pronounced for light nuclei. In a previous work we have proposed a technique based on the Pierls-Yoccoz projection operator applied to the mean-field relativistic solution, in order to project out spurious COM contributions. In this work we obtain a new fitting for the density dependent parameters of a density dependent hadronic model, taking into account the COM corrections. Our fitting is obtained taking into account the charge radii and binding energies for He 4 , O 16 , Ca 40 , Ca 48 , Ni 56 , Ni 68 , Sn 100 , Sn 132 and Pb 208 . We show that the nuclear observables calculated using our fit are of a quality comparable to others that can be found in the literature, with the advantage that now a translational invariant many-body wave function is at our disposal. (author)

  2. Including policy and management in socio-hydrology models: initial conceptualizations

    Science.gov (United States)

    Hermans, Leon; Korbee, Dorien

    2017-04-01

    Socio-hydrology studies the interactions in coupled human-water systems. So far, the use of dynamic models that capture the direct feedback between societal and hydrological systems has been dominant. What has not yet been included with any particular emphasis, is the policy or management layer, which is a central element in for instance integrated water resources management (IWRM) or adaptive delta management (ADM). Studying the direct interactions between human-water systems generates knowledges that eventually helps influence these interactions in ways that may ensure better outcomes - for society and for the health and sustainability of water systems. This influence sometimes occurs through spontaneous emergence, uncoordinated by societal agents - private sector, citizens, consumers, water users. However, the term 'management' in IWRM and ADM also implies an additional coordinated attempt through various public actors. This contribution is a call to include the policy and management dimension more prominently into the research focus of the socio-hydrology field, and offers first conceptual variables that should be considered in attempts to include this policy or management layer in socio-hydrology models. This is done by drawing on existing frameworks to study policy processes throughout both planning and implementation phases. These include frameworks such as the advocacy coalition framework, collective learning and policy arrangements, which all emphasis longer-term dynamics and feedbacks between actor coalitions in strategic planning and implementation processes. A case about longter-term dynamics in the management of the Haringvliet in the Netherlands is used to illustrate the paper.

  3. a Standardized Approach to Topographic Data Processing and Workflow Management

    Science.gov (United States)

    Wheaton, J. M.; Bailey, P.; Glenn, N. F.; Hensleigh, J.; Hudak, A. T.; Shrestha, R.; Spaete, L.

    2013-12-01

    An ever-increasing list of options exist for collecting high resolution topographic data, including airborne LIDAR, terrestrial laser scanners, bathymetric SONAR and structure-from-motion. An equally rich, arguably overwhelming, variety of tools exists with which to organize, quality control, filter, analyze and summarize these data. However, scientists are often left to cobble together their analysis as a series of ad hoc steps, often using custom scripts and one-time processes that are poorly documented and rarely shared with the community. Even when literature-cited software tools are used, the input and output parameters differ from tool to tool. These parameters are rarely archived and the steps performed lost, making the analysis virtually impossible to replicate precisely. What is missing is a coherent, robust, framework for combining reliable, well-documented topographic data-processing steps into a workflow that can be repeated and even shared with others. We have taken several popular topographic data processing tools - including point cloud filtering and decimation as well as DEM differencing - and defined a common protocol for passing inputs and outputs between them. This presentation describes a free, public online portal that enables scientists to create custom workflows for processing topographic data using a number of popular topographic processing tools. Users provide the inputs required for each tool and in what sequence they want to combine them. This information is then stored for future reuse (and optionally sharing with others) before the user then downloads a single package that contains all the input and output specifications together with the software tools themselves. The user then launches the included batch file that executes the workflow on their local computer against their topographic data. This ZCloudTools architecture helps standardize, automate and archive topographic data processing. It also represents a forum for discovering and

  4. Resident Workflow and Psychiatric Emergency Consultation: Identifying Factors for Quality Improvement in a Training Environment.

    Science.gov (United States)

    Blair, Thomas; Wiener, Zev; Seroussi, Ariel; Tang, Lingqi; O'Hora, Jennifer; Cheung, Erick

    2017-06-01

    Quality improvement to optimize workflow has the potential to mitigate resident burnout and enhance patient care. This study applied mixed methods to identify factors that enhance or impede workflow for residents performing emergency psychiatric consultations. The study population consisted of all psychiatry program residents (55 eligible, 42 participating) at the Semel Institute for Neuroscience and Human Behavior, University of California, Los Angeles. The authors developed a survey through iterative piloting, surveyed all residents, and then conducted a focus group. The survey included elements hypothesized to enhance or impede workflow, and measures pertaining to self-rated efficiency and stress. Distributional and bivariate analyses were performed. Survey findings were clarified in focus group discussion. This study identified several factors subjectively associated with enhanced or impeded workflow, including difficulty with documentation, the value of personal organization systems, and struggles to communicate with patients' families. Implications for resident education are discussed.

  5. Multilevel Workflow System in the ATLAS Experiment

    International Nuclear Information System (INIS)

    Borodin, M; De, K; Navarro, J Garcia; Golubkov, D; Klimentov, A; Maeno, T; Vaniachine, A

    2015-01-01

    The ATLAS experiment is scaling up Big Data processing for the next LHC run using a multilevel workflow system comprised of many layers. In Big Data processing ATLAS deals with datasets, not individual files. Similarly a task (comprised of many jobs) has become a unit of the ATLAS workflow in distributed computing, with about 0.8M tasks processed per year. In order to manage the diversity of LHC physics (exceeding 35K physics samples per year), the individual data processing tasks are organized into workflows. For example, the Monte Carlo workflow is composed of many steps: generate or configure hard-processes, hadronize signal and minimum-bias (pileup) events, simulate energy deposition in the ATLAS detector, digitize electronics response, simulate triggers, reconstruct data, convert the reconstructed data into ROOT ntuples for physics analysis, etc. Outputs are merged and/or filtered as necessary to optimize the chain. The bi-level workflow manager - ProdSys2 - generates actual workflow tasks and their jobs are executed across more than a hundred distributed computing sites by PanDA - the ATLAS job-level workload management system. On the outer level, the Database Engine for Tasks (DEfT) empowers production managers with templated workflow definitions. On the next level, the Job Execution and Definition Interface (JEDI) is integrated with PanDA to provide dynamic job definition tailored to the sites capabilities. We report on scaling up the production system to accommodate a growing number of requirements from main ATLAS areas: Trigger, Physics and Data Preparation. (paper)

  6. SPheno 3.1: extensions including flavour, CP-phases and models beyond the MSSM

    Science.gov (United States)

    Porod, W.; Staub, F.

    2012-11-01

    We describe recent extensions of the program SPhenoincluding flavour aspects, CP-phases, R-parity violation and low energy observables. In case of flavour mixing all masses of supersymmetric particles are calculated including the complete flavour structure and all possible CP-phases at the 1-loop level. We give details on implemented seesaw models, low energy observables and the corresponding extension of the SUSY Les Houches Accord. Moreover, we comment on the possibilities to include MSSM extensions in SPheno. Catalogue identifier: ADRV_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADRV_v2_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 154062 No. of bytes in distributed program, including test data, etc.: 1336037 Distribution format: tar.gz Programming language: Fortran95. Computer: PC running under Linux, should run in every Unix environment. Operating system: Linux, Unix. Classification: 11.6. Catalogue identifier of previous version: ADRV_v1_0 Journal reference of previous version: Comput. Phys. Comm. 153(2003)275 Does the new version supersede the previous version?: Yes Nature of problem: The first issue is the determination of the masses and couplings of supersymmetric particles in various supersymmetric models, the R-parity conserved MSSM with generation mixing and including CP-violating phases, various seesaw extensions of the MSSM and the MSSM with bilinear R-parity breaking. Low energy data on Standard Model fermion masses, gauge couplings and electroweak gauge boson masses serve as constraints. Radiative corrections from supersymmetric particles to these inputs must be calculated. Theoretical constraints on the soft SUSY breaking parameters from a high scale theory are imposed and the parameters at the electroweak scale are obtained from the

  7. Modelling of Water Cooled Fuel Including Design Basis and Severe Accidents. Proceedings of a Technical Meeting

    International Nuclear Information System (INIS)

    2015-11-01

    The demands on nuclear fuel have recently been increasing, and include transient regimes, higher discharge burnup and longer fuel cycles. This has resulted in an increase of loads on fuel and core internals. In order to satisfy these demands while ensuring compliance with safety criteria, new national and international programmes have been launched and advanced modelling codes are being developed. The Fukushima Daiichi accident has particularly demonstrated the need for adequate analysis of all aspects of fuel performance to prevent a failure and also to predict fuel behaviour were an accident to occur.This publication presents the Proceedings of the Technical Meeting on Modelling of Water Cooled Fuel Including Design Basis and Severe Accidents, which was hosted by the Nuclear Power Institute of China (NPIC) in Chengdu, China, following the recommendation made in 2013 at the IAEA Technical Working Group on Fuel Performance and Technology. This recommendation was in agreement with IAEA mid-term initiatives, linked to the post-Fukushima IAEA Nuclear Safety Action Plan, as well as the forthcoming Coordinated Research Project (CRP) on Fuel Modelling in Accident Conditions. At the technical meeting in Chengdu, major areas and physical phenomena, as well as types of code and experiment to be studied and used in the CRP, were discussed. The technical meeting provided a forum for international experts to review the state of the art of code development for modelling fuel performance of nuclear fuel for water cooled reactors with regard to steady state and transient conditions, and for design basis and early phases of severe accidents, including experimental support for code validation. A round table discussion focused on the needs and perspectives on fuel modelling in accident conditions. This meeting was the ninth in a series of IAEA meetings, which reflects Member States’ continuing interest in nuclear fuel issues. The previous meetings were held in 1980 (jointly with

  8. Using semi-automated curation workflows to collect, organize, and curate the data and models necessary to support the EPA CompTox chemical dashboard (ACS Spring meeting)

    Science.gov (United States)

    With the increasing need to leverage data and models to perform cutting edge analyses within the environmental science community, collection and organization of that data into a readily accessible format for consumption is a pressing need. The EPA CompTox chemical dashboard is i...

  9. Modeling of in-vessel fission product release including fuel morphology effects for severe accident analyses

    International Nuclear Information System (INIS)

    Suh, K.Y.

    1989-10-01

    A new in-vessel fission product release model has been developed and implemented to perform best-estimate calculations of realistic source terms including fuel morphology effects. The proposed bulk mass transfer correlation determines the product of fission product release and equiaxed grain size as a function of the inverse fuel temperature. The model accounts for the fuel-cladding interaction over the temperature range between 770 K and 3000 K in the steam environment. A separate driver has been developed for the in-vessel thermal hydraulic and fission product behavior models that were developed by the Department of Energy for the Modular Accident Analysis Package (MAAP). Calculational results of these models have been compared to the results of the Power Burst Facility Severe Fuel Damage tests. The code predictions utilizing the mass transfer correlation agreed with the experimentally determined fractional release rates during the course of the heatup, power hold, and cooldown phases of the high temperature transients. Compared to such conventional literature correlations as the steam oxidation model and the NUREG-0956 correlation, the mass transfer correlation resulted in lower and less rapid releases in closer agreement with the on-line and grab sample data from the Severe Fuel Damage tests. The proposed mass transfer correlation can be applied for best-estimate calculations of fission products release from the UO 2 fuel in both nominal and severe accident conditions. 15 refs., 10 figs., 2 tabs

  10. Health Promotion Behavior of Chinese International Students in Korea Including Acculturation Factors: A Structural Equation Model.

    Science.gov (United States)

    Kim, Sun Jung; Yoo, Il Young

    2016-03-01

    The purpose of this study was to explain the health promotion behavior of Chinese international students in Korea using a structural equation model including acculturation factors. A survey using self-administered questionnaires was employed. Data were collected from 272 Chinese students who have resided in Korea for longer than 6 months. The data were analyzed using structural equation modeling. The p value of final model is .31. The fitness parameters of the final model such as goodness of fit index, adjusted goodness of fit index, normed fit index, non-normed fit index, and comparative fit index were more than .95. Root mean square of residual and root mean square error of approximation also met the criteria. Self-esteem, perceived health status, acculturative stress and acculturation level had direct effects on health promotion behavior of the participants and the model explained 30.0% of variance. The Chinese students in Korea with higher self-esteem, perceived health status, acculturation level, and lower acculturative stress reported higher health promotion behavior. The findings can be applied to develop health promotion strategies for this population. Copyright © 2016. Published by Elsevier B.V.

  11. Include dispersion in quantum chemical modeling of enzymatic reactions: the case of isoaspartyl dipeptidase.

    Science.gov (United States)

    Zhang, Hai-Mei; Chen, Shi-Lu

    2015-06-09

    The lack of dispersion in the B3LYP functional has been proposed to be the main origin of big errors in quantum chemical modeling of a few enzymes and transition metal complexes. In this work, the essential dispersion effects that affect quantum chemical modeling are investigated. With binuclear zinc isoaspartyl dipeptidase (IAD) as an example, dispersion is included in the modeling of enzymatic reactions by two different procedures, i.e., (i) geometry optimizations followed by single-point calculations of dispersion (approach I) and (ii) the inclusion of dispersion throughout geometry optimization and energy evaluation (approach II). Based on a 169-atom chemical model, the calculations show a qualitative consistency between approaches I and II in energetics and most key geometries, demonstrating that both approaches are available with the latter preferential since both geometry and energy are dispersion-corrected in approach II. When a smaller model without Arg233 (147 atoms) was used, an inconsistency was observed, indicating that the missing dispersion interactions are essentially responsible for determining equilibrium geometries. Other technical issues and mechanistic characteristics of IAD are also discussed, in particular with respect to the effects of Arg233.

  12. S5-4: Formal Modeling of Affordance in Human-Included Systems

    Directory of Open Access Journals (Sweden)

    Namhun Kim

    2012-10-01

    Full Text Available In spite of it being necessary for humans to consider modeling, analysis, and control of human-included systems, it has been considered a challenging problem because of the critical role of humans in complex systems and of humans' capability of executing unanticipated actions–both beneficial and detrimental ones. Thus, to provide systematic approaches to modeling human actions as a part of system behaviors, a formal modeling framework for human-involved systems in which humans play a controlling role based on their perceptual information is presented. The theory of affordance provides definitions of human actions and their associated properties; Finite State Automata (FSA based modeling is capable of mapping nondeterministic humans into computable components in the system representation. In this talk, we investigate the role of perception in human actions in the system operation and examine the representation of perceptual elements in affordance-based modeling formalism. The proposed framework is expected to capture the natural ways in which humans participate in the system as part of its operation. A human-machine cooperative manufacturing system control example and a human agent simulation example will be introduced for the illustrative purposes at the end of the presentation.

  13. Networked Print Production: Does JDF Provide a Perfect Workflow?

    Directory of Open Access Journals (Sweden)

    Bernd Zipper

    2004-12-01

    Full Text Available The "networked printing works" is a well-worn slogan used by many providers in the graphics industry and for the past number of years printing-works manufacturers have been working on the goal of achieving the "networked printing works". A turning point from the concept to real implementation can now be expected at drupa 2004: JDF (Job Definition Format and thus "networked production" will form the center of interest here. The first approaches towards a complete, networked workflow between prepress, print and postpress in production are already available - the products and solutions will now be presented publicly at drupa 2004. So, drupa 2004 will undoubtedly be the "JDF-drupa" - the drupa where machines learn to communicate with each other digitally - the drupa, where the dream of general system and job communication in the printing industry can be first realized. CIP3, which has since been renamed CIP4, is an international consortium of leading manufacturers from the printing and media industry who have taken on the task of integrating processes for prepress, print and postpress. The association, to which nearly all manufacturers in the graphics industry belong, has succeeded with CIP3 in developing a first international standard for the transmission of control data in the print workflow.Further development of the CIP4 standard now includes a more extensive "system language" called JDF, which will guarantee workflow communication beyond manufacturer boundaries. However, not only data for actual print production will be communicated with JDF (Job Definition Format: planning and calculation data for MIS (Management Information systems and calculation systems will also be prepared. The German printing specialist Hans-Georg Wenke defines JDF as follows: "JDF takes over data from MIS for machines, aggregates and their control desks, data exchange within office applications, and finally ensures that data can be incorporated in the technical workflow

  14. Staffing and Workflow of a Maturing Institutional Repository

    Directory of Open Access Journals (Sweden)

    Debora L. Madsen

    2013-02-01

    Full Text Available Institutional repositories (IRs have become established components of many academic libraries. As an IR matures it will face the challenge of how to scale up its operations to increase the amount and types of content archived. These challenges involve staffing, systems, workflows, and promotion. In the past eight years, Kansas State University's IR (K-REx has grown from a platform for student theses, dissertations, and reports to also include faculty works. The initial workforce of a single faculty member was expanded as a part of a library-wide reorganization, resulting in a cross-departmental team that is better able to accommodate the expansion of the IR. The resultant need to define staff responsibilities and develop resources to manage the workflows has led to the innovations described here, which may prove useful to the greater library community as other IRs mature.

  15. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction.

    Science.gov (United States)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O; Ellis, Heidi J C; Gryk, Michael R

    2015-07-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  16. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert [UConn Health, Department of Molecular Biology and Biophysics (United States); Martyn, Timothy O. [Rensselaer at Hartford, Department of Engineering and Science (United States); Ellis, Heidi J. C. [Western New England College, Department of Computer Science and Information Technology (United States); Gryk, Michael R., E-mail: gryk@uchc.edu [UConn Health, Department of Molecular Biology and Biophysics (United States)

    2015-07-15

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  17. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    International Nuclear Information System (INIS)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O.; Ellis, Heidi J. C.; Gryk, Michael R.

    2015-01-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses

  18. HisT/PLIER: A two-fold Provenance Approach for Grid-enabled Scientific Workflows using WS-VLAM

    NARCIS (Netherlands)

    Gerhards, M.; Sander, V.; Belloum, A.; Vasunin, D.; Benabdelkader, A.; Jha, S.; Felde, N.G.; Buyya, R.; Fedak, G.

    2011-01-01

    Large scale scientific applications are frequently modeled as a workflow that is executed under the control of a workflow management system. One crucial requirement is the validation of the generated results, e.g. The trace ability of the experiment execution path. The automated tracking and storage

  19. A method to mine workflows from provenance for assisting scientific workflow composition

    NARCIS (Netherlands)

    Zeng, R.; He, X.; Aalst, van der W.M.P.

    2011-01-01

    Scientific workflows have recently emerged as a new paradigm for representing and managing complex distributed scientific computations and are used to accelerate the pace of scientific discovery. In many disciplines, individual workflows are large and complicated due to the large quantities of data

  20. Impact of including surface currents on simulation of Indian Ocean variability with the POAMA coupled model

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Mei; Wang, Guomin; Hendon, Harry H.; Alves, Oscar [Bureau of Meteorology, Centre for Australian Weather and Climate Research, Melbourne (Australia)

    2011-04-15

    Impacts on the coupled variability of the Indo-Pacific by including the effects of surface currents on surface stress are explored in four extended integrations of an experimental version of the Bureau of Meteorology's coupled seasonal forecast model POAMA. The first pair of simulations differs only in their treatment of momentum coupling: one version includes the effects of surface currents on the surface stress computation and the other does not. The version that includes the effect of surface currents has less mean-state bias in the equatorial Pacific cold tongue but produces relatively weak coupled variability in the Tropics, especially that related to the Indian Ocean dipole (IOD) and El Nino/Southern Oscillation (ENSO). The version without the effects of surface currents has greater bias in the Pacific cold tongue but stronger IOD and ENSO variability. In order to diagnose the role of changes in local coupling from changes in remote forcing by ENSO for causing changes in IOD variability, a second set of simulations is conducted where effects of surface currents are included only in the Indian Ocean and only in the Pacific Ocean. IOD variability is found to be equally reduced by inclusion of the local effects of surface currents in the Indian Ocean and by the reduction of ENSO variability as a result of including effects of surface currents in the Pacific. Some implications of these results for predictability of the IOD and its dependence on ENSO, and for ocean subsurface data assimilation are discussed. (orig.)

  1. Impact of Diabetes E-Consults on Outpatient Clinic Workflow.

    Science.gov (United States)

    Zoll, Brian; Parikh, Pratik J; Gallimore, Jennie; Harrell, Stephen; Burke, Brian

    2015-08-01

    An e-consult is an electronic communication system between clinicians, usually a primary care physician (PCP) and a medical or surgical specialist, regarding general or patient-specific, low complexity questions that would not need an in-person consultation. The objectives of this study were to understand and quantify the impact of the e-consult initiative on outpatient clinic workflow and outcomes. We collected data from 5 different Veterans Affairs (VA) outpatient clinics and interviewed several physicians and staff members. We then developed a simulation model for a primary care team at an outpatient clinic. A detailed experimental study was conducted to determine the effects of factors, such as e-consult demand, view-alert notification arrivals, walk-in patient arrivals, and PCP unavailability, on e-consult cycle time. Statistical tests indicated that 4 factors related to outpatient clinic workflow were significant, and levels within each of the 4 significant factors resulted in statistically different e-consult cycle times. The arrival rate of electronic notifications, along with patient walk-ins, had a considerable effect on cycle time. Splitting the workload of an unavailable PCP among the other PCPs, instead of the current practice of allocating it to a single PCP, increases the system's ability to handle a much larger e-consult demand. The full potential of e-consults can only be realized if the workflow at the outpatient clinics is designed or modified to support this initiative. This study furthers our understanding of how e-consult systems can be analyzed and alternative workflows tested using statistical and simulation modeling to improve care delivery and outcomes. © The Author(s) 2014.

  2. Workflow management in large distributed systems

    International Nuclear Information System (INIS)

    Legrand, I; Newman, H; Voicu, R; Dobre, C; Grigoras, C

    2011-01-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  3. Workflow management in large distributed systems

    Science.gov (United States)

    Legrand, I.; Newman, H.; Voicu, R.; Dobre, C.; Grigoras, C.

    2011-12-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  4. Towards a workflow driven design for mHealth devices within temporary eye clinics in low-income settings.

    Science.gov (United States)

    Bolster, Nigel M; Bastawrous, Andrew; Giardini, Mario E

    2015-01-01

    Only a small minority of mobile healthcare technologies that have been successful in pilot studies have subsequently been integrated into healthcare systems. Understanding the reasons behind this discrepancy is crucial if such technologies are to be adopted. We believe that the mismatch is due to a breakdown in the relation between technical soundness of the original mobile health (mHealth) device design, and integration into healthcare provision workflows. Quantitative workflow modelling provides an opportunity to test this hypothesis. In this paper we present our current progress in developing a clinical workflow model for mobile eye assessment in low-income settings. We test the model for determining the appropriateness of design parameters of a mHealth device within this workflow, by assessing their impact on the entire clinical workflow performance.

  5. Implementation of a Workflow Management System for Non-Expert Users

    DEFF Research Database (Denmark)

    Jongejan, Bart

    2016-01-01

    tools, the CLARIN-DK workflow management system (WMS) computes combinations of tools that will give the desired result. This advanced functionality was originally not envisaged, but came within reach by writing the WMS partly in Java and partly in a programming language for symbolic computation, Bracmat....... Handling LT tool profiles, including the computation of workflows, is easier with Bracmat's language constructs for tree pattern matching and tree construction than with the language constructs offered by mainstream programming languages....

  6. A new model for including the effect of fly ash on biochemical methane potential.

    Science.gov (United States)

    Gertner, Pablo; Huiliñir, César; Pinto-Villegas, Paula; Castillo, Alejandra; Montalvo, Silvio; Guerrero, Lorna

    2017-10-01

    The modelling of the effect of trace elements on anaerobic digestion, and specifically the effect of fly ash, has been scarcely studied. Thus, the present work was aimed at the development of a new function that allows accumulated methane models to predict the effect of FA on the volume of methane accumulation. For this, purpose five fly ash concentrations (10, 25, 50, 250 and 500mg/L) using raw and pre-treated sewage sludge were used to calibrate the new function, while three fly ash concentrations were used (40, 150 and 350mg/L) for validation. Three models for accumulated methane volume (the modified Gompertz equation, the logistic function, and the transfer function) were evaluated. The results showed that methane production increased in the presence of FA when the sewage sludge was not pre-treated, while with pretreated sludge there is inhibition of methane production at FA concentrations higher than 50mg/L. In the calibration of the proposed function, it fits well with the experimental data under all the conditions, including the inhibition and stimulating zones, with the values of the parameters of the methane production models falling in the range of those reported in the literature. For validation experiments, the model succeeded in representing the behavior of new experiments in both the stimulating and inhibiting zones, with NRMSE and R 2 ranging from 0.3577 to 0.03714 and 0.2209 to 0.9911, respectively. Thus, the proposed model is robust and valid for the studied conditions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Models of epidemics: when contact repetition and clustering should be included

    Directory of Open Access Journals (Sweden)

    Scholz Roland W

    2009-06-01

    Full Text Available Abstract Background The spread of infectious disease is determined by biological factors, e.g. the duration of the infectious period, and social factors, e.g. the arrangement of potentially contagious contacts. Repetitiveness and clustering of contacts are known to be relevant factors influencing the transmission of droplet or contact transmitted diseases. However, we do not yet completely know under what conditions repetitiveness and clustering should be included for realistically modelling disease spread. Methods We compare two different types of individual-based models: One assumes random mixing without repetition of contacts, whereas the other assumes that the same contacts repeat day-by-day. The latter exists in two variants, with and without clustering. We systematically test and compare how the total size of an outbreak differs between these model types depending on the key parameters transmission probability, number of contacts per day, duration of the infectious period, different levels of clustering and varying proportions of repetitive contacts. Results The simulation runs under different parameter constellations provide the following results: The difference between both model types is highest for low numbers of contacts per day and low transmission probabilities. The number of contacts and the transmission probability have a higher influence on this difference than the duration of the infectious period. Even when only minor parts of the daily contacts are repetitive and clustered can there be relevant differences compared to a purely random mixing model. Conclusion We show that random mixing models provide acceptable estimates of the total outbreak size if the number of contacts per day is high or if the per-contact transmission probability is high, as seen in typical childhood diseases such as measles. In the case of very short infectious periods, for instance, as in Norovirus, models assuming repeating contacts will also behave

  8. Particle-based modeling of heterogeneous chemical kinetics including mass transfer

    Science.gov (United States)

    Sengar, A.; Kuipers, J. A. M.; van Santen, Rutger A.; Padding, J. T.

    2017-08-01

    Connecting the macroscopic world of continuous fields to the microscopic world of discrete molecular events is important for understanding several phenomena occurring at physical boundaries of systems. An important example is heterogeneous catalysis, where reactions take place at active surfaces, but the effective reaction rates are determined by transport limitations in the bulk fluid and reaction limitations on the catalyst surface. In this work we study the macro-micro connection in a model heterogeneous catalytic reactor by means of stochastic rotation dynamics. The model is able to resolve the convective and diffusive interplay between participating species, while including adsorption, desorption, and reaction processes on the catalytic surface. Here we apply the simulation methodology to a simple straight microchannel with a catalytic strip. Dimensionless Damkohler numbers are used to comment on the spatial concentration profiles of reactants and products near the catalyst strip and in the bulk. We end the discussion with an outlook on more complicated geometries and increasingly complex reactions.

  9. Analysis of electronic models for solar cells including energy resolved defect densities

    Energy Technology Data Exchange (ETDEWEB)

    Glitzky, Annegret

    2010-07-01

    We introduce an electronic model for solar cells including energy resolved defect densities. The resulting drift-diffusion model corresponds to a generalized van Roosbroeck system with additional source terms coupled with ODEs containing space and energy as parameters for all defect densities. The system has to be considered in heterostructures and with mixed boundary conditions from device simulation. We give a weak formulation of the problem. If the boundary data and the sources are compatible with thermodynamic equilibrium the free energy along solutions decays monotonously. In other cases it may be increasing, but we estimate its growth. We establish boundedness and uniqueness results and prove the existence of a weak solution. This is done by considering a regularized problem, showing its solvability and the boundedness of its solutions independent of the regularization level. (orig.)

  10. Particle-based modeling of heterogeneous chemical kinetics including mass transfer.

    Science.gov (United States)

    Sengar, A; Kuipers, J A M; van Santen, Rutger A; Padding, J T

    2017-08-01

    Connecting the macroscopic world of continuous fields to the microscopic world of discrete molecular events is important for understanding several phenomena occurring at physical boundaries of systems. An important example is heterogeneous catalysis, where reactions take place at active surfaces, but the effective reaction rates are determined by transport limitations in the bulk fluid and reaction limitations on the catalyst surface. In this work we study the macro-micro connection in a model heterogeneous catalytic reactor by means of stochastic rotation dynamics. The model is able to resolve the convective and diffusive interplay between participating species, while including adsorption, desorption, and reaction processes on the catalytic surface. Here we apply the simulation methodology to a simple straight microchannel with a catalytic strip. Dimensionless Damkohler numbers are used to comment on the spatial concentration profiles of reactants and products near the catalyst strip and in the bulk. We end the discussion with an outlook on more complicated geometries and increasingly complex reactions.

  11. Effect of including decay chains on predictions of equilibrium-type terrestrial food chain models

    International Nuclear Information System (INIS)

    Kirchner, G.

    1990-01-01

    Equilibrium-type food chain models are commonly used for assessing the radiological impact to man from environmental releases of radionuclides. Usually these do not take into account build-up of radioactive decay products during environmental transport. This may be a potential source of underprediction. For estimating consequences of this simplification, the equations of an internationally recognised terrestrial food chain model have been extended to include decay chains of variable length. Example calculations show that for releases from light water reactors as expected both during routine operation and in the case of severe accidents, the build-up of decay products during environmental transport is generally of minor importance. However, a considerable number of radionuclides of potential radiological significance have been identified which show marked contributions of decay products to calculated contamination of human food and resulting radiation dose rates. (author)

  12. A temperature dependent cyclic plasticity model for hot work tool steel including particle coarsening

    Science.gov (United States)

    Jilg, Andreas; Seifert, Thomas

    2018-05-01

    Hot work tools are subjected to complex thermal and mechanical loads during hot forming processes. Locally, the stresses can exceed the material's yield strength in highly loaded areas as e.g. in small radii in die cavities. To sustain the high loads, the hot forming tools are typically made of martensitic hot work steels. While temperatures for annealing of the tool steels usually lie in the range between 400 and 600 °C, the steels may experience even higher temperatures during hot forming, resulting in softening of the material due to coarsening of strengthening particles. In this paper, a temperature dependent cyclic plasticity model for the martensitic hot work tool steel 1.2367 (X38CrMoV5-3) is presented that includes softening due to particle coarsening and that can be applied in finite-element calculations to assess the effect of softening on the thermomechanical fatigue life of hot work tools. To this end, a kinetic model for the evolution of the mean size of secondary carbides based on Ostwald ripening is coupled with a cyclic plasticity model with kinematic hardening. Mechanism-based relations are developed to describe the dependency of the mechanical properties on carbide size and temperature. The material properties of the mechanical and kinetic model are determined on the basis of tempering hardness curves as well as monotonic and cyclic tests.

  13. Fluid-structure interaction including volumetric coupling with homogenised subdomains for modeling respiratory mechanics.

    Science.gov (United States)

    Yoshihara, Lena; Roth, Christian J; Wall, Wolfgang A

    2017-04-01

    In this article, a novel approach is presented for combining standard fluid-structure interaction with additional volumetric constraints to model fluid flow into and from homogenised solid domains. The proposed algorithm is particularly interesting for investigations in the field of respiratory mechanics as it enables the mutual coupling of airflow in the conducting part and local tissue deformation in the respiratory part of the lung by means of a volume constraint. In combination with a classical monolithic fluid-structure interaction approach, a comprehensive model of the human lung can be established that will be useful to gain new insights into respiratory mechanics in health and disease. To illustrate the validity and versatility of the novel approach, three numerical examples including a patient-specific lung model are presented. The proposed algorithm proves its capability of computing clinically relevant airflow distribution and tissue strain data at a level of detail that is not yet achievable, neither with current imaging techniques nor with existing computational models. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  14. A generalized model for optimal transport of images including dissipation and density modulation

    KAUST Repository

    Maas, Jan

    2015-11-01

    © EDP Sciences, SMAI 2015. In this paper the optimal transport and the metamorphosis perspectives are combined. For a pair of given input images geodesic paths in the space of images are defined as minimizers of a resulting path energy. To this end, the underlying Riemannian metric measures the rate of transport cost and the rate of viscous dissipation. Furthermore, the model is capable to deal with strongly varying image contrast and explicitly allows for sources and sinks in the transport equations which are incorporated in the metric related to the metamorphosis approach by Trouvé and Younes. In the non-viscous case with source term existence of geodesic paths is proven in the space of measures. The proposed model is explored on the range from merely optimal transport to strongly dissipative dynamics. For this model a robust and effective variational time discretization of geodesic paths is proposed. This requires to minimize a discrete path energy consisting of a sum of consecutive image matching functionals. These functionals are defined on corresponding pairs of intensity functions and on associated pairwise matching deformations. Existence of time discrete geodesics is demonstrated. Furthermore, a finite element implementation is proposed and applied to instructive test cases and to real images. In the non-viscous case this is compared to the algorithm proposed by Benamou and Brenier including a discretization of the source term. Finally, the model is generalized to define discrete weighted barycentres with applications to textures and objects.

  15. Habitability of super-Earth planets around other suns: models including Red Giant Branch evolution.

    Science.gov (United States)

    von Bloh, W; Cuntz, M; Schröder, K-P; Bounama, C; Franck, S

    2009-01-01

    The unexpected diversity of exoplanets includes a growing number of super-Earth planets, i.e., exoplanets with masses of up to several Earth masses and a similar chemical and mineralogical composition as Earth. We present a thermal evolution model for a 10 Earth-mass planet orbiting a star like the Sun. Our model is based on the integrated system approach, which describes the photosynthetic biomass production and takes into account a variety of climatological, biogeochemical, and geodynamical processes. This allows us to identify a so-called photosynthesis-sustaining habitable zone (pHZ), as determined by the limits of biological productivity on the planetary surface. Our model considers solar evolution during the main-sequence stage and along the Red Giant Branch as described by the most recent solar model. We obtain a large set of solutions consistent with the principal possibility of life. The highest likelihood of habitability is found for "water worlds." Only mass-rich water worlds are able to realize pHZ-type habitability beyond the stellar main sequence on the Red Giant Branch.

  16. Empirical Validation of a Thermal Model of a Complex Roof Including Phase Change Materials

    Directory of Open Access Journals (Sweden)

    Stéphane Guichard

    2015-12-01

    Full Text Available This paper deals with the empirical validation of a building thermal model of a complex roof including a phase change material (PCM. A mathematical model dedicated to PCMs based on the heat apparent capacity method was implemented in a multi-zone building simulation code, the aim being to increase the understanding of the thermal behavior of the whole building with PCM technologies. In order to empirically validate the model, the methodology is based both on numerical and experimental studies. A parametric sensitivity analysis was performed and a set of parameters of the thermal model has been identified for optimization. The use of the generic optimization program called GenOpt® coupled to the building simulation code enabled to determine the set of adequate parameters. We first present the empirical validation methodology and main results of previous work. We then give an overview of GenOpt® and its coupling with the building simulation code. Finally, once the optimization results are obtained, comparisons of the thermal predictions with measurements are found to be acceptable and are presented.

  17. Standardized Competencies for Parenteral Nutrition Order Review and Parenteral Nutrition Preparation, Including Compounding: The ASPEN Model.

    Science.gov (United States)

    Boullata, Joseph I; Holcombe, Beverly; Sacks, Gordon; Gervasio, Jane; Adams, Stephen C; Christensen, Michael; Durfee, Sharon; Ayers, Phil; Marshall, Neil; Guenter, Peggi

    2016-08-01

    Parenteral nutrition (PN) is a high-alert medication with a complex drug use process. Key steps in the process include the review of each PN prescription followed by the preparation of the formulation. The preparation step includes compounding the PN or activating a standardized commercially available PN product. The verification and review, as well as preparation of this complex therapy, require competency that may be determined by using a standardized process for pharmacists and for pharmacy technicians involved with PN. An American Society for Parenteral and Enteral Nutrition (ASPEN) standardized model for PN order review and PN preparation competencies is proposed based on a competency framework, the ASPEN-published interdisciplinary core competencies, safe practice recommendations, and clinical guidelines, and is intended for institutions and agencies to use with their staff. © 2016 American Society for Parenteral and Enteral Nutrition.

  18. Analytical and numerical modelling of thermoviscous shocks in their interactions in nonlinear fluids including dissipation

    DEFF Research Database (Denmark)

    Rasmussen, Anders Rønne; Sørensen, Mads Peter; Gaididei, Yuri Borisovich

    2010-01-01

    A wave equation, that governs finite amplitude acoustic disturbances in a thermoviscous Newtonian fluid, and includes nonlinear terms up to second order, is proposed. The equation preserves the Hamiltonian structure of the fundamental fluid dynamical equations in the non dissipative limit. An exact...... thermoviscous shock solution is derived. This solution is, in an overall sense, equivalent to the Taylor shock solution of the Burgers equation. However, in contrast to the Burgers equation, the model equation considered here is capable to describe waves propagating in opposite directions. Studies of head...

  19. A model for Huanglongbing spread between citrus plants including delay times and human intervention

    Science.gov (United States)

    Vilamiu, Raphael G. d'A.; Ternes, Sonia; Braga, Guilherme A.; Laranjeira, Francisco F.

    2012-09-01

    The objective of this work was to present a compartmental deterministic mathematical model for representing the dynamics of HLB disease in a citrus orchard, including delay in the disease's incubation phase in the plants, and a delay period on the nymphal stage of Diaphorina citri, the most important HLB insect vector in Brazil. Numerical simulations were performed to assess the possible impacts of human detection efficiency of symptomatic plants, as well as the influence of a long incubation period of HLB in the plant.

  20. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more.

    Science.gov (United States)

    Rivas, Elena; Lang, Raymond; Eddy, Sean R

    2012-02-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases.

  1. Including sugar cane in the agro-ecosystem model ORCHIDEE-STICS

    Science.gov (United States)

    Valade, A.; Vuichard, N.; Ciais, P.; Viovy, N.

    2010-12-01

    With 4 million ha currently grown for ethanol in Brazil only, approximately half the global bioethanol production in 2005 (Smeets 2008), and a devoted land area expected to expand globally in the years to come, sugar cane is at the heart of the biofuel debate. Indeed, ethanol made from biomass is currently the most widespread option for alternative transportation fuels. It was originally promoted as a carbon neutral energy resource that could bring energy independence to countries and local opportunities to farmers, until attention was drawn to its environmental and socio-economical drawbacks. It is still not clear to which extent it is a solution or a contributor to climate change mitigation. Dynamic Global Vegetation models can help address these issues and quantify the potential impacts of biofuels on ecosystems at scales ranging from on-site to global. The global agro-ecosystem model ORCHIDEE describes water, carbon and energy exchanges at the soil-atmosphere interface for a limited number of natural and agricultural vegetation types. In order to integrate agricultural management to the simulations and to capture more accurately the specificity of crops' phenology, ORCHIDEE has been coupled with the agronomical model STICS. The resulting crop-oriented vegetation model ORCHIDEE-STICS has been used so far to simulate temperate crops such as wheat, corn and soybean. As a generic ecosystem model, each grid cell can include several vegetation types with their own phenology and management practices, making it suitable to spatial simulations. Here, ORCHIDEE-STICS is altered to include sugar cane as a new agricultural Plant functional Type, implemented and parametrized using the STICS approach. An on-site calibration and validation is then performed based on biomass and flux chamber measurements in several sites in Australia and variables such as LAI, dry weight, heat fluxes and respiration are used to evaluate the ability of the model to simulate the specific

  2. Visual Workflows for Oil and Gas Exploration

    KAUST Repository

    Hollt, Thomas

    2013-04-14

    The most important resources to fulfill today’s energy demands are fossil fuels, such as oil and natural gas. When exploiting hydrocarbon reservoirs, a detailed and credible model of the subsurface structures to plan the path of the borehole, is crucial in order to minimize economic and ecological risks. Before that, the placement, as well as the operations of oil rigs need to be planned carefully, as off-shore oil exploration is vulnerable to hazards caused by strong currents. The oil and gas industry therefore relies on accurate ocean forecasting systems for planning their operations. This thesis presents visual workflows for creating subsurface models as well as planning the placement and operations of off-shore structures. Creating a credible subsurface model poses two major challenges: First, the structures in highly ambiguous seismic data are interpreted in the time domain. Second, a velocity model has to be built from this interpretation to match the model to depth measurements from wells. If it is not possible to obtain a match at all positions, the interpretation has to be updated, going back to the first step. This results in a lengthy back and forth between the different steps, or in an unphysical velocity model in many cases. We present a novel, integrated approach to interactively creating subsurface models from reflection seismics, by integrating the interpretation of the seismic data using an interactive horizon extraction technique based on piecewise global optimization with velocity modeling. Computing and visualizing the effects of changes to the interpretation and velocity model on the depth-converted model, on the fly enables an integrated feedback loop that enables a completely new connection of the seismic data in time domain, and well data in depth domain. For planning the operations of off-shore structures we present a novel integrated visualization system that enables interactive visual analysis of ensemble simulations used in ocean

  3. Workflow in clinical trial sites & its association with near miss events for data quality: ethnographic, workflow & systems simulation.

    Science.gov (United States)

    de Carvalho, Elias Cesar Araujo; Batilana, Adelia Portero; Claudino, Wederson; Reis, Luiz Fernando Lima; Schmerling, Rafael A; Shah, Jatin; Pietrobon, Ricardo

    2012-01-01

    With the exponential expansion of clinical trials conducted in (Brazil, Russia, India, and China) and VISTA (Vietnam, Indonesia, South Africa, Turkey, and Argentina) countries, corresponding gains in cost and enrolment efficiency quickly outpace the consonant metrics in traditional countries in North America and European Union. However, questions still remain regarding the quality of data being collected in these countries. We used ethnographic, mapping and computer simulation studies to identify/address areas of threat to near miss events for data quality in two cancer trial sites in Brazil. Two sites in Sao Paolo and Rio Janeiro were evaluated using ethnographic observations of workflow during subject enrolment and data collection. Emerging themes related to threats to near miss events for data quality were derived from observations. They were then transformed into workflows using UML-AD and modeled using System Dynamics. 139 tasks were observed and mapped through the ethnographic study. The UML-AD detected four major activities in the workflow evaluation of potential research subjects prior to signature of informed consent, visit to obtain subject́s informed consent, regular data collection sessions following study protocol and closure of study protocol for a given project. Field observations pointed to three major emerging themes: (a) lack of standardized process for data registration at source document, (b) multiplicity of data repositories and (c) scarcity of decision support systems at the point of research intervention. Simulation with policy model demonstrates a reduction of the rework problem. Patterns of threats to data quality at the two sites were similar to the threats reported in the literature for American sites. The clinical trial site managers need to reorganize staff workflow by using information technology more efficiently, establish new standard procedures and manage professionals to reduce near miss events and save time/cost. Clinical trial

  4. Web-accessible molecular modeling with Rosetta: The Rosetta Online Server that Includes Everyone (ROSIE).

    Science.gov (United States)

    Moretti, Rocco; Lyskov, Sergey; Das, Rhiju; Meiler, Jens; Gray, Jeffrey J

    2018-01-01

    The Rosetta molecular modeling software package provides a large number of experimentally validated tools for modeling and designing proteins, nucleic acids, and other biopolymers, with new protocols being added continually. While freely available to academic users, external usage is limited by the need for expertise in the Unix command line environment. To make Rosetta protocols available to a wider audience, we previously created a web server called Rosetta Online Server that Includes Everyone (ROSIE), which provides a common environment for hosting web-accessible Rosetta protocols. Here we describe a simplification of the ROSIE protocol specification format, one that permits easier implementation of Rosetta protocols. Whereas the previous format required creating multiple separate files in different locations, the new format allows specification of the protocol in a single file. This new, simplified protocol specification has more than doubled the number of Rosetta protocols available under ROSIE. These new applications include pK a determination, lipid accessibility calculation, ribonucleic acid redesign, protein-protein docking, protein-small molecule docking, symmetric docking, antibody docking, cyclic toxin docking, critical binding peptide determination, and mapping small molecule binding sites. ROSIE is freely available to academic users at http://rosie.rosettacommons.org. © 2017 The Protein Society.

  5. Enhancing and Customizing Laboratory Information Systems to Improve/Enhance Pathologist Workflow.

    Science.gov (United States)

    Hartman, Douglas J

    2015-06-01

    Optimizing pathologist workflow can be difficult because it is affected by many variables. Surgical pathologists must complete many tasks that culminate in a final pathology report. Several software systems can be used to enhance/improve pathologist workflow. These include voice recognition software, pre-sign-out quality assurance, image utilization, and computerized provider order entry. Recent changes in the diagnostic coding and the more prominent role of centralized electronic health records represent potential areas for increased ways to enhance/improve the workflow for surgical pathologists. Additional unforeseen changes to the pathologist workflow may accompany the introduction of whole-slide imaging technology to the routine diagnostic work. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Formalizing an integrative, multidisciplinary cancer therapy discovery workflow

    Science.gov (United States)

    McGuire, Mary F.; Enderling, Heiko; Wallace, Dorothy I.; Batra, Jaspreet; Jordan, Marie; Kumar, Sushil; Panetta, John C.; Pasquier, Eddy

    2014-01-01

    Although many clinicians and researchers work to understand cancer, there has been limited success to effectively combine forces and collaborate over time, distance, data and budget constraints. Here we present a workflow template for multidisciplinary cancer therapy that was developed during the 2nd Annual Workshop on Cancer Systems Biology sponsored by Tufts University, Boston, MA in July 2012. The template was applied to the development of a metronomic therapy backbone for neuroblastoma. Three primary groups were identified: clinicians, biologists, and scientists (mathematicians, computer scientists, physicists and engineers). The workflow described their integrative interactions; parallel or sequential processes; data sources and computational tools at different stages as well as the iterative nature of therapeutic development from clinical observations to in vitro, in vivo, and clinical trials. We found that theoreticians in dialog with experimentalists could develop calibrated and parameterized predictive models that inform and formalize sets of testable hypotheses, thus speeding up discovery and validation while reducing laboratory resources and costs. The developed template outlines an interdisciplinary collaboration workflow designed to systematically investigate the mechanistic underpinnings of a new therapy and validate that therapy to advance development and clinical acceptance. PMID:23955390

  7. Logical provenance in data-oriented workflows?

    KAUST Repository

    Ikeda, R.

    2013-04-01

    We consider the problem of defining, generating, and tracing provenance in data-oriented workflows, in which input data sets are processed by a graph of transformations to produce output results. We first give a new general definition of provenance for general transformations, introducing the notions of correctness, precision, and minimality. We then determine when properties such as correctness and minimality carry over from the individual transformations\\' provenance to the workflow provenance. We describe a simple logical-provenance specification language consisting of attribute mappings and filters. We provide an algorithm for provenance tracing in workflows where logical provenance for each transformation is specified using our language. We consider logical provenance in the relational setting, observing that for a class of Select-Project-Join (SPJ) transformations, logical provenance specifications encode minimal provenance. We have built a prototype system supporting the features and algorithms presented in the paper, and we report a few preliminary experimental results. © 2013 IEEE.

  8. ESO Reflex: a graphical workflow engine for data reduction

    Science.gov (United States)

    Hook, Richard; Ullgrén, Marko; Romaniello, Martino; Maisala, Sami; Oittinen, Tero; Solin, Otto; Savolainen, Ville; Järveläinen, Pekka; Tyynelä, Jani; Péron, Michèle; Ballester, Pascal; Gabasch, Armin; Izzo, Carlo

    ESO Reflex is a prototype software tool that provides a novel approach to astronomical data reduction by integrating a modern graphical workflow system (Taverna) with existing legacy data reduction algorithms. Most of the raw data produced by instruments at the ESO Very Large Telescope (VLT) in Chile are reduced using recipes. These are compiled C applications following an ESO standard and utilising routines provided by the Common Pipeline Library (CPL). Currently these are run in batch mode as part of the data flow system to generate the input to the ESO/VLT quality control process and are also exported for use offline. ESO Reflex can invoke CPL-based recipes in a flexible way through a general purpose graphical interface. ESO Reflex is based on the Taverna system that was originally developed within the UK life-sciences community. Workflows have been created so far for three VLT/VLTI instruments, and the GUI allows the user to make changes to these or create workflows of their own. Python scripts or IDL procedures can be easily brought into workflows and a variety of visualisation and display options, including custom product inspection and validation steps, are available. Taverna is intended for use with web services and experiments using ESO Reflex to access Virtual Observatory web services have been successfully performed. ESO Reflex is the main product developed by Sampo, a project led by ESO and conducted by a software development team from Finland as an in-kind contribution to joining ESO. The goal was to look into the needs of the ESO community in the area of data reduction environments and to create pilot software products that illustrate critical steps along the road to a new system. Sampo concluded early in 2008. This contribution will describe ESO Reflex and show several examples of its use both locally and using Virtual Observatory remote web services. ESO Reflex is expected to be released to the community in early 2009.

  9. ESO Reflex: A Graphical Workflow Engine for Data Reduction

    Science.gov (United States)

    Hook, R.; Romaniello, M.; Péron, M.; Ballester, P.; Gabasch, A.; Izzo, C.; Ullgrén, M.; Maisala, S.; Oittinen, T.; Solin, O.; Savolainen, V.; Järveläinen, P.; Tyynelä, J.

    2008-08-01

    Sampo {http://www.eso.org/sampo} (Hook et al. 2005) is a project led by ESO and conducted by a software development team from Finland as an in-kind contribution to joining ESO. The goal is to assess the needs of the ESO community in the area of data reduction environments and to create pilot software products that illustrate critical steps along the road to a new system. Those prototypes will not only be used to validate concepts and understand requirements but will also be tools of immediate value for the community. Most of the raw data produced by ESO instruments can be reduced using CPL {http://www.eso.org/cpl} recipes: compiled C programs following an ESO standard and utilizing routines provided by the Common Pipeline Library. Currently reduction recipes are run in batch mode as part of the data flow system to generate the input to the ESO VLT/VLTI quality control process and are also made public for external users. Sampo has developed a prototype application called ESO Reflex {http://www.eso.org/sampo/reflex/} that integrates a graphical user interface and existing data reduction algorithms. ESO Reflex can invoke CPL-based recipes in a flexible way through a dedicated interface. ESO Reflex is based on the graphical workflow engine Taverna {http://taverna.sourceforge.net} that was originally developed by the UK eScience community, mostly for work in the life sciences. Workflows have been created so far for three VLT/VLTI instrument modes ( VIMOS/IFU {http://www.eso.org/instruments/vimos/}, FORS spectroscopy {http://www.eso.org/instruments/fors/} and AMBER {http://www.eso.org/instruments/amber/}), and the easy-to-use GUI allows the user to make changes to these or create workflows of their own. Python scripts and IDL procedures can be easily brought into workflows and a variety of visualisation and display options, including custom product inspection and validation steps, are available.

  10. MHD model including small-scale perturbations in a plasma with temperature variations

    International Nuclear Information System (INIS)

    Kuvshinov, B.N.; Mikhailovskii, A.B.

    1996-01-01

    The possibility is studied of using a hydrodynamic model to describe a magnetized plasma with density and temperature variations on scales that are arbitrary with respect to the ion Larmor radius. It is shown that the inertial component of the transverse ion thermal flux should be taken into account. This component is found from the collisionless kinetic equation. It can also be obtained from the equations of the Grad type. A set of two-dimensional hydrodynamic equations for ions is obtained with this component taken into account. These equations are used to derive model hydrodynamic expressions for the density and temperature variations. It is shown that, for large-scale perturbations (when the wavelengths are longer than the ion Larmor radius), the expressions derived coincide with the corresponding kinetic expressions and, for perturbations on sub-Larmor scales (when the wavelengths are shorter than the Larmor radius), they agree qualitatively. Hydrodynamic dispersion relations are derived for several types of drift waves with arbitrary wavenumbers. The range of applicability of the MHD model is determined from a comparison of these dispersion relations with the kinetic ones. It is noted that, on the basis of results obtained, drift effects can be included in numerical MHD codes for studying plasma instabilities in high-temperature regimes in tokamaks

  11. Integrated Sachs-Wolfe effect in a quintessence cosmological model: Including anisotropic stress of dark energy

    International Nuclear Information System (INIS)

    Wang, Y. T.; Xu, L. X.; Gui, Y. X.

    2010-01-01

    In this paper, we investigate the integrated Sachs-Wolfe effect in the quintessence cold dark matter model with constant equation of state and constant speed of sound in dark energy rest frame, including dark energy perturbation and its anisotropic stress. Comparing with the ΛCDM model, we find that the integrated Sachs-Wolfe (ISW)-power spectrums are affected by different background evolutions and dark energy perturbation. As we change the speed of sound from 1 to 0 in the quintessence cold dark matter model with given state parameters, it is found that the inclusion of dark energy anisotropic stress makes the variation of magnitude of the ISW source uncertain due to the anticorrelation between the speed of sound and the ratio of dark energy density perturbation contrast to dark matter density perturbation contrast in the ISW-source term. Thus, the magnitude of the ISW-source term is governed by the competition between the alterant multiple of (1+3/2xc-circumflex s 2 ) and that of δ de /δ m with the variation of c-circumflex s 2 .

  12. A Hydrological Concept including Lateral Water Flow Compatible with the Biogeochemical Model ForSAFE

    Directory of Open Access Journals (Sweden)

    Giuliana Zanchi

    2016-03-01

    Full Text Available The study presents a hydrology concept developed to include lateral water flow in the biogeochemical model ForSAFE. The hydrology concept was evaluated against data collected at Svartberget in the Vindeln Research Forest in Northern Sweden. The results show that the new concept allows simulation of a saturated and an unsaturated zone in the soil as well as water flow that reaches the stream comparable to measurements. The most relevant differences compared to streamflow measurements are that the model simulates a higher base flow in winter and lower flow peaks after snowmelt. These differences are mainly caused by the assumptions made to regulate the percolation at the bottom of the simulated soil columns. The capability for simulating lateral flows and a saturated zone in ForSAFE can greatly improve the simulation of chemical exchange in the soil and export of elements from the soil to watercourses. Such a model can help improve the understanding of how environmental changes in the forest landscape will influence chemical loads to surface waters.

  13. IT-benchmarking of clinical workflows: concept, implementation, and evaluation.

    Science.gov (United States)

    Thye, Johannes; Straede, Matthias-Christopher; Liebe, Jan-David; Hübner, Ursula

    2014-01-01

    Due to the emerging evidence of health IT as opportunity and risk for clinical workflows, health IT must undergo a continuous measurement of its efficacy and efficiency. IT-benchmarks are a proven means for providing this information. The aim of this study was to enhance the methodology of an existing benchmarking procedure by including, in particular, new indicators of clinical workflows and by proposing new types of visualisation. Drawing on the concept of information logistics, we propose four workflow descriptors that were applied to four clinical processes. General and specific indicators were derived from these descriptors and processes. 199 chief information officers (CIOs) took part in the benchmarking. These hospitals were assigned to reference groups of a similar size and ownership from a total of 259 hospitals. Stepwise and comprehensive feedback was given to the CIOs. Most participants who evaluated the benchmark rated the procedure as very good, good, or rather good (98.4%). Benchmark information was used by CIOs for getting a general overview, advancing IT, preparing negotiations with board members, and arguing for a new IT project.

  14. Adaptation of a Bioinformatics Microarray Analysis Workflow for a Toxicogenomic Study in Rainbow Trout.

    Directory of Open Access Journals (Sweden)

    Sophie Depiereux

    Full Text Available Sex steroids play a key role in triggering sex differentiation in fish, the use of exogenous hormone treatment leading to partial or complete sex reversal. This phenomenon has attracted attention since the discovery that even low environmental doses of exogenous steroids can adversely affect gonad morphology (ovotestis development and induce reproductive failure. Modern genomic-based technologies have enhanced opportunities to find out mechanisms of actions (MOA and identify biomarkers related to the toxic action of a compound. However, high throughput data interpretation relies on statistical analysis, species genomic resources, and bioinformatics tools. The goals of this study are to improve the knowledge of feminisation in fish, by the analysis of molecular responses in the gonads of rainbow trout fry after chronic exposure to several doses (0.01, 0.1, 1 and 10 μg/L of ethynylestradiol (EE2 and to offer target genes as potential biomarkers of ovotestis development. We successfully adapted a bioinformatics microarray analysis workflow elaborated on human data to a toxicogenomic study using rainbow trout, a fish species lacking accurate functional annotation and genomic resources. The workflow allowed to obtain lists of genes supposed to be enriched in true positive differentially expressed genes (DEGs, which were subjected to over-representation analysis methods (ORA. Several pathways and ontologies, mostly related to cell division and metabolism, sexual reproduction and steroid production, were found significantly enriched in our analyses. Moreover, two sets of potential ovotestis biomarkers were selected using several criteria. The first group displayed specific potential biomarkers belonging to pathways/ontologies highlighted in the experiment. Among them, the early ovarian differentiation gene foxl2a was overexpressed. The second group, which was highly sensitive but not specific, included the DEGs presenting the highest fold change and

  15. Modelling and control of a microgrid including photovoltaic and wind generation

    Science.gov (United States)

    Hussain, Mohammed Touseef

    Extensive increase of distributed generation (DG) penetration and the existence of multiple DG units at distribution level have introduced the notion of micro-grid. This thesis develops a detailed non-linear and small-signal dynamic model of a microgrid that includes PV, wind and conventional small scale generation along with their power electronics interfaces and the filters. The models developed evaluate the amount of generation mix from various DGs for satisfactory steady state operation of the microgrid. In order to understand the interaction of the DGs on microgrid system initially two simpler configurations were considered. The first one consists of microalternator, PV and their electronics, and the second system consists of microalternator and wind system each connected to the power system grid. Nonlinear and linear state space model of each microgrid are developed. Small signal analysis showed that the large participation of PV/wind can drive the microgrid to the brink of unstable region without adequate control. Non-linear simulations are carried out to verify the results obtained through small-signal analysis. The role of the extent of generation mix of a composite microgrid consisting of wind, PV and conventional generation was investigated next. The findings of the smaller systems were verified through nonlinear and small signal modeling. A central supervisory capacitor energy storage controller interfaced through a STATCOM was proposed to monitor and enhance the microgrid operation. The potential of various control inputs to provide additional damping to the system has been evaluated through decomposition techniques. The signals identified to have damping contents were employed to design the supervisory control system. The controller gains were tuned through an optimal pole placement technique. Simulation studies demonstrate that the STATCOM voltage phase angle and PV inverter phase angle were the best inputs for enhanced stability boundaries.

  16. A surplus production model including environmental effects: Application to the Senegalese white shrimp stocks

    Science.gov (United States)

    Thiaw, Modou; Gascuel, Didier; Jouffre, Didier; Thiaw, Omar Thiom

    2009-12-01

    In Senegal, two stocks of white shrimp ( Penaeusnotialis) are intensively exploited, one in the north and another in the south. We used surplus production models including environmental effects to analyse their changes in abundance over the past 10 years and to estimate their Maximum Sustainable Yield (MSY) and the related fishing effort ( EMSY). First, yearly abundance indices were estimated from commercial statistics using GLM techniques. Then, two environmental indices were alternatively tested in the model: the coastal upwelling intensity from wind speeds provided by the SeaWifs database and the primary production derived from satellite infrared images of chlorophyll a. Models were fitted, with or without the environmental effect, to the 1996-2005 time series. They express stock abundance and catches as functions of the fishing effort and the environmental index (when considered). For the northern stock, fishing effort and abundance fluctuate over the period without any clear trends. The model based on the upwelling index explains 64.9% of the year-to-year variability. It shows that the stock was slightly overexploited in 2002-2003 and is now close to full exploitation. Stock abundance strongly depends on environmental conditions; consequently, the MSY estimate varies from 300 to 900 tons according to the upwelling intensity. For the southern stock, fishing effort has strongly increased over the past 10 years, while abundance has been reduced 4-fold. The environment has a significant effect on abundance but only explains a small part of the year-to-year variability. The best fit is obtained using the primary production index ( R2 = 0.75), and the stock is now significantly overfished regardless of environmental conditions. MSY varies from 1200 to 1800 tons according to environmental conditions. Finally, in northern Senegal, the upwelling is highly variable from year to year and constitutes the major factor determining productivity. In the south, hydrodynamic

  17. A Prudent Approach to Fair Use Workflow

    Directory of Open Access Journals (Sweden)

    Karey Patterson

    2018-02-01

    Full Text Available This poster will outline a new highly efficient workflow for the management of copyright materials that is prudent and accommodates generally and legally accepted Fair Use limits. The workflow allows library or copyright staff an easy means to keep on top of their copyright obligations, manage licenses and review and adjust schedules but is still a highly efficient means to cope with large numbers of requests to use materials. The poster details speed and efficiency gains for professors and library staff while reducing legal exposure.

  18. SwinDeW-C: A Peer-to-Peer Based Cloud Workflow System

    Science.gov (United States)

    Liu, Xiao; Yuan, Dong; Zhang, Gaofeng; Chen, Jinjun; Yang, Yun

    Workflow systems are designed to support the process automation of large scale business and scientific applications. In recent years, many workflow systems have been deployed on high performance computing infrastructures such as cluster, peer-to-peer (p2p), and grid computing (Moore, 2004; Wang, Jie, & Chen, 2009; Yang, Liu, Chen, Lignier, & Jin, 2007). One of the driving forces is the increasing demand of large scale instance and data/computation intensive workflow applications (large scale workflow applications for short) which are common in both eBusiness and eScience application areas. Typical examples (will be detailed in Section 13.2.1) include such as the transaction intensive nation-wide insurance claim application process; the data and computation intensive pulsar searching process in Astrophysics. Generally speaking, instance intensive applications are those processes which need to be executed for a large number of times sequentially within a very short period or concurrently with a large number of instances (Liu, Chen, Yang, & Jin, 2008; Liu et al., 2010; Yang et al., 2008). Therefore, large scale workflow applications normally require the support of high performance computing infrastructures (e.g. advanced CPU units, large memory space and high speed network), especially when workflow activities are of data and computation intensive themselves. In the real world, to accommodate such a request, expensive computing infrastructures including such as supercomputers and data servers are bought, installed, integrated and maintained with huge cost by system users

  19. A rational workflow for sequential virtual screening of chemical libraries on searching for new tyrosinase inhibitors.

    Science.gov (United States)

    Le-Thi-Thu, Huong; Casanola-Martín, Gerardo M; Marrero-Ponce, Yovani; Rescigno, Antonio; Abad, Concepcion; Khan, Mahmud Tareq Hassan

    2014-01-01

    The tyrosinase is a bifunctional, copper-containing enzyme widely distributed in the phylogenetic tree. This enzyme is involved in the production of melanin and some other pigments in humans, animals and plants, including skin pigmentations in mammals, and browning process in plants and vegetables. Therefore, enzyme inhibitors has been under the attention of the scientist community, due to its broad applications in food, cosmetic, agricultural and medicinal fields, to avoid the undesirable effects of abnormal melanin overproduction. However, the research of novel chemical with antityrosinase activity demands the use of more efficient tools to speed up the tyrosinase inhibitors discovery process. This chapter is focused in the different components of a predictive modeling workflow for the identification and prioritization of potential new compounds with activity against the tyrosinase enzyme. In this case, two structure chemical libraries Spectrum Collection and Drugbank are used in this attempt to combine different virtual screening data mining techniques, in a sequential manner helping to avoid the usually expensive and time consuming traditional methods. Some of the sequential steps summarize here comprise the use of drug-likeness filters, similarity searching, classification and potency QSAR multiclassifier systems, modeling molecular interactions systems, and similarity/diversity analysis. Finally, the methodologies showed here provide a rational workflow for virtual screening hit analysis and selection as a promissory drug discovery strategy for use in target identification phase.

  20. Deconstruction of Interhospital Transfer Workflow in Large Vessel Occlusion: Real-World Data in the Thrombectomy Era.

    Science.gov (United States)

    Ng, Felix C; Low, Essie; Andrew, Emily; Smith, Karen; Campbell, Bruce C V; Hand, Peter J; Crompton, Douglas E; Wijeratne, Tissa; Dewey, Helen M; Choi, Philip M

    2017-07-01

    Interhospital transfer is a critical component in the treatment of acute anterior circulation large vessel occlusive stroke transferred for mechanical thrombectomy. Real-world data for benchmarking and theoretical modeling are limited. We sought to characterize transfer workflow from primary stroke center (PSC) to comprehensive stroke center after the publication of positive thrombectomy trials. Consecutive patients transferred from 3 high-volume PSCs to a single comprehensive stroke center between January 2015 and August 2016 were included in a retrospective study. Factors associated with key time metrics were analyzed with emphasis on PSC intrahospital workflow. Sixty-seven patients were identified. Median age was 74 years (interquartile range [IQR], 63.5-78) and National Institutes of Health Stroke Scale 17 (IQR, 12-21). Median transfer time measured by PSC-door-to-comprehensive stroke center-door was 128 minutes (IQR, 107-164), of which 82.8% was spent at PSCs (door-in-door-out [DIDO]; 106 minutes; IQR, 86-143). The lengthiest component of DIDO was computed-tomography-to-retrieval-request (median 59.5 minutes; IQR, 44-83). The 37.3% had DIDO exceeding 120 minutes. DIDO times differed significantly between PSCs ( P =0.01). In multivariate analyses, rerecruiting the initial ambulance crew for transfer ( P workflow represents a major opportunity to expedite mechanical thrombectomy and improve patient outcomes. © 2017 American Heart Association, Inc.

  1. 3-D FEM Modeling of fiber/matrix interface debonding in UD composites including surface effects

    International Nuclear Information System (INIS)

    Pupurs, A; Varna, J

    2012-01-01

    Fiber/matrix interface debond growth is one of the main mechanisms of damage evolution in unidirectional (UD) polymer composites. Because for polymer composites the fiber strain to failure is smaller than for the matrix multiple fiber breaks occur at random positions when high mechanical stress is applied to the composite. The energy released due to each fiber break is usually larger than necessary for the creation of a fiber break therefore a partial debonding of fiber/matrix interface is typically observed. Thus the stiffness reduction of UD composite is contributed both from the fiber breaks and from the interface debonds. The aim of this paper is to analyze the debond growth in carbon fiber/epoxy and glass fiber/epoxy UD composites using fracture mechanics principles by calculation of energy release rate G II . A 3-D FEM model is developed for calculation of energy release rate for fiber/matrix interface debonds at different locations in the composite including the composite surface region where the stress state differs from the one in the bulk composite. In the model individual partially debonded fiber is surrounded by matrix region and embedded in a homogenized composite.

  2. Challenges of including nitrogen effects on decomposition in earth system models

    Science.gov (United States)

    Hobbie, S. E.

    2011-12-01

    Despite the importance of litter decomposition for ecosystem fertility and carbon balance, key uncertainties remain about how this fundamental process is affected by nitrogen (N) availability. Nevertheless, resolving such uncertainties is critical for mechanistic inclusion of such processes in earth system models, towards predicting the ecosystem consequences of increased anthropogenic reactive N. Towards that end, we have conducted a series of experiments examining nitrogen effects on litter decomposition. We found that both substrate N and externally supplied N (regardless of form) accelerated the initial decomposition rate. Faster initial decomposition rates were linked to the higher activity of carbohydrate-degrading enzymes associated with externally supplied N and the greater relative abundances of Gram negative and Gram positive bacteria associated with green leaves and externally supplied organic N (assessed using phospholipid fatty acid analysis, PLFA). By contrast, later in decomposition, externally supplied N slowed decomposition, increasing the fraction of slowly decomposing litter and reducing lignin-degrading enzyme activity and relative abundances of Gram negative and Gram positive bacteria. Our results suggest that elevated atmospheric N deposition may have contrasting effects on the dynamics of different soil carbon pools, decreasing mean residence times of active fractions comprising very fresh litter, while increasing those of more slowly decomposing fractions including more processed litter. Incorporating these contrasting effects of N on decomposition processes into models is complicated by lingering uncertainties about how these effects generalize across ecosystems and substrates.

  3. Exergoeconomic performance optimization for a steady-flow endoreversible refrigeration model including six typical cycles

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Lingen; Kan, Xuxian; Sun, Fengrui; Wu, Feng [College of Naval Architecture and Power, Naval University of Engineering, Wuhan 430033 (China)

    2013-07-01

    The operation of a universal steady flow endoreversible refrigeration cycle model consisting of a constant thermal-capacity heating branch, two constant thermal-capacity cooling branches and two adiabatic branches is viewed as a production process with exergy as its output. The finite time exergoeconomic performance optimization of the refrigeration cycle is investigated by taking profit rate optimization criterion as the objective. The relations between the profit rate and the temperature ratio of working fluid, between the COP (coefficient of performance) and the temperature ratio of working fluid, as well as the optimal relation between profit rate and the COP of the cycle are derived. The focus of this paper is to search the compromised optimization between economics (profit rate) and the utilization factor (COP) for endoreversible refrigeration cycles, by searching the optimum COP at maximum profit, which is termed as the finite-time exergoeconomic performance bound. Moreover, performance analysis and optimization of the model are carried out in order to investigate the effect of cycle process on the performance of the cycles using numerical example. The results obtained herein include the performance characteristics of endoreversible Carnot, Diesel, Otto, Atkinson, Dual and Brayton refrigeration cycles.

  4. High performance computation of landscape genomic models including local indicators of spatial association.

    Science.gov (United States)

    Stucki, S; Orozco-terWengel, P; Forester, B R; Duruz, S; Colli, L; Masembe, C; Negrini, R; Landguth, E; Jones, M R; Bruford, M W; Taberlet, P; Joost, S

    2017-09-01

    With the increasing availability of both molecular and topo-climatic data, the main challenges facing landscape genomics - that is the combination of landscape ecology with population genomics - include processing large numbers of models and distinguishing between selection and demographic processes (e.g. population structure). Several methods address the latter, either by estimating a null model of population history or by simultaneously inferring environmental and demographic effects. Here we present samβada, an approach designed to study signatures of local adaptation, with special emphasis on high performance computing of large-scale genetic and environmental data sets. samβada identifies candidate loci using genotype-environment associations while also incorporating multivariate analyses to assess the effect of many environmental predictor variables. This enables the inclusion of explanatory variables representing population structure into the models to lower the occurrences of spurious genotype-environment associations. In addition, samβada calculates local indicators of spatial association for candidate loci to provide information on whether similar genotypes tend to cluster in space, which constitutes a useful indication of the possible kinship between individuals. To test the usefulness of this approach, we carried out a simulation study and analysed a data set from Ugandan cattle to detect signatures of local adaptation with samβada, bayenv, lfmm and an F ST outlier method (FDIST approach in arlequin) and compare their results. samβada - an open source software for Windows, Linux and Mac OS X available at http://lasig.epfl.ch/sambada - outperforms other approaches and better suits whole-genome sequence data processing. © 2016 The Authors. Molecular Ecology Resources Published by John Wiley & Sons Ltd.

  5. CFD simulations and reduced order modeling of a refrigerator compartment including radiation effects

    International Nuclear Information System (INIS)

    Bayer, Ozgur; Oskay, Ruknettin; Paksoy, Akin; Aradag, Selin

    2013-01-01

    Highlights: ► Free convection in a refrigerator is simulated including radiation effects. ► Heat rates are affected drastically when radiation effects are considered. ► 95% of the flow energy can be represented by using one spatial POD mode. - Abstract: Considering the engineering problem of natural convection in domestic refrigerator applications, this study aims to simulate the fluid flow and temperature distribution in a single commercial refrigerator compartment by using the experimentally determined temperature values as the specified constant wall temperature boundary conditions. The free convection in refrigerator applications is evaluated as a three-dimensional (3D), turbulent, transient and coupled non-linear flow problem. Radiation heat transfer mode is also included in the analysis. According to the results, taking radiation effects into consideration does not change the temperature distribution inside the refrigerator significantly; however the heat rates are affected drastically. The flow inside the compartment is further analyzed with a reduced order modeling method called Proper Orthogonal Decomposition (POD) and the energy contents of several spatial and temporal modes that exist in the flow are examined. The results show that approximately 95% of all the flow energy can be represented by only using one spatial mode

  6. A workflow for the 3D visualization of meteorological data

    Science.gov (United States)

    Helbig, Carolin; Rink, Karsten

    2014-05-01

    In the future, climate change will strongly influence our environment and living conditions. To predict possible changes, climate models that include basic and process conditions have been developed and big data sets are produced as a result of simulations. The combination of various variables of climate models with spatial data from different sources helps to identify correlations and to study key processes. For our case study we use results of the weather research and forecasting (WRF) model of two regions at different scales that include various landscapes in Northern Central Europe and Baden-Württemberg. We visualize these simulation results in combination with observation data and geographic data, such as river networks, to evaluate processes and analyze if the model represents the atmospheric system sufficiently. For this purpose, a continuous workflow that leads from the integration of heterogeneous raw data to visualization using open source software (e.g. OpenGeoSys Data Explorer, ParaView) is developed. These visualizations can be displayed on a desktop computer or in an interactive virtual reality environment. We established a concept that includes recommended 3D representations and a color scheme for the variables of the data based on existing guidelines and established traditions in the specific domain. To examine changes over time in observation and simulation data, we added the temporal dimension to the visualization. In a first step of the analysis, the visualizations are used to get an overview of the data and detect areas of interest such as regions of convection or wind turbulences. Then, subsets of data sets are extracted and the included variables can be examined in detail. An evaluation by experts from the domains of visualization and atmospheric sciences establish if they are self-explanatory and clearly arranged. These easy-to-understand visualizations of complex data sets are the basis for scientific communication. In addition, they have

  7. Improved Screening Mammogram Workflow by Maximizing PACS Streamlining Capabilities in an Academic Breast Center.

    Science.gov (United States)

    Pham, Ramya; Forsberg, Daniel; Plecha, Donna

    2017-04-01

    The aim of this study was to perform an operational improvement project targeted at the breast imaging reading workflow of mammography examinations at an academic medical center with its associated breast centers and satellite sites. Through careful analysis of the current workflow, two major issues were identified: stockpiling of paperwork and multiple worklists. Both issues were considered to cause significant delays to the start of interpreting screening mammograms. Four workflow changes were suggested (scanning of paperwork, worklist consolidation, use of chat functionality, and tracking of case distribution among trainees) and implemented in July 2015. Timestamp data was collected 2 months before (May-Jun) and after (Aug-Sep) the implemented changes. Generalized linear models were used to analyze the data. The results showed significant improvements for the interpretation of screening mammograms. The average time elapsed for time to open a case reduced from 70 to 28 min (60 % decrease, p workflow for diagnostic mammograms at large unaltered even with increased volume of mammography examinations (31 % increase of 4344 examinations for May-Jun to 5678 examinations for Aug-Sep). In conclusion, targeted efforts to improve the breast imaging reading workflow for screening mammograms in a teaching environment provided significant performance improvements without affecting the workflow of diagnostic mammograms.

  8. CSPBuilder - CSP based Scientific Workflow Modelling

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard; Vinter, Brian

    2008-01-01

    This paper introduces a framework for building CSP based applications, targeted for clusters and next generation CPU designs. CPUs are produced with several cores today and every future CPU generation will feature increasingly more cores, resulting in a requirement for concurrency that has not pr...

  9. Impact of workflow on the use of the Surgical Safety Checklist: a qualitative study.

    Science.gov (United States)

    Gillespie, Brigid M; Marshall, Andrea P; Gardiner, Therese; Lavin, Joanne; Withers, Teresa K

    2016-11-01

    Regardless of the benefits associated of the Surgical Safety Checklist, adherence across its three phases remains inconsistent. The aim of this study was to systematically identify issues around workflow that impact on surgical teams' ability to use the Surgical Safety Checklist in a large tertiary facility in Queensland, Australia. Observational audit of 10 surgical teams and 33 semi-structured interviews with 70 participants from nursing, medicine and the community were conducted. Data were collected during 2014-2015. Inductive and deductive approaches were used to analyse field observations and interview transcripts. The domain, impact of workflow on checklist utilization, was identified. Within this domain, seven categories illustrated the causal conditions which determined the ways in which workflow influenced checklist use. These categories included: 'busy doing the task'; 'clashing task priorities'; 'being pressured, running out of time'; 'adapting processes to work patterns'; 'doubling up on work'; 'a domino effect, leading to delays' and 'reality of the workflow'. One of the greatest systemic challenges to checklist use in surgery is workflow. Process changes in the way that surgical safety checklists are used need to incorporate the temporal demands of the workflow. Any changes made must ensure the process is reliable, is easily embedded into existing work routines and is not disruptive. © 2016 Royal Australasian College of Surgeons.

  10. Complexity metrics for workflow nets

    NARCIS (Netherlands)

    Lassen, K.B.; Aalst, van der W.M.P.

    2009-01-01

    Process modeling languages such as EPCs, BPMN, flow charts, UML activity diagrams, Petri nets, etc., are used to model business processes and to configure process-aware information systems. It is known that users have problems understanding these diagrams. In fact, even process engineers and system

  11. Complexity Metrics for Workflow Nets

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M.P.

    2009-01-01

    Process modeling languages such as EPCs, BPMN, flow charts, UML activity diagrams, Petri nets, etc.\\ are used to model business processes and to configure process-aware information systems. It is known that users have problems understanding these diagrams. In fact, even process engineers and system...

  12. Distributed interoperable workflow support for electronic commerce

    NARCIS (Netherlands)

    Papazoglou, M.; Jeusfeld, M.A.; Weigand, H.; Jarke, M.

    1998-01-01

    Abstract. This paper describes a flexible distributed transactional workflow environment based on an extensible object-oriented framework built around class libraries, application programming interfaces, and shared services. The purpose of this environment is to support a range of EC-like business

  13. Using workflow for projects in higher education

    NARCIS (Netherlands)

    van der Veen, Johan (CTIT); Jones, Valerie M.; Collis, Betty

    2000-01-01

    The WWW is increasingly used as a medium to support education and training. A course at the University of Twente in which groups of students collaborate in the design and production of multimedia instructional materials has now been supported by a website since 1995. Workflow was integrated with

  14. Workflow Automation: A Collective Case Study

    Science.gov (United States)

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  15. On Secure Workflow Decentralisation on the Internet

    Directory of Open Access Journals (Sweden)

    Petteri Kaskenpalo

    2010-06-01

    Full Text Available Decentralised workflow management systems are a new research area, where most work to-date has focused on the system's overall architecture. As little attention has been given to the security aspects in such systems, we follow a security driven approach, and consider, from the perspective of available security building blocks, how security can be implemented and what new opportunities are presented when empowering the decentralised environment with modern distributed security protocols. Our research is motivated by a more general question of how to combine the positive enablers that email exchange enjoys, with the general benefits of workflow systems, and more specifically with the benefits that can be introduced in a decentralised environment. This aims to equip email users with a set of tools to manage the semantics of a message exchange, contents, participants and their roles in the exchange in an environment that provides inherent assurances of security and privacy. This work is based on a survey of contemporary distributed security protocols, and considers how these protocols could be used in implementing a distributed workflow management system with decentralised control . We review a set of these protocols, focusing on the required message sequences in reviewing the protocols, and discuss how these security protocols provide the foundations for implementing core control-flow, data, and resource patterns in a distributed workflow environment.

  16. Adaptive workflow simulation of emergency response

    NARCIS (Netherlands)

    Bruinsma, Guido Wybe Jan

    2010-01-01

    Recent incidents and major training exercises in and outside the Netherlands have persistently shown that not having or not sharing information during emergency response are major sources of emergency response inefficiency and error, and affect incident mitigation outcomes through workflow planning

  17. Introducing students to digital geological mapping: A workflow based on cheap hardware and free software

    Science.gov (United States)

    Vrabec, Marko; Dolžan, Erazem

    2016-04-01

    The undergraduate field course in Geological Mapping at the University of Ljubljana involves 20-40 students per year, which precludes the use of specialized rugged digital field equipment as the costs would be way beyond the capabilities of the Department. A different mapping area is selected each year with the aim to provide typical conditions that a professional geologist might encounter when doing fieldwork in Slovenia, which includes rugged relief, dense tree cover, and moderately-well- to poorly-exposed bedrock due to vegetation and urbanization. It is therefore mandatory that the digital tools and workflows are combined with classical methods of fieldwork, since, for example, full-time precise GNSS positioning is not viable under such circumstances. Additionally, due to the prevailing combination of complex geological structure with generally poor exposure, students cannot be expected to produce line (vector) maps of geological contacts on the go, so there is no need for such functionality in hardware and software that we use in the field. Our workflow therefore still relies on paper base maps, but is strongly complemented with digital tools to provide robust positioning, track recording, and acquisition of various point-based data. Primary field hardware are students' Android-based smartphones and optionally tablets. For our purposes, the built-in GNSS chips provide adequate positioning precision most of the time, particularly if they are GLONASS-capable. We use Oruxmaps, a powerful free offline map viewer for the Android platform, which facilitates the use of custom-made geopositioned maps. For digital base maps, which we prepare in free Windows QGIS software, we use scanned topographic maps provided by the National Geodetic Authority, but also other maps such as aerial imagery, processed Digital Elevation Models, scans of existing geological maps, etc. Point data, like important outcrop locations or structural measurements, are entered into Oruxmaps as

  18. A Thermal Evolution Model of the Earth Including the Biosphere, Continental Growth and Mantle Hydration

    Science.gov (United States)

    Höning, D.; Spohn, T.

    2014-12-01

    By harvesting solar energy and converting it to chemical energy, photosynthetic life plays an important role in the energy budget of Earth [2]. This leads to alterations of chemical reservoirs eventually affecting the Earth's interior [4]. It further has been speculated [3] that the formation of continents may be a consequence of the evolution life. A steady state model [1] suggests that the Earth without its biosphere would evolve to a steady state with a smaller continent coverage and a dryer mantle than is observed today. We present a model including (i) parameterized thermal evolution, (ii) continental growth and destruction, and (iii) mantle water regassing and outgassing. The biosphere enhances the production rate of sediments which eventually are subducted. These sediments are assumed to (i) carry water to depth bound in stable mineral phases and (ii) have the potential to suppress shallow dewatering of the underlying sediments and crust due to their low permeability. We run a Monte Carlo simulation for various initial conditions and treat all those parameter combinations as success which result in the fraction of continental crust coverage observed for present day Earth. Finally, we simulate the evolution of an abiotic Earth using the same set of parameters but a reduced rate of continental weathering and erosion. Our results suggest that the origin and evolution of life could have stabilized the large continental surface area of the Earth and its wet mantle, leading to the relatively low mantle viscosity we observe at present. Without photosynthetic life on our planet, the Earth would be geodynamical less active due to a dryer mantle, and would have a smaller fraction of continental coverage than observed today. References[1] Höning, D., Hansen-Goos, H., Airo, A., Spohn, T., 2014. Biotic vs. abiotic Earth: A model for mantle hydration and continental coverage. Planetary and Space Science 98, 5-13. [2] Kleidon, A., 2010. Life, hierarchy, and the

  19. Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator

    Directory of Open Access Journals (Sweden)

    Thoraval Samuel

    2005-04-01

    Full Text Available Abstract Background Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. Results We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results can be reproduced or shared among users. Availability: http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive, ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download. Conclusion From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous

  20. Workflows and performances in the ranking prediction of 2016 D3R Grand Challenge 2: lessons learned from a collaborative effort.

    Science.gov (United States)

    Gao, Ying-Duo; Hu, Yuan; Crespo, Alejandro; Wang, Deping; Armacost, Kira A; Fells, James I; Fradera, Xavier; Wang, Hongwu; Wang, Huijun; Sherborne, Brad; Verras, Andreas; Peng, Zhengwei

    2018-01-01

    The 2016 D3R Grand Challenge 2 includes both pose and affinity or ranking predictions. This article is focused exclusively on affinity predictions submitted to the D3R challenge from a collaborative effort of the modeling and informatics group. Our submissions include ranking of 102 ligands covering 4 different chemotypes against the FXR ligand binding domain structure, and the relative binding affinity predictions of the two designated free energy subsets of 15 and 18 compounds. Using all the complex structures prepared in the same way allowed us to cover many types of workflows and compare their performances effectively. We evaluated typical workflows used in our daily structure-based design modeling support, which include docking scores, force field-based scores, QM/MM, MMGBSA, MD-MMGBSA, and MacroModel interaction energy estimations. The best performing methods for the two free energy subsets are discussed. Our results suggest that affinity ranking still remains very challenging; that the knowledge of more structural information does not necessarily yield more accurate predictions; and that visual inspection and human intervention are considerably important for ranking. Knowledge of the mode of action and protein flexibility along with visualization tools that depict polar and hydrophobic maps are very useful for visual inspection. QM/MM-based workflows were found to be powerful in affinity ranking and are encouraged to be applied more often. The standardized input and output enable systematic analysis and support methodology development and improvement for high level blinded predictions.

  1. Workflows and performances in the ranking prediction of 2016 D3R Grand Challenge 2: lessons learned from a collaborative effort

    Science.gov (United States)

    Gao, Ying-Duo; Hu, Yuan; Crespo, Alejandro; Wang, Deping; Armacost, Kira A.; Fells, James I.; Fradera, Xavier; Wang, Hongwu; Wang, Huijun; Sherborne, Brad; Verras, Andreas; Peng, Zhengwei

    2018-01-01

    The 2016 D3R Grand Challenge 2 includes both pose and affinity or ranking predictions. This article is focused exclusively on affinity predictions submitted to the D3R challenge from a collaborative effort of the modeling and informatics group. Our submissions include ranking of 102 ligands covering 4 different chemotypes against the FXR ligand binding domain structure, and the relative binding affinity predictions of the two designated free energy subsets of 15 and 18 compounds. Using all the complex structures prepared in the same way allowed us to cover many types of workflows and compare their performances effectively. We evaluated typical workflows used in our daily structure-based design modeling support, which include docking scores, force field-based scores, QM/MM, MMGBSA, MD-MMGBSA, and MacroModel interaction energy estimations. The best performing methods for the two free energy subsets are discussed. Our results suggest that affinity ranking still remains very challenging; that the knowledge of more structural information does not necessarily yield more accurate predictions; and that visual inspection and human intervention are considerably important for ranking. Knowledge of the mode of action and protein flexibility along with visualization tools that depict polar and hydrophobic maps are very useful for visual inspection. QM/MM-based workflows were found to be powerful in affinity ranking and are encouraged to be applied more often. The standardized input and output enable systematic analysis and support methodology development and improvement for high level blinded predictions.

  2. A Web application for the management of clinical workflow in image-guided and adaptive proton therapy for prostate cancer treatments.

    Science.gov (United States)

    Yeung, Daniel; Boes, Peter; Ho, Meng Wei; Li, Zuofeng

    2015-05-08

    Image-guided radiotherapy (IGRT), based on radiopaque markers placed in the prostate gland, was used for proton therapy of prostate patients. Orthogonal X-rays and the IBA Digital Image Positioning System (DIPS) were used for setup correction prior to treatment and were repeated after treatment delivery. Following a rationale for margin estimates similar to that of van Herk,(1) the daily post-treatment DIPS data were analyzed to determine if an adaptive radiotherapy plan was necessary. A Web application using ASP.NET MVC5, Entity Framework, and an SQL database was designed to automate this process. The designed features included state-of-the-art Web technologies, a domain model closely matching the workflow, a database-supporting concurrency and data mining, access to the DIPS database, secured user access and roles management, and graphing and analysis tools. The Model-View-Controller (MVC) paradigm allowed clean domain logic, unit testing, and extensibility. Client-side technologies, such as jQuery, jQuery Plug-ins, and Ajax, were adopted to achieve a rich user environment and fast response. Data models included patients, staff, treatment fields and records, correction vectors, DIPS images, and association logics. Data entry, analysis, workflow logics, and notifications were implemented. The system effectively modeled the clinical workflow and IGRT process.

  3. Performing Workflows in Pervasive Environments Based on Context Specifications

    OpenAIRE

    Xiping Liu; Jianxin Chen

    2010-01-01

    The workflow performance consists of the performance of activities and transitions between activities. Along with the fast development of varied computing devices, activities in workflows and transitions between activities could be performed in pervasive ways, whichcauses that the workflow performance need to migrate from traditional computing environments to pervasive environments. To perform workflows in pervasive environments needs to take account of the context information which affects b...

  4. Workflow Support for Advanced Grid-Enabled Computing

    OpenAIRE

    Xu, Fenglian; Eres, M.H.; Tao, Feng; Cox, Simon J.

    2004-01-01

    The Geodise project brings computer scientists and engineer's skills together to build up a service-oriented computing environmnet for engineers to perform complicated computations in a distributed system. The workflow tool is a front GUI to provide a full life cycle of workflow functions for Grid-enabled computing. The full life cycle of workflow functions have been enhanced based our initial research and development. The life cycle starts with a composition of a workflow, followed by an ins...

  5. The impact of missing sensor information on surgical workflow management.

    Science.gov (United States)

    Liebmann, Philipp; Meixensberger, Jürgen; Wiedemann, Peter; Neumuth, Thomas

    2013-09-01

    Sensor systems in the operating room may encounter intermittent data losses that reduce the performance of surgical workflow management systems (SWFMS). Sensor data loss could impact SWFMS-based decision support, device parameterization, and information presentation. The purpose of this study was to understand the robustness of surgical process models when sensor information is partially missing. SWFMS changes caused by wrong or no data from the sensor system which tracks the progress of a surgical intervention were tested. The individual surgical process models (iSPMs) from 100 different cataract procedures of 3 ophthalmologic surgeons were used to select a randomized subset and create a generalized surgical process model (gSPM). A disjoint subset was selected from the iSPMs and used to simulate the surgical process against the gSPM. The loss of sensor data was simulated by removing some information from one task in the iSPM. The effect of missing sensor data was measured using several metrics: (a) successful relocation of the path in the gSPM, (b) the number of steps to find the converging point, and (c) the perspective with the highest occurrence of unsuccessful path findings. A gSPM built using 30% of the iSPMs successfully found the correct path in 90% of the cases. The most critical sensor data were the information regarding the instrument used by the surgeon. We found that use of a gSPM to provide input data for a SWFMS is robust and can be accurate despite missing sensor data. A surgical workflow management system can provide the surgeon with workflow guidance in the OR for most cases. Sensor systems for surgical process tracking can be evaluated based on the stability and accuracy of functional and spatial operative results.

  6. A sub-circuit MOSFET model with a wide temperature range including cryogenic temperature

    Energy Technology Data Exchange (ETDEWEB)

    Jia Kan; Sun Weifeng; Shi Longxing, E-mail: jiakan.01@gmail.com [National ASIC System Engineering Research Center, Southeast University, Nanjing 210096 (China)

    2011-06-15

    A sub-circuit SPICE model of a MOSFET for low temperature operation is presented. Two resistors are introduced for the freeze-out effect, and the explicit behavioral models are developed for them. The model can be used in a wide temperature range covering both cryogenic temperature and regular temperatures. (semiconductor devices)

  7. Including Overweight or Obese Students in Physical Education: A Social Ecological Constraint Model

    Science.gov (United States)

    Li, Weidong; Rukavina, Paul

    2012-01-01

    In this review, we propose a social ecological constraint model to study inclusion of overweight or obese students in physical education by integrating key concepts and assumptions from ecological constraint theory in motor development and social ecological models in health promotion and behavior. The social ecological constraint model proposes…

  8. WS-VLAM: A GT4 based workflow management system

    NARCIS (Netherlands)

    Wibisono, A.; Vasyunin, D.; Korkhov, V.; Zhao, Z.; Belloum, A.; de Laat, C.; Adriaans, P.; Hertzberger, B.

    2007-01-01

    Generic Grid middleware, e.g., Globus Toolkit 4 (GT4), provides basic services for scientific workflow management systems to discover, store and integrate workflow components. Using the state of the art Grid services can advance the functionality of workflow engine in orchestrating distributed Grid

  9. Optimal resource assignment in workflows for maximizing cooperation

    NARCIS (Netherlands)

    Kumar, Akhil; Dijkman, R.M.; Song, Minseok; Daniel, Fl.; Wang, J.; Weber, B.

    2013-01-01

    A workflow is a team process since many actors work on various tasks to complete an instance. Resource management in such workflows deals with assignment of tasks to workers or actors. In team formation, it is necessary to ensure that members of a team are compatible with each other. When a workflow

  10. Database Support for Workflow Management: The WIDE Project

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Pernici, B; Sánchez, G.; Unknown, [Unknown

    1999-01-01

    Database Support for Workflow Management: The WIDE Project presents the results of the ESPRIT WIDE project on advanced database support for workflow management. The book discusses the state of the art in combining database management and workflow management technology, especially in the areas of

  11. LHCb: LHCbDirac is a DIRAC extension to support LHCb specific workflows

    CERN Multimedia

    Stagni, Federico

    2012-01-01

    We present LHCbDIRAC, an extension of the DIRAC community Grid solution to handle the LHCb specificities. The DIRAC software has been developed for many years within LHCb only. Nowadays it is a generic software, used by many scientific communities worldwide. Each community wanting to take advantage of DIRAC has to develop an extension, containing all the necessary code for handling their specific cases. LHCbDIRAC is an actively developed extension, implementing the LHCb computing model and workflows. LHCbDIRAC extends DIRAC to handle all the distributed computing activities of LHCb. Such activities include real data processing (reconstruction, stripping and streaming), Monte-Carlo simulation and data replication. Other activities are groups and user analysis, data management, resources management and monitoring, data provenance, accounting for user and production jobs. LHCbDIRAC also provides extensions of the DIRAC interfaces, including a secure web client, python APIs and CLIs. While DIRAC and LHCbDIRAC f...

  12. A methodology for including wall roughness effects in k-ε low-Reynolds turbulence models

    International Nuclear Information System (INIS)

    Ambrosini, W.; Pucciarelli, A.; Borroni, I.

    2015-01-01

    Highlights: • A model for taking into account wall roughness in low-Reynolds k-ε models is presented. • The model is subjected to a first validation to show its potential in general applications. • The application of the model in predicting heat transfer to supercritical fluids is also discussed. - Abstract: A model accounting for wall roughness effects in k-ε low-Reynolds turbulence models is described in the present paper. In particular, the introduction in the transport equations of k and ε of additional source terms related to roughness, based on simple assumptions and dimensional relationships, is proposed. An objective of the present paper, in addition to obtaining more realistic predictions of wall friction, is the application of the proposed model to the study of heat transfer to supercritical fluids. A first validation of the model is reported. The model shows the capability of predicting, at least qualitatively, some of the most important trends observed when dealing with rough pipes in very different flow conditions. Qualitative comparisons with some DNS data available in literature are also performed. Further analyses provided promising results concerning the ability of the model in reproducing the trend of friction factor when varying the flow conditions, though improvements are necessary for achieving better quantitative accuracy. First applications of the model in simulating heat transfer to supercritical fluids are also described, showing the capability of the model to affect the predictions of these heat transfer phenomena, in particular in the vicinity of the pseudo-critical conditions. A more extended application of the model to relevant deteriorated heat transfer conditions will clarify the usefulness of this modelling methodology in improving predictions of these difficult phenomena. Whatever the possible success in this particular application that motivated its development, this approach suggests a general methodology for accounting

  13. Measurement network design including traveltime determinations to minimize model prediction uncertainty

    NARCIS (Netherlands)

    Janssen, G.M.C.M.; Valstar, J.R.; Zee, van der S.E.A.T.M.

    2008-01-01

    Traveltime determinations have found increasing application in the characterization of groundwater systems. No algorithms are available, however, to optimally design sampling strategies including this information type. We propose a first-order methodology to include groundwater age or tracer arrival

  14. Validation on groundwater flow model including sea level change. Modeling on groundwater flow in coastal granite area

    International Nuclear Information System (INIS)

    Hasegawa, Takuma; Miyakawa, Kimio

    2009-01-01

    It is important to verify the groundwater flow model that reproduces pressure head, water chemistry, and groundwater age. However, water chemistry and groundwater age are considered to be influenced by historical events. In this study, sea level change during glacial-interglacial cycle was taken into account for simulating salinity and groundwater age at coastal granite area. As a result of simulation, salinity movement could not catch up with sea level changes, and mixing zone was formed below the fresh-water zone. This mixing zone was observed in the field measurement, and the observed salinities were agreed with simulated results including sea level change. The simulated residence time including sea level change is one-tenth of steady state. The reason is that the saline water was washed out during regression and modern sea-water was infiltrated during transgression. As mentioned before, considering sea level change are important to reproduce salinity and helium age at coastal area. (author)

  15. Integration of implant planning workflows into the PACS infrastructure

    Science.gov (United States)

    Gessat, Michael; Strauß, Gero; Burgert, Oliver

    2008-03-01

    The integration of imaging devices, diagnostic workstations, and image servers into Picture Archiving and Communication Systems (PACS) has had an enormous effect on the efficiency of radiology workflows. The standardization of the information exchange between the devices with the DICOM standard has been an essential precondition for that development. For surgical procedures, no such infrastructure exists. With the increasingly important role computerized planning and assistance systems play in the surgical domain, an infrastructure that unifies the communication between devices becomes necessary. In recent publications, the need for a modularized system design has been established. A reference architecture for a Therapy Imaging and Model Management System (TIMMS) has been proposed. It was accepted by the DICOM Working Group 6 as the reference architecture for DICOM developments for surgery. In this paper we propose the inclusion of implant planning systems into the PACS infrastructure. We propose a generic information model for the patient specific s